Learn bits
Science & Tech.
Mahesh

17/11/23 11:10 AM IST

As elections approach, how to deal with audio deepfakes?

In News
  • In the arena of electoral politics, such cloning can be put to dangerous use, spreading misinformation in an all new effective way.
  • Just clone the voice of any political leader, superimpose the audio onto an existing video clip, and share.
Deepfakes Audio
  • AI voice clones or deepfake audios refer to the use of artificial intelligence (AI) technology, particularly deep learning algorithms, to generate synthetic or manipulated voice recordings that mimic the voice of a specific individual.
  • The technology has advanced considerably in recent years and now it is possible to create highly realistic and convincing audio forgeries.
AI Cloning
  • Creating clones of anyones voice is very easy. All you need is a laptop with a good internet connection and the audio clip of the person whose voice you wish to clone.
  • Using the website covers.ai, one can simply upload an audio and then select the voice they want, the audio will be ready within 5 minutes”.
  • “Also anyone can create their voice clone with that website by paying only Rs 399.
  • They need to upload a good-quality audio clip of the voice which is at least 3 minutes long, and then wait for a week. The website will create their AI voice clone and they can create any song or audio with that for lifetime,”.
  • There are other online tools like play.ht and Eleven Labs that can be used to create AI voice clones easily.
  • There are also several tutorials available on YouTube on making AI voice clones.
Regulations of Deepfake video
  • Earlier, audio deepfakes were fairly robotic and unrealistic, making them easy to detect. However, technology has progressed significantly since then.
  • With the help of advanced AI, deepfake videos and images are being increasingly created by taking advantage of content posted on public social media profiles.
  • One way to deal with deepfakes is for the authorities to crack down on social media platforms for it.
  • The Indian Ministry of Electronics and Information Technology (MeitY) sent an advisory to social media companies urging them to tackle deep-fake content. 
  • The government also warned social media intermediaries that failing to remove deepfake information from their platforms might result in penalties such as losing safe harbour rights, among other things.
  • Such stringent advice from the government can help to flatten the curve of data being exploited to create deepfake content.
  • As a protective measure, digitally signed videos can be a way to verify that content can be trusted.
  • Much like how certificates are used to validate website security and email communications, the same could be used for validating digital media.
  • As technology evolves and deepfake production times shrink and quality vastly improves, a point may come where it’s impossible to distinguish a deepfake from real recorded content; therefore, validating content as true using a signing or verification process is needed.
Identifying Deepfakes Videos

  • Stay informed: Keep yourself updated about the latest political developments and the statements made by key political leaders so that you don’t fall into the trap of a widely shared audio or video clip of a politician making a controversial statement.
  • Verify before you share: Verify the source of the audio or video clip. If the source is not reliable, avoid sharing.
  • Use AI detection tools if possible: There are certain AI voice detectors available online, however unlike AI voice cloning tools, these are not free. Some of these are aivoicedetector.com, play.ht can also be used to detect AI voices.
Source- Indian Express

More Related Current Affairs View All

24 Sep

Leveraging transit-oriented development to build productive cities

'Indian cities are on the brink of a transportation revolution, with a projected expenditure of ?3 trillion (between 2022–2027) set to be spent on approved metro rail project

Read More

24 Sep

Section 107 of BNSS

'The Bharatiya Nagarik Suraksha Sanhita (BNSS), 2023, introduced Section 107, which deals with properties which are “proceeds of crime”.' Until recently, this term w

Read More

24 Sep

Supreme Court strengthened child pornography law

'Tightening the law on child pornography, the Supreme Court recently said that even viewing, possessing, and not reporting such content is punishable under the Protection of Childr

Read More

India’s First Ai-Driven Magazine Generator

Generate Your Custom Current Affairs Magazine using our AI in just 3 steps