Monday, May 18, 2026
English News
  • Hyderabad
  • Telangana
  • AP News
  • India
  • World
  • Entertainment
  • Sport
  • Science and Tech
  • Business
  • Rewind
  • ...
    • NRI
    • View Point
    • cartoon
    • My Space
    • Education Today
    • Reviews
    • Property
    • Lifestyle
E-Paper
  • NRI
  • View Point
  • cartoon
  • My Space
  • Reviews
  • Education Today
  • Property
  • Lifestyle
Home | My Space | Cyber Talk Growing Concern Around Deepfakes

Cyber Talk: Growing concern around deepfakes

This tech can be used to spread misinformation, propaganda, blackmail people

By Telangana Today
Published Date - 31 January 2023, 12:45 AM
Cyber Talk: Growing concern around deepfakes
whatsapp facebook twitter telegram

Deepfakes are a type of AI-based technology that allows for the creation of highly realistic digital forgeries, in which an individual’s face and/or voice is replaced with someone else’s.

This technology can be used to create convincing videos of real people doing or saying things they never did. Deepfakes are a growing concern because they can be used to spread misinformation and propaganda, and harass or blackmail individuals.

Also Read

  • Cyber Talk: ChatGPT plays vital role in cybersecurity
  • Cyber Talk: Importance of online reputation

Social media, govt’s role

Social media users could use a face-generating AI system to hide their identity in other people’s photos. Few social media sites, such as Facebook and Instagram, let users decide whether they are tagged in photos. Other areas of focus should be on (a) commercialising fact-checking as a service (b) creating a regulatory (c) creating structured fact-checking data through Claim Review and (d) regionally collaborating.

Social engineering crimes using Deepfakes

Deepfakes can be used as a tool in social engineering crimes, in which attackers use manipulation and deception to trick individuals into divulging sensitive information or performing certain actions.

–One example of how deepfakes can be used in social engineering is by creating a video of a high-level executive or official, such as a CEO or government official, asking for sensitive information or requesting a transfer of funds. This video can then be sent to an employee or individual, who may not be able to tell that the video is fake, and may comply with the request.

–Another example is creating a fake video of a friend or family member asking for money or other financial help.

–Deepfakes can also be used in impersonation attacks, in which an attacker creates a fake video or audio of an individual, such as a celebrity or public figure, and uses it to impersonate that person and gain access to sensitive information or resources.

Fact-checking deepfakes

There are several steps that can be taken to help verify the authenticity of a video:
–Check the source: Look for information about the video’s origin and try to verify that the source is reputable and credible.

–Check other sources: Look for other sources that report on the same video or story, and compare the information to see if it matches up.

–Look for inconsistencies: Check for any inconsistencies or errors in the video, such as unnatural movements or expressions, mismatched audio or inconsistencies in the lighting or shadows.

–Use specialised tools: There are tools available that can analyse videos and detect signs of manipulation, such as the ‘Deepfake Detection Model’ which is a specialised software that can detect deepfakes.

–Verify with the person: If the video features a specific person, try to reach out to that person and verify the authenticity of the video.

–Log on to factly.in/ or www.boomlive.in/ or www.altnews.in/ or any other official International Fact Checking Network Members to validate deepfakes.

Spotting Deep Fakes

–Look for unnatural movements or expressions: Deepfakes often struggle to perfectly replicate the subtle movements and expressions of real people. Look for unnatural or exaggerated movements in the video.

–Check the audio: Deepfakes may not perfectly match the audio with the video, so listen for any inconsistencies or errors in the audio.

–Check the lighting and shadows: Deepfakes may not accurately replicate the lighting and shadows in a scene; look for inconsistencies or errors in these elements.

–Check the background: In a deepfake video, the background may be out of focus, or not match the movement of the subject.

–Pay attention and check if the person is blinking enough or too much.

Conclusion

Deepfake technology is constantly evolving and becoming more difficult to spot deepfakes. Therefore, it is essential to remain sceptical and verify the authenticity of any video before accepting it as true.

  • Follow Us :
  • Tags
  • AI-based technology
  • Cyber Talk
  • deepfakes

Related News

  • ASCI releases draft rules for labelling AI-generated advertisements

    ASCI releases draft rules for labelling AI-generated advertisements

  • Elon Musk, X face French charges linked to child sexual abuse images

    Elon Musk, X face French charges linked to child sexual abuse images

  • French prosecutors summon Elon Musk over allegations of child abuse images and deepfakes on X

    French prosecutors summon Elon Musk over allegations of child abuse images and deepfakes on X

  • Paris prosecutors raid X offices in child pornography, deepfake investigation

    Paris prosecutors raid X offices in child pornography, deepfake investigation

Latest News

  • Jannik Sinner wins Italian Open, joins Djokovic with Career Golden Masters

    1 hour ago
  • Mitchell Starc’s four-wicket haul powers Delhi Capitals to crucial win over Rajasthan Royals

    1 hour ago
  • Adani Group to invest in big infra, energy projects in Bihar: Gautam Adani

    1 hour ago
  • Heatwave warning for Telangana for next seven days

    2 hours ago
  • Editorial: Parsis — a community in crisis

    2 hours ago
  • Opinion: From stardom to statesmanship — what sets Vijay apart in politics

    2 hours ago
  • Farmer attempts self-immolation over procurement delay in Jangaon

    2 hours ago
  • Fire accident at Yadadri thermal power plant: BHEL says no major damage

    2 hours ago

company

  • Home
  • About Us
  • Contact Us
  • Privacy Policy

business

  • Subscribe

telangana today

  • Telangana
  • Hyderabad
  • Latest News
  • Entertainment
  • World
  • Andhra Pradesh
  • Science & Tech
  • Sport

follow us

  • Telangana Today Telangana Today
Telangana Today Telangana Today

© Copyrights 2024 TELANGANA PUBLICATIONS PVT. LTD. All rights reserved. Powered by Veegam