Sunday, Apr 26, 2026
English News
  • Hyderabad
  • Telangana
  • AP News
  • India
  • World
  • Entertainment
  • Sport
  • Science and Tech
  • Business
  • Rewind
  • ...
    • NRI
    • View Point
    • cartoon
    • My Space
    • Education Today
    • Reviews
    • Property
    • Lifestyle
E-Paper
  • NRI
  • View Point
  • cartoon
  • My Space
  • Reviews
  • Education Today
  • Property
  • Lifestyle
Home | My Space | Cyber Talk Growing Concern Around Deepfakes

Cyber Talk: Growing concern around deepfakes

This tech can be used to spread misinformation, propaganda, blackmail people

By Telangana Today
Published Date - 31 January 2023, 12:45 AM
Cyber Talk: Growing concern around deepfakes
whatsapp facebook twitter telegram

Deepfakes are a type of AI-based technology that allows for the creation of highly realistic digital forgeries, in which an individual’s face and/or voice is replaced with someone else’s.

This technology can be used to create convincing videos of real people doing or saying things they never did. Deepfakes are a growing concern because they can be used to spread misinformation and propaganda, and harass or blackmail individuals.

Also Read

  • Cyber Talk: ChatGPT plays vital role in cybersecurity
  • Cyber Talk: Importance of online reputation

Social media, govt’s role

Social media users could use a face-generating AI system to hide their identity in other people’s photos. Few social media sites, such as Facebook and Instagram, let users decide whether they are tagged in photos. Other areas of focus should be on (a) commercialising fact-checking as a service (b) creating a regulatory (c) creating structured fact-checking data through Claim Review and (d) regionally collaborating.

Social engineering crimes using Deepfakes

Deepfakes can be used as a tool in social engineering crimes, in which attackers use manipulation and deception to trick individuals into divulging sensitive information or performing certain actions.

–One example of how deepfakes can be used in social engineering is by creating a video of a high-level executive or official, such as a CEO or government official, asking for sensitive information or requesting a transfer of funds. This video can then be sent to an employee or individual, who may not be able to tell that the video is fake, and may comply with the request.

–Another example is creating a fake video of a friend or family member asking for money or other financial help.

–Deepfakes can also be used in impersonation attacks, in which an attacker creates a fake video or audio of an individual, such as a celebrity or public figure, and uses it to impersonate that person and gain access to sensitive information or resources.

Fact-checking deepfakes

There are several steps that can be taken to help verify the authenticity of a video:
–Check the source: Look for information about the video’s origin and try to verify that the source is reputable and credible.

–Check other sources: Look for other sources that report on the same video or story, and compare the information to see if it matches up.

–Look for inconsistencies: Check for any inconsistencies or errors in the video, such as unnatural movements or expressions, mismatched audio or inconsistencies in the lighting or shadows.

–Use specialised tools: There are tools available that can analyse videos and detect signs of manipulation, such as the ‘Deepfake Detection Model’ which is a specialised software that can detect deepfakes.

–Verify with the person: If the video features a specific person, try to reach out to that person and verify the authenticity of the video.

–Log on to factly.in/ or www.boomlive.in/ or www.altnews.in/ or any other official International Fact Checking Network Members to validate deepfakes.

Spotting Deep Fakes

–Look for unnatural movements or expressions: Deepfakes often struggle to perfectly replicate the subtle movements and expressions of real people. Look for unnatural or exaggerated movements in the video.

–Check the audio: Deepfakes may not perfectly match the audio with the video, so listen for any inconsistencies or errors in the audio.

–Check the lighting and shadows: Deepfakes may not accurately replicate the lighting and shadows in a scene; look for inconsistencies or errors in these elements.

–Check the background: In a deepfake video, the background may be out of focus, or not match the movement of the subject.

–Pay attention and check if the person is blinking enough or too much.

Conclusion

Deepfake technology is constantly evolving and becoming more difficult to spot deepfakes. Therefore, it is essential to remain sceptical and verify the authenticity of any video before accepting it as true.

  • Follow Us :
  • Tags
  • AI-based technology
  • Cyber Talk
  • deepfakes

Related News

  • French prosecutors summon Elon Musk over allegations of child abuse images and deepfakes on X

    French prosecutors summon Elon Musk over allegations of child abuse images and deepfakes on X

  • Paris prosecutors raid X offices in child pornography, deepfake investigation

    Paris prosecutors raid X offices in child pornography, deepfake investigation

  • Al Qaeda scales up tech enabled operations as it eyes pan-India footprint

    Al Qaeda scales up tech enabled operations as it eyes pan-India footprint

  • Editorial: Global outrage over Grok

    Editorial: Global outrage over Grok

Latest News

  • BWF to roll out 3×15 scoring format from January 2027

    7 hours ago
  • Pakistan’s Karachi reports first Congo virus death of 2026

    8 hours ago
  • Kudankulam nuclear plant’s unit 3 enters reactor spillage phase

    8 hours ago
  • Pakistan launches EO-3 satellite, boosting remote sensing capabilities

    8 hours ago
  • Mamata says BJP’s use of Union cabinet, 19 CMs won’t tilt Bengal polls

    8 hours ago
  • BJP’s Bahraich MLA Anupama Jaiswal sustains minor burns while igniting effigy during protest

    8 hours ago
  • Rewind: Telangana’s Economy – from growth to policy uncertainty

    8 hours ago
  • Sunrisers Hyderabad overpower Royals despite Sooryavanshi’s stunning century

    8 hours ago

company

  • Home
  • About Us
  • Contact Us
  • Privacy Policy

business

  • Subscribe

telangana today

  • Telangana
  • Hyderabad
  • Latest News
  • Entertainment
  • World
  • Andhra Pradesh
  • Science & Tech
  • Sport

follow us

  • Telangana Today Telangana Today
Telangana Today Telangana Today

© Copyrights 2024 TELANGANA PUBLICATIONS PVT. LTD. All rights reserved. Powered by Veegam

.