FBI Warns Deepfakes Could Be Used in Remote Job Interviews – VICE | Region & Cash

If you’re interviewing someone online for a job, pay attention to whether the person’s words match their lip movements. According to the FBI, people have reported suspicious online interviews, believing they may be deepfakes.

The FBI’s Cybercrime Complaint Center (IC3) has been released a public notice on Tuesday, warning of “an increase in complaints reporting the use of deepfakes and stolen personally identifiable information (PII) to apply for a variety of remote and home-based jobs.”

Deepfakes are algorithmically generated videos or images that can be used to fake a person’s likeness, making it appear as if they are saying or doing something they have never done.

According to the FBI, they have received reports from people working in information technology and computer programming, databases and software alleging that job applicants are using deepfakes in video interviews. What these positions have in common is their ability to access customer information, financial data, corporate IT databases, and proprietary information.

While the FBI has not shared statistics with the public, e.g. B. whether deepfakes are actually used in these complaints, how many people allegedly using deepfakes have been successfully recruited for the roles, and whether information has been compromised, but it has reported that people have claimed that private information was stolen to create fake identities of the applicant and pass background checks before hiring.

Many open source software frameworks for creating deepfakes are available online, including DeepFaceLab, DeepFaceLiveand FaceSwap. Since then, making deepfakes has become more accessible to the general public Motherboard revealed the first deepfake created by casual consumers in 2019 – increasing the potential for misleading information to spread as truth.

This is not the first time deepfakes have been used for malicious or deceptive purposes. There have been several incidents where deepfakes have been used create non-consensual pornography. A 2019 report entitled The State of Deepfakes uncovered, “[A] The key trend we have identified is the importance of non-consensual deepfake pornography, which accounts for 96% of all deepfake videos on the internet.”

Deepfakes have also been used to commit acts of fraud and influence political outcomes. Senior financial controllers at security firm Symantec sent millions of pounds to cybercriminals who tricked them with deepfake audio various executives. 2018 a Belgian political party created a deepfake video of Donald Trump Calling on the Belgians to withdraw from the Paris climate protection agreement. This video quickly went viral, provoking the anger of many Belgians who were unaware that the video was fake.

These examples repeat the dangers of artificial intelligence, which can not only be used to harm people with misleading information, but also to perpetuate existing discriminatory and biased systems – particularly, target women.

The FBI shared some observations that people reported divulging the deepfakes, such as the interviewee’s actions and lip movements did not match the tone of their speech. Additionally, auditory things like coughing and sneezing weren’t aligned with the person’s on-screen actions.

As it becomes easier and easier to create a deepfake, online resources have popped up to help people spot deepfakes. The Massachusetts Institute of Technology (MIT) has started a research project and website with the name detect fakes this helps people spot deepfakes. In the project description, MIT asked eight questions that people can use to determine if a video is a deepfake. The questions are largely based on how the subject looks in the video, including “Does the skin look too smooth or too wrinkled?” and “Do shadows appear in places you would expect?”.

There is a high probability that you will spot a deepfake in an interview. While creating deepfakes is easy, the work required to create a perfect and believable deepfake is difficult. However, when you do, the FBI asks people to do it report it to IC3.

Leave a Comment