Quick Hits

  • Some scammers are using AI to fake their voice and image during video interviews.
  • This trend raises the risk that employers could experience poor performance by unqualified workers, cyberattacks, theft of sensitive data, or embezzlement.
  • Careful hiring strategies can help employers prevent these schemes.

The last thing employers want to do is to hire a person with a fake identity—whether the person’s goal is to obtain a job for which they are not qualified, steal data or money, or install spyware or ransomware on company devices. If the hiring process is rushed or inconsistent, it is easy for companies to fall victim to this kind of scheme. In January 2025, the Federal Bureau of Investigation (FBI) warned employers about the growing threat from North Korean IT workers infiltrating U.S. companies to steal sensitive data and extort money.

Online job postings have made it easier for employers to reach a wide pool of candidates across the United States, but it also has led to an environment where one job posting might draw thousands of applications, making it more difficult for hiring managers to sort through and find the best talent. The rise of remote work since 2020 has further complicated matters, as it can make it more difficult to detect when a new hire previously faked his or her voice or image during the interview process.

Risk Reduction Strategies

To reduce the risk of hiring someone with a fake identity, employers may wish to consider these strategies:

  • relying on in-person interviews whenever possible; otherwise, using live video with cameras and applying simple, neutral authenticity checks, such as turning the head, waving a hand, or reading a randomly selected sentence to detect overlay artifacts;
  • conducting multiple interview rounds with role-specific questions designed to elicit concrete details;
  • asking interview questions designed to elicit specific details about the applicant’s location and personal background (while, of course, avoiding questions prohibited by employment discrimination laws);
  • scrutinizing resumes and applications for typos, unusual terminology, and inconsistencies with public profiles;
  • verifying identity, work authorizations, education, and employment history through legally compliant methods, and making job offers contingent upon successful verification;
  • contacting and verifying the applicant’s professional references; and
  • training hiring managers to spot red flags in video interviews (e.g., lip-sync issues, abnormal lighting, or lagging inconsistent with audio).

Ironically, there are AI tools that can help employers spot fake job applicants, but employers may want to use those tools cautiously with vendor diligence and human review.

Employers may want to ensure that any screening, background checks, and AI-assisted tools are used in compliance with applicable federal, state, and local laws. This includes “ban-the-box” rules on criminal history inquiries and timing; background check disclosures, authorizations, and pre-adverse/adverse action procedures; automated decision-making regulations; and biometric identifier rules. In addition, employers may wish to coordinate recruitment policies and practices with IT security and privacy professionals.

Ogletree Deakins will continue to monitor developments and will provide updates on the Background Checks, Cybersecurity and Privacy, Employee Engagement, and Technology blogs as new information becomes available.

Rebecca J. Bennett is a shareholder in Ogletree Deakins’ Cleveland office.

This article was co-authored by Leah J. Shepherd, who is a writer in Ogletree Deakins’ Washington, D.C., office.

Follow and Subscribe
LinkedIn | Instagram | Webinars | Podcasts

Author


Browse More Insights

Fingerprint Biometric Authentication Button. Digital Security Concept
Practice Group

Technology

Ogletree Deakins is uniquely situated to provide tech employers and users (the “TECHPLACE™”) with labor and employment advice, compliance counseling, and litigation services that embrace innovation and mitigate legal risk. Through our Technology Practice Group, we support clients in the exploration, invention, and/or implementation of new and evolving technologies to navigate the unique and emerging labor and employment issues present in the workplace.

Learn more
person filling out a request for a criminal background check
Practice Group

Background Checks

Background checks are a trending topic for employers because of the tidal wave of class action lawsuits alleging technical violations of the federal Fair Credit Reporting Act as well as the proliferation of state and local background check laws (including those arising from the Ban the Box movement).

Learn more
Modern dark data center, all objects in the scene are 3D
Practice Group

Cybersecurity and Privacy

The attorneys in the Cybersecurity and Privacy Practice Group at Ogletree Deakins understand that data now accumulates quickly and transmits easily. As the law adapts to technical advancements, we effectively advise our clients as they work to comply with new developments and best practices for protecting the privacy of the data that their businesses collect and retain.

Learn more

Sign up to receive emails about new developments and upcoming programs.

Sign Up Now