By Prostock-studio @ Shutterstock.com

Is that your boss on the phone telling you to wire money to a suppliers’ account? Is that really your mom on the phone telling you she’s stuck in Paris and needs money for a flight home? Is that really your client asking you to transfer money from one of her accounts to another? The answer could be no.

Criminals have successfully used AI technology to fake an executive’s voice, using the fake to direct a subordinate to send the criminals money. Catherine Stupp reports:

Criminals used artificial intelligence-based software to impersonate a chief executive’s voice and demand a fraudulent transfer of €220,000 ($243,000) in March in what cybercrime experts described as an unusual case of artificial intelligence being used in hacking.

The CEO of a U.K.-based energy firm thought he was speaking on the phone with his boss, the chief executive of the firm’s German parent company, who asked him to send the funds to a Hungarian supplier. The caller said the request was urgent, directing the executive to pay within an hour, according to the company’s insurance firm, Euler Hermes Group SA.

Euler Hermes declined to name the victim companies.

Law enforcement authorities and AI experts have predicted that criminals would use AI to automate cyberattacks. Whoever was behind this incident appears to have used AI-based software to successfully mimic the German executive’s voice by phone. The U.K. CEO recognized his boss’ slight German accent and the melody of his voice on the phone, said Rüdiger Kirsch, a fraud expert at Euler Hermes, a subsidiary of Munich-based financial services company Allianz SE.

Several officials said the voice-spoofing attack in Europe is the first cybercrime they have heard of in which criminals clearly drew on AI. Euler Hermes, which covered the entire amount of the victim company’s claim, hasn’t dealt with other claims seeking to recover losses from crimes involving AI, according to Mr. Kirsch.

Scams using AI are a new challenge for companies, Mr. Kirsch said. Traditional cybersecurity tools designed to keep hackers off corporate networks can’t spot spoofed voices. Cybersecurity companies have recently developed products to detect so-called deepfake recordings.

Read more here.