Follow us on Twitter

Follow us on Facebook

Counter terrorism in UK ـ Terrorists could try to exploit artificial intelligence

Oct 19, 2023 | Studies & Reports

European Centre for Counterterrorism and Intelligence Studies, Germany & Netherlands – ECCI

Terrorists could try to exploit artificial intelligence, MI5 and FBI chiefs warn

theguardian – Artificial intelligence could be harnessed by terrorists or hostile states seeking to build bombs, spread propaganda or disrupt elections, according to the heads of MI5 and the FBI.Ken McCallum, the director general of MI5, and Christopher Wray, the director of the FBI, said their organisations were monitoring developments and needed to cooperate with experts in the private sector to tackle emerging threats.The MI5 chief said that while AI developers put in safeguards to prevent people using software to ask to how to build a bomb, there was a risk that it was possible to “jailbreak” those controls.

“If you are experienced in security, you would be unwise to rely on these controls remaining impregnable,” McCallum said. “So there is clear risk that some of these systems can be used, put to uses that their makers do not intend.Wray said terror groups had sought to use “AI to circumvent safeguards built into some of the AI infrastructure” to “do searches for, you know, how to build a bomb … or ways to obfuscate their searches for how to build the bomb”.It was not the only example that hostile actors had sought to use artificial intelligence, Wray added. “We’ve seen AI used to essentially amplify the distribution or dissemination of terrorist propaganda,” he said, using translation tools to make it “more coherent and more credible to potential supporters”.

Wray and McCallum were speaking on Tuesday at a Five Eyes intelligence summit with the heads of the domestic intelligence agencies of Australia, Canada, New Zealand at Stanford University, California.McCallum said he believed that some of the security risks relating to artificial intelligence would be discussed at Rishi Sunak’s global AI summit at the beginning of November, and that the industry was sensitive to the topic.“It’s one of those issues where no one has a monopoly of wisdom and trying to have a different form of public private partnership and crucially, international partnerships,” the MI5 chief added.

Both agency chiefs said they were monitoring for sophisticated, AI-generated efforts at political interference by hostile states such as Russia in the run-up to forthcoming elections in the US and UK respectively.“The use of AI in a way that if it’s sophisticated enough to create potential deep fakes is something that adds a level of threat to that we haven’t previously encountered,” Wray said. It was a threat “we’re on the lookout for”, he added, given “an existing strategy by hostile nations could become more dangerous”.Faked images of Donald Trump being arrested were generated this year by Eliot Higgins, of the investigative journalism site Bellingcat, using an AI photo generation software, illustrating the potential of the technology.

This month, faked audio purporting to be the Labour leader, Keir Starmer, bullying his staff was posted online as the party’s conference in Liverpool began. Though false, it was widely circulated, and experts said it may have been generated by AI software based on audio of his speeches.McCallum said monitoring disinformation was not MI5’s main job, but it was alert to hostile states trying to manipulate British opinion, and indicated that analysts were maintaining a watching brief.

“So I wouldn’t want to make some sort of strong prediction that that will feature in the forthcoming election, but we would be not doing our jobs properly if we didn’t really think through the possibility,” the head of MI5 said.The event in California earlier heard warnings that terror threats could rise as a result of the war in Israel and Gaza, and warnings about the scale of Chinese industrial espionage.

European Centre for Counterterrorism and Intelligence Studies, Germany & Netherlands – ECCI

 

Related articles:

Follow us on Twitter

Follow us on Facebook