Experts urge Ofcom to probe AI's role in fake news after major incidents
Briefly

Experts urge Ofcom to probe AI's role in fake news after major incidents
"Artificial intelligence (AI) software used to propagate fake news following significant events, often to generate income for social media users, should be scrutinised in an upcoming investigation into fraudulent advertising, experts have urged. Researchers at the Alan Turing Institute's Centre for Emerging Technology and Security discovered that AI played a role in driving some of the fake news disseminated online after the Southport murders, primarily for financial gain."
"They have recommended that Ofcom, the communications regulator, address this issue during its consultation on fraudulent advertising, scheduled for this summer. A report published on Wednesday revealed that Channel3Now, a website that initially published a false name for the suspect, was established using a service provider that "markets itself as using AI to generate content for users seeking passive income". The report also found that AI was employed to repackage articles, making them appear more credible."
Researchers at the Alan Turing Institute's Centre for Emerging Technology and Security found that AI played a role in generating and spreading fake news after the Southport murders. AI tools were often used to create content that could be monetised by social media users, and AI was also used to repackage articles to make them appear more credible. A site called Channel3Now initially published a false suspect name and was set up using a service that markets AI-generated content for passive income. Experts recommended that Ofcom address AI-enabled misinformation during its fraudulent advertising consultation this summer.
Read at www.independent.co.uk
Unable to calculate read time
[
|
]