Skip to content

Misinformation Exploration: Defining It, Understanding Its Prevalence, and Potential Regulations Proposals

Disinformation experts Renee DiResta and Guillaume Chaslot discussed its spread and amplification, as well as efforts to combat it, at the CADE Tech Policy Workshop.

Misinformation Explored: Definition, Prevalence, and Suggested Legislation
Misinformation Explored: Definition, Prevalence, and Suggested Legislation

Misinformation Exploration: Defining It, Understanding Its Prevalence, and Potential Regulations Proposals

In the digital age, the spread of misinformation and conspiracy theories on platforms like YouTube has become a significant concern, with major repercussions for society. Guillaume Chaslot, a former Google engineer who worked on YouTube's recommendation system, and Renee DiResta, the technical research manager at Stanford Internet Observatory, have been at the forefront of addressing this issue.

Chaslot's work has garnered attention from prestigious publications such as The Washington Post, The Guardian, and the Wall Street Journal. He now runs the non-profit AlgoTransparency, where he tracks YouTube's recommendation of conspiracy theories. Meanwhile, DiResta has advised Congress, the State Department, and other organizations on the topic of disinformation, having led one of the research teams that produced comprehensive assessments of influence operations targeting the U.S. for the Senate Select Committee on Intelligence.

One of DiResta's recent talks, titled "(Dis)Information & Regulation", was delivered at the University of San Francisco Center for Applied Data Ethics Tech Policy Workshop. In her address, DiResta proposed several regulatory approaches to address disinformation campaigns, with a focus on balancing privacy, security, and free expression.

Among the key proposals is the establishment of a UN-backed AI Influence Observatory, which would combine early-warning detection with public certification of disinformation incidents. This body would gather anonymized data from platforms and model providers, merged with crowdsourced civil-society reports to rank incidents by confidence and potential civic harm. Its responsibilities would include maintaining an open database for research and public transparency.

Another proposal emphasizes the need for stronger provenance and verified identity without chilling free speech. DiResta advocates for the promotion of technologies like passkeys, cryptographic attestations, and federated reputation standards to harden identity verification systems against spoofing. However, she acknowledges that "proof-of-human" systems alone are insufficient due to privacy concerns, the lack of formal IDs for many, and risks of account hijacking.

Regulation should also focus on the commercialization aspect of the disinformation ecosystem, where private sellers offer coordinated influence operations and fake social metrics at low costs. By addressing this supply side, DiResta believes that large-scale abuse can be curbed effectively without imposing blanket restrictions on free expression.

These approaches aim to create oversight that is technologically informed, globally coordinated, and sensitive to the privacy-security-free expression balance. They emphasize transparency, accountability, innovation in identity technologies, and addressing the supply side of disinformation markets rather than just content moderation alone.

In her talk, DiResta advocates for a multi-stakeholder, global, technically savvy regulatory framework that integrates democratic legitimacy and prioritizes protecting free speech and privacy while mitigating the harms of influence operations and disinformation campaigns.

YouTube's recommendation system has been criticized for promoting conspiracy theories and, in extreme cases, even pedophilia. Major concerns were raised in France about the platform promoting such content in 2006 and 2017, yet YouTube failed to take action until 2019. The system is incentivized by metrics, cheap cost of experimentation, and potential for rewards, which can lead to propagandists gaming the system.

The situation is even worse for languages other than English, where tech platforms tend not to invest many resources. Chaslot and DiResta's work serves as a call to action for policymakers, tech companies, and civil society to address this pressing issue and ensure a safer, more informed digital environment for all.

Links to watch Chaslot's and DiResta's talks, as well as the videos from the University of San Francisco Center for Applied Data Ethics Tech Policy Workshop, are provided below for those interested in learning more.

[1] University of San Francisco Center for Applied Data Ethics, "(Dis)Information & Regulation" and "The Toxic Potential of YouTube's Feedback Loop" videos, retrieved from [URL] [2] Guillaume Chaslot's talk, retrieved from [URL] [3] Renee DiResta's talk, retrieved from [URL]

  1. The work of Guillaume Chaslot, a former Google engineer, has been featured in notable publications like The Washington Post, The Guardian, and the Wall Street Journal, with his focus on tracking YouTube's recommendation of conspiracy theories.
  2. Renee DiResta, the technical research manager at Stanford Internet Observatory, has advised Congress, the State Department, and other organizations on disinformation, having led research teams that produced assessments for the Senate Select Committee on Intelligence.
  3. DiResta delivered a talk titled "(Dis)Information & Regulation" at the University of San Francisco Center for Applied Data Ethics Tech Policy Workshop, proposing regulatory approaches to address disinformation campaigns.
  4. One of DiResta's proposals is the establishment of a UN-backed AI Influence Observatory, which combines early-warning detection with public certification of disinformation incidents, gathering data from platforms and model providers along with crowdsourced reports to rank incidents.
  5. Another proposal emphasizes the need for stronger provenance and verified identity without chilling free speech, advocating for technologies like passkeys, cryptographic attestations, and federated reputation standards.
  6. Regulation should also address the commercialization aspect of the disinformation ecosystem, where private sellers offer coordinated influence operations and fake social metrics at low costs.
  7. DiResta advocates for a multi-stakeholder, global, technically savvy regulatory framework that integrates democratic legitimacy, prioritizes protecting free speech and privacy, and mitigates the harms of influence operations and disinformation campaigns.

Read also:

    Latest