This Week in Misinformation: Synthetic Media in Politics, Violence in Court, Tired Rehashes in Congress
9 November 2023
Keeping up on misinformation is basically the best thing you can do for your brain. So glad you’re here!
Give this newsletter a share, won’t you? Posting a snippet to social can be a good taste for people in your network, or sending this email directly also tends to work well. 🙂
Reliability scores for media outlets cited in the summary are in parentheses for each, courtesy of the terrific folks at Ad Fontes Media.
Now, on to our top stories.
Synthetic media, meet our feeble attempts to push back on thee.
A flood of reporting--on the fake nudes ruining teenagers' lives, on the Chinese influencers using deepfakes of themselves to livestream 24/7, on the greater prevalence of chatbot hallucinations, and on the the use of AI by hackers for phishing attacks, for example--bodes ill for the impacts AI is having on our society.
More Americans are coming to realize that AI is going to add to election misinformation in this cycle (Associated Press, 46.23). So far, the platforms trying to put some labeling in place (The Verge, 39.27) and restricting political advertisers from using GenAI ads tools (Reuters, 46.24) is apparently the best we’ve been able to come up with.
Yes, there are also efforts, by Google and others, to aid in detection of deepfakes. But this is getting harder by the day (Axios, 43.70), even for machines trained to do it (New York Times, 41.94).
Fallout from January 6th, including its legacy of political violence, rolleth on.
The Justice Department, pursuing its case against Donald Trump for defrauding the United States, has asserted that the former president’s belief that the 2020 election was stolen is immaterial to the question of his guilt (Washington Post, 37.97). Liar beware: pretending that you truly believe whatever thing is not a get-out-of-jail card.
In the Georgia case, meanwhile, a spokesperson for Trump says he is ‘confused’ as to why several of his former attorneys/co-defendants have pleaded guilty. Could it have something to do with them going broke and being professionally disgraced from associating with him?
In a January 6th foot soldier case, U.S. Marshals had to clear a courtroom and rush the judge to safety after the accused started a brawl when security tried to handcuff him (Insider, 41.29). It’s reached the judiciary, folks.
Republicans in Congress are still running with ye olde Twitter Files narrative of Trump’s administration colluding with Big Tech to prevent Trump from winning reelection. (Really.)
I can’t in good conscience recommend you go read the interim findings put out by the House GOP, but even the Fox News (36.26) headline gives away how flimsy the allegations are likely to be: ‘Secret reports’ reveal how government worked to ‘censor Americans’ prior to 2020 election, Jim Jordan says.
As a reminder that no one is immune to misinformation, including Jim Jordan, just know that nearly one-fifth of Congress has been fooled by misinformation about Hamas and/or crypto (Forbes, 40.38).
At least we’ll always have grab bag: Halloween candy has never been laced with drugs, so maybe calm down already; there’s going to be a new unit at CBS to do the AI, deepfakes, and misinformation beat (welcome!); Uncle Elon says if you do a misinformation you cannot be paid… but he has also made it basically impossible to do disinformation research on Twitter, and furthermore he’s probably too late to salvage the business at this point; Twitter and Meta are still profiting from congressional candidates promoting QAnon; the guy who attacked Paul Pelosi with a hammer goes on trial; belief in health misinformation has the natural consequence of lowering confidence in (life-saving) vaccines; the mere likelihood of fakes and out-of-context videos about the war is casting doubt on real footage from Israel and Gaza; and the U.S. says Russia is funding a Latin America-wide drive to spread anti-Ukraine disinformation.
All that, and a lot more, below. This is This Week in Misinformation.
-- Kevin