Attacked by conservatives, UW misinformation researcher gears up for 2024
Dec 11, 2023 3:26:03 GMT -5
Post by Deleted on Dec 11, 2023 3:26:03 GMT -5
hope it's okay to post this considering the subject matter. mostly did since it's all about Kate Starbird.
www.seattletimes.com/seattle-news/politics/attacked-by-conservatives-uw-misinformation-researcher-gears-up-for-2024/
www.seattletimes.com/seattle-news/politics/attacked-by-conservatives-uw-misinformation-researcher-gears-up-for-2024/
by Nina Shapiro, Seattle Times staff reporter
* * *
In a small meeting room on the University of Washington campus, researcher Kate Starbird stood at a glass whiteboard as she brainstormed with two doctoral students how to dissect information posted to the politicized, hyperbolic and often wildly speculative free-for-all that is social media.
To create research papers that show how misinformation spreads, they took an academic, even nerdy approach. They talked about devising coding systems for posts that represent “collective sense-making” and “deep stories” (narratives that feel true, even if they’re not).
Starbird, a former professional basketball player with an even-keeled demeanor, showed little hint she’s at the center of the kind of internet swirl she studies — or that her attempt to promote factual information and strengthen democracy has gotten her sued, blasted by congressional inquiries and subjected to a death threat.
Powerful conservative figures, some of whom she helped identify as rampant spreaders of misinformation, now accuse her and colleagues at UW and elsewhere of participating in a “mass-surveillance and mass-censorship program,” in the words of a federal Louisiana suit that names Starbird as a defendant.
The bruising attack centers on a claim that the academics labeled views they disagreed with as misinformation and worked with social media companies and the government to suppress conservative voices.
Starbird, co-founder and director at the UW’s Center for an Informed Public, vigorously denies the charge, as do her colleagues. The university says it stands behind the work despite “significant” legal costs.
Despite it all, the center is gearing up for the 2024 presidential election.
With former President Donald Trump running, and many of his supporters still believing false conspiracy theories about his 2020 defeat, the election promises to be another test of the country’s faith in democracy and adherence to facts.
“We understand that we’re operating in adversarial terrain,” Starbird, 48, said during an interview in her office. “And so that makes it challenging. It means we’ll be slower. We won’t be able to take on quite as much.”
The social media platform X has compounded the problem by pulling back on data formerly offered to researchers for free. Starbird and her colleagues are trying to figure out how they can operate in this new environment.
Digital volunteerism to digital vigilantism
Starbird launched her academic career, ironically, by looking at social media’s upside.
She had a Stanford undergraduate computer science degree under her belt, as well as several seasons playing for the Seattle Storm and other professional basketball teams in the U.S. and abroad. For a doctoral dissertation at the University of Colorado, she looked at “digital volunteerism,” or the ways people use online tools to help each other during crises.
Landing a job in the UW’s Human Centered Design and Engineering department in 2012, she began to notice “rumors and misinformation were a larger and larger part of the social media discourse in the aftermath of disasters.”
Starbird worked on a case study of the 2013 Boston Marathon bombing, when self-appointed online sleuths fingered the wrong culprit. “Digital volunteerism sort of turned into digital vigilantism,” she said.
She and research collaborators also picked up on a conspiracy theory that Navy SEALs were responsible for the terrorist attack, rather than two brothers found to be the real perpetrators.
As Starbird researched other case studies, she saw internet conspiracy theories growing in number and reach. Some believed the online community would self-correct, but Starbird found “the rumors spread much further than the corrections ever did.”
Then, researchers and law enforcement detected bad actors and foreign governments deliberately spreading falsehoods, taking misinformation to a new level. That disinformation, as it’s been coined, raised concerns about the integrity of elections, especially in the wake of an alleged Russian online campaign to sow discord and spread falsehoods in 2016.
Three years later, the John S. and James L. Knight Foundation awarded the UW $5 million to create the Center for an Informed Public, bringing together faculty and students from the fields of computer science, sociology and law.
The center, now comprising about 40 people, has researched the way false, unverified and misleading information plays out in a variety of arenas. Its analyses of elections and the pandemic, in particular, have unleashed conservative furor. (The center also partners with The Seattle Times on voter polls and contributes monthly columns that appear on its Opinion pages.)
One subject of enormous scrutiny is the center’s role in a collaboration of researchers and analysts called the Election Integrity Partnership.
As the 2020 presidential election approached, Stanford University’s Internet Observatory pulled together the collaboration to rapidly identify and dispel election-related misinformation, ideally before it went viral.
The researchers from Stanford, the UW and two other groups logged suspect social media posts and shared their findings through their own posts and media briefings. They also alerted social media platforms and concerned parties like local election officials.
The researchers identified some false narratives on the left. “Spreading among Democrats were these claims that U.S. Postal Service was intentionally sabotaging mail in votes,” Starbird said.
But critics, like Jonathan Turley, a George Washington University law professor, have called the collaboration a partisan effort, noting it called out far more posts from the right than the left.
Starbird said that just reflects reality: “A lot more content that was false or misleading was spreading among supporters of Donald Trump — in part because he was repeatedly saying the election was going to be rigged. And so a lot of his supporters were going to the polls and misinterpreting what they were seeing.”
For example, some voters worried their ballots would be invalidated because of bleeding ink from Sharpie pens given out at the polls. In fact, Starbird said, Sharpies helped the voting process because their fast-dying ink doesn’t smear when going through counting machines.
The UW center analyzed how such rumors spread, sometimes through graphed timelines showing inflection points among tweets as heavy-hitting influencers weighed in. Those included Trump, his sons Donald Jr. and Eric, and conservative media outlets, including a site called The Gateway Pundit, according to the collaboration’s final report.
By the time the report came out in 2021, Trump supporters had stormed the Capitol, leading the authors to draw a straight line between the misinformation they had charted and the “big lie” of a stolen election that led to the Jan. 6 insurrection.
A wave of negative attention
Starbird started seeing herself pop up in negative social media commentary around the summer of 2022. Some called her a communist or criticized her appearance.
“I felt like I was back in junior high,” Starbird said.
Then, Trump posted a video on his platform, Truth Social, calling for the breakup of “the entire toxic censorship industry that has arisen under the false guise of tackling so-called mis- and disinformation.” He continued: “The federal government should immediately stop funding all nonprofits and academic programs that support this authoritarian project.”
That’s when Starbird realized high-placed figures were making her and her colleagues out to be villains.
A torrent of negative attention followed. She received threatening notes and, through the center, dozens of public record requests seemingly looking for evidence of censorship.
Jim Hoft, owner of the Gateway Pundit, filed the Louisiana suit this past May, citing the Election Integrity Partnership report’s characterization of him as a spreader of misinformation.
Meanwhile, a U.S. House subcommittee chaired by Republican Rep. Jim Jordan of Ohio, an ardent Trump supporter who appeared at a “Stop the Steal” rally, started looking into alleged censorship. Starbird said she spent more than four hours last summer voluntarily testifying before Jordan’s “weaponization of the federal government” subcommittee, and seven hours before a different subcommittee.
She’s forbidden from discussing what was said. But she plays a prominent role in two reports from Jordan’s panel. That’s largely because of her volunteer role on an advisory committee of a federal agency dedicated in part to protecting election infrastructure.
One panel report described the agency as a censorship “nerve center.” Right or wrong, Starbird points out she had no hands-on involvement and served after the fraught 2020 election.
The second report looked at the Election Integrity Partnership, saying the collaboration helped pressure social media companies to remove, critically label or downgrade the visibility of posts from conservatives, even if they contained true information or jokes. One example: A tweet from former Arkansas Gov. Mike Huckabee saying he completed mail-in ballots for his deceased parents and grandparents. “They vote just like me!”
The report called this tweet, which played into debunked conspiracy theories of widespread voter fraud, a “quip.”
Starbird said her center had little to do with identifying posts containing misinformation in 2020. That was mostly handled by others in the collaboration, who followed election-related social media and analyzed tips from local election officials and civic groups like the NAACP. They then sought to verify the claims by looking at information from media, election officials and fact-checking organizations.
They also flagged roughly 3,000 posts to what was then known as Twitter for ostensibly violating the platform’s policies against disseminating false information and undermining elections. Starbird said UW researchers, to her knowledge, didn’t directly communicate with social media companies.
The companies took action on 35% of the flagged posts; 13% were removed, 21% given a warning label, and 1% hidden unless viewers bypassed a warning.
This part of the operation is core to the censorship allegation. But is there anything wrong with encouraging platforms to crack down on allegedly false information?
Yes, argued Turley of George Washington, saying academics “helped target and in some ways blacklist” conservatives.
“We are a free people capable of evaluating information and ideas for ourselves to discern fact from fiction, and separate good ideas from bad,” Turley wrote in a statement to Congress.
But Rebekah Tromble, director of the Institute for Data, Democracy & Politics at the same university, said the damage wrought by misinformation can be profound, in some cases inciting people to violence. What’s more, she said, if researchers were told they couldn’t make requests to platforms, “that would actually be in violation of their own First Amendment rights.”
Further, she said, in conservatives’ voluminous reports and litigation she’s seen no evidence that social media platforms felt coerced — a key part of censorship.
Also, the U.S. Constitution specifically prohibits government censorship, said Alex Abdo, litigation director at Columbia University’s Knight First Amendment Institute.
Whether that happened during the 2020 election and the pandemic is the subject of another sprawling lawsuit, filed against the Biden administration by the Republican attorneys general of Missouri and Louisiana.
Even more fundamental questions: How do you define misinformation — or “fake news” in Trump’s parlance — and who decides what qualifies?
Jevin West, a co-founder of the Center for an Informed Public, said he discusses that with students all the time. “This is hard,” he said.
Some statements you can prove true or false, such as: It rained today in Seattle. Far squishier are statements that are true but misleading. West’s hypothetical: It only rained in Seattle five days out of the last couple months, implying the city never gets much rain.
Is that misinformation? “We might need different language for it,” he said, becoming more convinced as he talked it through.
Starbird said she’s been moving toward using the word “rumor” more, explaining that rumors can have true and false aspects. Sharpie pens really were bleeding through ballots in 2020, for instance, but that wasn’t part of a conspiracy to disenfranchise conservative voters.
She’s also come to realize “slapping the word misinformation” on voters’ genuine concerns is “just not a helpful way to engage with them.”
The problem, she added, is that whatever term she and others settle on will likely turn toxic in a matter of months.
“Radical transparency” for 2024
By the 2022 midterms, the Election Integrity Partnership was already changing the way it operated. The UW center, then working just with Stanford’s Observatory, had gotten a grant empowering it to take on a bigger role in shaping the vision, Starbird said.
Researchers relied on their own analysis to identify misinformation, no longer taking tips from outside groups. They also stopped flagging content to platforms, with a few exceptions, concentrating instead on conveying their findings directly to the public.
Some things are still up in the air about 2024, including which groups will participate and the roles they will play, but Starbird said she and her colleagues intend again to communicate their work publicly.
“Radical transparency,” she called it, and “the right move now under these conditions.”
Starbird and her colleagues are also thinking about how to adapt to the loss of X social media data once freely available to them. Called API data, it allowed researchers to find all posts related to certain words — like “ballots,” for instance — and easily gather detailed information, such as who posted them, when, and how many followers they have.
Even with limited data, though, researchers will have one advantage in 2024: a huge store of posts and analyses collected from past elections. If a new conspiracy theory comes up, Starbird said they can point to similar theories in the past that were unsubstantiated or proven false.
Using information gleaned from fact-checking organizations or election officials, they might say: “This is what we know to be true. This is what we know to be false. This is what we know to be unsubstantiated. And this is what we know about similar rumors like this in the past.”
* * *
In a small meeting room on the University of Washington campus, researcher Kate Starbird stood at a glass whiteboard as she brainstormed with two doctoral students how to dissect information posted to the politicized, hyperbolic and often wildly speculative free-for-all that is social media.
To create research papers that show how misinformation spreads, they took an academic, even nerdy approach. They talked about devising coding systems for posts that represent “collective sense-making” and “deep stories” (narratives that feel true, even if they’re not).
Starbird, a former professional basketball player with an even-keeled demeanor, showed little hint she’s at the center of the kind of internet swirl she studies — or that her attempt to promote factual information and strengthen democracy has gotten her sued, blasted by congressional inquiries and subjected to a death threat.
Powerful conservative figures, some of whom she helped identify as rampant spreaders of misinformation, now accuse her and colleagues at UW and elsewhere of participating in a “mass-surveillance and mass-censorship program,” in the words of a federal Louisiana suit that names Starbird as a defendant.
The bruising attack centers on a claim that the academics labeled views they disagreed with as misinformation and worked with social media companies and the government to suppress conservative voices.
Starbird, co-founder and director at the UW’s Center for an Informed Public, vigorously denies the charge, as do her colleagues. The university says it stands behind the work despite “significant” legal costs.
Despite it all, the center is gearing up for the 2024 presidential election.
With former President Donald Trump running, and many of his supporters still believing false conspiracy theories about his 2020 defeat, the election promises to be another test of the country’s faith in democracy and adherence to facts.
“We understand that we’re operating in adversarial terrain,” Starbird, 48, said during an interview in her office. “And so that makes it challenging. It means we’ll be slower. We won’t be able to take on quite as much.”
The social media platform X has compounded the problem by pulling back on data formerly offered to researchers for free. Starbird and her colleagues are trying to figure out how they can operate in this new environment.
Digital volunteerism to digital vigilantism
Starbird launched her academic career, ironically, by looking at social media’s upside.
She had a Stanford undergraduate computer science degree under her belt, as well as several seasons playing for the Seattle Storm and other professional basketball teams in the U.S. and abroad. For a doctoral dissertation at the University of Colorado, she looked at “digital volunteerism,” or the ways people use online tools to help each other during crises.
Landing a job in the UW’s Human Centered Design and Engineering department in 2012, she began to notice “rumors and misinformation were a larger and larger part of the social media discourse in the aftermath of disasters.”
Starbird worked on a case study of the 2013 Boston Marathon bombing, when self-appointed online sleuths fingered the wrong culprit. “Digital volunteerism sort of turned into digital vigilantism,” she said.
She and research collaborators also picked up on a conspiracy theory that Navy SEALs were responsible for the terrorist attack, rather than two brothers found to be the real perpetrators.
As Starbird researched other case studies, she saw internet conspiracy theories growing in number and reach. Some believed the online community would self-correct, but Starbird found “the rumors spread much further than the corrections ever did.”
Then, researchers and law enforcement detected bad actors and foreign governments deliberately spreading falsehoods, taking misinformation to a new level. That disinformation, as it’s been coined, raised concerns about the integrity of elections, especially in the wake of an alleged Russian online campaign to sow discord and spread falsehoods in 2016.
Three years later, the John S. and James L. Knight Foundation awarded the UW $5 million to create the Center for an Informed Public, bringing together faculty and students from the fields of computer science, sociology and law.
The center, now comprising about 40 people, has researched the way false, unverified and misleading information plays out in a variety of arenas. Its analyses of elections and the pandemic, in particular, have unleashed conservative furor. (The center also partners with The Seattle Times on voter polls and contributes monthly columns that appear on its Opinion pages.)
One subject of enormous scrutiny is the center’s role in a collaboration of researchers and analysts called the Election Integrity Partnership.
As the 2020 presidential election approached, Stanford University’s Internet Observatory pulled together the collaboration to rapidly identify and dispel election-related misinformation, ideally before it went viral.
The researchers from Stanford, the UW and two other groups logged suspect social media posts and shared their findings through their own posts and media briefings. They also alerted social media platforms and concerned parties like local election officials.
The researchers identified some false narratives on the left. “Spreading among Democrats were these claims that U.S. Postal Service was intentionally sabotaging mail in votes,” Starbird said.
But critics, like Jonathan Turley, a George Washington University law professor, have called the collaboration a partisan effort, noting it called out far more posts from the right than the left.
Starbird said that just reflects reality: “A lot more content that was false or misleading was spreading among supporters of Donald Trump — in part because he was repeatedly saying the election was going to be rigged. And so a lot of his supporters were going to the polls and misinterpreting what they were seeing.”
For example, some voters worried their ballots would be invalidated because of bleeding ink from Sharpie pens given out at the polls. In fact, Starbird said, Sharpies helped the voting process because their fast-dying ink doesn’t smear when going through counting machines.
The UW center analyzed how such rumors spread, sometimes through graphed timelines showing inflection points among tweets as heavy-hitting influencers weighed in. Those included Trump, his sons Donald Jr. and Eric, and conservative media outlets, including a site called The Gateway Pundit, according to the collaboration’s final report.
By the time the report came out in 2021, Trump supporters had stormed the Capitol, leading the authors to draw a straight line between the misinformation they had charted and the “big lie” of a stolen election that led to the Jan. 6 insurrection.
A wave of negative attention
Starbird started seeing herself pop up in negative social media commentary around the summer of 2022. Some called her a communist or criticized her appearance.
“I felt like I was back in junior high,” Starbird said.
Then, Trump posted a video on his platform, Truth Social, calling for the breakup of “the entire toxic censorship industry that has arisen under the false guise of tackling so-called mis- and disinformation.” He continued: “The federal government should immediately stop funding all nonprofits and academic programs that support this authoritarian project.”
That’s when Starbird realized high-placed figures were making her and her colleagues out to be villains.
A torrent of negative attention followed. She received threatening notes and, through the center, dozens of public record requests seemingly looking for evidence of censorship.
Jim Hoft, owner of the Gateway Pundit, filed the Louisiana suit this past May, citing the Election Integrity Partnership report’s characterization of him as a spreader of misinformation.
Meanwhile, a U.S. House subcommittee chaired by Republican Rep. Jim Jordan of Ohio, an ardent Trump supporter who appeared at a “Stop the Steal” rally, started looking into alleged censorship. Starbird said she spent more than four hours last summer voluntarily testifying before Jordan’s “weaponization of the federal government” subcommittee, and seven hours before a different subcommittee.
She’s forbidden from discussing what was said. But she plays a prominent role in two reports from Jordan’s panel. That’s largely because of her volunteer role on an advisory committee of a federal agency dedicated in part to protecting election infrastructure.
One panel report described the agency as a censorship “nerve center.” Right or wrong, Starbird points out she had no hands-on involvement and served after the fraught 2020 election.
The second report looked at the Election Integrity Partnership, saying the collaboration helped pressure social media companies to remove, critically label or downgrade the visibility of posts from conservatives, even if they contained true information or jokes. One example: A tweet from former Arkansas Gov. Mike Huckabee saying he completed mail-in ballots for his deceased parents and grandparents. “They vote just like me!”
The report called this tweet, which played into debunked conspiracy theories of widespread voter fraud, a “quip.”
Starbird said her center had little to do with identifying posts containing misinformation in 2020. That was mostly handled by others in the collaboration, who followed election-related social media and analyzed tips from local election officials and civic groups like the NAACP. They then sought to verify the claims by looking at information from media, election officials and fact-checking organizations.
They also flagged roughly 3,000 posts to what was then known as Twitter for ostensibly violating the platform’s policies against disseminating false information and undermining elections. Starbird said UW researchers, to her knowledge, didn’t directly communicate with social media companies.
The companies took action on 35% of the flagged posts; 13% were removed, 21% given a warning label, and 1% hidden unless viewers bypassed a warning.
This part of the operation is core to the censorship allegation. But is there anything wrong with encouraging platforms to crack down on allegedly false information?
Yes, argued Turley of George Washington, saying academics “helped target and in some ways blacklist” conservatives.
“We are a free people capable of evaluating information and ideas for ourselves to discern fact from fiction, and separate good ideas from bad,” Turley wrote in a statement to Congress.
But Rebekah Tromble, director of the Institute for Data, Democracy & Politics at the same university, said the damage wrought by misinformation can be profound, in some cases inciting people to violence. What’s more, she said, if researchers were told they couldn’t make requests to platforms, “that would actually be in violation of their own First Amendment rights.”
Further, she said, in conservatives’ voluminous reports and litigation she’s seen no evidence that social media platforms felt coerced — a key part of censorship.
Also, the U.S. Constitution specifically prohibits government censorship, said Alex Abdo, litigation director at Columbia University’s Knight First Amendment Institute.
Whether that happened during the 2020 election and the pandemic is the subject of another sprawling lawsuit, filed against the Biden administration by the Republican attorneys general of Missouri and Louisiana.
Even more fundamental questions: How do you define misinformation — or “fake news” in Trump’s parlance — and who decides what qualifies?
Jevin West, a co-founder of the Center for an Informed Public, said he discusses that with students all the time. “This is hard,” he said.
Some statements you can prove true or false, such as: It rained today in Seattle. Far squishier are statements that are true but misleading. West’s hypothetical: It only rained in Seattle five days out of the last couple months, implying the city never gets much rain.
Is that misinformation? “We might need different language for it,” he said, becoming more convinced as he talked it through.
Starbird said she’s been moving toward using the word “rumor” more, explaining that rumors can have true and false aspects. Sharpie pens really were bleeding through ballots in 2020, for instance, but that wasn’t part of a conspiracy to disenfranchise conservative voters.
She’s also come to realize “slapping the word misinformation” on voters’ genuine concerns is “just not a helpful way to engage with them.”
The problem, she added, is that whatever term she and others settle on will likely turn toxic in a matter of months.
“Radical transparency” for 2024
By the 2022 midterms, the Election Integrity Partnership was already changing the way it operated. The UW center, then working just with Stanford’s Observatory, had gotten a grant empowering it to take on a bigger role in shaping the vision, Starbird said.
Researchers relied on their own analysis to identify misinformation, no longer taking tips from outside groups. They also stopped flagging content to platforms, with a few exceptions, concentrating instead on conveying their findings directly to the public.
Some things are still up in the air about 2024, including which groups will participate and the roles they will play, but Starbird said she and her colleagues intend again to communicate their work publicly.
“Radical transparency,” she called it, and “the right move now under these conditions.”
Starbird and her colleagues are also thinking about how to adapt to the loss of X social media data once freely available to them. Called API data, it allowed researchers to find all posts related to certain words — like “ballots,” for instance — and easily gather detailed information, such as who posted them, when, and how many followers they have.
Even with limited data, though, researchers will have one advantage in 2024: a huge store of posts and analyses collected from past elections. If a new conspiracy theory comes up, Starbird said they can point to similar theories in the past that were unsubstantiated or proven false.
Using information gleaned from fact-checking organizations or election officials, they might say: “This is what we know to be true. This is what we know to be false. This is what we know to be unsubstantiated. And this is what we know about similar rumors like this in the past.”