Confronting Disinformation Spreaders on Twitter Only Makes It Worse, MIT Scientists Say (vice.com) 79
According to a new study conducted by researchers at MIT, being corrected online just makes the original posters more toxic and obnoxious. From a report: Basically, the new thinking is that correcting fake news, disinformation, and horrible tweets at all is bad and makes everything worse. This is a "perverse downstream consequence for debunking," and is the exact title of MIT research published in the '2021 CHI Conference on Human Factors in Computing Systems.' The core takeaway is that "being corrected by another user for posting false political news increases subsequent sharing of low quality, partisan, and toxic content."
The MIT researchers' work is actually a continuation of their study into the effects of social media. This recent experiment started because the team had previously discovered something interesting about how people behave online. "In a recent paper published in Nature, we found that a simple accuracy nudge -- asking people to judge the accuracy of a random headline -- improved the quality of the news they shared afterward (by shifting their attention towards the concept of accuracy)," David Rand, an MIT researcher and co-author of the paper told Motherboard in an email. "In the current study, we wanted to see whether a similar effect would happen if people who shared false news were directly corrected," he said. "Direct correction could be an even more powerful accuracy prime -- or, it could backfire by making people feel defensive or focusing their attention on social factors (eg embarrassment) rather than accuracy."
The MIT researchers' work is actually a continuation of their study into the effects of social media. This recent experiment started because the team had previously discovered something interesting about how people behave online. "In a recent paper published in Nature, we found that a simple accuracy nudge -- asking people to judge the accuracy of a random headline -- improved the quality of the news they shared afterward (by shifting their attention towards the concept of accuracy)," David Rand, an MIT researcher and co-author of the paper told Motherboard in an email. "In the current study, we wanted to see whether a similar effect would happen if people who shared false news were directly corrected," he said. "Direct correction could be an even more powerful accuracy prime -- or, it could backfire by making people feel defensive or focusing their attention on social factors (eg embarrassment) rather than accuracy."
Yes, we know (Score:5, Informative)
It's well known and well covered. It's called the Backfire Effect [jstor.org]. In short, when a person is presented with facts which contradict their beliefs, the person will dig in their heels and disregard the facts.
Clearly this will apply to the bubbling intellectuals on Twitter. And Facebook. And Gab. And Parler.
Re:Yes, we know (Score:5, Insightful)
The purpose of confronting disinformation spreaders is not to convert the spreader, but to persuade the audience.
That the spreader becomes angry and obnoxious is a good thing because it reduces their credibility.
Re: (Score:3)
The audience are the same as the spreader and will all dig in together.
If everyone already agrees with the liar, then you are in the wrong forum.
There are many forums, including Slashdot, where there is no consensus on any issue.
See Jan 6 insurrection
Really? You are using a partisan riot as an example of a typical debate?
Re: (Score:2, Interesting)
They all believed the lie the election was stolen. No matter how many people told them the facts.
They don't care about the facts. Telling them the facts makes them believe the lies more strongly.
It's a typical example. Anything partisan would be the same. Anti-vax, climate change, flat earth, moon landing etc. How can you not know this?
Re: (Score:2, Interesting)
It's a typical example.
No, it is an extreme outlier, the exact opposite of a typical example.
How can you not know this?
You are taking the most extreme 0.01%, declaring them to be "typical", and then wondering why everyone does see your absurd generalization as a "fact".
Re: (Score:2, Insightful)
No they are extremely common, meme worthy, they are so common...
How on Earth do you think otherwise...
Re: (Score:1)
> . It's called the Backfire Effect [jstor.org]. In short, when a person is presented with facts which contradict their beliefs, the person will dig in their heels and disregard the facts.
So if I say the NORMAL annual turnover of people is ~0.86%, and it increased by ~0.14% in 2020 - this only causes people to dig in more and claim 0.14% is a huge number of extra deaths?
My anecdotal evidence seems to validate your theory.
Re: (Score:2)
If the former, that's in fact a fucking huge increase (16.2%)
Being the latter is *highly* unlikely, I'm betting it's the former, and you just suck at math. I would love to be corrected, though.
Re: (Score:2)
Re: (Score:1)
> If the former, that's in fact a fucking huge increase (16.2%)
If your flaccid dick is 2 inches long AND with an erection it is 2.5 inches... I assure you, no one thinks you have a big dick.
Small changes always look exaggerated when applied to small numbers.
Re: (Score:2)
If your flaccid dick is 2 inches long AND with an erection it is 2.5 inches... I assure you, no one thinks you have a big dick.
Sure, that argument definitely holds water.
However, the discussion was whether or not it represented a marked increase.
16.2% is a marked increase. It's a huge increase.
Is it still a large number, in absolute terms? Of course not.
This isn't the black plague- it isn't wiping out a large portion of human civilization as we know it, but it still represents a huge increase from standard mortality.
Re: Yes, we know (Score:1)
And, as shown by the continued defence of Rebekah Jones on this website, Slashdot.
It is worse than that (Score:5, Insightful)
...when a person is presented with facts which contradict their beliefs, the person will dig in their heels and disregard the facts.
Arguing with people on social media doesn't only have that backfire effect, it drags up engagement metrics which means Twitter is more likely to present it to other users!!!
Twitter wins when a bunch of people get angry and pile on something, so of course the algorithms motivation is to show things that will make them angry.
Re: (Score:2)
Re: (Score:2)
> Arguing with people on social media doesn't only have that backfire effect, it drags up engagement metrics which means Twitter is more likely to present it to other users!!!
No it doesn't. /s
Re: (Score:3, Insightful)
It's well known and well covered. It's called the Backfire Effect [jstor.org]. In short, when a person is presented with facts which contradict their beliefs, the person will dig in their heels and disregard the facts.
Clearly this will apply to the bubbling intellectuals on Twitter. And Facebook. And Gab. And Parler.
On the other hand, "Physics moves forward one funeral at a time". So its not just non-scientists who are prone to be stuck in their ways and not respond well to new evidence. Part of this is the overreach of some people/scientists who believe they can predict things they just can't predict (often because they involves things they are not familiar with). Every physicist I have ever heard talk about energy production is informed, an expert and about 90% of the time they are wrong. Why? Because there are
Re:Yes, we know (Score:5, Insightful)
Squelching debates (which is what this misinformation prevention is about) is a terrible, terrible idea and that should be very apparent to thinking people these days.
No.
One, some viewpoints are not worthy of serious debate by serious people. Do we need to "look at both sides" of the "debate" on whether the Holocaust happened? Do we take the views of people like David Irving [wikipedia.org] seriously? Absolutely not. Those people are best ridiculed, ignored for their own good and, in some particularly rare and egregious cases [wikipedia.org], criminally prosecuted.
Two, we're not talking about serious people trying to engage in a serious debate, but rather about deliberate, harmful disinformation promulgated by a small number of people acting in bad faith and with intent to cause harm [slashdot.org].
Re: (Score:3)
Very much this. There is, of course, areas where Science is not well verified or results are tentative, but there are vast areas were it is well verified and extraordinary proof is required to even start a debate. Anti-vaxxers, flat-earthers, climate-change deniers, religious people, etc. all do not have a leg to stand on. They live in their personal fantasy and their views do _not_ merit consideration.
Re: (Score:2)
Three, social media, the outrage media, the entertainment media, and even the mainstream media are all making money off of this. They have zero reason to suppress it.
Four, evil but clever people have figured out that they can use these faulty viewpoints as a purity test for belonging to their group, effectively trapping people in believing or being outcast.
We can't squelch dishonest debate when there's money to be made from it, and it's tied to a sense of belonging. We'd need a societal shift where people f
Re:Yes, we know (Score:5, Insightful)
Squelching debates (which is what this misinformation prevention is about)
No, it is absolutely not about "squelching debates". A debate requires some ability and willingness to listen to arguments, reason about them and consider the other point of view. That's not what QAnon followers do at all. Disinformation isn't opposing information. It's literally the opposite of information. It strives to cloud the picture, to drown debate in nonsense. If you think you need to be allowed to convince others that Covid is in any way related to 5G and that the vaccines exist to genetically manipulate people or control them through computer chips, and that you have "evidence" of that, then you are the kind of blithering idiot who must be opposed not to convince you, but to warn others of your stupidity.
Re: Yes, we know (Score:1)
Re: (Score:2)
accountability.
there was a time when that was relevant.
but one only consider the spanish american war to see what yellow press can accomplish
Re: (Score:2)
This is more of a case of the "epic rebuke" effect. The way twitter works is that an epic rebuke like for example this one: https://twitter.com/endek_pl/s... [twitter.com] ends up being reposted, promoted shared on other media and so on.
As a result the parent post gets promoted too. Law of unintended consequences.
Re: (Score:2, Flamebait)
It's well known and well covered. It's called the Backfire Effect [jstor.org]. In short, when a person is presented with facts which contradict their beliefs, the person will dig in their heels and disregard the facts.
Clearly this will apply to the bubbling intellectuals on Twitter. And Facebook. And Gab. And Parler.
Without it, religion, flat-earthers, anti-vaxxers, climate-change deniers, trumpists, etc. would all not exist.
Re: (Score:1)
Pretty Much (Score:5, Insightful)
Re: (Score:1)
Re: (Score:2)
>People's behavior are rewarded for things that would make you a pariah in regular and common social interactions.
there's a reason they're called "twits" . . .
hawk
No it doesn't! (Score:5, Funny)
When people spread lies online, the absolute best and most effective thing to do is post responses in which you definitively prove them wrong and insult them a lot. The more profanity the better.
This always shuts them right up.
We have plenty of evidence right here on Slashdot to prove it.
Re: (Score:1)
Re: (Score:2, Flamebait)
I definitely don't feel like I am in a hall of mirrors right now.
Re:No it doesn't! (Score:4, Interesting)
Besides, what's the alternative? Outright censorship of anything that gets deemed to be false, the inability to respond to anyone and engage in discourse less you disagree with them, banning people from posting online entirely?
Re:No it doesn't! (Score:4, Funny)
Dont feed the bears. (Score:4, Interesting)
" Come see the violence inherent in the system! Help! Help! Im being repressed! Did you see that? Did you see em repressin me? Thats what Im on about. "
Even monty python understood this over 4 decades ago!
Re: (Score:2)
"Dont feed the bears. "
So we need to shoot people on Twitter. Got it.
Re: Dont feed the bears. (Score:2)
And yet (Score:2, Insightful)
I've been saying this for years: Trump, QAnon, et al are not here to convince you. They are here to make noise. Their goal is to make you respond. Like a bully.
The best strategy is to ignore their bullshit, and use just appropriate legal mechanisms to report all behavior that is reportable.
Re: (Score:1, Insightful)
Re: (Score:2)
Re: (Score:1)
Obama, Biden, AOC, Kamala, Pelosi
People complain about Congress not doing anything.
I think it's
They ran a study?!? (Score:2)
This is one of those articles where my mind just boggles that someone felt they had to run a study on the subject.
In other news, water is wet at room temperature.
Re: (Score:2)
In other news, water is wet at room temperature.
Below a pressure of 17 mm of Hg (0.023 Atm) water is a gas at room temperature.
Duty calls. (Score:2)
According to a new study conducted by researchers at MIT, being corrected online just makes the original posters more toxic and obnoxious.
Duty calls. [xkcd.com]
Post-rational. (Score:4, Insightful)
This isn't a everywhere, or forever thing.
It's just the way the culture of the place is right at this time.
That is - it's considered fashionable to avoid conflict by dismissing the other person and moving on - and people that don't do that are considered prey for mockery.
It's a kind of post-rational culture on Twitter - which is why I never started with it myself, seeing how conversations went. You still can't escape folks posting links to threads, in a large percent of other media though.
Lots of cultures hit points like that, where arguments become largely intentionally useless - especially ones with an assumption of mostly anonymity and relative impermanence.
So, confrontation there isn't a matter of winning some fight - but getting out with the least complications.
It's more a commentary on the uselessness of most emotions invested into Twitter than in being able to argue for rational skepticism or something.
Ryan Fenton
Re: (Score:2)
This isn't a everywhere, or forever thing. It's just the way the culture of the place is right at this time. That is - it's considered fashionable to avoid conflict by dismissing the other person and moving on - and people that don't do that are considered prey for mockery.
From the Old Testament, Proverbs 26:4, "Do not answer a fool according to his folly, or you yourself will be just like him."
1878 quote, "Don’t argue with a fool, or the listener will say there is a pair of you." (ascribed to Mark Twain as "Never argue with a fool; onlookers may not be able to tell the difference.")
I disagree that it's a current-culture thing. We've always had the admonition not to argue with fools. This admonition has in the past been based on you looking foolish too. What the researc
Re: (Score:3)
Your hidden assumption is that the disinformation spreader is a fool. On the contrary, Sartre noted that they may know exactly what they are on about:
"Never believe that anti-Semites are completely unaware of the absurdity of their replies. They know that their remarks are frivolous, open to challenge. But they are amusing themselves, for it is their adversary who is obliged to use words responsibly, since he believes in words. The anti-Semites have the right to play. They even like to play with discourse f
This is the kind of thing that happens... (Score:2)
Keep asking questions (Score:5, Interesting)
Having met and "debated" 3 or 4 actual conspiracy theorists face to face I have found the best weapon is not studies or facts or data, they don't care about that. You just keep asking "who" and "why" after every claim and they will go deeper and deeper until eventually the engine sputters out from lack of fuel. All conspiracy theories will hit several points of contradiction at their core. Does this change their minds? Almost certainly never when you are a stranger to them. Only people close to them can ever really turn them around long term.
Also most of them are not "wrong" about the core issues. There are evil "cabals" of "wealthy elites" who control politicians and the economy but they are just the straight up capitalists and corporations and they don't need some elaborate scheme, it's just business as usual all right out in the open. The difference now is while they used to be "X-Files lone gunmen" type libertarians or anarchists in thought now they are just about all intertwined with right-wing politics and a core contradiction is the very politicians who feed their theories credence and they end up supporting are the ones perpetuating the actual problems in the world they see.
Re: (Score:2)
All true except for the implication that left-wing politics are somehow not embroiled with the "capitalists and corporations" that constitute the controlling wealthy elite.
Corruption is non-partisan. All political groups are equally guilty of it (more or less in proportion to the level of political clout they actually have). All political extremes are equally harmful, too.
Sounds like you may have been taken in by the pleasing words of a group that panders to your specific bias.
Re: (Score:3)
Sure, I never meant to imply there's no conspiracy thinking on the left end of the spectrum. The hard-socialist/communist wing tends to see "CIA everywhere" when dealing with any other left wing country who struggles, which is not unfounded but also not to the extent they profess it is. You could also make a case for many people Russia obsession, they think all opposition is Russian troll farms, which is not unfounded but they exaggerate it's reach too much sometimes. There's also a minor plurality but si
Re: (Score:2)
He got all flustered and went away. And stopped pestering me with his conspiracy theories.
He didn't stop believing in what he believed it, but at least he didn't waste my time anymore.
Don't feed the trolls
But, to be fair, this is a good practice in any disagreement. It's less confrontational
spread of disinformation is the result of decades (Score:3)
Re:spread of disinformation is the result of decad (Score:4, Interesting)
of bad education that fosters data retention and repetition over self thinking. promote better education and you'll see "didinformation" fizzle away, also it'll be harder to manipulate the masses
Sorry, I wish it was that simple, but it isn't true.
I was horrified to see that the assistant professor I was assigned to 20 years ago for my master thesis, now is one of the worst climate deniers I have ever come across. He is highly educated and very well-respected authority in his field, but when it comes to climate change he is completely ignorant, fighting that it exists, tooth and nail, with arguments that makes me nauseous.
I have also examples of people with a very low educational background but heaps of common sense that laugh at disinformation and identifies it right way.
So even though I think education is a good thing (I am an educator myself), it simply isn't a silver bullet in this case.
Comment removed (Score:3)
Re: (Score:2)
It's not about those people (Score:4, Interesting)
Trying to get everyone to agree is the wrong approach. It's impossible. The QAnon folks are a lost cause. A quarter to a third of people are too dumb to understand a moderately complex explanation. They can only be persuaded, but this never works if being persuaded means doing something they don't want to do (stop smoking, mask wearing, whatever). Policy needs to take this into account and make following the wrong path individually painful. The damage they do needs to be turned on them. There is a reason the law isn't just a book explaining why you shouldn't do some things.
Re:Blame the media and their hyperpartisanship (Score:4, Insightful)
link to an "unbiased" source such as the NY Times or Washington Post...
Lost me right there. The WaPo and the NYT are legitimate journalistic organisations that actively make an effort to be truthful and objective. Fox News, OANN, Newsmax, and Sidney Powell, OTOH, are trying to avoid being put out of business by Dominion, et al., by claiming in court documents that they are purveyors of entertainment that no reasonable person would take seriously. There is simply no comparison between the two. So yeah, take your false equivalence and stick it where the sun don't shine.
Re:Blame the media and their hyperpartisanship (Score:4, Insightful)
WaPo and NYT are media organizations, yes - but you are going to need to define "legitimate" and "journalism" in a purely objective way
I'm not sure how to define legitimate journalism in a purely objective way, but it's pretty clear that anyone who argues in court that they are pure entertainment which no one could possibly take seriously is not a legitimate journalist.
Not always true (Score:2)
Priming Studies (Score:2)
we found that a simple accuracy nudge -- asking people to judge the accuracy of a random headline -- improved the quality of the news they shared afterward
This kind of study -- a "simple... nudge" in some regard that has long-term effects afterward -- is called a "priming effect" and generally they're all bullshit. Bullshit that's cheap, from which it's easy and fast to make a study/paper, and hence get a PhD/psychology faculty position. From Wikipedia [wikipedia.org]:
Although semantic, associative, and form priming are well established, some longer-term priming effects were not replicated in further studies, casting doubt on their effectiveness or even existence. Nobel laureate and psychologist Daniel Kahneman has called on priming researchers to check the robustness of their findings in an open letter to the community, claiming that priming has become a "poster child for doubts about the integrity of psychological research." Other critics have asserted that priming studies suffer from major publication bias, experimenter effect and that criticism of the field is not dealt with constructively.
True (Score:2)
That's because they're on Moscow's payroll (Score:1)
Sounds farfetched? It isn't. Former UK Labour leader and would-be Prime Minister Michael Foot (a/k/a Agent Boot) took money from the KGB to publish Soviet propaganda* [thesun.co.uk].
*Yes, I know The Sun is a tabloid and thus not normally a reputable source, but this article's reporting on the content of Ben MacIntyre's book is accurate [amazon.com]. IOW, a stopped clock is right twice a day.
Here's another article [theconversation.com] about Foot taking money from the KGB.
Missing the point (Score:2)
Wrong! (Score:2)
"You're wrong!" doesn't work so well. This and a squirrel that chews bread into the face of Jesus at 11!
I hope they're wrong (Score:1)
They looked only at short term effects. There's still hope that the long term effects of challenging lies directly is salutary.
Otherwise free speech doesn't work and we're all screwed.
Trolls want you to engage them (Score:4)
Fact: trolls want you to engage them, that's how they get their jollies.
Fact: since the above are both true, society is presented with a quandary: people who actually believe the bullshit they're posting want to silence everyone who disagrees with them (which means their damage is allowed to spread unchecked), but if they're actually trolls, not engaging them is the only way to actually get them to knock it off (they'll get bored and go away if no one responds to them).
This is why so-called 'social media' is such a failed experiment: it gives a wide-reaching platform for both of the above, but does precisely nothing to discourage them. Moderation on these platforms is virtually impossible due to how much human labor is required to manage them.
'Social media' should be, in my opinion, abolished outright. We humans are clearly too irresponsible to have such a powerful tool at our fingertips, and the proof of that is right before everyones' eyes. Yes, that's right, I'm advocating 'throwing the baby out with the bathwater', but I don't see any way to fix the problem and keep 'social media' platforms around.