Shifting attention to accuracy can reduce misinformation online
Post number #746326, ID: 511809
|
In recent years, there has been a great deal of concern about the proliferation of false and misleading news on social media. Academics and practitioners alike have asked why people share such misinformation, and sought solutions to reduce the sharing of misinformation. Here, we attempt to address both of these questions. First, we find that the veracity of headlines has little effect on sharing intentions, despite having a large effect on judgments of accuracy.
Post number #746327, ID: 511809
|
This dissociation suggests that sharing does not necessarily indicate belief. Nonetheless, most participants say it is important to share only accurate news. To shed light on this apparent contradiction, we carried out four survey experiments and a field experiment; the results show that subtly shifting attention to accuracy increases the quality of news that people subsequently share.
Post number #746328, ID: 511809
|
Together with additional computational analyses, these findings indicate that people often share misinformation because their attention is focused on factors other than accuracy -- and therefore they fail to implement a strongly held preference for accurate sharing. Our results challenge the popular claim that people value partisanship over accuracy, and provide evidence for scalable attention-based interventions.
Post number #746329, ID: 511809
|
It's a long but very interesting read.
Post number #746456, ID: 906196
|
Good thread tbh, thanks OP.
Post number #746476, ID: 6dc821
|
>People are more likely to share misinformation if they'll get uplikes for it Finally, science has caught up to common knowledge!
Jokes aside, I hope this gets some visibility and results. Twitter has actually been doing a decent job, by which I mean better than nothing, at improving this part of their platform recently, so I hope they take this into consideration.
Post number #746480, ID: 787da9
|
sounds dangerously like opinion manipulation and censorship tendencies...
Post number #746490, ID: 6dc821
|
>>746480 If asking "is this link/article true?" to posters counts as censorship, it's undeniably the most benign form of it. It still lies on the poster to make the decision of whether on not to post. In fact, it even encourages thought towards the decision. That being said, this would be biasing posts towards posts that are valued for their accuracy than posts that are valued for their comedy. So you aren't entirely wrong. Our shitposting culture might suffer from the change...
Post number #746491, ID: 6dc821
|
which would definitely be a loss that potentially outweighs the benefits.
Total number of posts: 9,
last modified on:
Tue Jan 1 00:00:00 1616092475
| In recent years, there has been a great deal of concern about the proliferation of false and misleading news on social media. Academics and practitioners alike have asked why people share such misinformation, and sought solutions to reduce the sharing of misinformation. Here, we attempt to address both of these questions. First, we find that the veracity of headlines has little effect on sharing intentions, despite having a large effect on judgments of accuracy.