Researchers want the public to test themselves: https://yourmist.streamlit.app/. Selecting true or false against 20 headlines gives the user a set of scores and a “resilience” ranking that compares them to the wider U.S. population. It takes less than two minutes to complete.
The paper
Edit: the article might be misrepresenting the study and its findings, so it’s worth checking the paper itself. (See @realChem 's comment in the thread).
I’m not sure this is a good study. I mean I scored 85% so woohoo but you just get headlines to go off. The art of noticing disinformation is in reading articles and making inferences on them. Questions like “vaccines contain harmful chemicals” are obvious red flags but there are some that are a reasonable-sounding headline but I’d imagine the article itself would fall apart on first reading. I know half the problem is people don’t read articles but this is a very simplistic survey.
It is, and I feel the questions are quite obvious.
That being said… I’m related to conspiracy theorists. I got a first-row seat to their dumbassery on facebook before I deleted my account. And… a significant issue was paywalled articles with clickbait titles, during Covid especially. The title was a doubt-inducing questions, such as “Do vaccines make you magnetic?” and the reasoning disproving that was locked behind the paywall. And my relatives used those as confirmation that their views were true. Because the headlines introduced doubt and the content wasn’t readable. That and satire articles.
Not only is it not good, I’d dare to say it’s awful. Never mind that the headlines themselves are terribly crafted: the entire point is that one has to be critical of sources, and not take everything at face value just because it sounds somewhat convincing. It’s not about blatantly discrediting things at face value because they don’t fit what you believed to be true.
By the standards of this test, headlines such as “The CIA Subjected African-Amercians to LSD for 77 Consecutive Days in Experiment” would clearly belong in the fake news category. And if it’s supposed to test whether the (presumably American) respondent has decent insight into the realities of contemporary politics, why in the world would it include something as obscure as “Morocco’s King Appoints Committee Chief to Fight Poverty and Inequality”. There’s literally no way of knowing without context whether the associated article would be propaganda or just an obscure piece of foreign correspondence. Many of the “true” headlines are still things one shouldn’t take for granted without checking sources, and many of the “fake” ones are cartoonish.
It’s just bad research.
Edit: Rather than bad research, it seems it might be badly misrepresented. The article itself appears completely different from what is reported in the linked article. I’m still, however, not entirely convinced by the approach using AI generated headlines.
A common tactic I’ve seen in news headlines is referencing substances that can harm a human without indicating that in the quantities that they are present, they are not a concern. I’m not sure what the right answer would be to the vaccines question given that. If that is the case there, it may be true but misleading.
Somehow I got 100%, but it was mainly luck. I really have no clue what % support marijuana is in the US, how young Americans feel about global warming, or how globally respected they feel. I’m not from there, so I don’t follow it at all. I think it would’ve been better if they had an “I don’t know / Irrelevant to me” option.
It’s exactly those “reasonable” sounding headlines (and in some cases the ideas and opinions that back them up in the body of the article, but that has to be provided for it to relevant, which as you point out isn’t, which is a big problem) that serve as misinformation and/or dog whistles, so “vaccines contain harmful chemicals” could be aimed at antivaxxers (and those susceptible to being pushed there), but it’s also technically correct, for example apples and bananas contain “harmful chemicals” too.
The article could be either fear mongering and disinformation - false, or science based and educational - true, but we can’t know which just from the headline.
A headline like “small group of people control the global media and economy” could be a dog whistle for antisemitism - false, or be an observation of life on earth right now - truth.
My point is there are headlines that would seem like conspiracy theory to some, but irrefutable fact to others, and probably the opposite of each to each respective group, and without more than a headline (and often even with, of course), it’s entirely down to the readers’ existing opinions and biases.
Not a great way to test this.
Maybe they targeted redditors specifically.
Just took a look here, and yeah. One of the headlines they ask you to rate is “Hyatt Will Remove Small Bottles from Hotel Bathrooms”. It’s the kind of thing that’s basically a coin flip. Without having any context into the story, I have no opinion on whether it’s fake or not. I don’t think guessing incorrectly on this one would indicate somebody is any more or less susceptible to miscategorizing stories as real/fake.
I assume the idea is to include some pointless headlines (such as this) in order to provide some sort of baseline. The researcher probably extracts several dimensions from the variables, and I assume this headline would feed into a “general scepticism” variable that measures he likelihood that the respondent will lean towards things being fake rather than real.
Still, I’m not at all convinced about this research design.
I suspect that where you select on the extremely liberal to extremely conservative spectrum might have a correlation to which fake news titles you fall for. What sounds like obvious propaganda to you may sound like any news article that some may see from a more sensationalist less reliable news source, especially to those predisposed to conspiracy theories.