Researchers want the public to test themselves: https://yourmist.streamlit.app/. Selecting true or false against 20 headlines gives the user a set of scores and a “resilience” ranking that compares them to the wider U.S. population. It takes less than two minutes to complete.

The paper

Edit: the article might be misrepresenting the study and its findings, so it’s worth checking the paper itself. (See @realChem 's comment in the thread).

  • somefool@beehaw.orgOP
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    As a terminally online millennial, I was scared for a second, but I did okay on the test. Then again, I’m 40 and barely even qualify as ‘millennial’, and not at all as ‘young’.

    I found the language of the questions was glaringly obvious. What do you think?

    • Lvxferre@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I found the language of the questions was glaringly obvious. What do you think?

      It’s potentially on purpose, to exploit the fact that fake news have often a certain “discursive pattern”.

    • ryven@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      I thought that without any article text it was very difficult. Is the government trying to increase acceptance of genetic modification? Well, if the goal is to improve agricultural outputs they probably should be; I would expect that there is a pamphlet or website somewhere that helpfully explains that genetically modified crops are safe to consume and can be made more productive, or more resistant to pests or drought, or enhanced in other beneficial ways. That should count right?

      But I marked it as false because based on the tone I assumed the associated article would actually be about modifying babies or something. I scored 20/20, which means I was rewarded for making assumptions about the article without reading it, which is not a good method for determining the truth of an article in real life!

      Basically in order to achieve a good score, I stopped thinking about the information itself and started thinking about why the specific headlines were chosen for the test, which means they probably aren’t measuring what they believe they’re measuring.

      Edit: Thinking again, maybe the skill of guessing why researchers chose specific headlines is related to the skill of guessing which actual headlines are intentionally misleading. Guessing if the original writer/editor of a headline is trying to trick me is only one step removed from guessing why a researcher would include it in their survey. Apparently this test produces similar results to versions that have ledes, images, bylines, etc., which is interesting. Also in IRL scenarios, when I’m uncertain about an article I go looking for more info, which I didn’t do for this test because their suggestion that “it only takes 2 minutes” seemed to imply a rule against using other sites. I would be interested to see how much accuracy improves when participants are encouraged to take their time and try to find the correct answers.

    • Phanatik@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      On the tail end of the millennial generation and I scored 19/20. I think I realised what the tells were and went off that. I ended up classifying a real story as fake so I leaned slightly more on the sceptical side.