November 18, 2019

New Report from Stanford on the Trouble Students (K-12 and Higher Ed) Have Judging The Credibility Of Information Online

A comment by Gary Price, Founder/Editor of infoDOCKET.

Most of what the study finds is not very surprising.  I’ve been hearing from members of the K-12 and academic library communities for a number of years with reports and anecdotes that are similar to what’s discussed in the report. It’s likely you’ve also heard these reports. 

What this report  gives the library community is yet ANOTHER opportunity (digital privacy is one of many other examples) to demonstrate (with a popular and important topic) the relevancy of librarians, library resources, and libraries by informing and educating students and the general public for that matter on how to judge info credibility, online or in print. This is a topic that is central to what we do.

Finally, it might be a good idea and go back and ask what the library community could have done better to this point when it comes to this and other topics. Why? So it doesn’t happen again with other important issues and topics moving forward.

Now, to the report.

From the Stanford Graduate School of Education:

When it comes to evaluating information that flows across social channels or pops up in a Google search, young and otherwise digital-savvy students can easily be duped, finds a new report from researchers at Stanford Graduate School of Education.

The report, released this week by the Stanford History Education Group (SHEG), shows a dismaying inability by students to reason about information they see on the Internet, the authors said. Students, for example, had a hard time distinguishing advertisements from news articles or identifying where information came from.

2016-11-22_14-32-03“Many people assume that because young people are fluent in social media they are equally perceptive about what they find there,” said Professor Sam Wineburg, the lead author of the report and founder of SHEG. “Our work shows the opposite to be true.”

[Clip]

The new report covered news literacy, as well as students’ ability to judge Facebook and Twitter feeds, comments left in readers’ forums on news sites, blog posts, photographs and other digital messages that shape public opinion.

The assessments reflected key understandings the students should possess such as being able to find out who wrote a story and whether that source is credible. The authors drew on the expertise of teachers, university researchers, librarians and news experts to come up with 15 age-appropriate tests — five each for middle school, high school and college levels.

[Clip]

The new report covered news literacy, as well as students’ ability to judge Facebook and Twitter feeds, comments left in readers’ forums on news sites, blog posts, photographs and other digital messages that shape public opinion.

The assessments reflected key understandings the students should possess such as being able to find out who wrote a story and whether that source is credible. The authors drew on the expertise of teachers, university researchers, librarians and news experts to come up with 15 age-appropriate tests — five each for middle school, high school and college levels.

“In every case and at every level, we were taken aback by students’ lack of preparation,” the authors wrote.

In middle school they tested basic skills, such as the trustworthiness of different tweets or articles.

One assessment required middle schoolers to explain why they might not trust an article on financial planning that was written by a bank executive and sponsored by a bank. The researchers found that many students did not cite authorship or article sponsorship as key reasons for not believing the article.

Another assessment had middle school students look at the homepage of Slate. They were asked to identify certain bits of content as either news stories or advertisements. The students were able to identify a traditional ad — one with a coupon code — from a news story pretty easily. But of the 203 students surveyed, more than 80 percent believed a native ad, identified with the words “sponsored content,” was a real news story.

The assessments at the college level focused on more complex reasoning. Researchers required students to evaluate information they received from Google searches, contending that open Internet searches turn up contradictory results that routinely mix fact with falsehood.

[Clip]

In another assessment, college students had to evaluate website credibility. The researchers found that high production values, links to reputable news organizations and polished “About” pages had the ability to sway students into believing without very much skepticism the contents of the site.

The assessments were administered to students across 12 states. In total, the researchers collected and analyzed 7,804 student responses. Field-testing included under-resourced schools in Los Angeles and well-resourced schools in the Minneapolis suburbs. College assessments were administered at six different universities.

Additional Findings in the News Release

Direct to Complete Executive Summary (29 pages; PDF)
Includes examples of assessments used.

Gary Price About Gary Price

Gary Price (gprice@mediasourceinc.com) is a librarian, writer, consultant, and frequent conference speaker based in the Washington D.C. metro area. Before launching INFOdocket, Price and Shirl Kennedy were the founders and senior editors at ResourceShelf and DocuTicker for 10 years. From 2006-2009 he was Director of Online Information Services at Ask.com, and is currently a contributing editor at Search Engine Land.

Share