Nov 29 2016

Civic Online Reasoning

nuclearflowersA recent study adds some empirical data to the current discussions regarding online information. This Stanford University study looked at 7,804 student responses across 12 states, divided among middle school, high school, and college students. The goal of the study was to see if these students could distinguish reliable sources of information from fake or unreliable sources.

Their conclusion?

Overall, young people’s ability to reason about the information on the Internet can be summed up in one word: bleak.

Although students grew up in the internet and social media age, and are very skilled at using online resources, they apparently have not developed the skills to critically evaluate the information they are finding online.

The authors echo what I and many others have pointed out, that while the internet is a great source of information, it is largely a source without editorial filters. As I recently discussed, this has led to a range of outlets including high quality journalism, low quality journalism, advocacy sites, biased sites, advertising, opinion, and fake sites that exist only to drive clicks. Since you no longer need a large infrastructure, or years to build up a reputation and circulation, in order to publish articles that then get shared on social media as news, every kind of information is jumbled together and it is up to the reader to discriminate.

The authors looked at five tasks for each school level that they felt was appropriate for that level. Here are some example results:

For middle school students they showed them the front page of and asked them if specific articles were an advertisement or not. The students were able to recognize a real news article as news, and a straightforward ad as an ad. However, more than 80% of students thought that a “sponsored content” article (that was clearly labeled as such) was real news and not an advertisement. Many students noted that it was sponsored content but still thought it was news, indicating they don’t know what sponsored content is.

One of the tasks for the highschool students was to evaluate a photo uploaded to Imgur. The picture of flowers shown above was offered as evidence of the effect of radiation at the Fukushima nuclear plants. Less than 20% of the students appropriately questioned the utility of the photo as evidence and the source of the photo. Obviously the photo is only a close up of flowers, and there is nothing in the photo itself to tie it to Fukushima.

As an example of the college-level tasks, they showed the students a tweet with a link to regarding a poll about gun regulations conducted by the Center for American Progress, and asked how reliable this information is. The question was whether or not the students would recognize that the two sources of information are both liberal advocacy groups with a clear political agenda regarding gun regulations.

Let me give their full explanation of the results since it is a little involved:

We piloted this task with 44 undergraduate students at three universities. Results indicated that students struggled to evaluate tweets. Only a few students noted that the tweet was based on a poll conducted by a professional polling firm and explained why this would make the tweet a stronger source of information. Similarly, less than a third of students fully explained how the political agendas of and the Center for American Progress might influence the content of the tweet. Many students made broad statements about the limitations of polling or the dangers of social media content instead of investigating the particulars of the organizations involved in this tweet.

An interesting trend that emerged from our think-aloud interviews was that more than half of students failed to click on the link provided within the tweet. Some of these students did not click on any links and simply scrolled up and down within the tweet. Other students tried to do outside web searches. However, searches for “CAP” (the Center for American Progress’s acronym, which is included in the tweet’s graphic) did not produce useful information. Together these results suggest that students need further instruction in how best to navigate social media content, particularly when that content comes from a source with a clear political agenda.

The students have some elements of critical analysis, but most did not fully recognize the roll of political bias and less than half the students clicked the link to go to the original source (remember, they were specifically asked to evaluate the credibility of the source).

Overall I agree with the authors, the results are “bleak.”

When reading the actual student responses, I was often struck by how familiar their reasoning was to what I witnessed in my daughters as they are progressing through school. I could see that the student was reverting to responses that they were probably taught in school – doing the task they thought they were supposed to do, and not truly understanding the task or thinking clearly about it.

This to me suggests that students often learn the format of scholarship in school, but don’t truly understand it, and therefore cannot apply it to the real world. They were reaching for some task they were taught in school and then applied it even when it was not appropriate.

This is an area where education can have a clear impact. Overall I think the educational system needs to do a better job of getting students to truly own key intellectual skills and concepts, rather than just go through the motions. Specifically, however, I think that students should be directly taught how to evaluate the credibility of information online and elsewhere. This is now a critical life skill, and obviously students are not just absorbing it or extrapolating these skills from other contexts.

Imagine a dedicated course that taught civic scientific and critical thinking literacy. This should be given in middle school, and again in high school. This could easily be a fun and engaging course, that focuses on applying knowledge to the real world, self-learning, evaluating information, and thinking critically. This one course could arguably be more valuable to the average student than the rest of their curriculum combined.

Clearly something like this is needed. We now have some objective data to support this conclusion.

Like this post? Share it!

16 responses so far