Teaching

Some Surprises in Report on Youth Information Evaluation?

By Ian Quillen — February 29, 2012 2 min read
  • Save to favorites
  • Print

So I just finished browsing through the nine-page executive summary (yes, executive summary) of a literature review exploring how Internet users age 18 and under search and evaluate information on the expansive and occasionally overwhelming World Wide Web.

Its findings are, well, expansive and occasionally overwhelming.

All kidding aside, like the Internet, there’s a lot of good stuff in the report out of Youth and Digital Media project, an endeavor from Harvard University’s Berkman Center for Internet and Society. In looking at literature “at the intersection of research areas concerning digital media, youth, and information quality,” to ask how young Web users define information quality, how that differs from adult definitions, how those definitions influence how youths create online information, and how they learn their information- gathering practices, the analysis highlights some surprising conclusions.

For example:

• Youth Web-searching practices may vary by gender, socioeconomic status, friend networks, age, and ethnicity;
• Students are more likely to trust cites that have a .org or a .gov suffix than a .com site;
• Male students may be more likely to view a website as credible than female students;
• Norms in youths’ personal content creation may clash with norms of a classroom, making the transfer of applicable skills to academic work more difficult;
• There is little research on how parental involvement effects how students evaluate information; and
• Digital literacy education for most students is based largely on the search action, whereas a lot of research focuses on student evaluation outside of the search process.

The report also argues for more research on how creative interaction with information shapes opinions on quality, how socioeconomic, racial, and gender differences effect information gathering practices, how to transform personal and social content creation effectively into an academic context, and how to test whether outreach on digital literacy is working. Its authors see the analysis as a starting point, with findings that shouldn’t necessarily be generalized.

“We wrote this paper to initiate the policy discussion by distilling important findings in the literature and highlighting areas of future research,” reads the last paragraph of executive summary. “We have defined an information quality framework that emphasizes the imperative of involving in this policy discussion the participation of all stakeholders, including policymakers, technology developers, educators, parents, and youth.”

Hopefully, those stakeholders will have the common sense to see this blog post as a credible source of information. The .org in edweek.org should help.

Related Tags:

A version of this news article first appeared in the Digital Education blog.