This article in the Chronicle of Higher Education is the latest discussion of the woeful information literacy skills of contemporary undergraduates (a topic about which I have blogged already). No fewer than six of my friends and colleagues sent me the link to this article, because it’s also an excellent presentation of the ERIAL library ethnography project. My colleague Andrew Asher was one of the anthropologists participating in this large-scale study, which aimed to ground what we know about students and their interactions with libraries and library resources in their actual behavior, through open-ended interviews, participant observation, and other research instruments like photo diaries (all methods I engage in as a part of the Atkins Ethnography Project here at UNC Charlotte).
Particularly highlighted in the Chronicle’s coverage of the ERIAL project (and its upcoming publications via ALA) is the surprise that so-called “digital natives” could be so terribly unskilled at evaluating information. I don’t think we need to be surprised by students capable of googling not being aware of how to pick which source to use, any more than we would be surprised by children kept away from books all of their lives being unable to figure out what to do with all of those large blocks-full-of-paper things in the library. Digital literacy has never been the same as information literacy, and all of the digital toys in the world will not render our students (or anyone else’s students, for that matter) capable of distinguishing a reliable source from an unreliable source.
Persistent, consistent instruction in information literacy is what will give our students that skill set. And it cannot begin at university–this skill should be taught and exercised throughout K-12 education (and beyond). The testing culture of our current educational system makes critical thinking far less valued than retention and regurgitation of facts, and we are paying for that emphasis with the lack of preparation we see in our undergraduates. The idea that an undergraduate degree is “to get a job,” rather than a basis for becoming a thinking and contributing (not just in economic terms) member of society, also gets in the way of educators advocating for critical thinking in the classrooms. Some students get frustrated by it (being asked to think critically about class content, society, life in general) because they are not necessarily used to being asked to do it, and professors are frustrated by students’ frustration–why did they come to college if not to think?
I am collaborating on a project now that involves interviewing and observing high school seniors and college freshman as they look for information, academic and otherwise. My research partners and I are beginning to analyze the interview data now, and among the many striking things is the standard by which students judge information to be “reliable:” repetition. Several students say things like, “if I find it more than once on the web, I know that it’s reliable information.” Why do they think this? Where are they getting this standard of reliability? Is it possible that they’re not being told any other standards? Or are they simply assuming that the most popular Google link is popular for a fact-based reason?
I think about how students evaluate information when I see their interest in the library website providing reviews of books, articles, and other materials that they can access in our collections. They want an Amazon.com-style service whereby they can see what previous users of the materials have said about the materials, so that the students can make an informed decision about the utility of the materials for their purposes. If you think about the Amazon-style reviews, (see, for instance, the reviews of this Economics textbook), you see that the reviewers writing the “most useful” reviews are explicit about what they wanted out of the book, how the book met their needs (or didn’t), and allow the reader of the reviews to evaluate the extent to which the reviewer’s standards are the reader’s own. Something is given stars based on whether or not it met a particular user’s needs, therefore context is necessary in a review, for other users to be able to effectively evaluate the potential of an item.
What is “good,” therefore, is a subjective, shifting thing. Students who are writing five-page essays might review books as “too long” for what they need to do, and articles as “just the thing.” Graduate students working on dissertations might review books according to their theoretical perspectives. Reviews on a library web site might give students the ability to get in virtual form the kind of feedback that they already ask their peers for in person (or on facebook, via text, or via emails) about the materials they need for papers, exams, and other coursework.
Students already evaluate information in non-academic settings. They read (and act on) reviews of movies, cars, live music shows, and restaurants. They take into account who is doing the reviewing, and whether that reviewer’s perspective is relevant and informed (or not). It is not that they are utterly incapable of critical thinking. It is that they are not doing it in academic settings. They have not been trained to do it. Neither have they been told by our educational institutions (writ large) that critical thinking is terribly important.
Beefing up information literacy programs at the university level, and at K-12, would be an important first step towards remedying the problem. But the problem has other deep structural reasons for its existence, and those problems require fixes that come from outside of the educational system.