Monthly Archives: October 2014

How I learned to Stop Worrying about Digital Natives and love V&R

Dr._Strangelove

Those of you familiar with me on Twitter know that I’ve got frequent rants against Digital Natives in my feed, and I’ve been indulging in such rants more often lately, as there’s been a rash of people on the internet and in person invoking that particular trope.
I’m not going to spend time here debunking the Digital Natives thing, I’ve done actual work that helps deconstruct it, and that offers an alternative.

I think (it should be clear) that it’s important to stop thinking in terms of Digital Natives, and to stop giving an eye-rolley pass to people who do.  I’m writing this at least in part to have something to link to when I don’t want to write any new rants about this.  It’s not just that the Digital Natives thing is wrong, it’s about what is at stake if we continue to allow it to ease its way into conversations about pedagogy and technology.   Alternatives, whether Visitors and Residents or otherwise (though I’m fond of the former), are just a more ethical way to go.

First of all, the Digital Natives construction assumes that you know 1) that people of a certain age are engaged with technology, and 2) that you know why. All that is left to do, under that model, is count how many people have what kind of tech. Framings such as V&R insist on engagement with the qualitative data, with the complex behaviors of people, so that we can understand what is going on, not sidestep understanding via quantification.

Second, the cliche, lazy generational generalizing that Digital Natives indulges in is an imagining of a seamless present, wherein the mere presence of technology results in expertise that is untaught, in fact fundamentally unteachable, and therefore pre-existing, and something to expect from students of a Certain Age. This is more dangerous than the seamless future that so many of us imagine (of education, of libraries, of ubiquitous computing–I’ve been reading Dourish and Bell’s Divining a Digital Future and am clearly influenced by it here). Future thinking is unfortunate because in part it encourages a neglect of the complicated and messy (and interesting!) present. It’s easier to think and talk about a future where the current problems with which we wrestle are fixed (jet packs!).

It’s far more challenging to confront the present.

And the present being confronted should be a finely and accurately observed and described one, not an imagined present. Digital Natives hands us an imagined present wherein Kids These Days Can Just Do Technology.  It is a tailor-made justification to neglect a responsibility to the people we need to teach. The cliche suggests that we can ignore the messy complicated present where the ubiquity of computers still does not automatically (automagically, to steal from @audreywatters) produce any sort of literacy or critical thinking. We have a situation right here in front of us where engagement with technology is not pre-determined by age, but by a complex interaction of identity, economic class, privilege, power, and a host of other factors that enable or restrict people. Let’s talk about that, not about how young people Get It and older people Don’t.

I think as a construction, given all we know about the whys and hows of people’s motivations to engage (or not) with technology and information, Digital Natives is thoroughly reprehensible. It’s not borne out in the research, and it actively encourages practitioners not to teach things, because “they should already know this stuff.”

So we should dispense with it, stop referring to it, bury it deep. Move on.
Please please please.

Dr._Strangelove_-_Riding_the_Bomb

References:

Dourish, Paul, and Genevieve Bell. 2011. Divining a digital future: Mess and Mythology in Ubiquitous Computing. Cambridge, MA: MIT Press.

Images:  Wikipedia

Guest Blog: Beth Martin and Heather McCullough, Assessment Beyond Counting

Recently my colleagues Beth Martin and Heather McCullough presented at Educause on our assessment agenda in Atkins Library.  I love this work, I am delighted to work with them here at UNC Charlotte.  I’m sharing the poster here because I think it’s terrifically important to get as many voices as possible into the conversation about what assessment is, could be, and should be for, in Academic Libraries, and in Higher Education generally.  I have blogged about the Mixed-Method, Interdisciplinary library before, and I think that this poster is a nice example of how we are going to operationalize that at Atkins, with partnerships within and outside of the library.

I’m including a summary in this post of their main points, and a link to their poster.

Making Big Data Small

Click to get to .pdf

Educational Analytics and Libraries

Educational Analytics in this context encompasses both Learning Analytics and Academic Analytics.

Learning Analytics focuses on data about the learning process while academic analytics focuses on data about the institution.1

 UNC Charlotte Atkins Library is putting together an analytics initiative that will explore both learning and academic analytics through a library lens.

 

The Initiative

 

What we do now What we want to do How we will move forward
Count items and people How are patrons using the library? Discover what data we have using the Data Asset Framework Methodology2
Qualitative space assessment Explore impact of services across gender, race, ethnicity, major, grade level, etc. Statistical training

  • Move beyond counting
Impact of library on retention Qualitative research training

  • Make the big data relevant to our institution
  • Context
Impact of library on GPA Work across the institution using data to inform policies and practices
Compare to peer and aspirational institutions

 

 

The Tools

  •  Measurement Information Services Outcomes (MISO)
    • National Survey of libraries used to compare across institutions as well as inform internal practices
  • Integrated Library System
    • Main library system that stores usage data
  • Association of Research Libraries data
    • National library data
  • UNC Charlotte Institutional Research
    • Data analysis of all institutional research
  • Google Analytics
  • Learning Management System
    • Data on student usage
  • Library Database statistics
    • How are scholarly articles/books being used
  •  Altmetrics
    • Social and collaborative nature of institutional research
  • Integrated Postsecondary Education System (IPEDS)

 

  • Qualitative studies
    • Space use
    • Classroom use
    • Student Learning Outcomes analysis for library instruction

 

The project is in the first stage of the Data Asset Foundation methodology, which is scheduled to finish this December.  Once we have a clear picture of the available data we will look at our initial research questions to determine what we can explore with the available data and what data we may need in the future.

 

Contact Information

Beth Martin, Head of Access Services and Assessment

Atkins Library, UNC Charlotte

sarmarti@uncc.edu

 

Heather McCullough, Associate Director

Center for Teaching and Learning, UNC Charlotte

hamccull@uncc.edu

 

Citations

  1. Penetrating the Fog: Analytics in Learning and Education. (n.d.). Retrieved September 29, 2014, from http://www.educause.edu/ero/article/penetrating-fog-analytics-learning-and-education
  2.  Data Asset Framework | Digital Curation Centre. (n.d.). Retrieved September 29, 2014, from http://www.dcc.ac.uk/resources/repository-audit-and-assessment/data-asset-framework