Ars Technica: "Study: Why Bother to Remember When You Can Just Use Google?"
UPDATE: We’ve added links to the full text report discussed in the Ars Technica article. Scroll to the bottom of this post.
From Ars Technica:
In the age of Google and Wikipedia, an almost unlimited amount of information is available at our fingertips, and with the rise of smartphones, many of us have nonstop access. The potential to find almost any piece of information in seconds is beneficial, but is this ability actually negatively impacting our memory? The authors of a paper that is being released by Science Express describe four experiments testing this. Based on their results, people are recalling information less, and instead can remember where to find the information they have forgotten.
Read the Complete Ars Technica Article
A very interesting report and hopefully some additional comments after we read the complete research article discussed (linked below).
Questions the Ars Technica report has us asking? Some are directly related while others are tangential.
However, we were surprised not to see any mention in the article about info retrieval skills and info quality in determining what external resources to use and the resources they provide access to. In other words, what makes one starting point better than the others? Would a starting point become even better if they new a few extra skills (tips?) to save them time and overall effort?
Do users even consider assessing the quality/authority/currency of what they find and determine a good answer? Do searchers consider the info need before they decide to begin at point A or point B. Are they looking for a specific fact or an overview. Do they take what they learn from a search session (for example, finding a quality source) and beginning there the next time?
Do researchers ever consider user remotely accessible databases via a library or contacting a librarian as a starting point?
Where role does brand recognition play in determining starting points? Our guess, a LOT. For many Google is a synonym for search and online research. How can (can they?) other companies deal with this? Do they need to?
Finally, if users expect computerized info to be always be available (makes sense to us) where do quality, accuracy, etc. fit. Do they make a difference any more? What might this mean long term?
See Also: “Google Searches May Influence What People Forget, Test Finds” (via Bloomberg)
UPDATE: Here’s a Link to the Full Text Report Discussed in the Ars Technica Article
The advent of the Internet, with sophisticated algorithmic search engines, has made accessing information as easy as lifting a finger. No longer do we have to make costly efforts to find the things we want. We can “Google” the old classmate, find articles online, or look up the actor who was on the tip of our tongue. The results of four studies suggest that when faced with difficult questions, people are primed to think about computers and that when people expect to have future access to information, they have lower rates of recall of the information itself and enhanced recall instead for where to access it. The Internet has become a primary form of external or transactive memory, where information is stored collectively outside ourselves.
About Gary Price
Gary Price (email@example.com) is a librarian, writer, consultant, and frequent conference speaker based in the Washington D.C. metro area. He earned his MLIS degree from Wayne State University in Detroit. Price has won several awards including the SLA Innovations in Technology Award and Alumnus of the Year from the Wayne St. University Library and Information Science Program. From 2006-2009 he was Director of Online Information Services at Ask.com.