Discovering Your Discovery System

Twenty-first-century students have the world at their fingertips when it comes to finding material for research papers. Librarians know very well that the challenge is in finding the right resources—the most useful and the most credible—but convincing students well experienced in a world of Google that better search engines and procedures exist for academic work is not always easy. Two librarians—Bonnie Imler, library director at Penn State Altoona, and Michelle Eichelberger, systems and electronic services librarian at Genesee Community College, Batavia, New York—collaborate regularly on research studies, examining student behavior when searching for articles on a computer. Their latest results can be found in their new book, Optimizing Discovery Systems to Improve User Experience: The Innovative Librarian’s Guide (ABC-Clio, 2017).

In librarian lingo search engines are “discovery systems.” Type in a key word and the system will search through its databases for that word. Of course anyone who has ever searched like this knows that we don’t always find what we want on first try and it can take a little patience, a little honing of our searches, to get what we need. A librarian’s mission is to make sure we get what we need, a job they take very seriously. As Imler says, “We don’t want to lose the patron.”

Therefore, it’s very important to make sure students succeed in their searches. But without sitting next to students and watching every move they make, how can librarians learn what students are doing? Imler and Eichelberger thought creatively; they used a software called Studio Code, created for tracking purposes, to turn on screen capture and watch students’ behavior as they searched for research material on Penn State’s own discovery system, LionSearch. The students were tasked with finding a number of articles on a certain subject and knew they were being anonymously observed.

What the researchers found is that students get frustrated and abandon searches when they aren’t successful after a few clicks. How can someone tell a user is frustrated just through a screen capture? “The mouse sometimes slips to the bottom of the page. Or goes in circles,” Imler says. The trick is getting students to stick with the search, so it’s important to make things easier and faster to find. “My user experience is every time you have another link, you lose another user.”

With 700+ databases and the complete Penn State catalog, searches are conducted across a massive amount of material, which has its good and bad. Once a student locates suitable article, the hurdle becomes access to that article. “Most of our databases offer full data,” Imler says, but “if an article doesn’t show up in full text in one database, you need a ‘link resolver’ so you can get it in another database.” That’s another click. When a user clicks on a title in a search, a blue GET IT button appears.  But will the user click on that button to get the article? Not always, Imler admits. Students are not always clear on the difference between an abstract and a full article. The researchers know this because in the study students were asked to collect full articles and some collected only abstracts instead.

Libraries may be getting fewer physical books but still collections grow larger and researchers have access to what can be an overwhelming amount of material. Librarians are always trying to make sure researchers get what they need, with as few clicks as possible. Imler is encouraged, saying, “We have come so far.”  She also notes the changing landscape and the problems it brings as when, for example, “fewer and fewer students know what a print version of a periodical is.” But on the plus side, when searching for valid articles as opposed to pop culture or sponsored pieces, “one of the best [search] limits is ‘peer review.’” Imler and Eichelberger hope their experiences and research will help other librarians keep their own patrons informed.

Therese Boyd,’79

 

Bookmark the permalink.

Comments are closed