Duke Libraries regularly engage in assessment in order to gauge the value of services offered, determine future needs, and improve the overall library user experience. In many of the posts on the Instruction & Outreach Blog we have focused on library assessment as it relates to student learning outcomes, but what actions do librarians take after the instruction session to report their own processes, and how does that reporting process impede or align with their day to day responsibilities and workflow?
After each instruction or outreach related activity librarians are asked to record their involvement by completing a short survey. At the end of the semester results are compiled in a report that is viewed by librarians, library and university administrators, as well as the general Duke community. Below are a selection of questions from the report:
1. During the session did you assess whether students had met learning outcomes or session objectives?
2. What teaching methods did you employ: lecture, demonstration, class discussion, learning activity, clickers?
3. What resources did you use: digital collections, special collections, GIS/Maps, data sets, government documents?
4. Approximate total preparation time for the class: none – more than two hours?
Along with the university at large, Instruction & Outreach recently migrated from ViewsFlash to Qualtrics for online surveys. As I was considering the survey life cycle I began to reflect on principles of good design. How is this data going to be reported, and, how will users interact with the survey? Good (and bad) design permeates our everyday life in everything from the interface on our mobile devices to library signage. Much like good signage can ease your navigation, good survey design will lead to better data input methods resulting in cleaner data that is easier to access, evaluate, synthesize, and share.
The principle of Garbage In-Garbage Out, also known as GIGO, speaks to a major concern in report and survey design, “the quality of system output is dependent on system input” (Lidwell, Holden, & Butler p.112). When constructing your survey or assessment, in addition to considering the audience for whom you are designing and the amount of control (p. 64) you allow them (free text vs multiple choice), it is important to minimize error by utilizing constraints (p. 60), or, limiting user action.
In the case of the Instruction & Outreach survey, users most frequently input inconsistent data when the survey requested that they reference the specific course and section for which they were providing instruction. This invited room for improvement in the design and delivery of the question. In this example limiting user action can be achieved by reducing the number of clicks required to select the course for which the librarian provided instruction. Errors most frequently occur when “the organization of content directly affects our ability to receive a message. If the information appears jumbled or overwhelming, many viewers “disconnect” before a transmission is complete” (Visocky, p.80).
Qualtrics is capable of customization to the extent that MySQL databases can be embedded in the survey, allowing your users to select from a range of options presented in a familiar interface. Course listings will always be up to date, courses will be easier to recall in a timely manner, and there will likely be a decrease in survey-reported data that is inconsistent with registrar listings. Although we haven’t yet implemented the MySQL-Qualtrics integration, being aware of the option is an important consideration in the iterative design process.
Surveys informed by design principles will result in easy-to-navigate questionnaires that feel intuitive to the user and will challenge you as the designer to think in new and creative ways about not only the content, but also the context, order, and layout of your survey. These design principles are especially helpful for survey and instructional designers working with and designing surveys for unique populations (see SAP Design Guild for more on inclusive design). By utilizing the full technical functionality of the survey system, in combination with an informed survey design process, the survey start to completion time can be reduced, increasing the likelihood of consistent reporting and data return.
Lidwell, W., Holden, K., Butler, J., & Elam, K. (2010). Universal principles of design: 125 ways to enhance usability, influence perception, increase appeal, make better design decisions, and teach through design.
Beverly, Mass: Rockport Publishers.
Visocky, O. G. J., & Visocky, O. G. K. (2008). The information design handbook. Cincinnati, Ohio: How Books.
- An error has occurred, which probably means the feed is down. Try again later.
- May 2012 (2)
- April 2012 (4)
- March 2012 (5)
- February 2012 (4)
- January 2012 (4)
- December 2011 (3)
- November 2011 (3)
- October 2011 (5)
- September 2011 (5)
- August 2011 (4)
- July 2011 (4)
- June 2011 (5)
- May 2011 (4)
- April 2011 (5)
- March 2011 (5)
- February 2011 (3)
- January 2011 (3)
- December 2010 (2)
- November 2010 (4)
- October 2010 (3)
- September 2010 (5)
- August 2010 (3)
- July 2010 (4)
- June 2010 (4)
- May 2010 (3)
- April 2010 (5)
- March 2010 (5)
- February 2010 (3)