The H. Lee Waters Film Collection we published earlier this month has generated quite a buzz. In the last few weeks, we’ve seen a tremendous uptick in visits to Duke Digital Collections and received comments, mail, and phone calls from Waters fans, film buffs, and from residents of the small towns he visited and filmed over 70 years ago. It’s clear that Waters’ “Movies of Local People” have wide appeal.
The 92 films in the collection are clearly the highlight, but as an archivist and metadata librarian I’m just as fascinated by the logbooks Waters kept as he toured across the Carolinas, Virginia, and Tennessee screening his films in small town theaters between 1936 and 1942. In the logbooks, Waters typically recorded the theater name and location where he screened each film, what movie-goers were charged, his percentage of the profits, his revenue from advertising, and sometimes the amount and type of footage shown.
As images in the digital collection, the logbooks aren’t that interesting (at least visually), but the data they contain tell a compelling story. To bring the logbooks to life, I decided to give structure to some of the data (yes, a spreadsheet) and used a new visualization tool I recently discovered called TimeMapper to plot Waters’ itinerary on a synchronized timeline and map–call it a timemap! You can interact with the embedded timemap below, or see a full-screen version here. Currently, the Waters timemap only includes data from the first 15 pages of the logbook (more to come!). Already, though, we can start to visualize Waters’ route and the frequency of film screenings. We can also interact with the digital collection in new ways:
Click on a town in the map view to see when Waters’ visited and then view the logbook entry or any available films for that town.
Slide the timeline and click through the entries to trace Waters’ route
Toggle forward or backwards through the logbook entries to travel along with Waters
For me, the Waters timemap demonstrates the potential for making use of the data in our collections, not just the digitized images or artifacts. With so many simple and freely available tools like TimeMapper and Google Fusion Tables (see my previous post), it has never been so easy to create interactive visualizations quickly and with limited technical skills.
I’d love to see someone explore the financial data in Waters’ logbooks to see what we might learn about his accounting practices or even about the economic conditions in each town. The logbook data has the potential to support any number of research questions. So start your own spreadsheet and have at it!
2015 has been a banner year for Duke Digital Collections, and its only January! We have already published a new collection, broken records and expanded our audience. Truth be told, we have been on quite a roll for the last several months, and with the holidays we haven’t had a chance to share every new digital collection with you. Today on Bitstreams, we highlight digital collection news that didn’t quite make the headlines in the past few months.
H. Lee Watersmania
Before touching on news you haven’t about, we must continue the H. Lee Waters PR Blitz. Last week, we launched the H. Lee Waters digital collection. We and the Rubenstein Library knew there was a fair amount of pent-up demand for this collection, however we have been amazed by the reaction of the public. Within a few days of launch, site visits hit what we believe (though cannot say with 100% certainty) to be an all time high of 17,000 visits and 37,000 pageviews on Jan 19. We even suspect that the intensity of the traffic has contributed to some recent server performance issues (apologies if you have had trouble viewing the films – we and campus IT are working on it).
We have also seen more than 20 new user comments left on Water’s films pages, 6 comments left on the launch blog post, and 40+ new likes on the Duke Digital Collections Facebook page since last week. The Rubenstein Library has also received a surge of inquiries about the collection. These may not be “official” stats, but we have never seen this much direct public reaction to one of our new digital collections, and we could not be more excited about it.
Early Greek Manuscripts
In November we quietly made 38 early Greek manuscripts available online, one of which is the digital copy of a manuscript since returned to the Greek government. These beautiful volumes are part of the Rubenstein Library and date from the 9th – 17th centuries. We are still digitizing volumes from this collection, and hope to publish more in the late Spring. At that time we will make some changes to the look and feel of the digital collection. Our goal will be to further expose the general public to the beauty of these volumes while also increasing discoverability to multiple scholarly communities.
Curious about bone saws, blood letting or other historic medical instruments? Look no further than the Rubenstein Libraries History of Medicine Artifact’s Collection Guide. In December we published over 300 images of historic medical artifacts embedded in the collection guide. Its an incredible and sometimes frightening treasure trove of images.
These are legacy images taken by the History of Medicine. While we didn’t shoot these items in the Digital Production Center, the digital collections team still took a hands on approach to normalizing the filenames and overall structure of the image set so we could publish them. This project was part of our larger efforts to make more media types embeddable in Rubenstein collection guides, a deceptively difficult process that will likely be covered more in depth in a future Bitstreams post.
Digitization to Support the Student Nonviolent Coordinating Committee (SNCC) Legacy Project Partnership
This one is hot off the digital presses. Digital Collections partnered with University Archives to publish Coach K’s very first win at Duke just this week in anticipation of victory # 1000.
What’s Next for Duke Digital Collections?
The short answer is, a lot! We have very ambitious plans for 2015. We will be developing the next version of our digital collections platform, hiring an intern (thank you University Archives), restarting digitization of the Gedney collection, and of course publishing more of your favorite digital collections. Stay tuned!
This week, in conjunction with our H. Lee Waters Film Collection unveiling, we rolled out a handy new Embed feature for digital collections items. The idea is to make it as easy as possible for someone to share their discoveries from our collections, with proper attribution, on other websites or blogs.
It’s simple, really, and mimics the experience you’re likely to encounter getting embed code from other popular sites with videos, images, and the like. We modeled our approach loosely on the Internet Archive‘s video embed service (e.g., visit this video and click the Share icon, but only if you are unafraid of clowns).
Click the “Embed” link under an item from Duke Digital Collections, and copy the snippet of code that pops up. Paste it in your website, and you’re done!
I’ll paste a few examples below using different kinds of items. The embed code is short and nearly identical for all of these:
A Single Image
Document with Document Viewer
Building this feature required a little bit of math, some trial & error, and a few tricks. The steps were to:
Set up a service to return customized item pages at the path http://library.duke.edu/digitalcollections/embed/<itemid>/
Use CSS & JS to make the media as fluid as possible to fill whatever space it ends up in
Use a fixed height and overflow: auto on the attribution box so longer content will scroll
Use link rel=”canonical” to ensure the item’s embed page is associated with the real item page (especially to improve links / ranking signals for search engines).
Present the user a copyable HTML <iframe> element in the regular item page that has the correct height & width attributes to accommodate the item(s) to be embedded
This last point is where the math comes in. Take a single image item, for example. With a landscape-orientation image we need to give the user a different <iframe> height to copy than we would for a portrait. It gets even more complicated when we have to account for multiple tracks of audio or video, or combinations of the two.
We’ll refine this feature a bit in the coming weeks, and work out any embed-bugs we discover. We’ll also be developing a similar feature for embedding digitized content found in our archival collection guides.
The motion picture films in the H. Lee Waters Collection play out a history of North Carolina (and Virginia, and South Carolina) in the late 1930s and early 1940s unparalleled in scope and vision. But what would eventually become such a grand gift to the citizens and scholars and artists of the region did not begin with that in mind. Like fellow commercial photographer and North Carolinian Hugh Mangum, Waters might be considered an accidental documentarian, taking to the road in the depths of the Depression as a resourceful businessman, filling theatre seats with audiences who paid to see themselves in the movies. And yet, a natural behind the camera, Waters knew composition and how to frame a shot; more importantly, he knew people, loved to be around them, and could draw from his subjects positive reactions to this unexpected man with a camera, outside the mill, on main street, in front of the school, in the shop. As Waters biographer and documentarian Tom Whiteside has noted, Waters’ quick-cut aesthetic managed the immediate goal of getting as many townsfolk into the movie as possible while achieving, in the long-term, an archive of still image frames that is vast in its scope and ripe for investigation. From this perspective, the vernacular of his art puts him in the company of the prominent documentary photographers of his day.
Waters used reversal film, and the film he projected was the same film he shot in the camera, edited for length and his beloved special effects. He worked quickly, didn’t make copies, and after coming off the road in 1942 shelved the films until, later in life, he started selling them to their respective communities. Duke’s collection of H. Lee Waters films therefore owes a debt to the towns, libraries, and historical societies who over the years have sent, and continue to send, Waters’ legacy to Duke, recognizing that centralizing these resources works in favor of the region’s cultural heritage. It also means that over the years Duke has accrued film in all conditions and states of preservation. There is film in the collection that is literally turning to dust; there is also beautiful Kodachrome that could have been shot yesterday. Since 1988, too, audiovisual preservation has changed dramatically. Thankfully, and with the help of the National Film Preservation Foundation, a substantial number of the films have received full film-to-film preservation; nevertheless, earlier, heroic attempts at saving some films to videotape, some formulations of which are now severely degrading, have left us in a few cases with only a blurred shadow of what must have been on that original film. So our digital project reflects the films and their creator, but also the history of the collection at Duke.
Many at Duke Libraries have made the Waters collection what it is today, and those of us working on bringing the films online build on the efforts of librarians, archivists, and technical staff who were as passionate about these movies as we are. Ever in transition, the collection is marked by growth, an element that we see as integral to the website. In fact we are already adding to it. In addition to the films and (for some of them) shotlists, there are oral history interviews with the children of H. Lee Waters. Tom Waters and Mary Waters Spaulding have not only been essential in bringing their father’s films online, they have a unique perspective on a talented man whose contribution to the history of North Carolina was only beginning to be appreciated when he died in 1997. Waters’ home movies will be added to the site soon, and we anticipate presenting select work inspired by the Waters films, because, in addition to their own sublime artistry, the movies remain a magnet for artists and documentarians mining archival sources. One such work will debut March 20 for Duke Performances, as Jenny Scheinman premieres her work “Kannapolis: A Moving Portrait,” based around film from the collection.
Of course, we also hope the site might draw other Movies of Local People out of hiding, because while Duke and the State Archives hold a good number of the films, we still don’t know the whereabouts of some of them. So when you visit the site, take advantage of the embed and share functions accompanying each of the videos, use them on your blog or Facebook page, guide people to H. Lee Waters at Duke, and who knows? It may lead them to investigate further, to liberate that can of film that’s been sitting in the closet or biding its time at the local library.
Post Contributed by Craig Breaden, Audiovisual Archivist, David M. Rubenstein Rare Book & Manuscript Library.
The Digital Production Center engages with various departments within the Libraries and across campus to preserve endangered media and create unique digital collections. We work especially closely with The Rubenstein Rare Book, Manuscript, & Special Collections Library, as they hold many of the materials that we digitize and archive on a daily basis. This collaboration requires a shared understanding of numerous media types and their special characteristics; awareness of potential conservation and preservation issues; and a working knowledge of digitization processes, logistics, and limitations.
In order to facilitate this ongoing collaboration, we recently did a semester-long cross-training course with The Rubenstein’s Reproductions Manager, Megan O’Connell. Megan is one of our main points of contact for weekly patron requests, and we felt that this training would strengthen our ability to navigate tricky and time-sensitive digitization jobs heading into the future. The plan was for Megan to work with all three of our digitization specialists (audio, video, & still image) to get a combination of hands-on and observational learning opportunities.
Still image comprises the bulk of our workload, so we decided to spend most of the training on these materials. “Still image” includes anything that we digitize via photographic or scanning technology, e.g. manuscripts, maps, bound periodicals, posters, photographs, slides, etc. We identified a group of uniquely challenging materials of this type and digitized one of each for hands-on training, including:
Bound manuscript – Most of these items cannot be opened more than 90 degrees. We stabilize them in a custom-built book cradle, capture the recto sides of the pages, then flip the book and capture the verso sides. The resulting files then have to be interleaved into the correct sequence.
Map, or other oversize item – These types of materials are often too large to capture in one single camera shot. Our setup allows us to take multiple shots (with the help of the camera being mounted on a sliding track) which we then stitch together into a seamless whole.
Item with texture or different item depths, e.g. a folded map, tipped into a book – It is often challenging to properly support these items and level the map so that it is all in focus within the camera’s depth of field.
ANR volume – These are large, heavy volumes that typically contain older newspapers and periodicals. The paper can be very fragile and they have to be handled and supported carefully so as not to damage or tear the material.
Item with a tight binding w/ text that goes into the gutter – We do our best to capture all of the text, but it will sometimes appear to curve or disappear into the gutter in the resulting digital image.
Working through this list with Megan, I was struck by the diversity of materials that we collect and digitize. The training process also highlighted the variety of tricks, techniques, and hacks that we employ to get the best possible digital transfers, given the limitations of the available technology and the materials’ condition. I came out of the experience with a renewed appreciation of the complexity of the digitization work we do in the DPC, the significance of the rare materials in the collection, and the excellent service that we are able to provide to researchers through the Rubenstein Library.
The following is a series of loosely linked stories, loosely based on our digital collections, and loosely related to the holidays, where even the word “loosely” is applied with some looseness.
An American looking forward to baking delicious treats for the holidays in 1942 would have been intimately familiar with War Ration Book One. The Office of Price Administration issued Ration Order No. 3 in April of that year, and distributed the ration books via elementary schools in the first week of May. Holders could purchase one pound of sugar every two weeks between May 5 and June 27. By the end of the year, butter, coffee, and other foods joined the list of regulated goods.
As the holidays approached, the newspapers ran articles advising homemakers how to cope with the unavailability of key ingredients. Vegetable shortening could help stretch butter, molasses made cookies prone to burning, and fruit juice was a natural sweetener. The New Orleans Times-Picayune’s “Round Table Talk About Food” exhorted homemakers to make the best of it:
There is something stimulating this approaching holiday time in planning Christmas meals and gift packages or baskets with those substitute items we are permitted to use, rather than with the usual abundance of foods to suit every whim of the appetite.
I wrote before about how YMCA missionaries took basketball overseas after its invention, including to Japan. Did they also take Santa beards to China?
The Office of Price Administration provided Duke Law alum Richard Nixon with his first job in Washington, beginning in January of 1942. Rubber was his area of focus. He was industrious and diligent in his work, and by March, had been promoted to “acting chief of interpretations in the Rubber Branch.”
But the life of a government regulator was not to be for Dick Nixon. He joined the Navy in August, and by year’s end found himself serving at an airfield in Ottumwa, Iowa.
True story. Terry Sanford spent December 21 and 22 of 1944 riding in a convoy that took the 1st Battalion of the 517th Parachute Regimental Combat Team from Soissons, France to the town of Soy in Belgium. His unit fought Germans for the next few days, losing more than a hundred men, in the conflagration that became known as the Battle of the Bulge.
They were able to sleep on Christmas Eve. On Christmas morning, there was roasted turkey, but at noon orders came to take a hill, which they did. The next day, they held it, repelling a German counterattack.
In the action, Sanford tackled a German officer, disarmed him, and drove him off for interrogation. Years later, he speculated that the man was probably shot before being processed as a POW, as retaliation against a recent massacre of American troops.
Sanford would write home that “things are going well in this country,” and they had “[m]ore food than elsewhere,” without explaining why there was more to go around.
To doctor’s for sinus treatment, then down on Ginza shopping for Christmas presents: perfume for Umeko, a dog purse for Eiko, cookies for Mrs. Natsuzoe, toys for Mineko san and Masao chan. Lunch of fried oysters and fresh strawberries in Olympic Grille.
Brought Eiko home for her first Christmas. Tried to tell her the Christmas Story, but my limited knowledge of Japanese and her excitement made direct teaching impossible. Was up wrapping presents till almost mid-night.
Variant spellings for hanukkah occur three times in the OCR text of our 1960s Duke Chronicle collection. (Due to the imprecision of OCR, the actual occurrences may be more).
A photograph on the front page of the December 17, 1968, issue depicts a Star of David hanging on the side of a building. The caption reads, “It is now Hanukah, ‘the festival of lights.’”
At the top of that page, the lead story of the issue is headlined, “X-Mas amnesty asked for draft dodgers.” It reports that the “cabinet of the YMCA” at Duke had resolved to write to President Johnson on the matter. Of course, Johnson was a lame duck by then. That same day, the New York Times reported that he had spent an hour conferring with John Mitchell, Nixon’s incoming Attorney General.
WIND SONG USES HANDSOME “SANTA” TO BOOST PRE-CHRISTMAS SALES
BOB HALDEMAN NAMED MANAGER OF THE LOS ANGELES OFFICE
Eight years later, the newsletter noted the appointments of Haldeman and his subordinate, Ron Ziegler, to the White House staff. Haldeman would serve as Nixon’s Chief of Staff, and later did 18 months in prison for his role in the Watergate coverup. Ziegler became White House Press Secretary.
On November 17, 1967 – the Friday before Thanksgiving – the Chronicle ran a story about Terry Sanford and his newly published book, Storm over the States. He started writing Storm soon after leaving the NC governor’s office in 1965. Supported by the Ford Foundation and the Carnegie Corporation, he holed up in an office at Duke, hired a staff, and wrote about a model for state government that is federalist but proactive and constructive. Sanford’s point of view stood in stark, if unmentioned, contrast to the doctrines of “nullification” and states’ rights that segregationists like George Wallace wielded in their opposition to Civil Rights.
That Friday ended a tumultuous week at Duke. The lead story that same day was headlined “Knight bans use of segregated facilities by student groups.” The school’s president, Douglas Knight, had “re-interpreted” a university policy statement prohibiting the use of off-campus facilities that discriminated on the basis of race. Knight extended the policy, purported to apply to staff and faculty organizations, to include student organizations as well.
The previous week, the student body had defeated a referendum that would have had the same effect. Black students reacted by staging a sit in at Knight’s office (one holding a sign that read “Students Await An Overdue Talk With Our WHITE KNIGHT”), demanding that he take action. Knight acceded, complaining in his statement that “the application of this practice would have been made in the normal course of events,” but “we were confronted with an ultimatum, which carried with it a threat of disruption of the ordinary processes of the University.”
Confrontations between the administration and black students continued to escalate, leading to the Allen Building takeover in 1969, Knight’s resignation, and his succession by Terry Sanford.
Mary McMillan’s journals stretch from 1939 to 1991. Her “1939” journal actually contains entries from the 1940s, though there are significant gaps. In October of 1942 she wrote of traveling to Delta, Utah. She didn’t mention her purpose, and no other entries appear from that period, but she was heading to Utah to teach in the Topaz Relocation Center, an internment camp for Japanese Americans. En route, she wrote:
Those nice Marine recruits who got on our train in St. Louis shouldn’t be required to go so far from home to fight for objectives that seem to me not to be in keeping with United Nations Aims, as given in the Atlantic Charter. Why should Japan “be crushed”? The military mind there – and elsewhere – must be forced from power; but are we on the right track towards achieving that objective? I fear most of us have become too material-minded. By following methods resulting from our materialistic thinking, we only create atmospheres for other hostile “spiritual” forces – like Naziism.
Then, the week of Christmas in 1947 she traveled to Seattle, and embarked on a return to Japan. She arrived in Hiroshima in January, the first Christian missionary to return after the war. She lived there for more than thirty years before retiring.
On page 4 of the December 17, 1967 issue of the Chronicle – just below the article about Terry Sanford’s Storm Over the States – this ad ran: I have to believe at least a few students and faculty got the book or the record as stocking stuffers that year.
Writing software for the web has come a long way in the past decade. Tools such as Ruby on Rails, Twitter Bootstrap, and jQuery provide standard methods for solving common problems and providing common features of web applications. These tools help developers use more of their time and energy to solve problems unique to the application they are building.
One of the costs of relying on libraries and frameworks to build software is that your project not only depends on the frameworks and libraries you’ve chosen to use, but these frameworks and libraries rely on still more components. Compounding this problem, software is never really finished. There are always bugs being fixed and features being changed and added. These attributes of software, that it changes and has dependencies, complicates software projects in a few different ways:
You must carefully manage the versions of libraries, frameworks, and other dependencies so that you ensure all the pieces work together.
Developers working on multiple projects may have to use different versions of the same libraries and packages for each of their projects.
If you’re working with a team of developers on a project all members of the team have to make sure their computers are set up with the correct versions of the software and dependencies of the project.
Thankfully, there are still more tools available to help manage these problems. For instance, Ruby Version Manager (RVM) is a popular tools used by Ruby on Rails developers. It lets the software developer install and switch between different versions of Ruby. Other tools, such as Bundler make it possible to define exactly what version of which Ruby Gems (Ruby software packages that add functionality to your own project) you need to install for a particular project. Combined, RVM and Bundler simplify the management of complex project dependencies. There are similar tools available for other programming languages, such as Composer, which is a dependency manager for PHP.
While many of us already use dependency managers in our work, one tool we haven’t been using that we’re evaluating for use on a new project is Vagrant. Vagrant is a tool for creating virtual machines, self-contained systems that run within a host operating system. Virtual machines are software implementations of a computer system. For instance, using a virtual machine I could run Windows on my Mac hardware.
Vagrant does a few things that may make it even easier for developers to manage project dependencies.
With Vagrant you can write a script that contains a set of instructions about what operating system and other software you want to install in a virtual machine. Creating a virtual machine with all the software you need for a given project is as then as simple as typing a single command.
Vagrant provides a shared directory between your host operating system and the virtual machine. You can use the operating system you use everyday as you work while the software project runs in a virtual machine. This is significant because it means each developer can continue to use the operating system and tools they prefer while the software they’re building is all running in copies of the exact same system.
You can add the script for creating the virtual machine to the project itself making it very easy for new developers to get the project running. They don’t have to go through the sometimes painful process of installing a project’s dependencies by hand because the Vagrant script does it for them.
A developer working on multiple projects can have a virtual machine set up for each of their projects so they never interfere with each other and each has the correct dependencies installed.
Here’s how to use Vagrant in the most minimal way:
In a terminal window type: vagrant init hashicorp/precise32
After running the following command you will have downloaded, set up, and started a fully functional virtual machine running Ubuntu: vagrant up
You can then connect to and start using the running virtual machine by connecting to it via SSH: vagrant ssh
In a more complex setup you’d probably add a provisioning script with instructions for downloading and installing additional software as part of the “vagrant up” process. See the Vagrant documentation for more details about provisioning options.
We’re considering using Vagrant on an upcoming project in an effort to make it easier for all the developers on the project to set up and maintain a working development environment. With Vagrant, just one developer will need to spend the time to create the script that generates the virtual machine for the project. This should save the time of other developers on the project who should only have to install VirtualBox, copy the Vagrant file and type “vagrant up.” At least, that’s the idea. Vagrant has great documentation, so if you’re interested in learning more their website is a good place to start.
In late October of this year, the Digital Production Center (along with many others in the Library) were busy developing budgets for FY 2015. We were asked to think about the needs of the department, where the bottlenecks were and possible new growth areas. We were asked to think big. The idea was to develop a grand list and work backwards to identify what we could reasonably ask for. While the DPC is able to digitize many types of materials and formats, such as audio and video, my focus is specifically still image digitization. So that’s what I focused on.
We serve many different parts of the Library and in order to accommodate a wide variety of requests, we use many different types of capture devices in the DPC: high-speed scanners, film scanners, overhead scanners and high-end cameras. The most heavily used capture device is the Phase One camera system. This camera system uses P65 60 MP digital back with a 72mm Schneider flat field lens. This enables us to capture high quality images at archival standards. The majority of material we digitize using this camera are bound volumes (most of them rare books from the David M. Rubenstein Library), but we also use this camera to digitize patron requests, which have increased significantly over the years (everything is expected to be digital it seems), oversized items, glass plate negatives, high-end photography collections and much more. It is no surprise that this camera is a bottleneck for still image production. In researching cameras to include in the budget, I was hard pressed to find another camera system that can compete with the Phase One camera. For over 5 years we have used Digital Transitions, a New York-based provider of high-end digital solutions, for our Phase One purchases and support. We have been very happy with the service, support and equipment we have purchased from them over the years, so I contacted them to inquire about new equipment on the horizon and pricing for upgrading our current system.
New equipment they turned me onto is the BC100 book scanner. This scanner uses a 100° glass platen and two reprographic cameras to capture two facing pages at the same time. While there are other camera systems that use a similar two camera setup (most notably the Scribe, Kirtas and Atiz), the cameras and digital backs used with the BC100, as well as the CaptureOne software that drives the cameras, are more well suited for cultural heritage reproduction. Along with the new BC100, CaptureOne is now offering a new software package specifically geared toward the cultural heritage community for use with this new camera system. While inquiring about the new system, I was invited to attend a Cultural Heritage Round Table event that Digital Transitions was hosting.
This roundtable was focused on the new CaptureOne software for use with the BC100 and the specific needs of the cultural heritage community. I have always found the folks at Digital Transitions to be very professional, knowledgeable and helpful. The event they put together included Jacob Frost, Application Software R&D Manager for PhaseOne; Doug Peterson, Technical Support, Training, R&D at Digital Transitions; and Don Williams of Image Science Associates, Imaging Scientist. Don is also on the Still Image Digitization Advisory Board with the Federal Agencies Digitization Guidelines Initiative (FADGI), a collaborative effort by federal agencies to define common guidelines, methods, and practices for digitizing historical content. They talked about the new features of the software, the science behind the software, the science behind the color technology and new information about the FADGI Still Image standard that we currently follow at the Library. I was impressed by the information provided and the knowledge shared, but what impressed me the most was the fact that the main reason Digital Transitions pulled this particular group of users and developers together was to ask us what the cultural heritage community needed from the new software. WHAT!? What we need from the software? I’ve been doing this work for about 15 years now and I think that’s the first time any software developer from any digital imaging company has asked our community specifically what we need. Don’t get me wrong, there is a lot of good software out there but usually the software comes “as is.” While it is fully functional, there are usually some work-arounds to get the software to do what I need it to do. We, as a community, spent about an hour drumming up ideas for software improvements and features.
While we still need to see follow-through on what we talked about, I am hopeful that some of the features we talked about will show up in the software. The software still needs some work to be truly beneficial (especially in post-production), but Phase One and Digital Transitions are definitely on to something.
This fall we changed the default tool that students and faculty use to research library holdings. We have tools that work well for a broad search and tools that are tailored for more specialized research. So, how is this change working out?
We’ve got numbers and we’ve got opinion. First, let’s look at the numbers.
The most used feature on the Duke Libraries website is the search box on the homepage with 211,655 searches performed using the default “All” tab between August 25 and November 16, 2014.
Within the “All” tab search results, patrons selected results from Articles 48% of the time, results from Books & Media 44% of the time and other results 8% of the time. These results were presented side-by-side on a single results page.
The All search isn’t the only option on our homepage as the Books & Media tab was used 68,566 times and the Articles tab was used 46,028 times during the same timeframe.
The five most used search terms were PubMed, Web of Science, JSTOR, RefWorks, and Dictionary of National Biography.
The most frequently searched for fictional character was Tom Sawyer.
The most searched for person was Dr. Martin Luther King, Jr.
So what thoughts have you shared with us about the search options we provide?
During the first four weeks of the semester, 48 people submitted their opinions through a survey linked from the search results page.
Thirty percent of survey respondents said that they liked having the Articles and Books & Media results appear side by side on the new search results page.
Twenty-seven percent said they thought the page looked cluttered or that it was hard to read.
Twenty-five percent said that they did not know that they can choose a more highly-focused search option from the Search & Find menu.
Testing with a small group of researchers revealed that it was difficult to locate material from the Rubenstein Library using our default search results screen.
Based on your feedback, we made the following improvements to the search results page during the semester.
We de-cluttered the information shown in the Articles and Books & Media columns to make results easier to read.
We moved “Our Website” results to the top of the right column.
We reduced the space used by the search box on the results page.
In the coming months, we will explore ways of making it easier to find materials from the Rubenstein Library and from University Archives. We are also investigating options for implementing a Best Bets feature on the results page; this would provide clearer access to some of the most used resources.
Back in October, Molly detailed DigEx’s work on creating an exhibit for the Link Media Wall. We’ve finally finalized our content and hope to have the new exhibit published to the large display in the next week or two. I’d like to detail how this thing is actually put together.
I broke the content chunks into five main sections: the masthead (which holds the branding), the navigation (which highlights the current section and construction period), the map (which shows the location of the buildings), the thumbnail (which shows the completed building and adds some descriptive text), and the images (which houses a set of cross-fading historic photos illustrating the progression of construction). Working with a fixed-pixel layout feels strange in the modern world of web development, but it’s quick and satisfying to crank out. I’m using the jQuery Cycle plugin to transition the images, which is lightweight and offers lots of configurable options. I also created a transparent PNG file containing a gradient that fades to the background color which overlays the rotating images.
Another part of the puzzle I wrestled with was how to transition from one section of the exhibit to another. I thought about housing all of the content on a single page and using some JS to move from one to the next, but I was a little worried about performance so I again opted for the super simple solution. Each page has a meta refresh in the header set to the number of seconds that it takes to cycle through the corresponding set of images and with a destination of the next section of the exhibit. It’s a little clunky in execution and I would probably try something more elegant next time, but it’s solid and it works.
Here’s a preview of the exhibit cycling through all of the content. It’s been time compressed – the actual exhibit will take about ten minutes to play through.
In a lot of ways this exhibit is an experiment in both process and form, and I’m looking forward to seeing how our vision translates to the Media Wall space. Using such simple code means that if there are any problems, we can quickly make changes. I’m also looking forward to working on future exhibits and helping to highlight the amazing items in our collections.
Notes from the Duke University Libraries Digital Projects Team