All posts by Susan Ivey

Institutional multispectral imaging (MSI) survey results

Over the last year and a half, we’ve blogged quite a bit about our exploration of multispectral imaging (MSI) here at Duke Libraries. We’ve written about the hardware and software that we acquired in 2016 and about our collaboration and training to help us learn how to use this new equipment. We’ve shared examples from our experiments with MSI on various material and ink types and talked about how MSI can help inform conservation treatment and help uncover faded or hidden text.

Periodically, we’ve also mentioned our Duke MSI group. This group is a cross-departmental team of individuals that each bring to the table critical expertise to ensure that our MSI projects run smoothly and effectively. This past year, our team has created various best practices and workflows, tackled equipment and software hurdles, determined current resource capacity, imaged a number of materials from Duke’s collections, and documented data modeling scenarios. We’re really excited about and proud of the work that we’ve accomplished together.

Many times, though, we’ve talked about how helpful it would be to talk with other institutions that are doing MSI, particularly cultural heritage institutions and academic libraries, in order to learn how they develop their skills and organize their services. Luckily, the University of Manchester, an institution that has the same MSI system as ours, expressed interest in collaborating with us to our shared vendor R. B. Toth Associates. We had a very informative conference call with them in May 2017, during which we discussed their program, services, and technical details. We also shared documentation after the call, which has helped us refine our project request procedures and our deliverables that we provide to the requesters at the end of projects.

Since that collaboration proved so beneficial, we thought it would be helpful to reach out to other institutions that undertake MSI projects. Though it is difficult to determine exact numbers online, it is clear that very few cultural heritage organizations and academic libraries host their own MSI system. Therefore, we teamed up with R. B. Toth Associates again to determine what other institutions they’d worked with that may be interested in collaborating. They shared six institutions worldwide, and we created a brief online survey about the topics that we’d discussed with the University of Manchester, which included organization and staffing, time devoted to MSI, prioritization, project deliverables, and data modeling and access. Five of the six institutions thought that the survey applied to their current MSI setup and therefore completed the survey, and all six expressed some level of interest in future collaboration. The institutions that completed the survey are the Library of Congress, the Museums of New Mexico-Conservation, the University of Pennsylvania Libraries, KU Leuven Libraries, and the University of Copenhagen.

Out of the institutions that we’ve spoken with or surveyed (six in total, including the University of Manchester), we can determine that three have permanent MSI systems (the Library of Congress, the Museums of New Mexico, and University of Manchester). One institution’s survey results were unclear about owning its own system, and the other two institutions noted that they use MSI systems on an ad-hoc, project by project basis. One of those institutions indicated that they are in the initial stages of considering the purchase of a permanent system, and the institution that declined taking the survey stated that it was because they did not own their own system due to costs. Some institutions charge a fee for MSI projects, while others don’t. One institution that currently does not charge expressed interest in creating a cost recovery model.

In terms of the institutions that own a permanent MSI system, the Library of Congress is the only that has one full time employee devoted solely to MSI. The Museums of New Mexico have one employee that serves as the main PI for MSI, though the entire Conservation department is trained on the equipment and software. Their department devotes approximately 20 hours per week to MSI depending on the project. The University of Manchester has two imagers that devote one day per month to imaging, and process intermittently. The other institutions without permanent systems use a combination of trained staff, scholars, imaging scientists, object handlers and conservators, metadata and data managers, imaging technicians, and project managers based on specific project needs.

It appears that most institutions image their own collections for MSI projects, though one respondent indicated that they also partner with other conservation institutions for their research needs. Prioritization processes for MSI projects vary institution to institution. The Library of Congress is the exception, because that department solely does MSI projects, and therefore requests do not need to be prioritized with other digitization requests. The Museums of New Mexico prioritize on a first come, first serve basis, as long as the project is a good fit for MSI. The University of Manchester also has a pre-project process with conservation staff to determine if the project is a good fit for MSI. KU Leuven Libraries prioritizes based on scope, indicating the priority is documentary heritage with a strong focus on illuminated manuscripts, but will service other research if time permits. Additional prioritization strategies include availability of trained staff and scholars.

The institutions have similar goals for MSI projects based on whether a project is internal (for conservation or acquisition purposes) or external (research requested). These include pure documentation, establishing a condition baseline, forensic examination and reconstruction, pre-purchase examination, collecting spectral responses for preservation, revealing undertext for transcription, and revealing undertext for text identification.

Deliverables and preservation copies vary by institution as well. Three of the institutions always include some type of a report with the files. KU Leuven provides multiple reports, including a technical report about infrastructure, an analysis report, and an imaging methodology. Two of the institutions provide the raw and processed images, two provide only the processed images (as jpegs or tiffs), and two state that the deliverables vary based on project. The University of Pennsylvania is the only institution that noted that it provides metadata, which often include transcriptions or descriptive and structural metadata and is packaged in a preservation-ready package. One respondent replied that they retain raw data offline, but place processed captures into a digital asset management system (presumably for online access), and another noted that some projects have arranged preservation of their data packages through third parties that offer dark or accessible digital repositories. Every institution retains some or all of their MSI data internally.

This is just a brief summary of what we learned through this process—we were lucky to receive a lot of data and comments! We are very pleased with and thankful for their willingness to share. As noted earlier, all have expressed an interest in some level of future collaboration, so we’re hopeful to build a network of institutions that can provide feedback and advice to one another for future MSI needs. To all of our fellow MSI institutions, we thank you again for your participation! Please let me know if I misrepresented any of your information. We received a lot of great qualitative responses, and though I did my best to communicate those correctly, I may have misunderstood some responses.

Pink Squirrel: It really is the nuts

During the last 8 months that I’ve worked at Duke, I’ve noticed a lot of squirrels. They seem to be everywhere on this campus, and, not only that, they come closer than any squirrels that I’ve ever seen. In fact, while working outside yesterday, and squirrel hopped onto our table and tried to take an apple from us. It’s become a bit of a joke in my department, actually. We take every opportunity we can to make a squirrel reference.

Anyhow, since we talk about squirrels so often, I decided I’d run a search in our digital collections to see what I’d get. The only image returned was the billboard above, but I was pretty happy with it. In fact, I was so happy with it that I used this very image in my last blog post. At the time, though, I was writing about what my colleagues and I had been doing in regards to the new research data initiative since the beginning of 2017, so I simply used it as a visual to make my coworkers laugh. However, I reminded myself to revisit and investigate. Plus, although I bartended for many years during grad school, I’d never made (much less heard of) a Pink Squirrel cocktail. Drawing inspiration from our friends in Rubenstein Library that write for “The Devil’s Tales” in the “Rubenstein Library Test Kitchen” category, I thought I’d not only write about what I learned, but also try to recreate it.

This item comes from the “Outdoor Advertising Association of America (OAAA) Archives, 1885-1990s” digital collection, which includes over 16,000 images of outdoor advertisements and other scenes. It is one of a few digital outdoor advertising collections that we have, as were previously written about here.

This digital collection houses 6 Glenmore Distilleries Company billboard images in total. 2 are for liquors (a bourbon and a gin), and 4 are for “ready-to-pour” Glenmore cocktails.

These signs indicate that Glenmore Distilleries Company created a total of 14 ready-to-pour cocktails. I found a New York Times article from August 19, 1965 in our catalog stating that Glenmore Distilleries Co. had expanded its line to 18 drinks, which means that the billboards in our collection have to pre-date 1965. Its president, Frank Thompson Jr., was quoted as saying that he expected “exotic drinks” to account for any future surge in sales of bottled cocktails.

OK, so I learned that Glenmore Distilleries had bottled a drink called a Pink Squirrel sometime before 1965. Next, I needed to research to figure out about the Pink Squirrel. Had Glenmore created it? What was in it? Why was it PINK?

It appears the Pink Squirrel was quite popular in its day and has risen and fallen in the decades since. I couldn’t find a definitive academic source, but if one trusts Wikipedia, the Pink Squirrel was first created at Bryant’s Cocktail Lounge in Milwaukee, Wisconsin. The establishment still exists, and its website states the original bartender, Bryant Sharp, is credited with inventing the Pink Squirrel (also the Blue Tail Fly and the Banshee, if you’re interested in cocktails). Wikipedia lists 15 popular culture references for the drink, many from 90s sitcoms (I’m a child of the 80s but don’t remember this) and other more current references. I also found an online source saying it was popular on the New York cocktail scene in the late 70s and early 80s (maybe?). Our Duke catalog returns some results, as well, including articles from Saveur (2014), New York Times Magazine (2006), Restaurant Hospitality (1990), and Cosmopolitan (1981). These are mostly variations on the recipe, including cocktails made with cream, a cocktail made with ice cream (Saveur says “blender drinks” are a cherished tradition in Wisconsin), a pie(!), and a cheesecake(!!).

Armed with recipes for the cream-based and the ice cream-based cocktails, I figured I was all set to shop for ingredients and make the drinks. However, I quickly discovered that one of the three ingredients, crème de noyaux, is a liqueur that is not made in large quantities by many companies anymore, and proved impossible to find around the Triangle. However, it’s an important ingredient in this drink, not only for its nutty flavor, but also because it’s what gives it its pink hue (and obviously its name!). Determined to make this work, I decided to search to see if I could come up with a good enough alternative. I started with the Duke catalog, as all good library folk do, but with very little luck, I turned back to Google. This led me to another Wikipedia article for crème de noyaux, which suggested substituting Amaretto and some red food coloring. It also directed me to an interesting blog about none other than crème de noyaux, the Pink Squirrel, Bryant’s Cocktail Lounge, and a recipe from 1910 on how to make crème de noyaux. However, with time against me, I chose to sub Amaretto and red food coloring instead of making the 1910 homemade version.

First up was the cream based cocktail. The drink contains 1.5 ounces of heavy cream, .75 ounces of white crème de cacao, and .75 ounces of crème de noyaux (or Amaretto with a drop of red food coloring), and is served up in a martini glass.

The result was a creamy, chocolatey flavor with a slight nuttiness, and just enough sweetness without being overbearing. The ice cream version substitutes the heavy cream for a half a cup of vanilla ice cream and is blended rather than shaken. It had a thicker consistency and was much sweeter. My fellow taster and I definitely preferred the cream version. In fact, don’t be surprised if you see me around with a pink martini in hand sometime in the near future.

More MSI Fun: Experimenting with iron gall ink

Submitted on behalf of Erin Hammeke

For conservators, one of the aspects of having the MSI system that excites us most is being able to visualize and document the effects of the treatments that we perform. Although we are still learning the ropes with our new system, we had a recent opportunity to image some iron gall ink documents. Iron gall ink is common historic ink that reacts with moisture in the environment to form acidic complexes that spread and sink into the paper, weakening the paper and, in some cases, leaving holes and losses. This iron gall ink degradation can be better visualized with MSI, since the beginning stage, haloing, is not always visible under normal illumination. Look here for more information on iron gall ink damage and here for using MSI to document iron gall ink condition and treatment. We also illustrated the haloing effect of iron gall ink damage using MSI on Jantz MS #124 in a previous post.

Recently, DUL conservators experimented with treating some discarded iron gall ink manuscripts with a chemical treatment that aims to arrest the ink’s degradation. This treatment requires submerging the manuscripts in a calcium phytate solution – a chemical that bonds with free iron (II) ions, stabilizing the ink and preventing it from corroding further. The document is then rinsed in a water bath and an alkaline reserve is applied. Resizing with gelatin is another common step, but we did resize our test manuscripts.

Since these were discarded test material, we were able to cut the manuscripts in two and only treat one half. Imaging the manuscript with MSI revealed some notable findings.

Most of the treated papers now appear lighter and brighter under normal illumination because they have been washed. However, the untreated halves exhibited pronounced UV induced visible fluorescence around the 488 nm range and the treated halves did not. We believe this difference likely has to do with washing the paper substrate and rinsing out degradation products or perhaps paper size that may exhibit fluorescence at this wavelength. We were happy to see that for a treatment that targets the ink, there was very little noticeable difference in the appearance of the inks between untreated and treated portions of the test manuscript. There was some reduction in the “ink sink” (ink visible from the opposite side of the manuscript) and a very slight softness to the edges of the ink in the treated sample, but these changes were very minimal. We look forward to imaging more of our test manuscripts in the future and seeing what else we can learn from them.


Want to learn even more about MSI at DUL?

The Research Data Team: Hitting the Ground Running

There has been a lot of blogging over the last year about the Duke Digital Repository’s development and implementation, about its growth as a platform and a program, and about the creation of new positions to support research data management and curation. My fellow digital content analyst also recently posted about how we four new hires have been creating and refining our research data curation workflow since beginning our positions at Duke this past January. It’s obviously been (and continues to be) a very busy time here for the repository team at Duke Libraries, including both seasoned and new staff alike.

Besides the research data workflows between our two departments, what other things have the data management consultants and the digital content analysts been doing? In short, we’ve been busy!


In addition to envisioning stakeholder needs (which is an exercise we continuously do), we’ve received and ingested several data collections this year, which has given us an opportunity to also learn from experience. We have been tracking and documenting the types of data we’re receiving, the various needs that these types of data and depositors have, how we approach these needs (including investigating and implementing any additional tools that may help us better address these), how our repository displays the data and associated metadata, and the time spent on our management and curation tasks. Some of these are in the form of spreadsheets, others as draft policies that will first be reviewed by the library’s research data working group and then by a program committee, and others simply as brain dumps for things that require a further, more structured investigation by developers, the metadata architect, subject librarians, and other stakeholders. These documents live in either our shared online folder or our shared Box account, and, if a wider Duke library and/or public audience are required, are moved to our departments’ content collaboration software platforms (currently Confluence/Jira and Basecamp). The collaborative environments of these platforms support the dynamic nature of our work, particularly as our program takes form.

We also value the importance of face-to-face discussions, so we hold weekly meetings to talk through all of this work (we prefer outside when the weather is nice, and because squirrels are awesome).

One of the most exciting, and at times challenging, aspects of where we are is that we are essentially starting from the ground up and therefore able to develop procedures and features (and re-develop, and on and on again) until we find fits that best accommodate our users and their data. We rely heavily on each other’s knowledge about the research data field, and we also engage in periodic environmental scans of other institutions that offer data management and curation services.

When we began in January, we all considered the first 6-9 months as a “pilot phase”, though this description may not be accurate. In the minds of the data management consultants and the digital content analysts, we’re here and ready. Will we run into situations that require an adjustment to our procedures? Absolutely. It’s the nature of our work. Do we want feedback from the Duke community about how our services are (or are not) meeting their needs? Without a doubt. And will the DDR team continue to identify and implement features to better meet end-user needs? Certainly. We fully expect to adjust and readjust our tools and services, with the overall goal of fulfilling future needs before they’re even evident to our users. So, as always, keep watching to see how we grow!