Category Archives: Equipment

U-matic for the People

Duke Libraries has a large collection of analog videotapes, in several different formats. One of the most common in our archives is 3/4″ videotape, also called “U-matic” (shown above). Invented by Sony in 1969, U-matic was the first videotape to be housed inside a plastic cassette for portability. Before U-matic, videotape was recorded on very large reels in the 2″ format known as Quadruplex which required heavy recording and playback machines the size of household refrigerators. U-matic got its name from the shape of the tape path as it wraps around the video head drum, which looks like the letter U.

The VO-3800 enabled TV news crews to record directly to U-matic videotape at breaking news events.

The format was officially released in 1971, and soon became popular with television stations, when the portable Sony VO-3800 video deck was released in 1974. The VO-3800 enabled TV crews to record directly to U-matic videotape at breaking news events, which previously had to be shot with 16mm film. The news content was now immediately available for broadcast, as opposed to film, which had to wait for processing in a darkroom. And the compact videocassettes could easily and quickly be transported to the TV station.

In the 1970’s, movie studios also used U-matic tapes to easily transport filmed scenes or “dailies,” such as the first rough cut of “Apocalypse Now.” In 1976, the high-band BVU (Broadcast Video U-matic) version of 3/4″ videotape, with better color reproduction and lower noise levels, replaced the previous “lo-band” version.

The Digital Production Center’s Sony VO-9800P for PAL videotapes (top), and a Sony BVU-950 for NTSC tapes (bottom).

The U-matic format remained popular at TV stations throughout the 1980’s, but was soon replaced by Sony’s 1/2″ Betacam SP format. The BVU-900 series was the last U-matic product line made by Sony, and Duke Libraries’ Digital Production Center uses two BVU-950s for NTSC tapes, as well as a VO-9800P for tapes in PAL format. A U-matic videotape player in good working order is now an obsolete collector’s item, so they can be hard to find, and expensive to purchase.

Unfortunately, most U-matic tapes have not aged well. After decades in storage, many of the videotapes in our collection now have sticky-shed syndrome, a condition in which the oxide that holds the visual content is literally flaking off the polyester tape base, and is moist and gummy in texture. When a videotape has sticky-shed, not only will it not play correctly, the residue can also clog up the tape heads in the U-matic playback deck, then transfer the contaminant to other tapes played afterwards in the same deck.

The DPC’s RTI VT3100 U-matic tape cleaner.

To combat this, we always bake (dehumidify) our U-matic videotapes in a scientific oven at 52 celsius (125 fahrenheit) for at least 10 hours. Then we run each tape through a specialized tape-cleaning machine, which fast-forwards and rewinds each tape, while using a burnishing blade to wipe off any built-up residue. We also clean the video heads inside our U-matic decks before each playback, using denatured alcohol.

Most of the time, these procedures make the U-matic tape playable, and we are able to digitize them, which rescues the content from the videotapes, before the magnetic tape ages and degrades any further. While the U-matic tapes are nearing the end of their life-span, the digital surrogates will potentially last for centuries to come, and will be accessible online through our Duke Digital Repository, from anywhere in the world.

Mythical Beasts of Audio

Gear. Kit. Hardware. Rig. Equipment.

In the audio world, we take our tools seriously, sometimes to an unhealthy and obsessive degree. We give them pet names, endow them with human qualities, and imbue them with magical powers. In this context, it’s not really strange that a manufacturer of professional audio interfaces would call themselves “Mark of the Unicorn.”

Here at the Digital Production Center, we recently upgraded our audio interface to a MOTU 896 mk3 from an ancient (in tech years) Edirol UA-101. The audio interface, which converts analog signals to digital and vice-versa, is the heart of any computer-based audio system. It controls all of the routing from the analog sources (mostly cassette and open reel tape decks in our case) to the computer workstation and the audio recording/editing software. If the audio interface isn’t seamlessly performing analog to digital conversion at archival standards, we have no hope of fulfilling our mission of creating high-quality digital surrogates of library A/V materials.

Edirol UA-101
The Edirol enjoying its retirement with some other pieces of kit

While the Edirol served us well from the very beginning of the Library’s forays into audio digitization, it had recently begun to cause issues resulting in crashes, restarts, and lost work. Given that the Edirol is over 10 years old and has been discontinued, it is expected that it would eventually fail to keep up with continued OS and software updates. After re-assessing our needs and doing a bit of research, we settled on the MOTU 896 mk3 as its replacement. The 896 had the input, output, and sync options we needed along with plenty of other bells and whistles.

I’ve been using the MOTU for several weeks now, and here are some things that I’m liking about it:

  • Easy installation of drivers
  • Designed to fit into standard audio rack
  • Choice of USB or Firewire connection to PC workstation
  • Good visual feedback on audio levels, sample rate, etc. via LED meters on front panel
  • Clarity and definition of sound
MOTU 896mk3
The MOTU sitting atop the audio tower

I haven’t had a chance to explore all of the additional features of the MOTU yet, but so far it has lived up to expectations and improved our digitization workflow. However, in a production environment such as ours, each piece of equipment needs to be a workhorse that can perform its function day in and day out as we work our way through the vaults. Only time can tell if the Mark of the Unicorn will be elevated to the pantheon of gear that its whimsical name suggests!

A New(-ish) Look for Public Computing

Photo of library public computing terminals

Over the past year, you’ve probably noticed a change in the public computing environments in Duke University Libraries. Besides new patron-facing hardware, we’ve made even larger changes behind the scenes — the majority of our public computing “computers” have been converted to a Virtual Desktop Infrastructure (VDI).

The physical hardware that you sit down at looks a little different, with larger monitors and no “CPU tower”:

Close-up photo of a public terminal

What isn’t apparent is that these “computers” actually have NO computational power at all! They’re essentially just a remote keyboard and monitor that connects to a VDI-server sitting in a data-center.
Photo of the VDI serverThe end-result is really that you sit down at what looks like a regular computer, and you have an experience that “feels” like a regular computer.  The VDI-terminal and VDI-server work together to make that appear seamless.

All of the same software is installed on the new “computers” — really, virtual desktop connections back to the server — and we’ve purchased a fairly “beefy” VDI-server so that each terminal should feel very responsive and fast.  The goal has been to provide as good an experience on VDI as you would get on “real” computers.

But there are also some great benefits …

Additional Security:
When a patron sits down at a terminal, they are given a new, clean installation of a standard Windows environment. When they’re done with their work, the system will automatically delete that now-unused virtual desktop session, and then create a brand-new one for the next patron. From a security standpoint, this means there is no “leakage” of any credentials from one user to another — passwords, website cookies, access tokens, etc. are all wiped clean when the user logs out.

Reduced Staff Effort:
It also offers some back-end efficiency for the Specialized Computing team. First off, since the VDI-terminal hardware is less complex (it’s not a full computer), the devices themselves have been seen to last 7 to 10 years (vs. 4 years for a standard PC). There have also been reports that they can take quite a beating and remain operational (and while I don’t want to jinx it, there are reports of them being fully submerged in water and, once dried out, being fully functional).

Beyond that, when we need to update the operating system or software, we make the change on one “golden image” and that image is copied to each new virtual desktop session. So despite having 50 or more public computing terminals, we don’t spend 50-times as much effort in maintaining them.

It is worth noting that we can also make these updates transparent to our patrons. After logging in, that VDI session will remain as-is until the person logs out — we will not reboot the system from under them.  Once they logout, the system deletes the old, now-outdated image and replaces it with a new image. There is no downtime for the next user, they just automatically get the new image, and no one’s work gets disrupted by a reboot.

Flexibility:
We can, in fact, define multiple “golden images”, each with a different suite of software on it. And rather than having to individually update each machine or each image, the system understands common packages — if we update the OS, then all images referring to that OS automatically get updated. Again, this leads to a great reduction in staff effort needed to support these more-standardized environments.

We have deployed SAP and Envisionware images on VDI, as well as some more customized images (e.g. Divinity-specific software).  For managers who don’t otherwise have access to SAP, please contact Core Services and we can get you set up to use the VDI-image with SAP installed.

Future Expansion:
We recently upgraded the storage system that is attached to the VDI-server, and with that, we are able to add even more VDI-terminals to our public computing environment. Over the next few months, we’ll be working with stakeholders to identify where those new systems might go.

As the original hardware is nearing it’s end-of-life, we will also be looking at a server upgrade near the end of this year. Of note: the server upgrade should provide an immediate “speed up” to all public computing terminals, without us having to touch any of those 50+ devices.

Shiny New Chrome!

Chrome bumper and grill

In 2008, Google released their free web browser, Chrome.  It’s improved speed and features led to quick adoption by users, and by the middle of 2012, Chrome had become the world’s most popular browser. Recent data puts it at over 55% market share [StatCounter].

As smartphones and tablets took off, Google decided to build an “operating system free” computer based around the Chrome browser – the first official Chromebook launched in mid-2011.  The idea was that since everyone is doing their work on the web anyway (assuming your work==Google Docs), then there wasn’t a need for most users to have a “full” operating system – especially since full operating systems require maintenance patches and security updates.  Their price-point didn’t hurt either – while some models now top-out over $1000, many Chromebooks come in under $300.Acer Chromebook

We purchased one of the cheaper models recently to do some testing and see if it might work for any DUL use-cases.  The specific model was an Acer Chromebook 14, priced at $250.  It has a 14” screen at full HD resolution, a metal body to protect against bumps and bruises, and it promises up to 12 hours of battery life.  Where we’d usually look at CPU and memory specs, these tend to be less important on a Chromebook — you’re basically just surfing the web, so you shouldn’t need a high-end (pricey) CPU nor a lot of memory.  At least that’s the theory.

But what can it do?

Basic websurfing, check!  Google Docs, check!  Mail.duke.edu for work-email, check!  Duke.box.com, check!  LibGuides, LibCal, Basecamp, Jira, Slack, Evernote … check!

LastPass even works to hold all the highly-complex, fully secure passwords that you use on all those sites (you do you complex passwords, don’t you?).

Not surprisingly, if you do a lot of your day-to-day work inside a browser, then a Chromebook can easily handle that.  For a lot of office workers, a Chromebook may very well get the job done – sitting in a meeting, typing notes into Evernote; checking email while you’re waiting for a meeting; popping into Slack to send someone a quick note.  All those work perfectly fine.

What about the non-web stuff I do?

Microsoft Word and Excel, well, kinda sorta.  You can upload them to Google Docs and then access them through the usual Google Docs web interface.  Of course, you can then share them as Google Docs with other people, but to get them back into “real” Microsoft Word requires an extra step.

Aleph, umm, no.  SAP for your budgets, umm, no. Those apps simply won’t run on the ChromeOS.  At least not directly.

Acer ChromebookBut just as many of you currently “remote” into your work computer from home, e.g., you _can_ use a Chromebook to “remote” into other machines, including “virtual” machines that we can set up to run standard Windows applications.  There’s an extra step or two in the process to reserve a remote system and connect to it.  But if you’re in a job where just a small amount of your work needs “real” Windows applications, there still might be some opportunity to leverage Chromebooks as a cheaper alternative to a laptop.

Final Thoughts:

I’m curious to see where (or not) Chromebooks might fit into the DUL technology landscape.  Their price is certainly budget-friendly, and since Google automatically updates and patches them, they could reduce IT staff effort.  But there are clearly issues we need to investigate.  Some of them seem solvable, at least technically.  But it’s not clear that the solution will be usable in day-to-day work.Google Chrome logo

If you’re interested in trying one out, please contact me!

 

Adventures in 4K

When it comes to moving image digitization, Duke Libraries’ Digital Production Center primarily deals with obsolete videotape formats like U-matic, Betacam, VHS and DV, which are in standard-definition (SD). We typically don’t work with high-definition (HD) or ultra-high-definition (UHD) video because that is usually “born digital,” and doesn’t need any kind of conversion from analog, or real-time migration from magnetic tape. It’s already in the form of a digital file.

However, when I’m not at Duke, I do like to watch TV at home, in high-definition. This past Christmas, the television in my living room decided to kick the bucket, so I set out to get a new one. I went to my local Best Buy and a few other stores, to check out all the latest and greatest TVs. The first thing I noticed is that just about every TV on the market now features 4K ultra-high-definition (UHD), and many have high dynamic range (HDR).

Before we dive into 4K, some history is in order. Traditional, standard-definition televisions offered 480 lines of vertical resolution, with a 4:3 aspect ratio, meaning the height of the image display is 3/4 the dimension of the width. This is how television was broadcast for most of the 20th century. Full HD television, which gained popularity at the turn of the millennium, has 1080 pixels of vertical resolution (over twice as much as SD), and an aspect ratio of 16:9, which makes the height barely more than 1/2 the size of the width.

16:9 more closely resembles the proportions of a movie theater screen, and this change in TV specification helped to usher in the “home theater” era. Once 16:9 HD TVs became popular, the emergence of Blu-ray discs and players allowed consumers to rent or purchase movies, watch them in full HD and hear them in theater-like high fidelity, by adding 5.1 surround sound speakers and subwoofers. Those who could afford it started converting their basements and spare rooms into small movie theaters.

4K UHD has 3840 horizontal pixels and 2160 vertical pixels, twice as much resolution as HD, and almost five times more resolution than SD.

The next step in the television evolution was 4K ultra-high-definition (UHD) TVs, which have flooded big box stores in recent years. 4K UHD has an astounding resolution of 3840 horizontal pixels and 2160 vertical pixels, twice as much resolution as HD, and almost five times more resolution than SD. Gazing at the images on these 4K TVs in that Best Buy was pretty disorienting. The image is so sharp and finely-detailed, that it’s almost too much for your eyes and brain to process.

For example, looking at footage of a mountain range in 4K UHD feels like you’re seeing more detail than you would if you were actually looking at the same mountain range in person, with your naked eye. And high dynamic range (HDR) increases this effect, by offering a much larger palette of colors and more levels of subtle gradation from light to dark. The latter allows for more detail in the highlight and shadow areas of the image. The 4K experience is a textbook example of hyperreality, which is rapidly encroaching into every aspect of our modern lives, from entertainment to politics.

The next thing that dawned on me was: If I get a 4K TV, where am I going to get the 4K content? No television stations or cable channels are broadcasting in 4K and my old Blu-ray player doesn’t play 4K. Fortunately, all 4K TVs will also display 1080p HD content beautifully, so that warmed me up to the purchase. It mean’t I didn’t have to immediately replace my Blu-ray player, or just stare at a black screen night after night, waiting for my favorite TV stations to catch up with the new technology.

The salesperson that was helping me alerted me to the fact that Best Buy also sells 4K UHD Blu-ray discs and 4K-ready Blu-ray players, and that some content providers, like Netflix, are streaming many shows in 4K and in HDR, like “Stranger Things,” “Daredevil” and “The Punisher,” to name a few. So I went ahead with the purchase and brought home my new 4K TV. I also picked up a 4K-enabled Roku, which allows anyone with a fast internet connection and subscription to stream content from Netflix, Amazon and Hulu, as well as accessing most cable-TV channels via services like DirecTV Now, YouTube TV, Sling and Hulu.

I connected the new TV (a 55” Sony X800E) to my 4K Roku, ethernet, HD antenna and stereo system and sat down to watch. The 1080p broadcasts from the local HD stations looked and sounded great, and so did my favorite 1080p shows streaming from Netflix. I went with a larger TV than I had previously, so that was also a big improvement.

To get the true 4K HDR experience, I upgraded my Netflix account to the 4K-capable version, and started watching the new Marvel series, “The Punisher.” It didn’t look quite as razor sharp as the 4K images did in Best Buy, but that’s likely due to the fact that the 4K Netflix content is more compressed for streaming, whereas the TVs on the sales floor are playing 4K video in-house, that has very little, if any, compression.

As a test, I went back and forth between watching The Punisher in 4K UHD, and watching the same Punisher episodes in HD, using an additional, older Roku though a separate HDMI port. The 4K version did have a lot more detail than it’s HD counterpart, but it was also more grainy, with horizons of clear skies showing additional noise, as if the 4K technology is trying too hard to bring detail out of something that is inherently a flat plane of the same color.

Also, because of the high dynamic range, the image loses a bit of overall contrast when displaying so many subtle gradations between dark and light. 4K streaming also requires a fast internet connection and it downloads a lot of data, so if you want to go 4K, you may need to upgrade your ISP plan, and make sure there are no data caps. I have a 300 Mbps fiber connection, with ethernet cable routed to my TV, and that works perfectly when I’m streaming 4K content.

I have yet to buy a 4K Blu-ray player and try out a 4K Blu-ray disc, so I don’t know how that will look on my new TV, but from what I’ve read, it more fully takes advantage of the 4K data than streaming 4K does. One reason I’m reluctant to buy a 4K Blu-ray player gets back to content. Almost all the 4K Blu-ray discs for sale or rent now are recently-made Hollywood movies. If I’m going to buy a 4K Blu-ray player, I want to watch classics like 2001: A Space Odyssey,” The Godfather,” “Apocalypse Now” and Vertigo” in 4K, but those aren’t currently available because the studios have yet to release them in 4K. This requires going back to the original film stock and painstakingly digitizing and restoring them in 4K.

Some older films may not have enough inherent resolution to take full advantage of 4K, but it seems like films such as “2001: A Space Odyssey,” which was originally shot in 65 mm, would really be enhanced by a 4K restoration. Filmmakers and the entertainment industry are already experimenting with 8K and 16K technology, so I guess my 4K TV will be obsolete in a few years, and we’ll all be having seizures while watching TV, because our brains will no longer be able to handle the amount of data flooding our senses.

Prepare yourself for 8K and 16K video.

 

A History of Videotape, Part 1

As a Digital Production Specialist at Duke Libraries, I work with a variety of obsolete videotape formats, digitizing them for long-term preservation and access. Videotape is a form of magnetic tape, consisting of a magnetized coating on one side of a strip of plastic film. The film is there to support the magnetized coating, which usually consists of iron oxide. Magnetic tape was first invented in 1928, for recording sound, but it would be several decades before it could be used for moving images, due to the increased bandwidth that is required to capture the visual content.

Bing Crosby was the first major entertainer who pushed for audiotape recordings of his radio broadcasts. in 1951, his company, Bing Crosby Enterprises (BCE) debuted the first videotape technology to the public.

Television was live in the beginning, because there was no way to pre-record the broadcast other than with traditional film, which was expensive and time-consuming. In 1951, Bing Crosby Enterprises (BCE), owned by actor and singer Bing Crosby, demonstrated the first videotape recording. Crosby had previously incorporated audiotape recording into the production of his radio broadcasts, so that he would have more time for other commitments, like golf! Instead of having to do a live radio broadcast once a week for a month, he could record four broadcasts in one week, then have the next three weeks off. The 1951 demonstration ran quarter-inch audiotape at 360 inches per second, using a modified Ampex 200 tape recorder, but the images were reportedly blurry and not broadcast quality.

Ampex introduced 2” quadruplex videotape at the National Association of Broadcasters convention in 1956. Shown here is a Bosch 2″ Zoll Quadruplex Machine.

More companies experimented with the emerging technology in the early 1950’s, until Ampex introduced 2” black and white quadruplex videotape at the National Association of Broadcasters convention in 1956. This was the first videotape that was broadcast quality. Soon, television networks were broadcasting pre-recorded shows on quadruplex, and were able to present them at different times in all four U.S. time zones. Some of the earliest videotape broadcasts were CBS’s “The Edsel Show,” CBS’s “Douglas Edwards with the News,” and NBC’s “Truth or Consequences.” In 1958, Ampex debuted a color quadruplex videotape recorder. NBC’s “An Evening with Fred Astaire” was the first major TV show to be videotaped in color, also in 1958.

Virtually all the videotapes of the first ten years (1962-1972) of “The Tonight Show with Johnny Carson” were taped over by NBC to save money, so no one has seen these episodes since broadcast, nor will they… ever.

 

One of the downsides to quadruplex, is that the videotapes could only be played back using the same tape heads which originally recorded the content. Those tape-heads wore out very quickly, which mean’t that many tapes could not be reliably played back using the new tape-heads that replaced the exhausted ones. Quadruplex videotapes were also expensive, about $300 per hour of tape. So, many TV stations maximized the expense, by continually erasing tapes, and then recording the next broadcast on the same tape. Unfortunately, due to this, many classic TV shows are lost forever, like the vast majority of the first ten years (1962-1972) of “The Tonight Show with Johnny Carson,” and Super Bowl II (1968).

Quadruplex was the industry standard until the introduction of 1” Type C, in 1976. Type C video recorders required less maintenance, were more compact and enabled new functions, like still frame, shuttle and slow motion, and 1” Type C did not require time base correction, like 2” Quadruplex did. Type C is a composite videotape format, with quality that matches later component formats like Betacam. Composite video merges the color channels so that it’s consistent with a broadcast signal. Type C remained popular for several decades, until the use of videocassettes gained in popularity. We will explore that in a future blog post.

Multispectral Imaging: What’s it good for?

At the beginning of March, the multispectral imaging working group presented details about the imaging system and the group’s progress so far to other library staff at a First Wednesday event. Representatives from Conservation Services, Data and Visualization Services, the Digital Production Center, the Duke Collaboratory for Classics Computing (DC3) and the Rubenstein Library each shared their involvement and interest in the imaging technology. Our presentation attempted to answer some basic questions about how the equipment works and how it can be used to benefit the scholarly community. You can view a video of that presentation here

Some of the images we have already shared illustrate a basic benefit or goal of spectral imaging for books and manuscripts: making obscured text visible. But what else can this technology tell us about the objects in our collections? As a library conservator, I am very interested in the ways that this technology can provide more information about the composition and condition of objects, as well as inform conservation treatment decisions and document their efficacy.

Conservators and conservation scientists have been using spectral imaging to help distinguish between and to characterize materials for some time. For example, pigments, adhesives, or coatings may appear very differently under ultraviolet or infrared radiation. Many labs have the equipment to image under a few wavelengths of light, but our new imaging system allows us to capture at a much broader range of wavelengths and compare them in an image stack.

Adhesive samples under visible and UV light.
(Photo credit Art Conservation Department, SUNY Buffalo State)

Spectral imaging  can help to identify the materials used to make or repair an object by the way they react under different light sources. Correct identification of components is important in making the best conservation treatment decisions and might also be used to establish the relative age of a particular object or to verify its authenticity.  While spectral imaging offers the promise of providing a non-destructive tool for identification, it does have limitations and other analytical techniques may be required.

Pigment and dye-based inks under visible and infrared light.
(Photo credit Image Permanence Institute)

Multispectral imaging offers new opportunities to evaluate and document the condition of objects within our collections. Previous repairs may be so well executed or intentionally obscured that the location or extent of the repair is not obvious under visible light. Areas of paper or parchment that have been replaced or have added reinforcements (such as linings) may appear different from the original when viewed under UV radiation. Spectral imaging can provide better visual documentation of the degradation of inks ( see image below) or damage from mold or water that is not apparent under visible light.

Iron gall ink degredation. Jantz MS#124, Rubenstein Library
(Jantz MS#124, Rubenstein Library)

This imaging equipment provides opportunities for better measuring the effectiveness of the treatments that conservators perform in-house. For example, a common treatment that we perform in our lab is the removal of pressure sensitive tape repairs from paper documents using solvents. Spectral imaging before, during, and after treatment could document the effectiveness of the solvents or other employed techniques in removing the tape carrier and adhesive from the paper.

Tape removal before and during treatment under visible and UV light.
(Photo credit Art Conservation Department, SUNY Buffalo State)

Staff from the Conservation Services department have a long history of participating in the library’s digitization program in order to ensure the safety of fragile collection materials. Our department will continue to serve in this capacity for special collections objects undergoing multispectral imaging to answer specific research questions; however, we are also excited to employ this same technology to better care for the cultural heritage within our collections.

______

Want to learn even more about MSI at DUL?

 

Let’s Get Small: a tribute to the mighty microcassette

In past posts, I’ve paid homage to the audio ancestors with riffs on such endangered–some might say extinct–formats as DAT and Minidisc.  This week we turn our attention to the smallest (and perhaps the cutest) tape format of them all:  the Microcassette.

Introduced by the Olympus Corporation in 1969, the Microcassette used the same width tape (3.81 mm) as the more common Philips Compact Cassette but housed it in a much smaller and less robust plastic shell.  The Microcassette also spooled from right to left (opposite from the compact cassette) as well as using slower recording speeds of 2.4 and 1.2 cm/s.  The speed adjustment, allowing for longer uninterrupted recording times, could be toggled on the recorder itself.  For instance, the original MC60 Microcassette allowed for 30 minutes of recorded content per “side” at standard speed and 60 minutes per side at low speed.

The microcassette was mostly used for recording voice–e.g. lectures, interviews, and memos.  The thin tape (prone to stretching) and slow recording speeds made for a low-fidelity result that was perfectly adequate for the aforementioned applications, but not up to the task of capturing the wide dynamic and frequency range of music.  As a result, the microcassette was the go-to format for cheap, portable, hand-held recording in the days before the smartphone and digital recording.  It was standard to see a cluster of these around the lectern in a college classroom as late as the mid-1990s.  Many of the recorders featured voice-activated recording (to prevent capturing “dead air”) and continuously variable playback speed to make transcription easier.

The tiny tapes were also commonly used in telephone answering machines and dictation machines.

As you may have guessed, the rise of digital recording, handheld devices, and cheap data storage quickly relegated the microcassette to a museum piece by the early 21st century.  While the compact cassette has enjoyed a resurgence as a hip medium for underground music, the poor audio quality and durability of the microcassette have largely doomed it to oblivion except among the most willful obscurantists.  Still, many Rubenstein Library collections contain these little guys as carriers of valuable primary source material.  That means we’re holding onto our Microcassette player for the long haul in all of its atavistic glory.

image by the author. other images in this post taken from Wikimedia Commons (https://commons.wikimedia.org/wiki/Category:Microcassette)

 

Multispectral Imaging Through Collaboration

I am sure you have all been following the Library’s exploration into Multispectral Imaging (MSI) here on Bitstreams, Preservation Underground and the News & Observer.  Previous posts have detailed our collaboration with R.B. Toth Associates and the Duke Eye Center, the basic process and equipment, and the wide range of departments that could benefit from MSI.  In early December of last year (that sounds like it was so long ago!), we finished readying the room for MSI capture, installed the equipment, and went to MSI boot camp.

Obligatory before and after shot. In the bottom image, the new MSI system is in the background on the left with the full spectrum system that we have been using for years on the right. Other additions to the room are blackout curtains, neutral gray walls and black ceiling tiles all to control light spill between the two camera systems. Full spectrum overhead lighting and a new tile floor were installed which is standard for an imaging lab in the Library.

Well, boot camp came to us. Meghan Wilson, an independent contractor who has worked with R.B. Toth Associates for many years, started our training with an overview of the equipment and the basic science behind it. She covered the different lighting schemes and when they should be used.  She explained MSI applications for identifying resins, adhesives and pigments and how to use UV lighting and filters to expose obscured text.   We quickly went from talking to doing.  As with any training session worth its salt, things went awry right off the bat (not Meghan’s fault).  We had powered up the equipment but the camera would not communicate with the software and the lights would not fire when the shutter was triggered.  This was actually a good experience because we had to troubleshoot on the spot and figure out what was going on together as a team.  It turns out that there are six different pieces of equipment that have to be powered-up in a specific sequence in order for the system to communicate properly (tee up Apollo 13 soundtrack). Once we got the system up and running we took turns driving the software and hardware to capture a number of items that we had pre-selected.  This is an involved process that produces a bunch of files that eventually produce an image stack that can be manipulated using specialized software.  When it’s all said and done, files have been converted, cleaned, flattened, manipulated and variations produced that are somewhere in the neighborhood of 300 files. Whoa!

This is not your parents’ point and shoot—not the room, the lights, the curtains, the hardware, the software, the pricetag, none of it. But it is different in another more important way too. This process is team-driven and interdisciplinary. Our R&D working group is diverse and includes representatives from the following library departments.

  • The Digital Production Center (DPC) has expertise in high-end, full spectrum imaging for cultural heritage institutions along with a deep knowledge of the camera and lighting systems involved in MSI, file storage, naming and management of large sets of files with complex relationships.
  • The Duke Collaboratory for Classics Computing (DC3) offers a scholarly and research perspective on papyri, manuscripts, etc., as well as  experience with MSI and other imaging modalities
  • The Conservation Lab brings expertise in the Libraries’ collections and a deep understanding of the materiality and history of the objects we are imaging.
  • Duke Libraries’ Data Visualization Services (DVS) has expertise in the processing and display of complex data.
  • The Rubenstein Library’s Collection Development brings a deep understanding of the collections, provenance and history of materials, and valuable contacts with researchers near and far.

To get the most out of MSI we need all of those skills and perspectives. What MSI really offers is the ability to ask—and we hope answer—strings of good questions. Is there ink beneath that paste-down or paint? Is this a palimpsest? What text is obscured by that stain or fire-damage or water damage? Can we recover it without having to intervene physically? What does the ‘invisible’ text say and what if anything does this tell us about the object’s history? Is the reflectance signature of the ink compatible with the proposed date or provenance of the object? That’s just for starters. But you can see how even framing the right question requires a range of perspectives; we have to understand what kinds of properties MSI is likely to illuminate, what kinds of questions the material objects themselves suggest or demand, what the historical and scholarly stakes are, what the wider implications for our and others’ collections are, and how best to facilitate human interface with the data that we collect. No single person on the team commands all of this.

Working in any large group can be a challenge. But when it all comes together, it is worth it. Below is a page from Jantz 723, one processed as a black and white image and the other a Principal Component Analysis produced by the MSI capture and processed using ImageJ and a set of tools created by Bill Christens-Barry of R.B. Toth Associates with false color applied using Photoshop. Using MSI we were able to better reveal this watermark which had previously been obscured.

Jantz 723

I think we feel like 16-year-old kids with newly minted drivers’ licenses who have never driven a car on the highway or out of town. A whole new world has just opened up to us, and we are really excited and a little apprehensive!

What now?

Practice, experiment, document, refine. Over the next 12 (16? 18) months we will work together to hone our collective skills, driving the system, deepening our understanding of the scholarly, conservation, and curatorial use-cases for the technology, optimizing workflow, documenting best practices, getting a firm grip on scale, pace, and cost of what we can do. The team will assemble monthly, practice what we have learned, and lean on each other’s expertise to develop a solid workflow that includes the right expertise at the right time.  We will select a wide variety of materials so that we can develop a feel for how far we can push the system and what we can expect day to day. During all of this practice, workflows, guidelines, policies and expectations will come into sharper focus.

As you can tell from the above, we are going to learn a lot over the coming months.  We plan to share what we learn via regular posts here and elsewhere.  Although we are not prepared yet to offer MSI as a standard library service, we are interested to hear your suggestions for Duke Library collection items that may benefit from MSI imaging.  We have a long queue of items that we would like to shoot, and are excited to add more research questions, use cases, and new opportunities to push our skills forward.   To suggest materials, contact Molly Bragg, Digital Collections Program Manager (molly.bragg at Duke.edu), Joshua Sosin, Associate Professor in Classical Studies & History (jds15 at Duke.edu) or Curator of Collections (andrew.armacost at Duke.edu).

Want to learn even more about MSI at DUL?