Wind Song Haldeman

Holiday Notes from the Homefront and Abroad

The following is a series of loosely linked stories, loosely based on our digital collections, and loosely related to the holidays, where even the word “loosely” is applied with some looseness.


 

War ration book one. Ration order no. 3.

An American looking forward to baking delicious treats for the holidays in 1942 would have been intimately familiar with War Ration Book One. The Office of Price Administration issued Ration Order No. 3 in April of that year, and distributed the ration books via elementary schools in the first week of May. Holders could purchase one pound of sugar every two weeks between May 5 and June 27. By the end of the year, butter, coffee, and other foods joined the list of regulated goods.

As the holidays approached, the newspapers ran articles advising homemakers how to cope with the unavailability of key ingredients. Vegetable shortening could help stretch butter, molasses made cookies prone to burning, and fruit juice was a natural sweetener. The New Orleans Times-Picayune’s “Round Table Talk About Food” exhorted homemakers to make the best of it:

There is something stimulating this approaching holiday time in planning Christmas meals and gift packages or baskets with those substitute items we are permitted to use, rather than with the usual abundance of foods to suit every whim of the appetite.


"YMCA Christmas, Children" (1917-1919)
“YMCA Christmas, Children” (1917-1919)

I wrote before about how YMCA missionaries took basketball overseas after its invention, including to Japan. Did they also take Santa beards to China?


hfccp010130010
“Certificate of registration to buy, sell, and transfer men’s rubber boots and rubber work shoes.” Ration order no. 6, September 29, 1942.

The Office of Price Administration provided Duke Law alum Richard Nixon with his first job in Washington, beginning in January of 1942. Rubber was his area of focus. He was industrious and diligent in his work, and by March, had been promoted to “acting chief of interpretations in the Rubber Branch.

But the life of a government regulator was not to be for Dick Nixon. He joined the Navy in August, and by year’s end found himself serving at an airfield in Ottumwa, Iowa.

- Summarized from Richard M. Nixon: A Life in Full, by Conrad Black.


True story. Terry Sanford spent December 21 and 22 of 1944 riding in a convoy that took the 1st Battalion of the 517th Parachute Regimental Combat Team from Soissons, France to the town of Soy in Belgium. His unit fought Germans for the next few days, losing more than a hundred men, in the conflagration that became known as the Battle of the Bulge.

They were able to sleep on Christmas Eve. On Christmas morning, there was roasted turkey, but at noon orders came to take a hill, which they did. The next day, they held it, repelling a German counterattack.

In the action, Sanford tackled a German officer, disarmed him, and drove him off for interrogation. Years later, he speculated that the man was probably shot before being processed as a POW, as retaliation against a recent massacre of American troops.

Sanford would write home that “things are going well in this country,” and they had “[m]ore food than elsewhere,” without explaining why there was more to go around.

While riding with his commander, Lt. Colonel “Wild Bill” Boyle, shortly after the New Year, he was wounded. He later received the Purple Heart.

- Summarized from Terry Sanford: Politics, Progress, and Outrageous Ambitions, Howard E. Covington and Marion A. Ellis.


wtddy020020490
December 24, 1940 entry from Mary McMillan’s journal.

Mary McMillan’s 1940 journal tells of her experiences as a missionary in Japan. Her entry for Christmas Eve of 1940 reads:

To doctor’s for sinus treatment, then down on Ginza shopping for Christmas presents: perfume for Umeko, a dog purse for Eiko, cookies for Mrs. Natsuzoe, toys for Mineko san and Masao chan. Lunch of fried oysters and fresh strawberries in Olympic Grille.

Brought Eiko home for her first Christmas. Tried to tell her the Christmas Story, but my limited knowledge of Japanese and her excitement made direct teaching impossible. Was up wrapping presents till almost mid-night.

A month later she wrote of talks with “Bishop Abe, and Ambassador Grew:”

They advised us to begin making preparations to leave Japan – as is we were certain that war were coming and we were sure to be called home by our Board.”

Her next entry, dated “Feb. 29, ‘41,” is headed “Homeward Bound.”


Hanukah Variant spellings for hanukkah occur three times in the OCR text of our 1960s Duke Chronicle collection. (Due to the imprecision of OCR, the actual occurrences may be more).

A photograph on the front page of the December 17, 1968, issue depicts a Star of David hanging on the side of a building. The caption reads, “It is now Hanukah, ‘the festival of lights.’”

At the top of that page, the lead story of the issue is headlined, “X-Mas amnesty asked for draft dodgers.” It reports that the “cabinet of the YMCA” at Duke had resolved to write to President Johnson on the matter. Of course, Johnson was a lame duck by then. That same day, the New York Times reported that he had spent an hour conferring with John Mitchell, Nixon’s incoming Attorney General.


Wind Song Haldeman J. Walter Thompson Company News, November 30, 1960, the two lead articles:

WIND SONG USES HANDSOME “SANTA” TO BOOST  PRE-CHRISTMAS SALES

BOB HALDEMAN NAMED MANAGER OF THE LOS ANGELES OFFICE

Eight years later, the newsletter noted the appointments of Haldeman and his subordinate, Ron Ziegler, to the White House staff. Haldeman would serve as Nixon’s Chief of Staff, and later did 18 months in prison for his role in the Watergate coverup. Ziegler became White House Press Secretary.


Sit in
Sit in, Allen Building. December, 1967.

On November 17, 1967 – the Friday before Thanksgiving – the Chronicle ran a story about Terry Sanford and his newly published book, Storm over the States. He started writing Storm soon after leaving the NC governor’s office in 1965. Supported by the Ford Foundation and the Carnegie Corporation, he holed up in an office at Duke, hired a staff, and wrote about a model for state government that is federalist but proactive and constructive. Sanford’s point of view stood in stark, if unmentioned, contrast to the doctrines of “nullification” and states’ rights that segregationists like George Wallace wielded in their opposition to Civil Rights.

That Friday ended a tumultuous week at Duke. The lead story that same day was headlined “Knight bans use of segregated facilities by student groups.” The school’s president, Douglas Knight, had “re-interpreted” a university policy statement prohibiting the use of off-campus facilities that discriminated on the basis of race. Knight extended the policy, purported to apply to staff and faculty organizations, to include student organizations as well.

The previous week, the student body had defeated a referendum that would have had the same effect. Black students reacted by staging a sit in at Knight’s office (one holding a sign that read “Students Await An Overdue Talk With Our WHITE KNIGHT”), demanding that he take action. Knight acceded, complaining in his statement that “the application of this practice would have been made in the normal course of events,” but “we were confronted with an ultimatum, which carried with it a threat of disruption of the ordinary processes of the University.”

Confrontations between the administration and black students continued to escalate, leading to the Allen Building takeover in 1969, Knight’s resignation, and his succession by Terry Sanford.


First Days Back in Japan Mary McMillan’s journals stretch from 1939 to 1991. Her “1939” journal actually contains entries from the 1940s, though there are significant gaps. In October of 1942 she wrote of traveling to Delta, Utah. She didn’t mention her purpose, and no other entries appear from that period, but she was heading to Utah to teach in the Topaz Relocation Center, an internment camp for Japanese Americans. En route, she wrote:

Those nice Marine recruits who got on our train in St. Louis shouldn’t be required to go so far from home to fight for objectives that seem to me not to be in keeping with United Nations Aims, as given in the Atlantic Charter.  Why should Japan “be crushed”? The military mind there – and elsewhere – must be forced from power; but are we on the right track towards achieving that objective? I fear most of us have become too material-minded. By following methods resulting from our materialistic thinking, we only create atmospheres for other hostile “spiritual” forces – like Naziism.

Then, the week of Christmas in 1947 she traveled to Seattle, and embarked on a return to Japan. She arrived in Hiroshima in January, the first Christian missionary to return after the war. She lived there for more than thirty years before retiring.


On page 4 of the December 17, 1967 issue of the Chronicle – just below the article about Terry Sanford’s Storm Over the States – this ad ran: RoadGoesEverOn I have to believe at least a few students and faculty got the book or the record as stocking stuffers that year.

running vagrant up

Vagrant Up

Ruby on Rails logo

Writing software for the web has come a long way in the past decade. Tools such as Ruby on Rails, Twitter Bootstrap, and jQuery provide standard methods for solving common problems and providing common features of web applications. These tools help developers use more of their time and energy to solve problems unique to the application they are building.

One of the costs of relying on libraries and frameworks to build software is that your project not only depends on the frameworks and libraries you’ve chosen to use, but these frameworks and libraries rely on still more components. Compounding this problem, software is never really finished. There are always bugs being fixed and features being changed and added. These attributes of software, that it changes and has dependencies, complicates software projects in a few different ways:

  • You must carefully manage the versions of libraries, frameworks, and other dependencies so that you ensure all the pieces work together.
  • Developers working on multiple projects may have to use different versions of the same libraries and packages for each of their projects.
  • If you’re working with a team of developers on a project all members of the team have to make sure their computers are set up with the correct versions of the software and dependencies of the project.

Thankfully, there are still more tools available to help manage these problems. For instance, Ruby Version Manager (RVM) is a popular tools used by Ruby on Rails developers. It lets the software developer install and switch between different versions of Ruby. Other tools, such as Bundler make it possible to define exactly what version of which Ruby Gems (Ruby software packages that add functionality to your own project) you need to install for a particular project. Combined, RVM and Bundler simplify the management of complex project dependencies. There are similar tools available for other programming languages, such as Composer, which is a dependency manager for PHP.

By Fco.plj (Own work) [CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons

While many of us already use dependency managers in our work, one tool we haven’t been using that we’re evaluating for use on a new project is Vagrant. Vagrant is a tool for creating virtual machines, self-contained systems that run within a host operating system. Virtual machines are software implementations of a computer system. For instance, using a virtual machine I could run Windows on my Mac hardware.

Vagrant does a few things that may make it even easier for developers to manage project dependencies.

  • With Vagrant you can write a script that contains a set of instructions about what operating system and other software you want to install in a virtual machine. Creating a virtual machine with all the software you need for a given project is as then as simple as typing a single command.
  • Vagrant provides a shared directory between your host operating system and the virtual machine. You can use the operating system you use everyday as you work while the software project runs in a virtual machine. This is significant because it means each developer can continue to use the operating system and tools they prefer while the software they’re building is all running in copies of the exact same system.
  • You can add the script for creating the virtual machine to the project itself making it very easy for new developers to get the project running. They don’t have to go through the sometimes painful process of installing a project’s dependencies by hand because the Vagrant script does it for them.
  • A developer working on multiple projects can have a virtual machine set up for each of their projects so they never interfere with each other and each has the correct dependencies installed.

Here’s how to use Vagrant in the most minimal way:

  1. Download and install VirtualBox
  2. Download and install Vagrant
  3. In a terminal window type:
    vagrant init hashicorp/precise32
  4. After running the following command you will have downloaded, set up, and started a fully functional virtual machine running Ubuntu:
    vagrant up
  5. You can then connect to and start using the running virtual machine by connecting to it via SSH:
    vagrant ssh

"Vagrantup" by Fco.plj - Own work. Licensed under CC BY-SA 3.0 via Wikimedia Commons - http://commons.wikimedia.org/wiki/File:Vagrantup.jpg#mediaviewer/File:Vagrantup.jpg

In a more complex setup you’d probably add a provisioning script with instructions for downloading and installing additional software as part of the “vagrant up” process. See the Vagrant documentation for more details about provisioning options.

We’re considering using Vagrant on an upcoming project in an effort to make it easier for all the developers on the project to set up and maintain a working development environment. With Vagrant, just one developer will need to spend the time to create the script that generates the virtual machine for the project. This should save the time of other developers on the project who should only have to install VirtualBox, copy the Vagrant file and type “vagrant up.” At least, that’s the idea. Vagrant has great documentation, so if you’re interested in learning more their website is a good place to start.

budgetcalculator

Digital Transitions Roundtable

In late October of this year, the Digital Production Center (along with many others in the Library) were busy developing budgets for FY 2015. We were asked to think about the needs of the department, where the bottlenecks were and possible new growth areas. We were asked to think big. The idea was to develop a grand list and work backwards to identify what we could reasonably ask for. While the DPC is able to digitize many types of materials and formats, such as audio and video, my focus is specifically still image digitization. So that’s what I focused on.dt-bc100-book

We serve many different parts of the Library and in order to accommodate a wide variety of requests, we use many different types of capture devices in the DPC: high-speed scanners, film scanners, overhead scanners and high-end cameras. The most heavily used capture device is the Phase One camera system. This camera system uses P65 60 MP digital back with a 72mm Schneider flat field lens. This enables us to capture high quality images at archival standards. The majority of material we digitize using this camera are bound volumes (most of them rare books from the David M. Rubenstein Library), but we also use this camera to digitize patron requests, which have increased significantly over the years (everything is expected to be digital it seems), oversized items, glass plate negatives, high-end photography collections and much more. It is no surprise that this camera is a bottleneck for still image production. In researching cameras to include in the budget, I was hard pressed to find another camera system that can compete with the Phase One camera. For over 5 years we have used Digital Transitions, a New York-based provider of high-end digital solutions, for our Phase One purchases and support. We have been very happy with the service, support and equipment we have purchased from them over the years, so I contacted them to inquire about new equipment on the horizon and pricing for upgrading our current system.captureone

New equipment they turned me onto is the BC100 book scanner. This scanner uses a 100° glass platen and two reprographic cameras to capture two facing pages at the same time. While there are other camera systems that use a similar two camera setup (most notably the Scribe, Kirtas and Atiz), the cameras and digital backs used with the BC100, as well as the CaptureOne software that drives the cameras, are more well suited for cultural heritage reproduction. Along with the new BC100, CaptureOne is now offering a new software package specifically geared toward the cultural heritage community for use with this new camera system. While inquiring about the new system, I was invited to attend a Cultural Heritage Round Table event that Digital Transitions was hosting.

This roundtable was focused on the new CaptureOne software for use with the BC100 and the specific needs of the cultural heritage community. I have always found the folks at Digital Transitions to be very professional, knowledgeable and helpful. The event they put together included Jacob Frost, Application Software R&D Manager for PhaseOne; Doug Peterson, Technical Support, Training, R&D at Digital Transitions; and Don Williams of Image Science Associates, Imaging Scientist. Don is also on the Still Image Digitization Advisory Board with the Federal Agencies Digitization Guidelines Initiative (FADGI), a collaborative effort by federal agencies to FADGI1define common guidelines, methods, and practices for digitizing historical content. They talked about the new features of the software, the science behind the software, the science behind the color technology and new information about the FADGI Still Image standard that we currently follow at the Library. I was impressed by the information provided and the knowledge shared, but what impressed me the most was the fact that the main reason Digital Transitions pulled this particular group of users and developers together was to ask us what the cultural heritage community needed from the new software. WHAT!? What we need from the software? I’ve been doing this work for about 15 years now and I think that’s the first time any software developer from any digital imaging company has asked our community specifically what we need. Don’t get me wrong, there is a lot of good software out there but usually the software comes “as is.” While it is fully functional, there are usually some work-arounds to get the software to do what I need it to do. We, as a community, spent about an hour drumming up ideas for software improvements and features.

While we still need to see follow-through on what we talked about, I am hopeful that some of the features we talked about will show up in the software. The software still needs some work to be truly beneficial (especially in post-production), but Phase One and Digital Transitions are definitely on to something.

Having it “All” – About Library Search Results

This fall we changed the default tool that students and faculty use to research library holdings. We have tools that work well for a broad search and tools that are tailored for more specialized research. So, how is this change working out?

Word cloud depicting the 30 most frequently used search terms. The size of the text is proportional to the number of times the term has been used.
Word cloud depicting the 30 most frequently used search terms. The size of the text is proportional to the number of times the term has been used.

We’ve got numbers and we’ve got opinion. First, let’s look at the numbers.

  • The most used feature on the Duke Libraries website is the search box on the homepage with 211,655 searches performed using the default “All” tab between August 25 and November 16, 2014.
  • Within the “All” tab search results, patrons selected results from Articles 48% of the time, results from Books & Media 44% of the time and other results 8% of the time. These results were presented side-by-side on a single results page.
  • The All search isn’t the only option on our homepage as the Books & Media tab was used 68,566 times and the Articles tab was used 46,028 times during the same timeframe.
  • The five most used search terms were PubMed, Web of Science, JSTOR, RefWorks, and Dictionary of National Biography.
  • The most frequently searched for fictional character was Tom Sawyer.
  • The most searched for person was Dr. Martin Luther King, Jr.

So what thoughts have you shared with us about the search options we provide?

On the Libraries' homepage, you can click the gear icon to choose a different search tab as your customized default.
On the Libraries’ homepage, you can click the gear icon to choose a different search tab as your customized default.
  • During the first four weeks of the semester, 48 people submitted their opinions through a survey linked from the search results page.
  • Thirty percent of survey respondents said that they liked having the Articles and Books & Media results appear side by side on the new search results page.
  • Twenty-seven percent said they thought the page looked cluttered or that it was hard to read.
  • Forty percent said that they did not know you can change the default search tab that appears when you view the Duke Libraries’ homepage.
  • Twenty-five percent said that they did not know that they can choose a more highly-focused search option from the Search & Find menu.
  • Testing with a small group of researchers revealed that it was difficult to locate material from the Rubenstein Library using our default search results screen.

Based on your feedback, we made the following improvements to the search results page during the semester.

  • We de-cluttered the information shown in the Articles and Books & Media columns to make results easier to read.
  • We moved “Our Website” results to the top of the right column.
  • We reduced the space used by the search box on the results page.

In the coming months, we will explore ways of making it easier to find materials from the Rubenstein Library and from University Archives. We are also investigating options for implementing a Best Bets feature on the results page; this would provide clearer access to some of the most used resources.

What can you do to help?

Complete our online survey and tell us what you think about the search tools provided through the Libraries’ homepage.

got_feature

Assembling the Game of Stones

Back in October, Molly detailed DigEx’s work on creating an exhibit for the Link Media Wall. We’ve finally finalized our content and hope to have the new exhibit published to the large display in the next week or two. I’d like to detail how this thing is actually put together.

HTML Code

In our planning meetings the super group talked about a few different approaches for how to start. We considered using a CMS like WordPress or Drupal, Four Winds (our institutional digital signage software), or potentially rolling our own system. In the end though, I decided to build using super basic HTML / CSS / Javascript. After the group was happy with the design, I built a simple page page framework to match our desired output of 3840 x 1080 pixels. And when I mean simple, I mean simple.

got_assembly

I broke the content chunks into five main sections: the masthead (which holds the branding), the navigation (which highlights the current section and construction period), the map (which shows the location of the buildings), the thumbnail (which shows the completed building and adds some descriptive text), and the images (which houses a set of cross-fading historic photos illustrating the progression of construction). Working with a fixed-pixel layout feels strange in the modern world of web development, but it’s quick and satisfying to crank out. I’m using the jQuery Cycle plugin to transition the images, which is lightweight and offers lots of configurable options. I also created a transparent PNG file containing a gradient that fades to the background color which overlays the rotating images.

Another part of the puzzle I wrestled with was how to transition from one section of the exhibit to another. I thought about housing all of the content on a single page and using some JS to move from one to the next, but I was a little worried about performance so I again opted for the super simple solution. Each page has a meta refresh in the header set to the number of seconds that it takes to cycle through the corresponding set of images and with a destination of the next section of the exhibit. It’s a little clunky in execution and I would probably try something more elegant next time, but it’s solid and it works.

Here’s a preview of the exhibit cycling through all of the content. It’s been time compressed – the actual exhibit will take about ten minutes to play through.

In a lot of ways this exhibit is an experiment in both process and form, and I’m looking forward to seeing how our vision translates to the Media Wall space. Using such simple code means that if there are any problems, we can quickly make changes. I’m also looking forward to working on future exhibits and helping to highlight the amazing items in our collections.

angulardc

New Angles & Avenues for Bitstreams

This week, we added a display of our most recent Bitstreams blog posts to our Digital Collections homepage (example), and likewise, a view of posts relevant to a given collection on the respective collection’s homepage (example).

Screen Shot 2014-11-12 at 1.19.56 PM

Background

Our Digital Projects & Production team has been writing in Bitstreams at least weekly since February 2014. We’ve had some excellent guest contributors, too. Some posts share updates about new digital collections or additions, while others share insights, lessons learned, and behind-the-scenes looks at the projects we’re currently tackling.

Many of our posts have been featured on our library homepage and library news site. But until now, we haven’t been able to display any of them—not even the ones about new digital collections—alongside the collections themselves. So, if you visited the DukEngineer collection in the past, you likely missed out on Melanie’s excellent overview, which puts the magazine in context and highlights the best of what’s inside.

Past Solutions

Syndicating tagged blog posts for display elsewhere is a pretty common use case, and we’ve used a bunch of different solutions as our platforms have evolved. Each solution has naturally been painstakingly tailored to accommodate the inner workings of both the source and the destination. Seven years ago, we were writing custom XSLT to create and then consume our own RSS feeds in Cascade Server CMS. We have since hopped over to Wordpress for managing news and blogs (whew!). An older version of our digital collections app used WordPress’ XML-RPC API to get tagged posts and parsed them with Python.

These days, our library website does blog syndication by using a combo of WordPress RSS, Drupal’s feed aggregator module, and occasionally Yahoo! Pipes for data mashing and munging. It works well in Drupal, but other platforms require other approaches.

Under the Hood: Angular.js and Wordpress JSON API

Bret Davidson’s Code4Lib 2014 presentation, Towards Pasta Code Nirvana: Using JavaScript MVC to Fill Your Programming Ravioli  (slides) made me hungry. Hungry for pasta, yes, but also for knowledge. I wanted to:

  1. Experiment with one of the Javascript MVC frameworks to learn how they work, and in the process…
  2. Build something potentially useful for digital collections that could be ported over to a new application framework in the future (e.g., from our current Django app to a future Ruby on Rails app).

From the many possibilities, I chose AngularJS. It seemed well-documented, increasingly popular, and with Google’s backing, it seems like it’ll be around for awhile.

WordPress JSON API

Among Angular’s virtues is that it really simplifies the process of getting and using JSON data from an API. I found Wordpress’ JSON API plugin, which was interestingly developed by staff at MoMA so they could use WordPress as a back-end to a site with a Rails front-end. So we first had to enable that for our Bitstreams blog.

AngularJS

angularjsAngularJS definitely helps keep code clean, especially by abstracting the model (the blogposts & associated characteristics, as well as the page state) from the view (indicates how to display the data) from the controller (gets and refines the data into the model, updates the model upon interactions with the view). I’ve done several projects in the past using jQuery and DOM manipulation to retrieve and display data. It usually works, but in the process I create a veritable rat’s nest of spaghetti code wherein /* no amount of commenting */ can truly help disentangle what’s happening.

Angular also supercharges HTML with more useful attributes to control a display. I’ve only just scratched the surface, but it’s clear that built-in directives like ng-repeat and filters like limitTo spare me from writing a ton of Javascript, e.g., <li ng-repeat="post in blogposts | limitTo:pageSize">. After the initial learning curve, the markup is visually intuitive. And it’s nice that directives and filters are extensible so you can make your own.

Source code: controller js, HTML (view source)

Initial Lessons Learned

  • AngularJS has a steeper learning curve than I’d expected; I assumed I could do this mini-project in a few hours, but it took a couple days to really get a handle on the basic pieces I needed for this project.
  • Writing an Angular app within a Django app is tricky. Both use {{ variable }} template tags so I had to change Angular to use [[ variable ]] instead.

Looking Ahead

I consider this an encouraging proof of concept. While our own blog posts can be interesting, there are many other sources of valuable data out in the world that are relevant to our collections that would add value for our researchers if we were able to easily get and display them. AngularJS won’t be the answer to all of these needs, but it’s nice to have in the toolset.

SNCC workers prepare to go to Belzoni in the Fall of 1963 to organize for the Freedom Vote. Courtesy of www.crmvet.org.

Profiling Movement Activists in 7 Steps

SNCC workers prepare to go to Belzoni in the Fall of 1963 to organize for the Freedom Vote. Courtesy of www.crmvet.org.
SNCC workers prepare to go to Belzoni in the Fall of 1963 to organize for the Freedom Vote. Courtesy of www.crmvet.org.

On the surface, writing a 500-word profile about a SNCC (Student Nonviolent Coordinating Committee) field secretary or a Mississippi-beautician-turned-grassroots-organizer doesn’t seem like a formidable task. Five hundred words hardly takes ten minutes to type. But the One Person, One Vote project is aiming for more than short biographies; it’s trying to capture why each individual was important to the movement and show that using stories. So this is how we craft a profile in 7 steps:

Step 1: Choose a person

Back in June, the Editorial Board generated a list of people that the One Person, One Vote site needed to profile in order to understand SNCC’s voting rights activism. In less than an hour, we had a list of over a hundred names that included SNCC field secretaries, local people, movement elders, and everyone in between. And those were only the first names that came to mind! We narrowed that list down to 65 people, and that’s what the project team has been working from.

Step 2: Find out everything you can about the person

The first step in profile writing is research. We have a library of twenty books for instant referencing of secondary sources. Next comes surveying available primary sources. Our profiles include documents, photographs, audio clips, news stories, and other items created during the movement to make historical actors come to life. Some of our go-to places to find these sources include: Wisconsin Historical Society’s Freedom Summer Digital Collection, the Civil Rights History Project at the Library of Congress, the Joseph Sinsheimer interviews and SNCC 40th Anniversary Conference tapes at Duke University, the Civil Rights in Mississippi Digital Archive at the University of Southern Mississippi, the University of Georgia’s Civil Rights Digital Library. There are more, of course, but that’s the start.

Step 3: Figure out why the person you’re profiling was important to the movement

The central question behind every profile the project team writes is: who was _______ to the movement? Once you start filling in the blank, the answers vary to an incredible degree. Movement elders like Ella Baker and Myles Horton contributed to the movement in different, yet equally important ways as  Mississippi-born field secretaries like Charles McLaurin, Sam Block, and Willie Peacock. Trying to figure out how and why is no small undertaking. This is where the guidance of our Visiting Activist Scholar helps focus the One Person, One Vote site on the themes that were at the heart of the movement: grassroots activism, community organizing, and individual empowerment.

Step 4: Use stories to express that in 500 words

Next, spend hours trying to express who the person you’re profiling was to the movement in only five hundred words. The profiles on the One Person, One Vote site aren’t mini academic biographies. Instead, we try to tell stories that illustrate who the people were and how their lives and work influenced SNCC’s voting rights activism in the 1960s. Finding the right story to highlight these central themes is key, and telling it well takes time and (lots of) revision.

The OPOV project does all content production in Google Drive. Here is the OPOV Profile Log to keep tract of profiles through the steps towards completion.
The OPOV project does all content production in Google Drive. Here is the OPOV Profile Log to keep tract of profiles through the steps towards completion.

Step 5: Workshop profile draft with project team

The first draft of every profile is workshopped with the One Person, One Vote project team (made up of 4 undergrads, 2 graduate students, and the project manager). As a group, we suggest how profiles can better convey their central theme, make sure that the person’s story is readable and compelling, line edit for clunky writing, and go through the primary sources.

Step 6: Send to Visiting Activist Scholar for editing

All of the revised profile drafts go the Visiting Activist Scholar for a final round of editing. Charlie Cobb, a journalist and former SNCC field secretary, is our first Visiting Activist Scholar. He helps bring the profiles to life in ways that only someone who was a part of the movement can. Charlie adds details about events, mannerisms of people, and behind-the-scene stories that never made it into history books. While the project team relies on available primary and secondary sources, the Visiting Activist Scholars adds something extra to the profiles on the One Person, One Vote site.

Step 7:  Voila!

Profiles goes through one last proofreading and polishing. Then come 2015, the will be posted on the One Person, One Vote site and the primary sources will be embedded and linked to from the Resources section of the profile pages. Voila!

 

 

http://www.diglib.org/

Dispatches from the Digital Library Federation Forum

On October 27-29 librarians, archivists, developers, project managers, and others met for the Digital Library Federation (DLF) Forum in Atlanta, GA. The program was packed to the gills with outstanding projects and presenters, and several of us from Duke University Libraries were fortunate enough to attend.  Below is a round up of notes summarizing interesting sessions, software tools, projects and collections we learned about at the conference.

Please note that these notes were written by humans listening to presentations and mistakes are inevitable.  Click the links to learn more about each tool/project or session straight from the source.

Tools and Technology

Spotlight is an open-source tool for featuring digitized resources and is being developed at Stanford University.  It appears to have fairly similar functionality to Omeka, but is integrated into Blacklight, a discovery interface used by a growing number of libraries.

 

The J. Williard Marriott Library at the University of Utah presented on their use of Pamco Imaging tools to capture 360 degree images of artifacts.  The library purchased a system from Pamco that includes an automated turntable, lighting tent and software to both capture and display the 3-D objects.

 

There were two short presentations about media walls; one from our friends in Raleigh at the Hunt Library at N.C. State University, and the second from Georgia State.  Click the links to see just how much you can do with an amazing media wall.

Projects and Collections

The California Digital Library (CDL) is redesigning and reengineering their digital collections interface to create a kind of mini-Digital Public Library of America just for University of California digital collections.  They are designing the project using a platform called Nuxeo and storing their data through Amazon web services.  The new interface and platform development is highly informed by user studies done on the existing Calisphere digital collections interface.

 

Emblematica Online is a collection of  digitized emblem books contributed by several global institutions including Duke. The collection is hosted by University of Illinois at Urbana Champagne.  The project has been conducting user studies and hope to publish them in the coming year.

 

The University of Indiana Media Digitization and Preservation Initiative started in 2009 with a survey of all the audio and visual materials on campus.  In 2011, the initiative proposed digitizing all rare and unique audio and video items within a 15 year period. However in 2013, the President of the University said that the campus would commit to completing the project in a 7 year period.   To accomplish this ambitious goal, the university formed a public-private partnership with Memnon Archiving Services of Brussels. The university estimates that they will create over 9 petabytes of data. The initiative has been in the planning phases and should be ramping up in 2015.

Selected Session Notes

The Project Managers group within DLF organized a session on “Cultivating a Culture of Project Management” followed by a working lunch. Representatives from John’s Hopkins and Brown talked about implementing Agile Methodology for managing and developing technical projects.  Both libraries spoke positively about moving towards Agile, and the benefits of clear communication lines and defined development cycles.  A speaker from Temple university discussed her methods for tracking and communicating the capacity of her development team; her spreadsheet for doing so took the session by storm (I’m not exaggerating – check out Twitter around the time of this session).   Two speakers from the University of Michigan shared their work in creating a project management special interest group within their library to share PM skills, tools and heartaches.

A session entitled “Beyond the digital Surrogate” highlighted the work of several projects that are using digitized materials as a starting point for text mining and visualizing data.  First, many of UNC’s Documenting the American South collections are available as a text download.  Second, a tool out of Georgia Tech supports interactive exploration and visualization of text based archives.  Third, a team from University of Nebraska-Lincoln is developing methods for using visual information to leverage discovery and analysis of digital collections.

 

Assessment

“Moving Forward with Digital Library Assessment.” Based around the need to strategically focus our assessment efforts in digital libraries and to better understand and measure the value, impact, and associated costs of what we do. 

Community notes for this session

  • Joyce Chapman, Duke University
  • Jody DeRidder, University of Alabama
  • Nettie Lagace, National Information Standards Organization
  • Ho Jung Yoo, University of California, San Diego

Nettie Legace: update on NISO’s altmetrics initiative.

  • The first phase exposed areas for potential standardization. The community then collectively prioritized those potential projects, and the second phase is now developing those best practices. A Working group is developed, its recommendation due June 2016.
  • Alternative Metrics Initiative Phase 1 White Paper 

Joyce Chapman: a framework for estimating digitization costs

Jody DeRidder and Ho Jung Yoo: usability testing

  • What critical aspects need to be addressed by a community of practice?
  • What are next steps we can take as a community?
The bad moon rises over Savannah City Hall.

Midnight in the Garden of Film and Video

A few weeks ago, archivists, engineers, students and vendors from across the globe arrived in the historic city of Savannah, GA for AMIA 2014. The annual conference for The Association of Moving Image Archivists is a gathering of professionals who deal with the challenge of preserving motion picture film and videotape content for future generations. Since today is Halloween, I must also point out that Savannah is a really funky city that is haunted! The downtown area is filled with weeping willow trees, well-preserved 19th century architecture and creepy cemeteries dating back to the U.S. Civil and Revolutionary wars. Savannah is almost as scary as a library budget meeting.

The bad moon rises over Savannah City Hall.
The bad moon rises over Savannah City Hall.

Since many different cultural heritage institutions are digitizing their collections for preservation and online access, it’s beneficial to develop universal file standards and best practices. For example, organizations like NARA and FADGI have contributed to the universal adoption of the 8-bit uncompressed TIFF file format for (non-transmissive) still image preservation. Likewise, for audio digitization, 24-bit uncompressed WAV has been universally adopted as the preservation standard. In other words, when it comes to still image and audio digitization, everyone is driving down the same highway. However, at AMIA 2014, it was apparent there are still many different roads being taken in regards to moving image preservation, with some potential traffic jams ahead. Are you frightened yet? You should be!

The smallest known film gauge: 3mm. Was it designed by ancient druids?
The smallest known film gauge: 3mm. Was it built by ancient druids?

Up until now, two file formats have been competing for dominance for moving image preservation: 10-bit uncompressed (.mov or .avi wrapper) vs. Motion JPEG2000 (MXF wrapper). The disadvantage of uncompressed has always been its enormous file size. Motion JPEG2000 incorporates lossless compression, which can reduce file sizes by 50%, but it’s expensive to implement, and has limited interoperability with most video software and players. At AMIA 2014, some were championing the use of a newer format, FFV1, a lossless codec that has compression ratios similar to JPEG2000, but is open source, and thus more widely adoptable. It is part of the FFmpeg software project. Adoption of FFV1 is growing, but many institutions are still heavily invested in 10-bit uncompressed or Motion JPEG2000. Which format will become the preservation standard, and which will become ghosts that haunt us forever?!?

Another emerging need is for content management systems that can store and provide public access to digitized video. The Hydra repository solution is being adopted by many institutions for managing preservation video files. In conjunction with Hydra, many are also adopting Avalon to provide public access for online viewing of video content. Like FFmpeg, both Hydra and Avalon are open source, which is part of their appeal. Others are building their own systems, catered specifically to their own needs, like The Museum of Modern Art. There are also competing metadata standards. For example, PBCore has been adopted by many public television stations, but is generally disliked by libraries. In fact, they find it really creepy!

A new print of Peter Pan was shown at AMIA 2014
A new print of Peter Pan was shown at AMIA 2014. That movie gave me nightmares as a child.

Finally, there is the thorny issue of copyright. Once file formats are chosen and delivery systems are in place, methods must be implemented to control access by only those intended, to protect copyright and hinder piracy. The Avalon Media System enables rights and access control to video content via guest passwords. The Library of Congress works around some of these these issues another way, by setting up remote viewing rooms in Washington, DC, which are connected via fiber-optic cable to their Audio-Visual Conservation Center in Culpeper, Va. Others, with more limited budgets, like Dino Everett at USC Cinematic Arts, watermark their video, upload it to sites like Vimeo, and implement temporary password protection, canceling the passwords manually after a few weeks. I mean, is there anything more frightening than a copyright lawsuit? Happy Halloween!

T206 Harry Lumley

Preview of the W. Duke, Sons & Co. Digital Collection

T206_Piedmont_cards
When I almost found the T206 Honus Wagner

It was September 6, 2011 (thanks Exif metadata!) and I thought I had found one–a T206 Honus Wagner card, the “Holy Grail” of baseball cards.  I was in the bowels of the Rubenstein Library stacks skimming through several boxes of a large collection of trading cards that form part of the W. Duke, Sons & Co. adverting materials collection when I noticed a small envelope labeled “Piedmont.”  For some reason, I remembered that the Honus Wagner card was issued as part of a larger set of cards advertising the Piedmont brand of cigarettes in 1909.  Yeah, I got pretty excited.

I carefully opened the envelope, removed a small stack of cards, and laid them out side by side, but, sadly, there was no Honus Wagner to be found.  A bit deflated, I took a quick snapshot of some of the cards with my phone, put them back in the envelope, and went about my day.  A few days later, I noticed the photo again in my camera roll and, after a bit of research, confirmed that these cards were indeed part of the same T206 set as the famed Honus Wagner card but not nearly as rare.

Fast forward three years and we’re now in the midst of a project to digitize, describe, and publish almost the entirety of the W. Duke, Sons & Co. collection including the handful of T206 series cards I found.  The scanning is complete (thanks DPC!) and we’re now in the process of developing guidelines for describing the digitized cards.  Over the last few days, I’ve learned quite a bit about the history of cigarette cards, the Duke family’s role in producing them, and the various resources available for identifying them.

T206 Harry Lumley
1909 Series T206 Harry Lumley card (front), from the W. Duke, Sons & Co. collection in the Rubenstein Library
T206 Harry Lumley card (back)
1909 Series T206 Harry Lumley card (back)

 

 

Brief History of Cigarette Cards

A Bad Decision by the Umpire
“A Bad Decision by the Umpire,” from series N86 Scenes of Perilous Occupations, W. Duke, Sons & Co. collection, Rubenstein Library.
  • Beginning in the 1870s, cigarette manufacturers like Allen and Ginter and Goodwin & Co. began the practice of inserting a trade card into cigarette packages as a stiffener. These cards were usually issued in sets of between 25 and 100 to encourage repeat purchases and to promote brand loyalty.
  • In the late 1880s, the W. Duke, Sons, & Co. (founded by Washington Duke in 1881), began inserting cards into Duke brand cigarette packages.  The earliest Duke-issued cards covered a wide array of subject matter with series titled Actors and Actresses, Fishers and Fish, Jokes, Ocean and River Steamers, and even Scenes of Perilous Occupations.
  • In 1890, the W. Duke & Sons Co., headed by James B. Duke (founder of Duke University), merged with several other cigarette manufacturers to form the American Tobacco Company.
  • In 1909, the American Tobacco Company (ATC) first began inserting baseball cards into their cigarettes packages with the introduction of the now famous T206 “White Border” set, which included a Honus Wagner card that, in 2007, sold for a record $2.8 million.
The American Card Catalog
Title page from library’s copy of The American Card Catalog by Jefferson R. Burdick.

Identifying Cigarette Cards

  • The T206 designation assigned to the ATC’s “white border” set was not assigned by the company itself, but by Jefferson R. Burdick in his 1953 publication The American Card Catalog (ACC), the first comprehensive catalog of trade cards ever published.
  • In the ACC, Burdick devised a numbering scheme for tobacco cards based on manufacturer and time period, with the two primary designations being the N-series (19th century tobacco cards) and the T-series (20th century tobacco cards).  Burdick’s numbering scheme is still used by collectors today.
  • Burdick was also a prolific card collector and his personal collection of roughly 300,000 trade cards now resides at the Metropolitan Museum of Art in New York.

 

Preview of the W. Duke, Sons & Co. Digital Collection [coming soon]

Dressed Beef (Series N81 Jokes)
“Dressed Beef” from Series N81 Jokes, W. Duke, Sons & Co. collection, Rubenstein Library
  •  When published, the W. Duke, Sons & Co. digital collection will feature approximately 2000 individual cigarette cards from the late 19th and early 20th centuries as well as two large scrapbooks that contain several hundred additional cards.
  • The collection will also include images of other tobacco advertising ephemera such as pins, buttons, tobacco tags, and even examples of early cigarette packs.
  • Researchers will be able to search and browse the digitized cards and ephemera by manufacturer, cigarette brand, and the subjects they depict.
  • In the meantime, researchers are welcome to visit the Rubenstein Library in person to view the originals in our reading room.

 

 

 

Notes from the Duke University Libraries Digital Projects Team