Overview of Encoding Survey

Last month, I solicited EAD templates and documentation from partner institutions to get a clearer picture of TARO’s EAD landscape. Thank you to the 24 institutions that answered the questionnaire and provided documentation. The responses and accompanying documentation illuminate some of the shared (or similar) encoding practices across the TARO partners, as well as areas of encoding diversity. This knowledge will help me and the Steering Committee make useful recommendations for incorporating a schema-compliant workflow into existing practices. The goal is to find that sweet point between breadth and specificity, so that participation in TARO is both convenient and beneficial.     

Overall, there is plenty of common ground amongst the respondents in regards to encoding workflows and processes. The following is a very general overview of the survey responses:   

24 total responses

17 of the 24 of the institutions that responded to the survey described a process of encoding by hand using previous finding aids and/or templates as guides. MS Word and Excel are common tools used for creating collection inventories that are then copied and pasted into an XML editor.   

13 use Oxygen XML editor  

Finding aid creation is a multi-step, multi-tool process for everyone, and common ground bodes well as TARO moves toward greater standardization. Common tools, such as MS Excel and Oxygen XML editor can be incorporated and leveraged in best practices guidelines.  

As of right now, fewer organizations use archival management systems, while a handful of respondents expressed plans to adopt an AMS in the near future.

7 use AMS

3 ArchivesSpace
2 Archivists’ Toolkit
1 Archon
1 CuadraStar

As you may be aware, ArchivesSpace generates schema-compliant EAD. In fact, the AS output is sometimes stricter than the EAD 2002 schema . Currently, the institutions that use these archival management systems must reverse edit their EAD back to DTD to make it TARO compliant. With more organizations adopting (or at least considering) management systems, TARO must plan to accommodate current and future developments in technology. Updating the XML in TARO will not only improve the front-end user experience, but will also broaden potential participation.

The greatest variation across the respondents appears (quite obviously) in the documentation, instructions, and templates of each contributing institution. A large consideration going forward is finding the optimal level of standardization that benefits all contributing institutions. Participation in TARO should be easy, perhaps effortless. With this goal in mind, the question we need to ask is:

How can we reduce redundancies between unique institutional workflows and contributing to TARO?

Feel free to continue this conversation, especially if you feel that the overview above does not represent how your institution creates EAD.

 

Web Platform and EAD Resources

The first platform evaluation is fast approaching, and the WebTex Subcommittee is in search of volunteers. The first platform to be evaluated will be Access to Memory (AtoM). If you’re interested in volunteering, we’d love your help! You can find out more by reading this blog post. ArchivesSpace and XTF are slated for evaluation in the spring.

In preparation for the upcoming platform evaluations, we have gathered some resources on each one. Additionally, we’ve gathered some basic resources on EAD.

The approach to finding resources began with the most easily located pages: the main websites for each of the platforms. The WebTex team also did some brief brainstorming on resources that we were already aware of, such as Yale’s blog on ArchivesSpace and SAA’s EAD documentation page. Links to wikis, GitHub, and other blogs were mined from these. Additionally, we performed Google keyword searches to locate more blog posts related to specific platforms, which led to more link mining and discovery of additional front-end interface examples.

We hoped to build a list that would help us and others learn more about each of the platforms and bolster preexisting EAD knowledge. However, the list is not comprehensive! Please feel free to share additional resources in the comments so that we can add them to the list. You can view the list of resources along with brief annotations on the Annotated Bibliography page.

TARO User Volunteers Needed! – Archival Description Platform Testing

As you all know, TARO will undergo some big changes in the next couple of years. We are looking into moving to an new archival description platform.

But which one? This is where your help is vital to the success of the new TARO.

If you agree to volunteer to test the following archival platforms, you will be contributing to the improvement of a valuable resource for the larger regional archives community.

And, you’ll be helping yourself (possibly) by doing research that can inform your institution’s own descriptive practices.

The WebTech committee is looking at the following platforms:

Over the next year we need 10 volunteers to help us test AtoM this Fall; AS and XTF in the Spring.

Volunteers will be given access to an instance of the platform and will use a set of prompts similar to a usability test to help us determine which platform best addresses the core needs as we see them:

  • Finding aid discovery
  • Finding aid creation

Should you agree to test AtoM, you will receive an electronic packet of links for the three testing sections consisting of:

  1. Evaluation Matrix: The matrix is divided into sections mapped roughly to the user stories provided in your evaluation packet.
    • Indicate how you would prioritize each criterion (High / Medium / Low)
    • Indicate the availability of the criterion for the platform you are evaluating (Yes / No / n/a)
  2. Follow-up Questions: Follow-up questions are short-answer questions that address topics related to the new TARO platform that cannot be addressed using the other evaluation tools provided.
  3. Comments: The comments section is entirely free form. We ask that you provide feedback about the evaluation process, the TARO planning grant, your institutional orientation towards TARO, etc.

From November 2 – 13, volunteers will evaluate AtoM with the materials to be e-mailed out next week.

We anticipate that the volunteer time required will be no more than 2 hours over the evaluation period of two weeks. The work does not need to be done in one sitting.

Volunteers can sign up by filling out the google form here: https://docs.google.com/forms/d/1f8ubEINFbMRGboqaWhPjUBsi2jzD-PEmvYHswu8avNs/viewform?usp=send_form

Contact Daniel Alonzo or Jessica Meyerson if you have any questions about the volunteer process.

We are looking forward to hearing from you!