Platform evaluation: One down!

Howdy TARO Members!

The WebTex Subcommittee and its team of volunteers have completed our first platform evaluation, as described in our posting from October 29. The platform under consideration for this first round of testing was Access to Memory, or AtoM.

The evaluation proceeded according to four user personas crafted to present the needs of a range of hypothetical *archival staff* end users with diverse job descriptions and levels of experience. That approach helped our volunteers step off of the “beaten path” of their own typical use of such a platform and into areas they might not otherwise consider. The varying levels of experience of the volunteers themselves also provided insights into how intuitive the front-end and back-end interfaces were, the initial learning curve for getting acquainted with the platform, and the strength of the documentation provided.

Testing with the user personas occurred in early November; after the long Thanksgiving holiday weekend, the Subcommittee held a conference call with the evaluation volunteers to discuss their experiences with the evaluation procedure, and the platform itself. That information will be helpful in planning our next platform evaluations, and the Subcommittee is grateful for the commitment of the volunteer pool to continue testing the other platforms.

Analysis of this round of platform evaluation will be complete before the holiday break. After the start of the new year, we will call on our volunteers again to put the next platform through its paces. The diversity of the institutions which participate in TARO — in size, history, mission, and personnel — makes a broad scope of input imperative as we update and enhance the services which TARO offers.

 

 

Web Platform and EAD Resources

The first platform evaluation is fast approaching, and the WebTex Subcommittee is in search of volunteers. The first platform to be evaluated will be Access to Memory (AtoM). If you’re interested in volunteering, we’d love your help! You can find out more by reading this blog post. ArchivesSpace and XTF are slated for evaluation in the spring.

In preparation for the upcoming platform evaluations, we have gathered some resources on each one. Additionally, we’ve gathered some basic resources on EAD.

The approach to finding resources began with the most easily located pages: the main websites for each of the platforms. The WebTex team also did some brief brainstorming on resources that we were already aware of, such as Yale’s blog on ArchivesSpace and SAA’s EAD documentation page. Links to wikis, GitHub, and other blogs were mined from these. Additionally, we performed Google keyword searches to locate more blog posts related to specific platforms, which led to more link mining and discovery of additional front-end interface examples.

We hoped to build a list that would help us and others learn more about each of the platforms and bolster preexisting EAD knowledge. However, the list is not comprehensive! Please feel free to share additional resources in the comments so that we can add them to the list. You can view the list of resources along with brief annotations on the Annotated Bibliography page.

TARO User Volunteers Needed! – Archival Description Platform Testing

As you all know, TARO will undergo some big changes in the next couple of years. We are looking into moving to an new archival description platform.

But which one? This is where your help is vital to the success of the new TARO.

If you agree to volunteer to test the following archival platforms, you will be contributing to the improvement of a valuable resource for the larger regional archives community.

And, you’ll be helping yourself (possibly) by doing research that can inform your institution’s own descriptive practices.

The WebTech committee is looking at the following platforms:

Over the next year we need 10 volunteers to help us test AtoM this Fall; AS and XTF in the Spring.

Volunteers will be given access to an instance of the platform and will use a set of prompts similar to a usability test to help us determine which platform best addresses the core needs as we see them:

  • Finding aid discovery
  • Finding aid creation

Should you agree to test AtoM, you will receive an electronic packet of links for the three testing sections consisting of:

  1. Evaluation Matrix: The matrix is divided into sections mapped roughly to the user stories provided in your evaluation packet.
    • Indicate how you would prioritize each criterion (High / Medium / Low)
    • Indicate the availability of the criterion for the platform you are evaluating (Yes / No / n/a)
  2. Follow-up Questions: Follow-up questions are short-answer questions that address topics related to the new TARO platform that cannot be addressed using the other evaluation tools provided.
  3. Comments: The comments section is entirely free form. We ask that you provide feedback about the evaluation process, the TARO planning grant, your institutional orientation towards TARO, etc.

From November 2 – 13, volunteers will evaluate AtoM with the materials to be e-mailed out next week.

We anticipate that the volunteer time required will be no more than 2 hours over the evaluation period of two weeks. The work does not need to be done in one sitting.

Volunteers can sign up by filling out the google form here: https://docs.google.com/forms/d/1f8ubEINFbMRGboqaWhPjUBsi2jzD-PEmvYHswu8avNs/viewform?usp=send_form

Contact Daniel Alonzo or Jessica Meyerson if you have any questions about the volunteer process.

We are looking forward to hearing from you!