I attended a closed-shop symposium at UT this week on the future of the academic library (http://www.utexas.edu/president/symposium/index.html). The two opening addresses, by James Duderstadt, former President of the University of Michigan) and Clifford Lynch (of CNI) were models of insightful, powerpoint-free talks that took us through a range of future scenarios (definitely plural!) suggesting major challenges ahead. Duderstadt pointed to the growing need for libraries as learning spaces, not as repositories, and made a case for a world of life-long learners who would engage with universities remotely and repeatedly. I was a little concerned about the presentation of dramatic scenarios for a new cyberinfrastructure of open access without clear examples of real human activities that we could consider, the talk certainly raised the collective sights of the attendees. Clifford Lynch noted specifically that the humanities have thoroughly embraced digital technologies, with new research enabled through text mining, remote access to collections and e-publishing, but he argued convincingly that easy predictions of what lies ahead for scholarship in the digital realm are inevitably wrong.
With only 60 attendees present it was easy to engage and lively discussions were common. I chaired a panel consisting of Dan Connolly (of W3C), Kevin Guthrie (Ithaka) and Alice Proschaka (Yale) on the future of access and preservation which got the crowd going when Dan stated there was no real preservation problem since 95% of clicks on links resulted in the desired result, and Kevin argued that access suffered greater impermanence than preservation in the digital realm. Much depends on how you interpret these points, and we spent much time trying to clarify just what Dan was measuring, but he argued strongly that this is not the same as claiming 95% of sites are permanent, and indeed on the web there is a good reason why we might want and expect some sights to be very transient. The facts need to be established more clearly here and there is certainly a study waiting to happen.
The final session after 1.5 days was an open discussion which led to some interesting summary statements. While it is clear that no university or publisher has the answers, there is real concern that the world is changing and we are not ready. Personally, I think the missing piece is a better understanding of human behavior since scholarship, learning, education etc. reside at the human not the artifact or collection level. The media will always change, but the human need to communicate, share and engage with data can be undertood better and designed for accordingly.
One interesting side-discussion involved the fate of LIS education for this new world of open access, networked and aggregated, personal digital spaces. Jim Neal (Columbia) suggested that the current masters programs in LIS were not really meeting the needs of academic libraries, and this was interpreted by one attendee from another program as deeming them irrelevant (a charge Jim denied). Oddly, nobody here mentioned ‘crisis’, or a failure to teach cataloging as the problem, but the feeling seemed to be that the futures facing academic libraries will not be shaped by graduates of many current LIS programs. No comment from me required!
Update — audio files (mixed quality) of the symposium are available at: http://www.lib.utexas.edu/symposium/
Thanks for this summary; I’d welcome anything more you cared to say about the meeting.
I will suggest (as a strawman to be demolished) that library schools may serve academic libraries not so much by changing curricula as by changing recruitment and acceptance patterns. Library schools may not be able to create “digital natives” or whatever the current need is thought to be — but they may well be able to recruit them and give them the library-specific knowledge they need to apply their existing skills in the library sphere.
Good notes. I may add some of my own (oddly enough, I was also at this invitational symposium), but not for a while… (And if others do notes as good as yours, I may not bother.)
The more I thought about Dan’s numbers, the more I think they’re not so much deserving of study as simply irrelevant: They conflate popularity and scholarly significance.
He’s saying that, 96% of the time someone clicks on a link–which is either mostly going to be from a set of bookmarks or from one of the first two or three results on a search engine, the link works.
The commenters were saying that (a) the average web object now persists only about 100 days and (b) that as many as half the web-based references in an article are likely to be dead within a year.
I don’t think there’s any contradiction there. Scholarly articles aren’t, by and large, linking to the most popular sites or highest results from search engines, just as their print citations are less likely to be to USA Today or People.
Thus, the 4% of broken links–when measured by the total number of clicks–could very easily include 50% or more of all sources cited in scholarly papers that are a year or more old.
No disagreement here, Dorothea. And yes, I will post more when I have digested the meeting and thought a little more about it.
Hi Walt — I DO think there is a need for more data on all this – the relationships between permanence, popularity and purpose of web sites and objects seem complicated enough to warrant study, but that’s my empirical bias coming through.
Andrew, I certainly wouldn’t argue against more research on web persistence, maybe specifically on the persistence of cited objects.
I just don’t believe there’s an inherent contradiction between the notion that 96% of all clicks work (most of those clicks being very popular sites) and the notion that 50% or more of year-old cited sources have broken links. The two figures measure such different things that I wouldn’t expect comparability.