The accreditation issue again

I’ve been surprised at the reaction to my earlier plea for accreditation reform (see below) with more than a few people contacting me offline to offer support but in doing so, revealing that they did not feel able to say this out loud in their own schools and departments. That is truly worrying. IF we cannot openly discuss this because of fear among faculty, then something is really wrong. Nearly as worrying, but probably with an ironic twist, it was pointed out to me that the Williamson Report of 1923 invoked the need for ALA-related accreditation as the schools of the time were felt to be unable to raise standards. Well now look where we are.

I seem to find myself on the same side as the American Council of Trustees and Alumni, ACTA, though a close reading of their various publications gives me pause. Let’s just say, we share the same concern that accreditation no long ensures quality, and leave it there.

The real point though is that everyone, in principle, believes accreditation should ensure a certain standard of educational experience. When then, did this setting of standards become so tied to processes of endless review and targets that show so little relevance to real world needs? Maybe ACTA are not so far off when they state that too often accrediting agencies act as monopolies, are a costly nuisance and offer no guarantees of quality. Surely it’s time to revisit this whole mess?

iSchool and Iron Mountain launching new partnership

Am delighted that we’re engaging in a series of open educational sessions with Iron Mountain — it’s a wonderful relationship for us, Iron Mountain are great to work with and this promises to open up new avenues for the study of information management outside of the traditional approaches. See more here   The launch event is this week at the AT&T Conference Center here at UT. Open to all, and watch for new events.

Please reform accreditation

The annual Deans and Directors meeting at ALISE this year proved refreshingly robust. We had but one real topic, the accreditation process pursued by the ALA  Committee on Accreditation. There is a proposal afoot to reduce the number of standards from six to five. This alone is worthy of celebration as ALA follows the laughable requirement of having one person per standard when forming site teams to visit programs. There is almost no justification for this but tradition, and consequently, site teams have arrived at schools outnumbering the tenure-track faculty. Since no one seems to be laughing, especially those who foot the bill for this extravagance, it would at least seem as if this merging of a couple of standards has one tangible benefit for programs.

That said, the discussion quickly moved on from wordsmithing the standards to challenging the whole process, and it was not just a minority of folks who pushed for reform. Speaker after speaker complained of the persistent disconnect between the review by the site team and the final decisions from the politburo committee, the slavish insistence on over-documenting learning outcomes, the constant demands for reports, reports and even more reports (usually about very little), the credentials of those conducting the review, and in some case, the embarrassment teams cause to programs by their obvious lack of  familiarity with university standards when dealing with upper administrations. Sadly, there was also a feeling in the room that one must be careful raising objections or one’s program will face retribution for speaking out (hence my temperate comments here). It really is hard to imagine that anyone believes this is a voluntary, collegial process anymore. Does it surprise you that only now, after years of campaigning,  the deans and directors will actually have a representative at the table when a new committee (we need more!) is formed to consider the problems?

Despite what one imagines, deans and directors like to do more than just complain (yes, it’s hard to resist the line that we leave this to the faculty–rimshot please!), we actually considered some alternatives. These included reducing the number and lengths of reports between reviews, using existing statistical data rather than forcing repeated submissions, lengthening the time between review visits, and getting more faculty involved in the final review committee. All sensible options, but I’d like to suggest we go further.

Accreditation, for all its flaws, is essentially about quality control, but somewhere along the line, the emphasis on quality has taken a backseat to control. There are many reasons which I won’t rehash here, but no matter the motivations, the results are obvious. Programs are expected to comply to language, measures and indices that reveal little about quality and more about allegiance. Take for example, the rather important matter of graduate placement. Certainly it is used by potential students, it might reasonably be interpreted as a measure of how well a program prepares new professionals for their careers, and it is based on the input of external employers, but it’s not mentioned specifically in the standards. One could meet all the requirements for accreditation, articulating all the specific learning outcomes for each course, and yet reveal nothing about the real job prospects and advancement of the students who come for this education. Is it any wonder we hear so many accounts of disgruntled, poorly paid graduates who feel their Master’s degree was not quite all it promised to be?

How hard could it be to identify and document indices of quality? I would suggest there are some basic measures we can all agree offer us some clues as a program’s overall quality:

  • Faculty size and rank
  • Graduation rate
  • Employment rate of graduates
  • Budget and resources
  • Curricular coverage

Surely there are others but let’s consider these for a moment. If a program has e.g., 12 faculty, all on tenure-track, this tells us something. If it has 5, one of whom is a part-timer and only two of whom are on tenure track, this tells us something else. No, it’s not automatically the case the the first is to be accredited and the second not, but it does give us a real data point. Having sufficient faculty is important. Having these faculty be on tenure-track tells us about the university in which the program exists and how it views the program. And having these same faculty deliver the courses that make up the program tells us something more. Similarly with budget. These are hard numbers which obviously vary across regions and universities but there is surely a minimum,  secure, recurring funding level that a faculty of a certain size must have to deliver a graduate program. We can make the same estimates on space or technical infrastructure for programs, a basic threshold at which we can be confident a program really is able to exist and deliver instruction. And yes, let’s measure employment rate. It is not a perfect score, there are none, but if your graduates are in demand and earning decent salaries over time, this suggests the professional community must be satisfied to some extent with your program’s efforts. If you cannot demonstrate this, then maybe it suggests that what you are providing is not quite up to professional standards.

You can see where this is going. I would allow for small schools,  or those just starting up, to make a case for themselves by emphasizing some measures over others. Mature programs should be able to demonstrate relatively objectively how they are resourced, what faculty standards they maintain, how they deliver the program and where their graduates go upon completion. Such reporting need not be onerous. Certainly there is room for a narrative report on the program’s emphasis,  mission, plans and general philosophy, but this would be wrapped around some hard data of the kind outlined above and used to justify the claims to quality.  There is surely a form of Turing test for programs we could apply here — answer the questions and let a normal evaluator determine if you are running a solid program or a diploma mill.

The second part of this would be to revisit the mechanisms of reviews. If a program was small or new, unable to document some key aspects such as placement or curricular coverage by appropriate faculty, or if the budget and resources seemed to prevent appropriate instructional delivery, then by all means send in a review team and make some specific recommendations. If a program decides to revisit its mission, is merged or generally undergoes a major change of direction, then send in a review team. But for most programs, once established and able to continually document their capabilities using data, let them do so by reporting every few years how they are doing using this agreed data set.  I suggest that this need not be difficult. If enrolments are healthy, faculty are strong and actively delivering the program rather than leaving it to adjuncts, and graduates can report healthy employment prospects in relevant professional roles, then it’s likely the program is doing something right. There are certainly more data points  and explanation to add but these basic measures of quality are essential — without them, something is likely in need of attention.

Most schools are already overburdened by compliance reporting and university-wide accreditation processes. Adding more to the process really does not seem to add value.  The shift to more data-driven reporting of agreed quality indices (and can anyone seriously argue against graduate employment as one such index?) would allow for some flexibility in review, not foist a one-size-fits-all cycle on every program or allow increasingly obsessive attention to secondary processes to dominate the review. Some programs would have a site visit, some would not. Some would be required to justify developments, others would be able to continue as they are doing if the data made their case. Schools would in some sense be able to tailor reviews as best fit their needs and we might move toward that more collegial, voluntary process of quality control that we are told is at the heart of accreditation.  That it might also shake out a few of the programs that are failing to deliver anything of real value would be a bonus, but I am sure none of us knows any of those.

 

 

Enough with the ‘entrepreneurialsm’ already….

Everywhere you go in academia these days (or on social media) you find people harping on about being entrepreneurial or innovative. We are supposed to aspire to think big, think different, and to disrupt the status quo. Hell, you can even make up new words to show how unique you are (or if you are self-titled ‘thought-leader, speculatist and acclaimed thought-leader’ Troy Hitch, you can even forget the old rule about not repeating yourself, THAT’s how innovative he is). It used to be that everyone aspired to be ‘ethical’, at least in the wake of Wall St scandals a few years back, but now the BS has been upped a notch or two and people are desperate to demonstrate their credentials as genuine creative forces bent on shaping the future, seizing the opportunity, taking the uncharted course, starting the next Facebook, mining the big data,  having double visions (ok, I made that last bit up).

How pin-prickingly deflationary it must be then for such folks to read the latest National Association of Colleges and Employers survey of what employers actually want in new hires. No, the old staple of ‘communication skills’ is not #1 (but it remains highly sought). Nor is it even ‘ethics’ but ‘good work ethic’ is required. The #1 is Leadership, joint with Ability to Work in a Team (oh dear!). Being an entrepreneur? Ranked 17th, with only 25% of employers thinking it important. Being creative? That’s 19th, rated as less important than being tactful. Being a speculatist? Sorry, no need to apply.

More iSchools on the way?

Fresh from approving the latest additions to the iSchool community (59 now) and having already marked out 10th conference in Berlin earlier this year, the iSchool movement might be considered a success on all normal measures. Of course, having new members now whose faculty members a decade ago were somewhat dismissive of the whole movement is certainly one other measure of maturity, I suppose. More impressive, to me at least, is the sense that new schools are being created, not just re-badged. To this end, there are plans at the University of Arizona to create a new iSchool which I learned about earlier this year.  That process is ongoing but I was particularly intrigued to see the announcement this morning that University of Colorado-Boulder was creating a new college, for the first time in decades,  that certainly has iSchool credentials: the new College of Media, Communication and Information. Of course, I prefer a simpler name than one that strives not to exclude any existing community by piling them into one long monicker, but let’s not quibble (they could have added a few more given the planned departmental structure!).  The call for a new dean speaks directly to the need for interdisciplinary collaboration across it’s departmental units and an integrated curriculum as a hallmark of the new college. It’s rare to see new colleges formed but in this field, Indiana and Penn State have done it well. Given the units involved, I have every expectation that the Boulder initiative will offer a program of similar impact in due course, though the noticeable absence of any computational group suggests this program might not be like any other to date.

Perhaps more seriously, the connections between communications and information schools is one to consider. There are several such colleges in the US, formed through mergers motivated by efficiencies rather than intellectual synergies, and it must prove somewhat comforting to university administrators to lump smaller  independent units into larger singletons when finances are tight. However, while communications and information might appear natural bedfellows on some measures, there are some very real differences and both academic traditions are themselves somewhat odd entities often originally created from the assemblage of even smaller academic programs in previous, resource-challenged times.  If the barriers between communications and information scholars are fading, the question to ask is if this is a process that evolves from natural scholarly advances in our theories and methods for addressing human activities in the 21st century or if this is more often a sign that on the surface, these areas look so alike to administrators that they might as well lump them together. One suspects there are strong views on either side, but we rarely hear them expressed, at least publicly.   And for what it’s worth, don’t imagine I think that only one of these routes is the appropriate path – sometimes just putting groups together can have real and previously unimagined benefits if handled appropriately. We shall see.  In the meantime, I recommend everyone reads Andrew Abbott’s Chaos of Disciplines and applies those insights to the information field.

 

 

 

 

 

Frightening stuff at the Patients Privacy Rights Summit

In DC this week for the annual PPR Summit — and just when you think you’ve heard it all about breaches of privacy and our lack of control over our own data, each year there are new findings which reveal how low we have sunk.  A Utah police officer on a major data-fish downloading the prescription data for every employee in a paramedical group to see who is receiving what treatments in an effort to generate suspects in a possible minor loss of medication. Your own health data being sold and accessed every day is now a reality. Your searches on specific health issues being used to flag you as ‘concerned’ and therefore subject to targeted marketing and worse.  Lots of hot air from the White House about ethics and health data while simultaneously surveilling every citizen.

Depressing as the stories may be, the bigger problem seems to be finding common ground on solutions. I’m hearing a lot from lawyers and policy specialists but little from the perspective of the consumer like you and me, other than how negatively we are impacted. The law will prove important here but in one sense the genie is out of the bottle and it’s hard to see how ordinary people can engage in meaningful acts to control  their own data. I think the doctor-patient interface is going to be crucial but the complicated nature of the processes involved and the profits involved in capturing and mining this type of data presents some real challenges in designing better information systems.

Another iSchool formed

Another School of Information is being created, this one at Arizona. Not sure Allen has the best grasp of regional geography though 🙂

Leaders at the University of Arizona are trying to make one school out of two; the goal is to get students ready to fill jobs for all sorts of fields involving computers.

By merging two programs at U of A’s School of Information, the school would then become an iSchool. These type of schools are evolving from programs formerly focused on specific tracks such as information technology, library science, and information science.

Schools focus on educating students for a range of professions from web development to data analysis. Today A panel of deans from the top iSchools across the nation discussed the future for U of A.

Ken McAllister, the Planning director of U of A’s School of Information says, “Information is ubiquitous, there’s no job anywhere that doesn’t rely on information. Our goal is to train the next generation of scholars and workers.” Deans from other successful iSchools gave their input saying the University of Arizona is the perfect place for this to happen.

Allen Renear, Interim Dean of the University of Illinois says, “It’s a major research national research university, iSchools flourish as places just like University of Arizona. Right now, there are no iSchools in the Southwest. U of A is looking to be the first one here to join 53 other iSchools across the world.

ASIST really goes international

The first ASIST annual meeting under the new, non-national affiliation was held last week in beautiful Montreal and it was a raging success. Not only was the content much improved but the spirit of fun that emerged last year in hurricane-blasted Baltimore was sustained and enhanced. For me, the year was about internationalization and while the name change was important, the proof was in the attendance. By the start of the conference we knew 38 countries would be represented, a major increase over previous years, with attendees from as far away as Australia, Brazil, China, Ethiopia, Hong Kong, Indonesia, Israel, Jamaica, Japan, Korea, Kuwait, Malaysia, New Zealand, Pakistan, Saudi Arabia, Singapore, South Africa, Taiwan, Trinidad and Tobago. We had 40 attendees from various European countries and 89 from Canada. It was a pleasure to meet so many and to turn the attendee numbers back up into positive territory after several years of decline (we passed the 600 mark once walk-in registrations were counted). Kudos to the program committee and to the local help provided by the School of Information Studies at McGill who put in a lot of effort here, culminating in a fabulous dinner celebrating CAIS and ASIST at the university’s Faculty Club on Tuesday evening. If anything, the program suggests we need to think again about adding in Wednesday to the conference program, there is just too much happening in the evenings to get to everything. Regardless, this was the best ASIST conference in years, and the stage is set to move further into international leadership in the years ahead.

Masters degree blues

Library Journal ran a column on the value of the MLS degree to budding librarians which seems to have caused a bit of a reaction among readers, some of which really makes you wonder about the type of education certain ALA-accredited programs are offering. Descriptors such as ‘dull’ ‘unnecessary’, ‘poor value’ are thrown about regularly and there is a strong sense that many graduates received little real education and merely acquired debt and the required membership card for some jobs.

The broad issue at stake here is just how well accreditation works and just what some programs are seeing fit to provide their students. When two major online programs are churning out close to half the accredited students that more than 50 schools graduated in total 10 years ago, you’d be forgiven for thinking the job market was booming. Think again. I don’t know where most of these folks go but I am fairly sure some of them are among the posters at that LJ site.

At Texas, we graduate around 100 masters students a year. Less than half go to work in libraries and archives, the traditional collection agencies of employment, but those that do are well equipped to contribute. The others go into a mix of roles that is not simply reduced to a few job titles, most are singletons, serving as some type of information specialist in a management, design, organization, or service function. It is not clear that these folks need an ALA-accredited degree but they certainly benefited from our education. And we do offer an education, not job training. And that’s just the problem: does accreditation really assess or evaluate the important qualities of a program or just the appearance of a process?

The Information Institute is launched

We’ve been working on this quietly for awhile but it’s now about to launch officially, see the full details here

The aim of the institute is to bring information education to communities that have begun to recognize the importance of information skills and techniques, have perceived a gap in their own company or organizations information activities and have not previously known where to go to obtain the knowledge they need. This is not continuing professional education as you may have experienced it in the past. This is a distinct effort aimed at making information science available to those who would not come here for a full degree, are motivated to learn quickly in focused, thematic chunks, and who place value on acquiring these skills from recognized experts. It’s a new move for us but one we think is needed.

Social Widgets powered by AB-WebLog.com.