Why we did it that way

I’d like to expand on Adam’s post on recruiting and retention, and also tie it in to our earlier policy to build our own administrative applications instead of buying commercial packages. This was the reasoning behind Lawrence and Randy and company managing Data Processing the way they did:

  1. The key to any successful organization is the people who make it up. You should do whatever you can to recruit and retain the best people you can get.
  2. The University can’t compete with private industry on salary. If we want to hire and keep good people, we have to provide something they can’t get in another job.

So what were the non-financial incentives they used?

  • We recruit and hire people with little or no experience, but strong aptitude and people skills, and train them. In this way we get good people that other IT organizations miss.
  • We provide interesting and challenging work. This is where the “build” strategy beats out “buy”: developing your own application requires more creativity than installing and maintaining something bought from a vendor.
  • We show respect and trust by allowing by providing direction and resources, but allow individuals to work out their own creative solutions to problems.

I agree with Adam that it wasn’t just the creation of ITS that changed this. It started with the simultaneous dot-com boom and Y2K remediation. For one thing, outside salaries increased more quickly than the University could keep up. Your job needs to be really interesting to justify not leaving when doing so could mean doubling or tripling your salary. At the same time, the kind of remediation that was needed for the Y2K stuff wasn’t all that interesting. Then ITS was formed and we had to do SSN remediation and by the time that was over the culture (and people in management positions) had really changed.

Still, I wanted to point out that there was a well thought out, coherent philosophy and strategy behind the way Data Processing/ACS was managed, and that it worked quite well for many years.

HyperCard!

Ars Technica has a good retrospective on HyperCard, a program I still remember fondly. (In fact, I still have my copy of HyperTalk 2.2: The Book in a bookshelf at home.) I wrote a JCL tutorial in HyperCard; UT developers who’ve taken my JCL class have seen some of the content that was in it.

When Tim Berners-Lee’s innovation finally became popular in the mid-1990s, HyperCard had already prepared a generation of developers who knew what Netscape was for.

Yep.

Magnetic disk, too

As a bit of follow-up to yesterday’s post, IBM started shipping magnetic disk about four years after their tape. Jeffrey McGuire sent me the link to this video, 60 Years Ago: IBM Invents the Hard Drive.

(One impression from this video: people talk about corporations controlling things now, but it feels like it was worse in the 1950’s. Of course, IBM’s corporate culture under Watson Sr. was always a bit creepy.)

Data-driven

Higher Education and the Perfect Data Storm

Some quotes:

In education, data always arrives too late, like Inspector Clouseau, blundering into a scene, oblivious to what’s really going on or who the villain is.  The kind of information data yields is retrospective, not predictive.  Correlation, as we know, is not causation.  To this, I would add mathematization is not explanation. I just learned “mathematization” is among the “bottom 20% of lookups” in the online Merriam-Webster’s Dictionary; what exactly does this tell me?

Even analyzed data is haunted by forging, fudging, trimming, and cooking, along with confirmation bias and egocentric thinking.

I am going to make a radical suggestion about data and higher education:  colleges and universities will be better served if they avoid kneeling at the altar of data and instead fill key positions with people driven by intuition, experience, values, conviction, and principle.  A good place to start would be looking for leadership guided by a transcendent educational narrative.

An article at the Register yesterday ranked buzzwords’ credibility from -1.0 (not at all credible) to +1.0 (credible). “Data-driven” was rated -0.76.

Microsoft a top Linux contributor

Microsoft makes Top 20 list of Linux kernel contributors

For contributions made to the kernel since version 2.6.36, Microsoft ranks 17th, with Redmond’s contribution estimated at 1 per cent of the whole. The top contributing companies were Red Hat, Intel, and Novell. Samsung and Texas Instruments were also named as fast-growing contributors, reflecting an increase in interest in Linux for mobile and embedded systems.

Times have changed.

No, not a good idea

I’ve had the unpleasant experience of trying to help someone debug some Java code. We had a stack trace, and it pointed to a line where a new exception was being thrown within a catch block. So all that’s needed is to look at the try block and figure out where it was throwing the original exception, right? Only problem: the try block is 140 lines long.

Why would someone do something like this? Well, beyond not really understanding the importance of modularization, undoubtably the reason was to work around one of Java’s features: in a method’s signature you have to declare all the exceptions it might throw. So he (I know who the guilty party is; that”s the correct pronoun) wrapped everything in a try block and threw his own exception so the compiler would accept his code. The reasoning behind this feature is clear, they’re trying to force programmers to think about what errors might occur and how they should deal with them. But of course you can’t really force someone to think if he doesn’t want to; people will just do something goofy to subvert your restrictions.

Deleting code

It may seem paradoxical, but even though a programmer’s job is to write code one of the most productive things he or she can do is remove code. Software AG changed the way a particular module works, so I’m having to change a system routine to fix a resulting error.* This routine was originally written by Bill Wagner in the mid 1980’s and I’ve been maintaining it for close to a quarter century, and as I was looking at it I realized that the reasons for a lot of the things it’s doing disappeared years ago. When I get done it should be significantly shorter than what it is now.

This reminded me of a story about early Macintosh development: -2000 Lines of Code. It’s a great story, and a great site.

* Technical details: ADALNKR, the reentrant module for calling Adabas at the Assembler level, now dynamically allocates working storage. If you call it repeatedly, it will leak memory, and our installation’s IEFUJV, the job verification exit that runs in the JES2 address space, does this. (This is what caused our emergency IPL on November 4.) Last week they finally sent us documentation for how to ask for this memory to be released.