NPLIHK: JCL

I posted this on my personal blog last week, and just realized I forgot to cross-post here.

So this is going to be a twist on my series on Programming Languages I Have Known. I’m going to talk about something that is often called a programming language but isn‘t one: the Job Control Language (JCL) of OS/360 and its successors.

Before talking about JCL, though, let’s start here. I assume everyone reading this (both of you!) is using a desktop or laptop computer, or a smart phone;.(I wonder how much longer we’ll say “smart phone” instead of just “phone”.) When you decide what program (or “app” as we tend to say these days) we want to use, if we’re on a desktop or laptop we’ll find the icon that represents it and double click with our mouse or trackpad or trackball. This is made possible by graphical terminals and pointing devices. If we’re on a phone, we again find the app icon, but now we just press on it, because the screens on our phones have special hardware that can sense touches. My point here is how we interact with our devices and signal what program we want to run depends on the available technologies for input and output.

The original Macintosh was revolutionary for the time because it had a graphical display and a mouse. Before that the computers most people used had text-based terminals, that could only display letters, numbers, and punctuation. You told it which program to run by typing in the program’s name and perhaps some additional information and then pressing “Enter”. This is now called a Command Line Interface (CLI) while the what Macintosh does is called a Graphical User Interface (GUI). (CLIs still exist. I use one almost every day. They are harder to learn than GUIs, but if you have learned them they can be faster and more flexible than a GUI.)

In the early 1960s, when IBM was developing System/360, the principle technology for inputing data into computers was Hollerith or “punch” cards. So when they needed a way for people to tell these computers what program to run, they designed it for that interface. The result was JCL. So, for example, JCL has an EXEC statement that you use to tell the computer which program to use. It has DD (Data Definition) statements to tell which files to use for input or output, somewhat analogous to the “Save” or “Open” dialogues in the Mac or Windows interface.

This is not to say that JCL is particularly well designed. My understanding is that IBM was getting close to releasing OS/360 when someone realized they were going to need something like JCL, and so it was thrown together somewhat haphazardly. Also, more recent operating systems abstract many of the lower-level details about the physical nature of I/O devices that JCL still exposes.

So there you have it: JCL is not a programming language. It’s the friendliest punch-card-based Human Interface language the IBM engineers could throw together at the last minute in 1964 or ’65.

PLIHK: Natural

Cross posted from my personal blog.

Time to get back to this series. (If you’re just joining me here, PLIHK=”Programming Languages I Have Known”, so these are posts about programming languages I have used.)

After a year teaching school, I decided I couldn’t afford to do that any more, so I started looking for another job. After a summer of looking, I was hired by the Data Processing Division at the University of Texas at Austin in the fall of 1987, and I have worked there ever since. (Not, strictly speaking, in the Data Processing Division, which has been renamed, merged, split up, and otherwise reorganized many times over the years.)

At the time, and for nearly another decade afterwards, administrative applications at the University were written in the Natural programming language from Software AG. Natural was originally designed for querying Adabas databases (Adabas being Software AG’s database management system) and expanded to support updating and maintaining data in that DBMS as well. It’s target was COBOL programmers and it bears some resemblance to COBOL, but with integrated facilities for accessing databases. It also has features that ease writing reports from the data and laying out fields for interacting with the program via a 3270 terminal. Since the applications we were writing back then consisted pretty much exclusively of batch jobs and interactions through 3270, that were centered on querying and maintaining databases, Natural fit our needs almost perfectly.

Natural is very much a procedural language, with relatively few features for modularization. (Some “object oriented” extensions were added a couple of decades ago, but they were implemented clumsily—it was more about supporting Microsoft’s COM scheme that actual OO—and we never used them at the University. My impression is that we weren’t the only ones who never used them.) I also believe that Natural has been so good at accessing Adabas that it has held the DBMS back. Although Adabas is a very capable, flexible, and performant DBMS, once Software AG started licensing Natural to go with it they have only made half-hearted efforts at making it accessible from other programming languages, so Natural’s weaknesses have in effect become weaknesses of Adabas as well.

Today I still do occasional programming in Natural. Just this week I was updating a batch Natural program I wrote last year to handle some kinds of data I hadn’t anticipated. And Natural has evolved and become more capable over the years: when I started at the University we were just beginning to migrate from Natural version 1 to version 2; the current version is version 9. However, it has become a niche language that only has limited use cases. I also think its proprietary nature has held it back. Significant enhancements only happen when Software AG sees the benefit from them, and the community of Natural programmers is limited to people working for companies that have licensed it. If you work for one of those companies, enjoy writing in Natural. If you don’t, you’re probably not missing much.

PLIHK: Pascal

(Once again cross posted.)

This will be another short one, since I haven’t used Pascal in decades.

Although my supervisor at CDI recommended me for a promotion, a recession hit and I was laid off instead. I was able to get a job as a high school and middle school science teacher for the Chilton Independent School District. (The district only had a single campus, for all grades K–12.) Two of the classes I taught were 8th grade computer literacy and upper class computer science. We had a room full of Apple IIe computers, with programs like Oregon Trail and Math Blaster and, of course, a BASIC interpreter. I had read, though, that BASIC was out of favor with the academic community and that the AP exam for computer science assumed the students knew Pascal, so I persuaded the school board to purchase licenses for a Pascal compiler.

My memory of Pascal is that it was a pretty clear and clean language that resulted in easy-to-read programs. Unlike the “spaghetti-code” style of typical BASIC programs, it pretty much enforced structured programming. So, although I haven’t had any occasion to use Pascal since then, I have positive memories of it.

PLIHK: Rexx

(Cross-posted from my personal blog.)

The other language I learned while at CDI was Rexx. First, a little background on VM/CMS.

In CMS, files have an eight character file name, an eight character file type, and a one character code specifying which minidisk the file is on. The original CMS command interpreter supported simple scripting by putting commands in a file with type EXEC. This is very similar to shell scripts in Unix. The EXEC scripting language was very basic, so after a while an extended version, EXEC 2, was added, with a few additional features and the removal of some restrictions, like token size and line length. The file type was still EXEC; you start an EXEC 2 script with a &TRACE statement (which doesn’t exist in EXEC) to distinguish between the languages. Then in 1982 IBM released a newer, much more powerful scripting language named Rexx, created by Mike Cowlishaw. The file type was still EXEC; this time you started a Rexx script with a Rexx-style comment (which I think has to include “REXX” somewhere in the first line) to let the system know it should be processed as a Rexx script. So when I started working at CDI in 1985, Rexx was the new hotness in CMS scripting. IBM later ported Rexx to most if not all of its operating systems, and it has been ported by others to many other environments.

Since it was designed for scripting, almost any line of text can be a valid Rexx statement. If the interpreter doesn’t recognize it as Rexx code, it passes it to the host environment as a command for that environment. What the host environment is depends on how or where the script was invoked, and can be changed by the Rexx ADDRESS command. When I was coding at CDI in the CMS environment, the host environment for a script could be the XEDIT editor, CMS itself, or the VM control program (CP). If a Rexx script was invoked as an XEDIT macro, for example, commands would go to the editor bylaws default, but if you needed to, say, issue a CMS command you would just prefix it with ADDRESS "CMS". The mainframe implementations have a fairly well-defined interface so you can create your own environment if that’s what you want to do. There’s also a well-defined API so that assembler programs called from a Rexx script can access and set Rexx variables, and this has been done so that many MVS facilities, like RMF or SDSF, can be automated using Rexx.

Rexx has an interesting approach to variables. Variable names are case-insensitive: Rexx always internally converts them to upper case. You create a variable by assigning a value to a name. If you reference a variable name that has never had a value assigned to it, Rexx just uses the (upper case) name. If you want to make sure something that could be a variable name is just treated as text instead of a variable, you put it in quotes. As an example, suppose you had a Rexx program that included:

bar = "ABC"
"FOO" bar baz

Then, assuming “BAZ” had never been assigned a value, “FOO ABC BAZ” would be passed as a command to the host environment. Another thing about Rexx variables is that their values are always text strings. If you use a variable in an arithmetic statement (like with ‘+’) Rexx will internally attempt to convert the value to a number, perform the operation, and then convert the result back to a text string.

Another cool thing in Rexx is “stems”, which are basically associative arrays. (This is the only kind of array Rexx supports.) A stem is a variable name followed by a period followed by a variable or value which serves as the key to identify which element of the array is being referenced.

Rexx has a fairly powerful PARSE command for extracting data from text strings. (As far as REXX is concerned, all data is a text string.) It’s not as powerful as regular expressions available in Perl, Ruby, Python, and other more recent scripting languages, but on the other hand it’s very good for extracting data from fixed positions, which is non-trivial (if even possible) for regular expression engines.

The wikipedia article I linked says that Rexx is considered a precursor to scripting languages like Tcl and Python. In most cases I would prefer to use Python, but if you’re working with one of the environments with robust Rexx integration it is a good choice.

PLIHK: Assembler

Cross posted from my personal blog.

(Specifically for IBM System/360–System/370–Z systems.)

After I dropped out of graduate school my first job was as a plotter operator for a computer aided design/drafting shop run by CDI. We had a crew of about a half-dozen drafters working on IBM 2250 graphics terminals attached to an IBM 4341 computer, and a big flat-bed printer. When they had a schematic or diagram they wanted printed, I would write the print instructions on a reel tape drive attached to the 4341 and then move the tape to a drive attached to the plotter. After positioning the paper (or occasionally plastic) to write on, and putting ink in the pens and the pens in the plotter head, I would start up the plotter and it would read the tape and move the head around to make the plot. (A big vacuum under the plotter bed kept the paper or whatever flat on the bed.)

Anyway, this left me with a lot of time sitting around, which I put to use learning to run and operate the computer, so I was eventually promoted to computer operator. (My boss, Ray Tamar, was going to put me in for a promotion to systems programmer, but a recession hit and I was laid off instead.)

The 4341 implemented the IBM System/370 architecture. We ran the VM/370 operating system, with a OS/VS1 guest supporting the actual drafters, who were using a program called CADAM from Lockheed. While I was there IBM stopped supporting OS/VS1, but about the same time Lockheed provided a CADAM version that ran under CMS (a single-user operating system that only runs as a VM guest), so we converted to that. Anyway, part of what I taught myself on that machine was how to program using IBM’s Assembler for System/370. I remember one of the projects I had was to replace the generic CADAM logon screen with a custom one using the CDI logo. This meant also learning the Graphics Access Method (GAM) for programming the 2250 terminals. My assembler skills would later come in handy when I started working for the University of Texas.

So what can I say about assembler? If you care at all about programmer productivity, and there are any other options, you should probably not write in assembly language. On the other hand, if you really want to understand how the computer works at the lowest level, assembly language is what you want to learn. You’re right there, manipulating registers and looking at memory addresses and learning the individual instructions and so on. Also, at least in the IBM mainframe world, there is a lot of “legacy” assembly language code out there, so this is a valuable skill to have if that’s where you’re working.

PLIHK: BASIC

(Cross-posted from my personal blog.)

Shortly before I finished my mission, my father quit IBM and started working for Billings Computer, a company that built microcomputers in Provo. So after my mission I lived at home, where there were several Billings computers available for us to play with. Like most microcomputers in that time, Billings computers came with a BASIC interpreter so you could write your own programs.

As an aside, Microsoft is celebrating its fiftieth anniversary this year—if wikipedia is to be believed April 4 is the actual anniversary of its founding—and two days ago Bill Gates posted a remembrance that includes a downloadable PDF of the source code for the BASIC interpreter they wrote for the Altair 8800. This was the first thing Microsoft (Micro-Soft in those days) ever did.

Anyway, I did play around with writing programs in BASIC on those computers. I remember I wrote a game to play Blackjack.

Since it’s been years since I did any BASIC programming I don’t remember much about it. I’m sure the language has changed since then. It was designed to be easier to learn than languages like FORTRAN and COBOL (not to mention assembler or machine code) and I think it succeeded in that. Like most languages of that era, though, it lacked facilities for building higher-level abstractions. I don’t feel any nostalgia for BASIC at all.

PLIHK: Machine code

(Cross-posted from my personal blog.)

My other coding experience in my senior Physics lab was in one of the more fun modules: a digital voltmeter. We had a breadboard with a simple processor chip, memory chip, EPROM, etc. along with another chip that was either a digital to analog converter or digital signal processor, I don’t remember which, and a bunch of other components like resistors and capacitors. We had to wire it all together, and then write a program that would connect the capacitor to the voltage we wanted to test, let the capacitor charge, and then let it discharge through a known resistance. While it was discharging the program went into a loop that incremented a counter, and when it reached a low threshold the DSP or D2A chip (whichever it was) would interrupt the processor, and you could use the counter to calculate the voltage knowing the capacitance and resistance. You had to write this program using the machine code for the chip, and then you entered the program by pressing a toggle until the hexadecimal value for the next byte of the program appeared on a two-character display and then pressing a button that stored it in the next memory location. You also had to look up exactly how long each instruction in the loop took to execute so you could convert the counter value into seconds.

I’m not being sarcastic when I say it was fun. Maybe I was meant to go into programming.

PLIHK: APL

(Cross-posted from my personal blog)

I had encounters with programming in another BYU class, my senior Physics lab. This class was billed as “things about labs and such every Physics graduate should know”, but in practice seemed more like “how many ways can we torture Physics students one more time before giving them a degree?” It consisted of a series of modules that were only related in having something to do with Physics and something to do with labs.

Anyway, one of the modules involved writing a series of programs in APL, on a teletypewriter connected to a server computer via modem. So yes, you’d go into the lab, dial the server’s number on a telephone, and then place the handset on the modem attached to the teletypewriter and hope the connection worked.

APL stands for “A Programming Language”, which to be fair is correct as far as it goes. Now, I haven’t had anything to do with it for nearly half a century, so this is how I remember it. It was very mathematically oriented and very concise. It required a special keyboard because most of the operations were specified with mathematical symbols. Some of these symbols were for things like matrix addition and multiplication, so you could write a program that did a lot of calculating in only one or two lines of code. The downside of this concision was poor readability. More than once I wrote a program, got it working, and then the next day looked at it and couldn’t figure out what it was doing or how it did it. I’ve jokingly referred to APL as a “write-only” language. It did have, on the other hand, the idea of a “workspace” where your programs, data, and results were stored.

PLIHK: FORTRAN

I’ve decided to write a series of blog posts in the category “Programming Languages I Have Known” (PLIHK). This is a cross-post from my personal blog. I will understand if you want to bail out now.

My first exposure to a programming language was in elementary school when my father brought home some COBOL manuals for me to read, but since I didn’t have access to a computer to actually write and test programs then I didn’t really learn anything from them. I will make no claims to understanding COBOL.

Instead, my first actual experience writing programs was when I took a Numerical Methods course as an undergraduate. I found out the week before class started I was expected to know FORTRAN, but my brother Erick had taken a class in it and still had his textbook, and the Numerical Methods textbook (which I think I still have somewhere) had many examples, so it really wasn’t hard to pick up enough to do the assignments.

FORTRAN was an amazing accomplishment for its time, and it proved that programming in a level above machine code or assembler was not only possible but desirable. So I’m not going to bad-mouth FORTRAN at all. That being said, an awful lot has been learned about programming language design since then, and the only reasons I can see to use it now are “historical curiosity” or having a large body of existing code needing maintenance that might take more effort to rewrite than is justified.

In case you’re wondering, the main thing I remember from my Numerical Methods class is “floating point numbers are fiddly.”

Back from the cloud?

Basecamp-maker 37Signals says its “cloud exit” will save it $10M over 5 years

…when 37Signals decided to pull its seven cloud-based apps off Amazon Web Services in the fall of 2022, it didn’t do so quietly or without details. Back then, Hansson described his firm as paying “an at times almost absurd premium” for defense against “wild swings or towering peaks in usage.” In early 2023, Hansson wrote that 37Signals expected to save $7 million over five years by buying more than $600,000 worth of Dell server gear and hosting its own apps.

Late last week, Hansson had an update: it’s more like $10 million (and, he told the BBC, more like $800,000 in gear). By squeezing more hardware into existing racks and power allowances, estimating seven years’ life for that hardware, and eventually transferring its 10 petabytes of S3 storage into a dual-DC Pure Storage flash array, 37Signals expects to save money, run faster, and have more storage available.