About RAMBLIN' TECH

Looking back, looking ahead.

TL;DR: Skip down to "Retirement"

Introduction

My first recollection of a computer, aside from the images of tape drives depicted in popular media as being "computers", is a Digi-Comp given to me by my paternal grandparents when I was probably 8 or 9 years old back in the late-1960s.  Though I was able to assemble it, the Digi-Comp was too complex for me to understand its operation.  Nevertheless, I was fascinated by the idea of computing and building computers even if I was not entirely sure what they actually did.


High School

Model railroading, model rocketry, and weekend classes at the St. Petersburg Science Center aside, my tech journey began in earnest as a 14 year old high school freshman in 1973, when our geometry teacher asked the class if anyone wanted to stay after school to learn about a timeshared Honeywell mainframe computer that our school system had gotten access to.  Over the course of a few weeks, a handful of us learned the rudiments of BASIC programming by punching lines of code into paper tape using an offline Teletype and then uploading our programs with an online Teletype over an asynchronous 110 baud dial-up modem with an acoustic coupler.  Computer time was considered too valuable to consume it by interactively typing in your programs.

By the time I was a sophomore in 1974, my high school was offering Computer Math I, which was a full semester of BASIC, and Computer Math II, a semester of FORTRAN.  I sequentially took each of these courses as electives and then, as a junior, began collecting and reading Byte and Popular Electronics magazines.  While owning a personal computer was entirely beyond my financial means, I spent countless hours dreaming of building one, as I was inspired by the articles and advertisements in Byte, and particularly by the series of articles on the COSMAC Elf that appeared in Popular Electronics when I was a senior.

During my junior year in high school I decided on a career in a STEM field (even if the "STEM" acronym had not been coined yet) and began reading about universities with STEM emphasis.  Growing up in a working-class Florida neighborhood, I did not know any engineers or computer programmers so I was not sure what those professions were, but I did know that whatever field I entered, I wanted to work with computers.  My reading (in print media of course, no Internet back then) led me toward an interest in computer science as a major and the Georgia Institute of Technology for a college education.

I did well enough taking the SAT as a junior that I applied to Georgia Tech with those results and requested early consideration for acceptance.  I got a fat envelope over the summer after my junior year and was free from worrying about college acceptance during my senior year.  Instead, I spent free time during my senior year poring through issues of Byte and Popular Electronics, and bread-boarding simple digital circuits using 7400-series TTL integrated circuits.


College

I arrived on the Georgia Tech campus in the Fall of 1977, enrolling in the School of Information and Computer Science's BS program, and believing that I could breeze through.  After all, I had already learned to write elementary BASIC and FORTRAN programs, could wire up trivial digital circuits, and had a collection of Byte magazines; what more was there to learn?  The adage that "I didn't know what I didn't know I" was never more true and began dawning on me by my sophomore year. I slogged through the freshman and sophomore ICS coursework using punched cards as a freshman and then the interactive video terminals that became more widely available my sophomore year for those willing to queue up to use them.  Underclass ICS coursework requiring programming largely used the Institute's CDC Cyber timeshared supercomputer available as a resource for all academic disciplines, with Pascal being the preferred teaching language of the School of ICS.  

With my upper-class electives, I gravitated toward operating system and data communications courses, along with electrical engineering service courses ("shocks for jocks"), as I felt that these were more interesting and comprehensible than other CS concepts that I had more difficulty grasping. Upperclass ICS coursework typically utilized computing resources in ICS-specific labs, primarily a Prime minicomputer with a relative abundance of interactive terminals. When it came time to pick a senior design project topic, I figured that the culmination of my BS studies should be a confirmation that I had learned how computers actually worked: I would design and build a computer from scratch using MSI/LSI ICs.  How hard could it be?

For my senior design project, I settled on designing and building a computer using microcoded, bit-sliced ALUs to implement the CPU. I named my project the "Peach" as a wink at the increasingly popular computer line produced by a growing California company called "Apple" and then spent time devising the instruction set. As the instruction set would only need to execute a trivial test program to demonstrate functionality, I evaluated the trade-offs of implementing its opcodes in microcode using vertical versus horizontal encoding. In the end, I went with horizontal as the instruction set was so small and simple that I did not want to introduce the additional complexity of vertical decoding circuitry.

The Peach was intended to be a learning/instruction device, not a full-blown computer capable of running an OS and applications. Toward that end, I decided against using an oscillator to provide clock pulses.  Instead, my plan was for the operator to manually pulse the clock through a button.  With each clock button pulse, the operator could follow the state machine of the circuitry by observing logic levels on data and address busses through attached LEDs, and also selectively see levels on other IC pins by applying a logic probe. The operator would be able to literally see how a computer operates by watching microcode words get decoded and drive the ALUs and other components, one clock pulse at a time.

The next task was studying the pin-outs of the various components that would be needed to implement the Peach: ALU, memory, I/O devices, and assorted logic chips to glue everything together.  My design project adviser, Pete Jensen, helped me procure components that were not readily available in the lab, which was then followed by weeks of me bread-boarding the entire contraption. During this time, I felt like I was pulling together everything that I had learned during my fours years of college.

With the end of my final term rapidly coming to a close, it was time to program the EEPROM with the microcode and start executing instructions, one clock pulse at a time. It was then that I experienced the biggest I-didn't-know-what-I-didn't-know of them all: The EEPROM required voltage to erase/program that I had neglected to investigate and was not prepared for.  When I looked around in the lab for some way out of this bind, I found that the only power supply in the lab capable of doing this was not working.  As a CS guy who thought I knew a lot more about hardware than I actually did, it never occurred to me that I would need anything more than +5 VDC for my project, and now I was out of time to do anything about it.  Dejected, I showed my advisor what seemed like miles of wiring on the breadboard and explained the situation.  He told me to submit the write-up of my design, to which he then gave me an "A", out of pity I suppose. In retrospect, the Peach probably never would have worked as then bread-boarded even if I had been able to program the EEPROM, as I was equally ignorant about the criticality of debouncing circuits and suppressing noise with capacitors.  After fours years of study, I still didn't know what I didn't know. Sigh.


Career Entry and Grad School

Graduating into an economic recession in the Spring of 1981, it was not as easy to land a CS-related job as I had envisioned four years earlier, though some of the difficulty might also be attributed to the complete collapse of my own self-esteem with my Peach debacle. My job interviews with employers who manufactured standalone or embedded computer systems only reinforced to me that I was woefully ignorant of all things computing. Finally, a month before graduation, I landed a local job offer from Lockheed-Georgia in Marietta, Georgia, for a role in their IT sys admin group that maintained their timeshared scientific and engineering computers systems: the Univac 1100 and DEC VAX 11/780.  My initial assignment was to assist in the project to transition the Univac user base from FORTRAN 66 to FORTRAN 77.  It was not particularly challenging work technically, but it was a solid income for someone who came from a working-class background. It also marked the end of my dream to be involved with designing and building computers as my profession.

My first year of professional employment was in many ways like my last year of college enrollment in that I still had friends and roommates who were working toward graduation, I still spent entirely too much time in college hang-outs, and I still did not have a great deal of disposable income (having racked up debt with new car and wardrobe purchases immediately upon getting my Lockheed offer).  I still wanted to own a personal computer, but that purchase remained beyond my financial grasp.  Even consumer-grade gaming PCs like the VIC-20 or Atari 400 were unaffordable by me without incurring even more debt. Unfortunately, a very inexpensive computer line, the Sinclair ZX80 and ZX81 was only available in the UK. The concept of the Sinclair was very much like that of the Raspberry Pi that would also originate in the UK thirty years later: a hobbyist/educational computer affordable by everyone. But then one day I saw in a magazine that Timex would be marketing the Sinclair ZX81 in the US and for under $100 if you wanted the assemble-it-yourself kit version. I placed my order for the kit as soon as it was available.

When the Sinclair kit arrived, I tore open the packaging, dug out my old soldering iron that I had since high school, and hurriedly soldered the 40 pins of the Zilog Z80 microprocessor to the printed circuit board. I then grabbed the next component to be soldered, but something just did not look right with the board. I stared at the PCB for a while and then it dawned on me: I had soldered the Z80 to the bottom of the board, rather than the top! An IC manufactured as a Dual Inline Package (DIP) can physically fit into the pin holes from either side of the PCB, but the IC will only correctly function when it is installed as intended by the PCB designer. Horrified at the sloppiness originating from my zeal, and thinking that I had not done something this boneheaded since my Peach misadventure, I grabbed my solder-sucker bulb and painstakingly removed the solder from all 40 pins and pried the Z80 off the PCB.

After carefully re-soldering the Z80 onto the PCB, along with the other components in the kit, I connected the Sinclair to my 12" black and white CRT TV that I also had since high school.  The video connection was made via an integrated analog RF-modulator, as digital video interfaces (DisplayPort, HDMI, Thunderbolt, etc) would not come along until years later. A wide smile crossed my face as the Sinclair booted up and presented its command line interface prompt! I had not destroyed it with my soldering, de-soldering, and re-soldering, and now I was the proud owner of a personal computer with a whopping 17 Kilobytes of RAM (yes, kilobytes; not gigabytes or even megabytes): 1KB on the PCB and another 16KB with the memory expansion pack attached.

I next attached my Walkman knockoff portable cassette player to the Sinclair via its microphone and earplug jacks to try storing a simple BASIC program onto a cassette tape, the common means of storage for inexpensive computers back then. The principle was similar to that of a modem: a digital bit from the computer is modulated into a sound frequency that represented the bit's value, either "0" or "1". Instead of transmitting the sounds over the phone line like a modem does for demodulation on the other end, the cassette player stored the sounds on tape. To retrieve the stored data, simply play the sounds back into the computer where it was demodulated back into a bit value. It was also as painfully slow as an acoustic coupler modem.

I could now write BASIC programs on an awkward "chiclet" keyboard, store and retrieve them very slowly on cassette tape, and view output on a low-resolution black and white TV. But instead of being pleased, I was feeling somewhat disappointed. I felt that I had no more computing capability than I did in high school, and it was a far cry from the technology I was exposed to in college. I realized that if I was not to sink into technological obsolescence and irrelevance I would need to be challenged by more than a $100 toy computer and answering questions on FORTRAN 77 syntax at work.

To graduate with a bachelor's from Georgia Tech back then, a degree candidate had to complete an exit exam and submit their results to the Institute; for ICS majors, the exit exam was the GRE Computer Science test (GRE CS test now since discontinued). I surprised myself by how well I had done on this standardized test and began thinking that maybe I could press the reset button on my academic career and pursue a masters degree.  Lockheed-Georgia had an extremely generous program with Georgia Tech whereby they would pay all tuition and book costs (in advance, no less) for approved part-time STEM degree programs. With the encouragement of my Lockheed manager, I submitted my GRE results and somewhat unremarkable undergrad transcript back to Georgia Tech and was admitted into their ICS masters program for the Fall term of 1982.

In the same time-frame, early 1982, my Lockheed manager asked our team if anyone wanted to be involved with replacing the AT&T Western Electric modems we were using across our enormous campus (spanning the Dobbins Air Force Base) with less expensive non-AT&T equipment. Looking for a new challenge, I raised my hand and volunteered.  Since 1974, antitrust legal challenges to AT&T's telephone monopoly had been attempting to force the Bell System to open up to customer deployment of third-party equipment. These challenges culminated in the Modified Final Judgement (1982) and the ultimate break-up of the Bell System in 1984, with AT&T divesting its local operating companies into independent Regional Bell Operating Companies. With the impending Bell System break-up, Western Electric no longer had a monopoly on equipment deployments in the RBOCs. In our specific case, we were to replace Western Electric Bell 208A leased-line modems operating at 4800bps with 9600bps modems from Paradyne

Thus began the convergence of two undertakings that would shape my entire professional career: graduate course work on a part-time basis and data communications projects in my full-time job. My manager had himself earned a master's degree while working full-time at Lockheed and was very accommodating with my academic schedule; his basic requirements for me were to not let my work responsibilities slip, and to put in 40 hours.  I was then free to schedule a course in the middle of the day and make up the time by working extended hours. I took only one course per term (quarters then, not semesters) as I did not want to be crushed by a full-time job in addition to carrying the workload of multiple classes.

Having taken a course on data communications as an undergrad with Prof. Phil Enslow, one of the first graduate courses I signed up for was another with him on computer networking. This was later followed by three more special topics courses with Prof. Enslow on computer networking where the grad students focusing on that area extensively studied the theories behind circuit switching, packet switching, and the emerging OSI reference model. With completion of additional graduate course work in complementary areas, including queueing theory, I began to think of telecommunications and computer networking as my field.

Career Path

In the early 1980s, the Lockheed-Georgia Information Systems organization was entirely computer-centric.  That is, applications programming and system administration groups were organized around specific mainframes and minicomputers, of which there seemed to be an abundance. There was the Univac 1100 mainframe and DEC VAX minicomputer for engineering and scientific use, there was the Univac 1100 mainframe to support manufacturing applications, there was the IBM System/370 mainframe for business data processing, there were the computers dedicated to engineering graphics apps (eg, CADAM/CATIA), and there was the collection of DEC minicomputers to support more ad hoc business productivity apps, just to name a few (there was even an analog computer with plug-boards). While the IBM personal computer was introduced to the market in 1981, PCs and microcomputers were not yet a factor in a company where a common method of entering lines of code into a mainframe was to have them punched on cards by unionized hourly workers who transcribed them from paper sheets handwritten by salaried programmers.

Interactive terminals were primarily being used on minicomputers, but a growing number were being attached to the mainframes as well. In this regard, the data communications systems for terminal access (not really computer networks, per se) were also being built in silos with a given terminal communicating with one and only one mainframe or minicomputer. In the case of the IBM mainframe, the terminals were 3270 types connected via coax to cluster controllers. Univac had their comparable, proprietary terminal line in the Uniscope, while DEC had their line of VT100 terminals (and its many successors). 

The engineering/scientific Univac mainframe was also configurable to support asynchronous terminals like the VT100, rather than just Uniscopes. With both the Univac and DEC VAX being accessible via async VT100-compatible terminals, the next data communications project my manager had me work on was remote accessibility of each. Remote in this sense meant users who were still on the Lockheed campus, but not within the distance limitations of RS-232 signals from the computer interfaces (nominally 50 feet, but 200+ feet was not uncommon with lower bit rates). Terminals were typically clustered in terminal rooms, as having one on your desk was still quite unusual for our users. If the terminal room was close to the Univac or VAX, individual cables could be run for each terminal, driven by RS-232 signaling. If the terminal room was farther away (but still within a few miles), individual 4-wire circuits would be leased from Southern Bell for each terminal, with the circuits having their loading coils removed (ie, "dry circuits"). Removing the loading coils, which acted as band-pass filters, increased the bandwidth of the circuits, enabling higher data rates to be implemented. Attached at each end of these circuits would be short-haul modems (aka, line-drivers), which converted RS-232 signaling to an electrical format more amenable to the increased distance. In many ways, line-drivers on dry circuits resembled the DSL technology that would appear a few years later.

Our basic scheme was to multiplex multiple terminals in a cluster room onto a single telephone circuit for connectivity to the host computer.  That is, instead of eight circuits and eight pairs of modems for eight terminals in a cluster, we could install a single Paradyne modem and single Micom Systems multiplexer with eight terminal interfaces on each end of a single circuit. The terminals, operating asynchronously at 1200 or 2400bps, would be statistically multiplexed onto a single circuit operating synchronously at 9600bps. The science/magic of stat muxing enabled arithmetically higher input rates than the output rate based on the statistical probability that all inputs would not be active 100% of the time.  This is the fundamental premise of packet switching that I was studying in grad school and putting into practice at work.

Connecting terminal rooms for our scientific/engineering users over multiplexers saved Lockheed some amount of money every month in circuit charges, but did not enable a given terminal to communicate with any computer other than the one "hard-wired" to the corresponding interface on the remote mux. Our next project was to provide flexibility to the users by enabling them to use the same terminal to communicate with multiple computers. Since the Univac and VAX both could interface to an async terminal, we only needed a way for the async terminal user to be able to select which computer host they wished to connect to for a given session. Fortunately, such async switching devices (aka, data PBX) were already on the market and we just needed to evaluate products and pick one. We chose Micom Systems' Micro600 Port Selector and I was tasked with installing and administering it. A user's terminal would now be connected to the Port Selector, rather than hard-wired to a single host. The Port Selector would present the user with a menu of host choices and then would establish a "soft" connection to switch the incoming and outgoing async characters between the terminal and host interfaces.

This was the golden era of centralized supercomputers, before the supercomputer became a system distributed across thousands of microprocessors. This era came to be dominated by Cray Research with their Cray-1 and its follow-on products. Seemed like every national lab and aerospace company craved getting a Cray to run massive computational models, and Lockheed-Georgia was no different. If a site could not afford a Cray, a CDC Cyber 7600 was the next best thing. Our workgroup's customers, Lockheed engineering/scientific users, determined that their computations were taking too long on the Univac 1100 (something of a supercomputer in its earlier days) and on the DEC VAX, so they pushed through a project to lease a 7600. A peer of mine in the group, who had system admin experience on the Cyber, and I were sent to CDC headquarters outside Minneapolis to oversee the running of representative benchmark programs to confirm the computational power of the 7600. When the formalities of the procurement process were completed, a Cyber 7600 was installed in our data center.

My post-installation Cyber responsibility was to connect its front-end processor to the Micom Port Selector to enable our interactive users to chose between the Univac, VAX, and Cyber, along with a growing number of other computer systems not dedicated to scientific/engineering computation. Users at an async terminal would be presented a menu allowing them to pick the particular host computer needed for that session. As there was a need to move data sets between hosts for processing best suited to a particular computer, a back-end network was also conceived. This network connection was implemented using Network Systems Corp.'s HYPERchannel product, an early local area network aimed at the supercomputer market with speeds much faster than contemporary, competitive LAN products. HYPERchannel adapters were available for attachment to the Cyber, Vax, Univac, and IBM systems, but to my consternation, I had very little involvement with this first multi-vendor LAN at Lockheed-Georgia.

The business climate for Lockheed-Georgia in this time-frame, circa 1983, had become more bullish. When I was hired in 1981, Lockheed's contract to extend the fuselages of C-141A Starlifter cargo planes into C-141B models was winding down. It was expected that the manufacturing plant would be kept open by building a few C-130 Hercules cargo planes every month, with a total workforce of ten thousand (or less) employees on the campus; nine to ten thousand employees being considered the minimum viable headcount for the Marietta site. However, with the Reagan-era increase in defense spending came a push to increase the Air Force's airlift capability in the short term, as McDonnell Douglas' new C-17 Globemaster would not be operational until the 1990s. Consequently, Lockheed-Georgia was awarded a multi-year contract in 1982 by the USAF to restart the C-5 Galaxy manufacturing line, with 50 C-5B models to be delivered. Funding was available for projects (eg, CDC 7600) and hiring was underway to double the workforce size.

In anticipation of gearing up to support the massive C5-B program, management began re-thinking of the structure of the Information Systems organization. Information Systems leadership determined that the organization would be more efficient if it shifted away from its mission-oriented structure where IS departments were vertically integrated around their specific customers and the computers those customers used. Instead, departments would be functionally-oriented, with every IS employee performing a similar function, regardless of customer or computer, being in the same group. Consequently, the tight-knit system admin group I was in was broken up and dispersed among various sys admin groups, customer support groups, and in my case a newly formed communications department.

My new communications group and management chain consisted of people I had not worked closely with previously. I was given a tool bag and sent out on service calls to troubleshoot problem reports with terminal access, starting over as a glorified field technician, or so it felt. An opportunity arose for me to become more than a technician with a screwdriver and breakout box when my new management was directed to investigate dial-up access and security systems associated with it. 

In the early 1980's , Lockheed-Georgia's security organization consisted almost entirely of armed, uniformed guards who checked badges at the pedestrian and vehicle gates. There was no expertise in cybersecurity let alone anyone dedicated to  overseeing it. The same could be said of the Information Systems organization I was in: there were system administrators who added and deleted user accounts and privileges on mainframes and minicomputers, but no group or individuals dedicated to cybersecurity. This made both organizations risk-averse when it came to security with a default posture of "no" for anything new, particularly anything seen as a breach in the perimeter of the assembly plant that was manned by armed guards.

We take for granted now that being able to remotely access your computer work files and messages promotes productivity, but that was not always the sentiment. In the early to mid 1980's at Lockheed-Georgia, remote access for increased productivity was explicitly deemed to be insufficient reason to authorize dial-up access for ordinary users; the only legitimate reasons for granting remote access was for IS system administrators to troubleshoot problems in the middle of the night without having to drive into work, and for the company's sales teams to access data necessary to close deals on airlifter sales as they worked around the globe.

Initially, dial-up access was performed manually, with the remote user calling into the computer operator (mainframes had human operators) to authenticate themselves with a password. The initial call would then be terminated and the operator would call the user back at a pre-authorized number, providing two-factor authentication: the password and the pre-authorized telephone number. However, with increasing pressure from Lockheed's business units to expand dial-up access, it was seen that the manual approach would not scale up, as minicomputers did not have operators and mainframes typically had only a single modem connection.

Several automated callback security systems had reached the commercial market by this time which enabled a remote user to call the system, enter in their password, hang up, and wait for the system to call back their modem on the phone line associated with their password. With no computer operator required, dial-up access was possible to unattended computer systems, as well as multiple simultaneous connections to computer hosts by connecting the callback system with multiple modems to the Port Selector data switch. My new manager asked me to help evaluate the commercially available callback systems, and then install and administer it after we selected one.

It was recognized by both the company's IS and security organizations that the implementation of the dial-up security mechanisms and granting of permission to use dial-up must be separated to ensure accountability and auditability. Consequently, it was decided to place the console for the call-back system physically inside the security organization's office, with a security staffer to enter approved call-back numbers into the system. However, as the call-back device did not have a user-friendly interface and nor did the security organization have IT expertise at the time, I was tasked with writing a BASIC program on a PC as a front-end interface to the system that would be easier to use.

Retirement