August 29th, 2022

Computers I have known

Murray Sargent
Principal Software Engineer

A friend recommended that since I got into computers a long time ago, I should post about how computers have changed over the years. Well, here goes a trip down memory lane!

Analog and button pushing

The first computer I ever used was an Electronics Associates analog computer at the Perkin Elmer Corporation where I worked as an intern in the summers of 1961—1963. I wired up the machines to simulate aspects of the response of control systems that guided the balloon-borne Stratoscope II telescope. The telescope could be pointed to an accuracy of 0.1 arc seconds, which is the angle subtended by a dime at two miles. During my third summer, I saw an LGP-30 digital drum computer. One weekend I wanted to see what was going on, so I pushed a button that printed out the progress. That action apparently wrecked the whole weekend run much to the frustration of Bob Bernard who ran the machines and called me a button pusher. The LGP-30 was hardly more powerful than the analog computers, although it was suited to solving different problems. The drum had considerable latency, so programs had to be written carefully to catch the magnetized bits optimally. Unlike other computers of the day which used octal, the LGP-30 used hexadecimal with f g j k q w representing 1010 – 1510 instead of A B C D E F.

Main frames

The following year I was a physics graduate student at Yale and learned how to program Fortran II on an IBM 709 computer. That computer used vacuum tubes and had 32768 36-bit words which could handle 6 characters in the BCD character set (0-9 A-Z +-.,()$*). Input was on IBM computer cards and output was on a printer or on a pen-and-ink plotter. We prepared our card decks with IBM 026 and later 029 keypunches. With the 029, you could insert a character by holding the source card and typing a character. In 1964, the Yale Computer Center upgraded to Fortran IV and an IBM 7094 computer which was made with discreet transistors, had a 2-microsecond machine cycle, and the same memory architecture as the IBM 709. Since compilations took appreciable time, I used to make simple changes in the binary cards using a keypunch. You had to kill the card check sum before resubmitting the card deck. You could punch more holes or fill up holes with chads that had been punched out. Amazingly the filled-in holes passed through the card reader without falling out. I learned enough assembly language to understand the machine-language 1’s and 0’s. I used the computer for calculating graphs in my PhD dissertation and papers on Zeeman laser theory. The Yale Computer Center also had an IBM 1401 computer with tape drives and an IBM 1620 computer which was a decimal machine. I didn’t use either except to read/write magnetic tapes.

One need for magnetic tape was to collect data for a Stromberg Carlson SC4020 Microfilm Printer & Plotter located at Bell Labs in Murray Hill, NJ. Marlan Scully, Willis Lamb, and I made what was likely the first computer movie (Build up of laser radiation from spontaneous emission) in 1965. You can see the movie by thumbing through the corners of Applied Optics circa 1970.

After finishing my PhD in June 1967, I went to Bell Labs in Holmdel, NJ to work as a post doc on laser theory. Bell Labs had an IBM 360 65, which used 8-bit bytes, EBCDIC character codes, and zipped along at 563 kips. The 7-bit ASCII character encoding came out in 1963, but I didn’t get to use it until 1973 on a DEC 10. Both character code standards have lower case, although Fortran IV was all upper case. The card decks required some JCL (job control language) which was sort of awkward and not used on later computers. After a year, I developed the SCROLL math display language and implemented it in Fortran IV. SCROLL was the first facility that formatted and displayed built-up equations on a computer. The notation was Polish prefix.

At the end of my two-year post doc, I was torn between joining the computer science department at Bell Labs in Murray Hill, NJ, and becoming an Assistant Professor of Optical Sciences at the University of Arizona. I went to the latter partly because Marlan, Willis, and I wanted to write a book on Laser Physics. The U of A had a CDC 6400 with 60-bit words, an 18-bit address space, 1 mips, and magically no JCL!

I found out about a special-projects program on computers hosted by the U of A Electrical ‎Engineering department and volunteered to teach a course on comparative programming languages. ‎After a year or so, I concluded that it would be good to formalize the program into a ‎department of its own. I called Ralph Griswold, a Bell Labs colleague, and asked him if he’d be interested in such an endeavor. It was ‎perfect timing. He had been interested in a change and creating a computer science department in an ‎exotic location was compelling. See 50 Years of Computer Science 1971–2021.

Soon we had a Digital Equipment DEC 10 time-shared computer! You could dial in with a 110-baud teletype terminal, or better yet with a 300-baud CRT terminal. I never could abide 110 baud, but I used 300-baud connections for a while. Then I got access to a Tektronix 4010 graphics terminal which sped along at 9600 baud. That could fill up a 24-row screen in a mere second! And you could graph formulas on it. The DEC 10 had 36-bit words and an 18-bit address space. It also had an extended addressing capability consisting of multiple segments of 18-bit address spaces. A similar segment-offset architecture was used later in the Intel 286 microprocessor.

Microcomputers

I spent 1975—1976 on sabbatical at the University of Stuttgart and the Max Planck Institute for Solid State Research and learned many things, one of which was that something called a microprocessor was being used in fledgling computers. On December 24, 1976, I bought for $2500 and assembled an IMSAI 8080 microcomputer kit. It had a whopping 48 KB thanks to a dynamic RAM card that one of my physics colleagues said would never be reliable. “Stick with the robust 8 KB static memory cards!” he urged. The IMSAI was like the Altair 8800 that Bill Gates wrote his famous 4K and 8K Basic interpreters on. The 4-MHz Zilog Z80 microprocessor was considerably more powerful than the 2-MHz Intel 8080, so I installed a Z80 processor card in the microcomputer’s S-100 bus. The IMSAI-8080 front panel has 22 switches and many LEDs. I rewired the front panel so that the 8 status LEDs could be controlled by software and set them up to display the contents of the memory byte pointed to by the address switches. I custom wire-wrapped most of the cards in the computer. There was a ROM with a 2 KB monitor program that let you examine and change memory. That program was the start for what evolved into my SST debugger. I added a CRT terminal, a modem, a floppy-disk drive, and a board with programmable relays to control the house lights and the front-door keypad. A friend who worked at a garage-door opener company down in Nogales, Sonora, gave me some garage-door openers that we used to control the house lights and the front door. A far cry from today’s smart phones! The whole system was hard wired since WiFi didn’t exist back in the 1970’s. One advantage of that was that it couldn’t be hacked (until I opened it up to remote control via modem). There were manual overrides for all functionality since I didn’t really trust computers! All programming was in tight Z80 assembly language.

64 KB sounds miniscule by today’s standards with our gigabytes and terabytes and subnanosecond machine cycles. But it was impressive how much we could do with so little. In addition to writing and printing books and papers, we could control experiments. A nifty example was Rick’s measurements of photon echo. For that, you subject a medium to two pulses of light separated by a time interval of τ and then watch for a light echo a time τ afterwards. But the experimental apparatus was very noisy, so the echo was drowned out if you only measured it once. If you measure it many times and add the results, the noise averages out to an overall flat background and the echo appears on top. But who wants to measure something thousands of times? Enter a microcomputer, which was happy to sit there and do so!

I got a Diablo daisy-wheel printer and wrote a program to send the printer proportionally spaced text. I used this approach to create the camera-ready pages of Rick Shoemaker’s and my first microcomputer book Interfacing Microcomputers to the Real World. That book describes the state of microcomputing at the time in detail. I enhanced the print program to handle mathematical text in multiple fonts using algorithms like those for the SCROLL language. Another physicist, Mike Aronson, who had written the PMATE editor I was using, suggested that the input format should resemble real linearized math as in the C language rather than the Polish prefix format used in SCROLL. So I wrote a translator to accept a simplified linear format, the forerunner of UnicodeMath which we use in Microsoft Office apps today. The translator was coded so tightly in Z80 assembly language that it along with the rest of the formatter fit into 16KB of ROM for a controller some friends of mine created for Diablo daisy-wheel printers. Those friends had also made the Z80 processor card in my IMSAI. When the printer was used with a tractor feed, it could print the whole document with one daisy, roll the document back, print with the next daisy, etc. It was positively wild watching the printer type the symbols into place after printing the main text.

IBM Personal Computer

In August 1981, IBM released a cool microcomputer that really surprised Rick and me. We figured that IBM wouldn’t get into microcomputing because it wouldn’t understand the market. IBM was into big machines, wrote its own software, supported a fancy sales force, had proprietary hardware, and didn’t collaborate with other companies. But the IBM PC was developed by a small independent group under Don Estridge in IBM Boca Raton, FL, that espoused open architecture and non-proprietary components and software. It used a 16-bit Intel microprocessor, the 8088, which is an 8086 with an 8-bit data bus, a 20-bit address space instead of the microcomputer industry’s 16-bit address space, and an optional 8087 floating-point processor. IBM had invented the floppy disk, but the PC used Tandon disk drives, and the PCs were sold in major outlets like ComputerLand and Sears Roebuck. The operating system was Microsoft’s MSDOS 1.0, which was an upgrade from the popular CP/M-80 microcomputer OS. It had Bill Gates 8K Basic interpreter stored in ROM in high memory. IBM documented the PC thoroughly as well. If you want lots of details, you can read Rick’s and my second microcomputer book The IBM Personal Computer from the Inside Out, also “typeset” on my Diablo daisy-wheel printer. With the PC, IBM was considerably ahead of the competition from the TRS-80, Apple II, and other microcomputers. Thanks to IBM’s excellent documentation, competitors emerged. One that I liked a lot was the Victor 9000. Its floppy disks held 1.2 MB compared to 360 KB on the IBM PC at the time. It had a cute cousin, the Apricot.

One of the many cool third-party IBM PC add-ons was the Hercules Graphics Card, which converted the 80 column by 25 row monochrome display card with 9×14 character cells into a 720×350 monochrome graphics card. Rick and Chris Koliopoulos copied the ROM Basic down into the upper 32K of the video space and modified it to support graphics.

IBM extended its PC lead in August 1984 with the IBM PC/AT, which used an Intel 80286 fully 16-bit processor with the ability to access up to 16 MB of memory in “protected mode”, considerably larger than the 8088’s 1 MB address space. It took a full year for the competition to create personal computers as powerful. My IBM AT had a 10 MB hard drive which was a great upgrade from the floppy disks and more than a third as large as the Model 1 1301 disk drive used with some IBM 7094 computers. I also got an HP laser printer, which HP released in April 1984. Being a laser physicist, I naturally enhanced my PS Technical Word Processor to work with it. So much easier than using multiple passes with a Diablo daisy-wheel printer!

On August 2, 1985, Estridge and his wife died in a plane crash caused by a strong thunderstorm near Dallas, Texas. That tragedy was a turning point for IBM’s microcomputer successes. Subsequent PC releases didn’t keep up with the competition from Compaq and other companies, possibly due to worries that other IBM computer systems might not survive the competition. Steve Jobs was never afraid to cannibalize his products. “If you don’t cannibalize yourself, others will” was his philosophy. The IBM PS/2 released in April 1987 was no match for the competition.

The software industry standardized on Compaq 386’s for a while. I used a Compaq 386 desktop computer and a Toshiba T5100 laptop to enhance my SST debugger to run in protected mode and access all of memory via the selector/offset memory model. In that way 80286 PC’s, which were more prevalent than 386 PC’s at the time, could access all their memory. That capability was key to getting Windows 3.0 to access all of memory and fend off OS/2. Rick and I updated our PC book in 1995 using Microsoft Word and renamed it to The Personal Computer from the Inside Out. Since then we’ve resisted the temptation to write more about the incredible evolution of microcomputers. Windows 95, an updated version of Windows 3.1, could run in as little as 4 MB of memory, although 8 MB was recommended. Nowadays 8 GB or more is recommended for a Windows laptop! Of course, today’s laptops can do so much more that the microcomputers running Windows 95. I almost never used a main frame after getting into microcomputers. The Data General Eclipse minicomputer was the biggest machine I used after 1975 and then only for a few years. The PC’s had all the power I needed.

Author

Murray Sargent
Principal Software Engineer

Yale BS, MS, PhD in theoretical physics. Worked 22 years in laser theory & applications first at Bell Labs and then Professor of Optical Sciences, University of Arizona. Worked on technical word processing, writing the first math display program (1969) and the technical word processor PS (1980s). Developed the SST debugger we used to get Windows 2.0 running in protected mode thereby eliminating the 640KB DOS barrier (1988). Have more than 100 refereed publications, 3 laser-physics books, 4 ...

More about author

0 comments

Discussion are closed.