Unix Continues Its Quest for Wide Acceptance
By Erik Sandberg-Diment
The New York Times
January 14, 1986
Last year was supposed to be the year of Unix. So was the year before, and the year before that and the year before that. This operating system's search for acceptance in the world of personal computing has seemingly become as endless as the quest for the unicorn.
An operating system is the software that makes a computer run, that electronically ties everything - the monitor, the keyboard, the disk drives, the central processing unit and so on - together so that the pieces all work in concert as a system.
Nowadays most computer users, even casual ones, run across the initials DOS all the time. The acronym stands for ''disk operating system,'' a designation that came about in the early days of personal computers when the machines did not have a monitor or a keyboard. The operating system's most crucial function then was to control the operation of the microcomputer's disk drives and the flow of information to and from them.
Among the many operating systems today are MS-DOS, developed by Microsoft and adopted by I.B.M. under the name PC DOS; ProDOS, used by Apple; TRS-DOS, used by Tandy; UCSD p-System and so on.
But before DOS there was CP/M. For years CP/M was the standard operating system, and for years it dominated the eight-bit machines, those computers that deal with software in eight-bit words, that is, in groups of eight zeros and ones.
Skipping all the gossip concerning why CP/M was not adopted by the International Business Machines Corporation for its personal computers, the simple fact is that Big Blue chose MS-DOS. Even though the I.B.M. PC is only a quasi 16-bit machine, it does deal at least in part with 16-bit words, and MS-DOS became the standard for a whole new generation of larger-byte machines.
Today 32-bit personal computers are gathering in the wings. I do not foresee their actual arrival on stage until 1987 or 1988. The year they do make their entrance, however, may finally become the year of Unix. In the meantime, you will still see the name cropping up regularly, for followers of the Unix cult apparently cannot be deprogrammed.
Unix was developed for in-house programming at Bell Laboratories in the late 1960's and early 1970's. It actually predates the very concept of the personal computer. Yet the driving force behind the development of both personal computers and Unix was the same: the desire for one's own computer time.
The problem of the era was that there were large, expensive computers, huge staffs of programmers and still not enough time available on the machines for the users who needed them. The answer to that dilemma came in two varieties. One was the personal computer, which put a relatively affordable machine on the desk of everybody who needed one. The other was Unix, an operating system that allowed many users to do many tasks at once. This so-called multiuser, multitasking software changed the modus operandi of the computer from that of a queue-building monolith in which all work was done in a sequential fashion as time allowed to a congress of cooperative activity in which everyone had a chance to do his or her thing at any time desired.
If, on a computer system equipped with Unix, you happened to be working on a spreadsheet while I was using a word processor to write this column, the word-processing program and your calculations would run at the same time. The CPU would quite literally sandwich a few steps of your calculations in between each of my keystrokes.
This sort of multiple processing requires both a lot of memory and high density disks running on high-speed drives. But it is precisely these qualities with which personal computers are being endowed today.
Disk drive capacity is leaping upward, while memory chips are declining so rapidly in cost that manufacturers are almost having to give them away. THE only element that seems to be missing in order for Unix to feel truly cozy in the personal computer world is a C.P.U. using a larger ''word.''
Word length is a crucial consideration in computer processing, because the number of bits a computer uses at a time limits the amount of memory it is able to access directly. An 8-bit machine can address 64,000 characters or bytes of memory (64K); a 16-bit machine one million bytes (a megabyte), and a 32-bit machine four billion bytes (four gigabytes) - at least in theory. In reality, various technical considerations reduce the accessible memory considerably. The I.B.M. PC's reach, for instance, is limited to 640K of directly accessible memory, which is really not a great quantity for a Unix system. A.T.&T.'s equivalent of the PC, the 7300, does run Unix, but that is just because the operating system happens to be from Brother Bell.
The extended memory boards you may have seen for sale do break the 640K barrier, but they do so by means of bank switching, which is a kind of computerized sleight of hand. It is like a game of three-card monte. There are several cards involved, but the computer only gets to look at one at a time.
Once 32-bit machines have descended upon the personal computing world, however, Unix could become quite comfortably ensconced there, and that ''latest and greatest'' operating system would have finally made it. Already it is seeping into college and university environments. If, as can probably be safely predicted, I.B.M. decides to excuse Unix's origins and ends up embracing it as an operating system for its future 32-bit PC's, then what the heralds of this system have been proclaiming all along may finally materialize. Unix may also turn out to be the only viable toehold that A.T.&T. earns in the personal computer field.
Unfortunately, even were Unix to win I.B.M.'s favor, the operating system's future would not be assured. For the essence of personal computing, all the talk of multitasking, LAN's and time-sharing notwithstanding, is you and the computer, one on one.
Copyright 1986 The New York Times Company