Personal computers on campus

M. Mitchell Waldrop
Science

April 26, 1985

In 1978, the University of Washington's vice provost for computing, Robert Gillespie, bought a few of the new Commodore PET home computers and loaned them out to faculty members. "We just wanted them to try the machines for a while and see if there was anything they could be used for," recalls Gillespie, who is now a private consultant on computing in higher education. Then he laughs: "We never could get them to give the things back."

It was just a hint of what was to come. No one has made an accurate count of all the Apple II's, IBM Personal Computers, Macintoshes, and other such machines that have appeared on American campuses during the last few years (Gillespie estimates hundreds of thousands), but there is little doubt that academia has taken to the personal computer (PC) with exuberance. Indeed, it is striking how rapidly computers have spread beyond their traditional enclaves in science and engineering. Business students are using spreadsheets to do their homework. Historians are using filing programs to manage their bibliographies. Social scientists are using database management systems and statistical packages to reduce their data. Dance students are using Apple II graphics to notate choreography. Drama students are using IBM PC's to design stage sets and lighting plans. And everyone is processing words, from students doing term papers to assistant professors revising journal articles.

But the PC is more than just a box on a desktop. Together with a host of other information technologies, including online databases, optical data storage, digital telecommunications, and computer networks, the Pc is transforming the campus into a microcosm of the much-heralded "information society." It is no accident that industry giants such as IBM and Digital Equipment Corporation (DEC) have invested tens of millions of dollars in university research projects, both to develop software for a new generation of high-powered personal computers--known generically as "the scholar's workstation"--and to find out what can be accomplished in a network of such machines.

In addition, the PC's have accelerated the on-going process of computer decentralization. From the 1960's, when the word "computer" meant a mainframe and an air-conditioned, professionally staffed computer center, through the 1970's, when the more affluent research groups started buying VAX's and other minicomputers to bring their data processing in house, and continuing in the 1980's, when the PC's started arriving, the control of computational resources has passed more and more into the hands of the users.

Finally, the PC and its associated information technologies seem to be triggering some profound changes in the educational process and in higher education as an institution--although no one is quite sure yet what those changes will be.

Of course, one could be forgiven a certain skepticism about all this. "Historically, these things go in waves of enthusiasm, followed by retrenchment when it turns out that the hardware is not quite ready," says Fred W. Weingarten, program manager for Communication and Information Technologies at the Office of Technology Assessment (OTA). He points out, for example, that the first attempt to provide universal access to computers was begun more than two decades ago at Dartmouth College, using time-sharing terminals tied into a central mainframe. In fact, the 1960's left a rich legacy for educational computing: Dartmouth president John Kemeny and mathematician Thomas Kurtz wrote the BASIC programming language, still widely used on PC's; in 1960 Donald L. Bitzer of the University of Illinois created PLATO, a mainframe-based teaching system now licensed to the Control Data Corporation and widely used in industrial training; and in 1964 a group of colleges and universities formed a consortium known as EDUCOM, which is now headquartered in Princeton and which continues to be active in such areas as computer training for faculty and intercampus computer networking. But in retrospect, says Weingarten, the enthusiasts of that era were ahead of their time. Mainframe computers proved to be too expensive for general use, and time-sharing was simply too cumbersome.

In the same way, he says, "some of the current 'front edge' things like the scholar's workstation may or may not pan out." On the other hand, he says, the PC does represent a much more flexible and inexpensive technology than the older mainframes. "So there is a lot of reality behind the current excitement, enough substance that the technology probably will cause some permanent changes."

Permanent or not, the PC has certainly created a challenge for the campus administrators and faculty members who must somehow make everything fit together, in an era when computer technology seems to change almost weekly. Responses have ranged from a cautious wait-and-see attitude to an exuberant high-tech activism. But in general the campuses have had to face three issues: access, management, and educational uses.

Access

More and more campuses have committed themselves to making personal computers both abundant and easy to use. Thus, an increasingly common sight in libraries and classroom buildings is the PC laboratory: a room full of row upon row of IBM PC's, Macintoshes, and the like, where students can go to do homework or term papers using software they have checked out from the front desk. Meanwhile, the computer manufacturers themselves are trying to get as many PC's on campus as possible, in the hope that a student who gets used to working with one brand of hardware in college will continue to be loyal to that product after graduation. (Also, as discussed below, the campuses are a fertile source of new software and a convenient test bed for new PC applications.) Among the first was Apple Computer, which announced its Apple University Consortium in January 1984. Students and faculty at the 24 member universities would be offered a 40 percent discount on the company's new Macintosh computer. The consortium members would also exchange educational software developed for the machine. Similar academic discounts were offered at about the same time by IBM, DEC, Zenith, and other computer makers.

A matter of some debate in the higher education community is whether to require incoming students to buy their own PC's. At least nine institutions now do so (1). On the one hand, computers have acquired a certain cachet among parents and students. In 1984, when the liberal arts college of Drew University in Madison, New Jersey, announced that all incoming freshmen would be provided with Epson QX-10's (paid for by an increase in tuition), applications for admission increased by 49 percent (2). With colleges and universities in contention for an ever-declining pool of 18-year-olds, this is not a factor to ignore.

On the other hand, the required purchase of a PC does add to the already high cost of education, even with discounts. Furthermore, a dance major may not need the same kind of computer as an engineering student, assuming that he or she needs one at all. And finally, the pace of technology is such that a student who buys a state-of-the-art Pc as a freshman may be stuck with an obsolescent model by graduation.

"We argue that we should not require any student to buy a machine," says Richard L. Van Horn, chancellor of the University of Houston. "We provide access to the network if a student does buy a machine. But the literature on innovation suggests that if you just show people the technology, educate them to the opportunities, and give them access, then the technology transfer process goes much faster than it would have if you'd used coercion."

A second factor in the issue of access is networking: trying the machines into a web if electronic communications. Many departments have already installed local area networks of their own, routing messages and files through inhouse minicomputer systems originally purchased for research. The same incentives that led them to do it now apply to the campus at large. Not only can students and faculty members on a network share expensive resources, such as laser printers and hard disks, they can use their desktop PC's to access remote databases and to communicate via electronic mail systems. For example, a researcher could revise a manuscript on his word processor, send it electronically to a collaborator across campus (or on another campus), and get comments back that afternoon. A professor who discovers he has made a mistake in a problem set could automatically send a correction to every student in his class. A student could send an incomplete essay from her computer at home to a central computer on campus; then when she has a free hour between classes, she could call up the essay again from a PC in the library and continue working on it.

As it happens, this vision of a campus-wide PC network comes just as the break-up of AT&T has resulted in rising prices for voice communications. The upshot is that more and more schools are installing advanced digital telephone systems with both voice and data capability; in effect, they are setting themselves up as their own telephone company. It is not a decision to be taken lightly, however. Aside from the expenses--Stanford's system, for example, will cost some $20 million--networking a campus means laying miles of cable through miles of trenches; running wires through the walls of historic buildings that were never made for it; getting all the PC's from all the different manufacturers to talk to the network; and then getting all the existing departmental networks to talk to each other.

Meanwhile, there is increasing need for a system that will tie the campuses together nationwide. A vocal proponent of this idea is Ira H. Fuchs, vice chancellor for University Systems at the City University of New York and president of EDUCOM. Since 1981 he has been spearheading the development of a multicampus system called BITNET, which now connects some 500 computers at 200 colleges and universities. The system is essentially a network of networks: to send an intercampus message, an individual user simply types it in on his or her office terminal, and sends it through the existing local network to a nearby computer that serves as a BITNET "node." The message is then routed cross-country to a BITNET node at the receiving end and from there through the local network to the addressee's terminal. If BITNET continues to grow as Fuchs hopes it will, the system will link every major campus in the United States and Canada and will become "a primary medium for interuniversity exchanges of information" (3).

Meanwhile, the National Science Foundation (NSF) has started to develop an even more ambitious system to support its new supercomputer centers. (Supercomputers are very high-speed machines designed to do numeric calculations in such fields as astrophysics, fluid dynamics, and climatology.) The idea is to develop a nationwide, high-capacity network so that users can get data to and from the supercomputers without having to go to the centers in person (4,5). In the beginning, at least, this network will tie together such existing systems as BITNET and the defense department's ARPANET. "If you do it right, there's no reason the supercomputer system couldn't become a general purpose network linking every scientist in the United States with every other scientist," says NSF's Dennis Jennings, who is in charge of implementing the system. "And when that happens it will very quickly become a worldwide system, because some of our biggest user communities--the astrophysicists, the high-energy physicists, the climatologists--are multinational communities already."

One final aspect of the access issue is coherence. For example, if computers are to communicate on a network, they must all have the same format for coding data and they must all send that data at the same rate. At the very least they must be able to translate from one system to another. There is still no commonly accepted standard, however, and this has been a major obstacle in expanding networks beyond the one-department or one-campus level.

Operating systems are in a similar state. These are the software packages that supervise the flow of data within a computer, in much the same way that an office manager supervises the flow of work within a business. However, the IBM PC operating system is different from the Macintosh operating system, which in turn is different from the VAX operating system, and so on. An applications package written for one machine has to be totally rewritten before it will work on the next; establishing a common operating system would thus eliminate an enormous duplication of labor. The most popular candidate is UNIX, which was originally written for mainframe computers at Bell Laboratories and which is already widely used in academic computing. Last summer the 15-member Inter-University Consortium for Educational Computing, led by Carnegie-Mellon University, urged Apple, DEC, and IBM to adopt UNIX officially as their common standard on future generations of PC's. John P. Crecine, Carnegie-Mellon's vice president for academic affairs and chairman of the consortium's governing board, later said he was "encouraged" by the meetings (6). But no promises were made.

Management

The first and foremost problem in managing the PC transition is cost. "Obviously, we're all trying to get foundation grants, manufacturer's discounts, and so forth," says Houston's Van Horn, who is presiding over a large campus (31,000 students) and an ambitious networking plan. "But a lot of it is our own money." Washington's Gillespie, who headed a 1981 study on computing and higher education for NSF (7), has estimated that the cost of fully equipping a 5000-student university would be approximately $32 million, or roughly the cost of a major new building. He estimated further that large universities will have to spend a total of $100 million to $200 million apiece for computers and related technologies during the next 10 years, and that even the small liberal arts colleges will have to spend a total of some $20 million to $30 million apiece during that same time.

Such sums put computerization in direct competition with other claimants to the institutional purse, such as scholarships, research, salaries, and buildings. Administrators may have to make some tough choices, especially at the smaller schools. Nor are the expenditures on computers a one-time thing. "A very, very tough problem, which nobody is really addressing yet, is the maintenance and operating cost of these networks," points out Van Horn. "You aren't going to get many grants to cover that. On the other hand, we're used to putting a lot of money into the maintenance of buildings, so we're just going to have to assign priorities and recognize that we have to do it here too."

Not far behind the question of money is the problem of managerial talent. In view of the complexity of the computerization process, more and more institutions are installing a computer "czar" at provost or even vice chancellor level (8). Normally such matters are handled at the vice-provost level. "It was a mildly controversial move," says Van Horn, who last year named James Johnson, former director of information technology at the University of Iowa, as his vice chancellor for computing. "But this is a major area of investment, and we wanted a high-quality person and high-quality management at a very high level." In general, academics with the requisite managerial experience and computer expertise are very much in demand.

A more subtle strategic issue of managers is deciding what the campus computer system should be and how it should be run. The problem is that there have traditionally been two separate information services on campus: the libraries and the computer centers. However, they often report to different people in the university hierarchy, they often do not talk to each other, and they often lack comprehension of what the other is trying to do. There are librarians who do not understand the new technology, for example, or who can barely afford to sustain what they now have. And there are computer experts who think of things only in terms of hardware. moreover, on many campuses these two centers of power are now being joined by a third center, the telecommunications office. "The trick is to get them all talking to each other and talking the same language," says the OTA's Weingarten. "They're becoming blended into a single large information service on campus, and traditionally they haven't bee."

Houston's Van Horn, for one, is a strong proponent of making the libraries the focus of computerization. Not only are they the traditional disseminators of information, he points out, but they are already deeply involved in such information technologies as on-line bibliographic databases and nationwide networking for interlibrary loans. "We see computing merging with the library in the future because that's the right environment," he says. "We're not very interested in teaching programming. The kind of model we should be teaching students is that we give them a problem, and they go draw upon a library of problem-solving software tools tied into databases. That makes for a very smooth transition into professional and business use. And this is the kind of thing that libraries understand: access to knowledge bases, access to tools."

Educational Uses of PC's

As Van Horn suggests, software does not have to be explicity "educational" to be educationally important, at least not at the college level. Spreadsheets, for example, are fast becoming ubiquitous in the business world, making it important for business students to know how to use them. Law students likewise need to know about bibliographic searches. Psychology students need to know about statistics packages. And every student needs to know about word processing. (In fact, students often seem to write better with computers, simply because they find it easier to revise their work.)

The software vendors are beginning to pursue this academic market by offering massive student discounts, in much the same style as the PC hardware vendors (and with much the same motives). Framework, for example, an integrated work processing-spreadsheet-database package from Ashton-Tate that retails for $695, can be had for class adoption for $70; a modified version, in which the word processor, say, is limited to files of under ten pages, will sell for $19.95 as an adjunct to texts such as Paul A. Samuelson's Economics.

Meanwhile, projects are underway to develop more specialized educational software for PC's. Illinois' PLATO, for example, has recently been adapted for microcomputers and is being intensively marketed. And hundreds of faculty members around the country are writing their own programs. To consider just a few examples from the general category of simulation:

1) At Stanford, historian Carolyn Lougee, associate dean of humanities and sciences, has written a role-playing program for her class on Louis XIV. Running on an Apple Macintosh, it resembles such venerable computer games as ADVENTURE and the role-playing games used for many years in business schools. The student begins by taking the role of a modestly well-off young Frenchman in 1638, and from there tries to win as much wealth and prestige as possible. The student collects rents and grain from the tenant farmers on his land, tries to buy an advantageous position at court (25,000 livre will make him secretary to the king), tries to get in with the right clique at court, and woos various maidens. (The size of her dowry is important, but she must be young enough to produce an heir.) From all reports, the game gives students a vivid and personal understanding of another social milieu. It has also brought out some startling behaviors: more than one student has plotted to have his peasants starve so as to drive up the price of h is grain.

2) At Carnegie-Mellon, students in the psychology department's methods course begin by planning a research design. Then they use an IBM PC to generate sets of simulated "experimental data," they test those data on the IBM using all the appropriate statistical tools, and then they refine the design and try again. In this way the students can get practice in the subject of the course, methodology. To perform a real experiment would have taken the better part of a semester--for one data set.

3) Stanford physicist Blas Cabrera has written a number of Macintosh simulations for first-year physics students. Students often have trouble linking the formal equations they learn in elementary physics classes with the behavior of objects in the real world (9). Cabrera's programs allow them to vary parameters, experiment with different situations, and get a vivid feel for how mechanics really works. In the same vein, Carnegie-Mellon's James Hoburg has developed simulations for more advanced concepts in electrostatics and magnetostatics. Using high-resolution color graphics, students plot the potential surfaces and field lines around a wide variety of conducting or insulating surfaces. For a course in control theory, Hoburg has developed a graphics package that allows students to arrange the components of say, a power plant, and then plot how the system responds to various driving forces. In a simple demonstration of the program, the screen shows an imaginary cart that can move about the floor in any direction. On the top of the cart is a broom handle standing on end; the task is to build in a feedback response that will always keep the broom handle balanced upright.

Despite the help and encouragement being given to authors by many campus administrators, educational software is still a cottage industry, roughly equivalent to a professor's writing out a set of xeroxed notes for a course that does not have an adequate textbook. On the other hand, publishers are becoming very interested in academic software. "The notion of writing a program for your course will not be prevalent in 5 years," says Van Horn. "You'll have whole libraries of these programs, and it will look just like the textbook industry. And that's the right way to go, because then the stuff will be professionally documented, maintained, and upgraded."

The prospect of publication of software does raise a critical policy question, however. Traditionally, when a faculty member writes a textbook during working hours, he or she has usually retained the copyright. And when a faculty member discovers something patentable in the laboratory, the rights usually go to the institution. But neigher tradition quite seems to fit when a faculty member writes a piece of software. So who owns it, the author or the institution? The question is complicated by the fact that textbooks almost never make money, whereas a good educational program might make a great deal of money. People are just beginning to wrestle with this issue, with no consensus yet in sight.

The "Scholar's Workstation"

While most campuses have been focusing their energies on the existing generation of PC's, at least three schools--the Massachusetts Institute of Technology (MIT), Carnegie-Mellon, and Brown University--have also gotten deeply involved in defining a new generation. The early 1980's has been described as a "window of opportunity" for the campuses to have a say in how the technology would develop (10): first, a new generation of 32-bit microprocessors was promising PC's with capabilities approaching those of today's large minicomputers and medium-sized mainframes. Second, advances in networking technology were making it feasible to consider linking together several thousand workstations. And finally, the manufacturers were beginning to recognize the importance of the academic market, which gave them a powerful incentive to undertake joint research.

Although each of these three efforts arose independently, the schools have arrived at a remarkably similar vision of what they need, as evidenced by the fact that the term "scholar's workstation," which originated at Brown, is now widely used as a generic term. The vision is sometimes described as a "3-M" machine: that is, a PC with a memory capacity of 1 million bytes, a display screen of 1 million pixels (1000 by 1000), and an operating speed of 1 million instructions per second, communicating through a network at millions of bits per second. For the user, the most obvious feature of such a machine would be its Macintosh-style graphics--on a screen about four times the area of the 9-inch Macintosh--and its ability to divide the screen up into different regions, or "windows," with word processing, says, running in one window while a PASCAL program runs in another.

None of the three schools is working on hardware development as such; however, they are working closely with IBM, DEC, Apple, and other manufacturers, with every expectation that within the next 2 or 3 years these firms will be bringing out commercial versions of a scholar's workstation at a price of roughly $3000. (Actually, such workstations are already commercially available from SUN Microsystems, Apollo, and other manufacturers, although they currently cost many tens of thousand of dollars; one the other hand, they are very useful for development work.)

Although the three programs are all headed in basically the same direction, each does have its own particular emphasis. Carnegie-Mellon, perhaps the most computer-conscious campus in the country, got the earliest start in late 1982 (11). Its Information technology Center, with financial support from IBM, is developing file systems, networking protocols, and a user interface; the goal is to put an integrated, campus-wide computer environment in place and to have some version of the scholar's workstation available for students in the fall of 1986. Meanwhile, on 1 January, Carnegie-Mellon inaugurated the Center for the Design of Educational Computing under the direction of cognitive psychologist Jill Larkin. The idea is first, to help keep the larger development effort in contact with real educational needs, and second, to use the university's strength in cognitive psychology and artificial intelligence to help put educational software design on a sounder scientific footing.

Brown University, which is more of a liberal arts institution than either Carnegie-Mellon or MIT, started its Institute for Research in Information and Scholarship (IRIS) in June 1983 with support from Apple, IBM, and the Annenberg/Corporation for Public Broadcasting Project. The main emphasis at IRIS is defining what the scholar's workstation itself ought to be. The IRIS researchers are also designing advanced, graphics-based software tools for education and research; examples include a word processor for Greek, neo-Cyrillic, and other non-Roman text, and the hypertext/hypervideo system discussed below.

Massachusetts Institute of Technology, which in many ways has the most ambitious program, began its Project Athena in mid-1983 (12). Project Athena is not too concerned with the scholar's workstation as such; instead, it is a vast experiment to see what could be done with a network of workstations once they are available. To perform the experiment MIT is giving students and researchers access to advanced VAX terminals contributed by DEC and advanced PC's supplied IBM (the new IBM PC-AT's). Over the 5-year term of the project, the total contribution of equipment and maintenance will be worth some $50 million. Meanwhile, MIT itself is raising $20 million to finance the development of educational software by faculty members (13).

These projects have raised a few academic eyebrows, especially at Brown and Carnegie-Mellon. In both cases, the contracts with IBM give university researchers access to workstation prototypes and other proprietary technology, which they must then keep secret. Officials at both institutions do point out that their own work is open. (Carnegie-Mellon certainly had no problem in showing a reporter its prototype software, which was running on a SUN workstation.) Moreover, they maintain that there is no commitment to buy hardware in the future from IBM or from anyone else. However, there are obvious concerns that requiring researchers to keep some things secret will interrupt the free flow of information (14). MIT, for example, decided as a matter of principle that it would not get involved with proprietary information. "The price is that we don't have certain hardware," says Project Athena director Steven R. Lerman. "The benefit is that we remain part of the university."

Advanced PC's and Education

The scholar's workstation, when and if it is developed, will not be a single entity. Maurice Glicksman, provost and dean of the faculty at Brown, envisions several different kinds of machines at different prices. "At the high end would be a workstation with high-resolution color graphics capability," he says. "It would be especially useful in the arts, in architecture, and in engineering design. Another type would be essentially equivalent to an Apollo or SUN workstation (black characters and graphics on a white screen). This would be used by many of the faculty members for files, for communication, for data and text analysis. And at the low end, there would be a portable, affordable machine for students. But it would have most of the same capabilities as the others.

"Of course," he adds, "we cannot guarantee that such a family of machines will exist. It depends on whether the vendors see a market."

Assuming that the scholar's workstation does come to pass, however, how will it make a difference in education and research? Clearly, more powerful machines will be able to do a better and better job on the kinds of applications already mentioned--more complex simulations, for example, faster simulations, better graphics, word processing with multiple typefaces, automatic hyphenation and justification of text, and so on. But advanced hardware also begins to make it feasible to do some qualitatively new things. To mention just three possibilities.

1) "Hypertext" and "hypermedia" are dynamic cross-referencing systems. In hypertext, the reader starts onscreen with an original document that has footnotes and other annotations. At a stroke, the reader can then call up the complete text of any given reference, then call up references to that text, and so on, all the while adding links to new annotations and references of his own. In principle, hypertext could build a web of linkages throughout an entire (electronic) library. The concept of hypertext was originated in the late 1960's by computer consultant Theodore Nelson, who was then a researcher at Brown; an experimental version has since been implemented within the PLATO system by Donald Thursh of the University of Illinois's School of Clinical Medicine. Meanwhile, the IRIS group at Brown is extending the idea to hypermedia, which would cross-link graphics, videotape, photographs, sound recordings, and other nontext materials. IRIS also plans to develop the hypertext/hypermedia system as a framework for designing educational software. For example, a faculty member using the system would be able to browse through a catalog of predefined text and media components, customize special templates, define new functions, and indicate methods of student interaction. IRIS is currently developing a prototype hypertext/hypermedia system on the Macintosh, although a complete implementation will have to wait for the full-scale scholar's workstation.

2) An intelligent word processing program for the scholar's workstation would be analogous to the EPISTLE text-critiquing system that has been under development at IBM since 1980 (15) or to the SERGEANT and PREWRITE programs being developed by MIT's James G. Paradis for Project Athena. Such a system would flag misspellings, punctuation errors, or grammatical errors, then suggest alternatives and even provide an optional tutorial on the subject. By means of more advanced artificial intelligence techniques, it is conceivable that the system could also flag jargon, redundancy, and verbosity. This would provide immediate and powerful feedback for students (and professors) to learn better writing. Moreover, it would allow instructors to spend their time and mental energy on more substantive issues, such as the students' content, organization, and style.

3) Finally, there is the possibility of intelligent tutoring machines. The idea is to create systems that could tutor students directly and assist them in solving problems. Such machines would have to incorporate both artificial intelligence techniques and cognitive psychology; presumably they would have some understanding of natural language, the human learning process, and effective pedagogy. (For example, a wise tutor will sometimes let a student discover his own mistakes.) Prototypes have been demonstrated at the Xerox Palo Alto Research Center and at several other laboratories (16).

Long-Term Implications

People have only recently begun to take a serious look at the long-term implications of campus computerization. In fact, many schools have established committees on the subject. Perhaps not surprisingly, however, they have so far generated many more questions than answers.

For example, computers seem to allow students and professors alike to make much more productive use of their time. What are the implications for faculty salaries and for the length of time it takes a student to get a degree?

Computer networks offer electronic mail, a new medium of communication that makes it easier to collaborate with researchers in other fields and on other campuses. How much will that affect the traditional segmentation of the campus into discipline-oriented departments?

Computer networks and the widespread availability of more powerful PC's also means that interactive course work can be delivered to people in their homes and in their offices. Will this be just another kind of university extension service? Or will it undermine the institutional structure of colleges and universities by providing alternatives to higher education?

Desktop computers are convenient, accessible, and even fun to use. They are certainly more convenient than the big mainframes and supercomputers, where time has to be shared. So to what extent will PC's subtly warp research if investigators, perhaps unconsciously, begin to pick problems that they can do in their office?

Perhaps most important, how will computers affect the way people in the campus community interact with each other? "That's the essential issue," says Brown sociologist Mark Shields, who is studying computer impacts for the IRIS project. "Will technology divide us or unite us?"

This seems to be the aspect of the PC that has worried people the most. It may be efficient to put the library's card catalog on line, for example, and to have people pull text into their desktop terminals over the network. But campus libraries are also where students can meet and talk and take study breaks together; how much of that social function will be sacrificed to efficiency, and what will take its place? Meanwhile, what will happen to student-teacher relationships when computerized tutoring becomes commonplace? What will happen to friendships if people spend more and more time typing electronic messages to each other and less and less time talking face to face?

"It's impossible to make any global judgments about these issues," says Shields. "For one thing, we don't really know what the consequences are going to be. For another, what's good and what's bad depends on who you ask. And finally, some benefits and some drawbacks are so intertwined that there is no way to disentangle them."

But Shields himself is an optimist about the PC. "I think that, as people get used to the computer as a medium, the thing itself will become far less important than what they are using it for," he says. "Communication, research, study--it's a question of experience. You start finding that the computer is compatible with a whole variety of human needs and interests. It's a tool that people can use in any way they want."

COPYRIGHT 1985 American Association for the Advancement of Science