Computing

Citation metadata

Editor: Jorge Reina Schement
Date: 2002
Encyclopedia of Communication and Information
Publisher: Gale
Document Type: Topic overview
Pages: 9
Content Level: (Level 5)

Document controls

Main content

Full Text: 
Page 172

COMPUTING

Computers and computer networks have changed the way in which people work, play, do business, run organizations and countries, and interact with one another on a personal level. The workplace of the early twentieth century was full of paper, pens, and typewriters. The office of the early twenty-first century is a place of glowing monitor screens, keyboards, mice, scanners, digital cameras, printers, and speech recognition equipment. The office is no longer isolated; it is linked by computer networks to others like it around the world. Computers have had such an effect that some say an information revolution is occurring. This revolution may be as important as the printing revolution of the fifteenth century, the industrial revolution of the nineteenth century, or the agricultural revolutions of the ancient and medieval worlds.

The computer was invented to perform mathematical calculations. It has become a tool for communication, for artistic expression, and for managing the store of human knowledge. Text, photographs, sounds, or moving pictures can all be recorded in the digital form used by computers, so print, photographic, and electronic media are becoming increasingly indistinguishable. As Tim Berners-Lee (1998), developer of the World Wide Web, put it, computers and their networks promise to become the primary medium in which people work and play and socialize, and hopefully, they will also help people understand their world and each other better.

During the last half of the twentieth century, electronic digital computers revolutionized business, learning, and recreation. Computers are now used in newspaper, magazine, and book publishing, and in radio, film, and television production. They guide and operate unmanned space probes, Page 173  |  Top of Article control the flow of telecommunications, and help people manage energy and other resources. They are used to categorize and preserve the store of human knowledge in libraries, archives, and museums. Computer chips called "embedded microprocessors" are found in the control systems of aircraft, automobiles, trains, telephones, medical diagnostic equipment, kitchen utensils, and farm equipment. The effect on society has been so great that digital information itself is now exchanged more rapidly and more extensively than the commodities or manufactured goods it was originally supposed to help manage. Information has become an essential commodity and, some would argue, a necessary social good.

The history of computing is several stories combined. One is a hardware story—a tale of inventions and technologies. Another is a software story—a tale of the operating systems that enabled specific computers to carry out their basic functions and the applications programs designed to deliver services to computer users. A third story tells how computers provide answers to the problems of society, and how they in turn create new possibilities for society.

Full Text: 

Computers and the Media

The computer has transformed print journalism and magazine and book production, changing the ways in which stories are researched, written, transmitted to publishers, typeset, and printed. Through computing and telecommunications, a news story breaking in Asia can be sent within seconds to North America, along with digital pictures. Word-processing software and more sophisticated desktop publishing programs allow authors to create and revise documents easily and to check them for spelling, grammar, and readability.

Copies of digital documents can be printed on demand, and because computers check for transmission errors, all the copies will be identical. While the first word-processing programs offered little more than typewriter-style characters, the introduction of graphical user interfaces (GUIs) in the 1980s and 1990s opened new design possibilities. Writers could choose from a variety of type fonts, select different page layouts, and include photographs and charts. Some feared that this might eliminate jobs since tasks performed by authors, editors, typesetters, proofreaders, graphic designers, and layout artists could all be performed by one person with a computer.

Laptop or notebook computers gave writers even more flexibility. A reporter on location could compose a story and transmit it immediately to a newspaper (using a modem and a hotel room telephone) on the other side of the globe and, perhaps, to wire news services such as The Associated Press or the Reuters news agency. Satellite uplinks, cellular phones, and infrared "beaming" between machines provide even more possibilities. Moreover, digital photography eliminates the time taken to develop photographs, and digital pictures can be transmitted as easily as text.

Computers have revolutionized radio, television, and film production as well. Computerized camera switching and special-effects generators, electronic music synthesizers, photographic exposure control, and digital radio and television programming are all examples. Computer graphics can be used to superimpose sports statistics over a picture of a game in progress or allow a commentator to explain a key play by drawing a diagram over a television picture. Computers have made it possible to produce the entire programming lineup of a radio station without relying on tape recorders except for archival materials or for recordings made in the field.

Digital sound editing can eliminate noise, mix voice and music, and give producers second-by-second precision in the assembly of programs. Computerized film processing can provide better quality images or allow images to be converted from color to black-and-white and vice versa. While movie animation has traditionally involved photographing thousands of separately drawn pictures or "cells," computer animation can use fewer drawings and produce thousands of variations. Special effects are much more convincing when the computer handles the lighting, perspective, and movement within the movie scene.

Speech recognition and dictating software can convert voice recordings directly to word-processed text, and translation programs can then rewrite the word-processed text into another human language. Musicians can compose new works at a computer keyboard and create a printed score from the finished version.

Even when an organization's primary medium is print, radio, or television, it has become common Page 174  |  Top of Article to provide more in-depth coverage on an associated website. While some radio and television networks simultaneously broadcast and webcast their programming, perhaps the most powerful potential will be found in ever-growing digital archives. Using search engines and, increasingly, programs called "intelligent agents," users can retrieve items from the archives, print fresh copies, or compare different accounts of the same event.

Most young people probably first use a computer for entertainment. Individual-and multiple-player games, online "chat" rooms, newsgroups, electronic mailing lists, and websites provide computer-mediated education and leisure activities that were never possible before.

At first, computer programmers wrote games to amuse themselves. The classic "dungeons and dragons" game, Adventure, invented by Will Crowther and Bob Woods, was a favorite. Players gave commands such as "go left" or "take lamp," and the computer printed replies such as "OK." There were no pictures. Simple games that used graphics, with names such as Pong and Pacman, became available during the 1970s. As personal computers and handheld games became practical to produce, an entire electronic games industry was born. Nintendo and Sega are two familiar games companies. Computerized video games and lottery ticket machines soon became such popular attractions in shopping malls and corner stores that critics began to warn that they might become addictive.

Research Databases

Computing has changed the way writers research and prepare scientific articles. During the early 1970s, a small number of databases containing "abstracts" (i.e., summaries of scholarly and popular articles) could be searched offline. Users submitted lists of subjects or phrases on coding forms. Keypunchers typed them onto computer cards, and operators processed them on mainframe computers. The answers would be available the next day. Library catalogs were printed on paper cards or computer output microform (COM). A microfiche is a transparent plastic slide, roughly the size of an ordinary index card, but it contains images of many pages of computer output.

The Library of Congress, and national libraries in other countries, had by this time converted most of the descriptions of the books they owned into machine-readable form. Toward the end of the 1970s, research databases and library catalogs were becoming widely available online. The Dialog database, and library services such as the Online Computer Library Center (OCLC), made it possible to search the contents of many journals or the holdings of many libraries at once. Standards such as the Machine-Readable Cataloging format (MARC) made it possible to exchange this information worldwide and to display it on many different types of computers. However, limits on computer disk space, telecommunications capacities, and computer processing power still made it impractical to store the full text of articles.

Because of the costs, researchers working for large institutions were the main users of these services. By the mid-1980s, when microcomputer workstations became widely available and compact disc read only memory (CD-ROM) became a practical distribution method, much research could be conducted without connecting to large central databases. Companies such as EBSCO and Info Trac began licensing CD-ROMs to their subscribers. With better magnetic "hard" disks and faster microcomputer chips, full-text storage and retrieval finally became workable.

By the end of the twentieth century, databases and catalogs could be accessed over the Internet, on CD-ROM, or through dial-up connections. Some of the special databases include ERIC (for educational issues), Medline and Grateful Med (for medical issues), and Inspec (for engineering issues). Legal research was simplified by services such as Lexis and Westlaw, which allowed identification and cross-referencing of U.S. and international statute and case law. In one of the more interesting applications of computing technology, the Institute for Scientific Information in Washington, D.C., introduced its citation indexing services, which allow researchers to discover important authors and issues by revealing which authors quote one another. Some databases are free of charge, and some are available for a fee.

A researcher at a public library, in a television newsroom, or in a medical practice can perform searches against thousands of special databases and millions of sites on the World Wide Web. While this sort of research was possible with printed directories in the past, it was time consuming and labor intensive. However, searching for data electronically can have unexpected results. Because the computer does not really Page 175  |  Top of Article understand what the string of letters "Jim Smith" means, it will faithfully report any occurrence it finds, regardless of the context. Information retrieval theory and informetrics are two fields that study the implications.

The Computer Industry

In the late 1960s, some writers scoffed at the potential of computers. The mainframe machines of the time occupied entire rooms, and only large institutions could afford them. No computer ever conceived, suggested one writer, had ever weighed less than a human being or been capable of performing as many tasks.

Without the transistor and the integrated circuit, computers would still fill large rooms. Without the laser and improved plastics, optical storage media such as CD-ROMs and digital versatile discs (DVDs) would not be possible. Magnetic tapes and disks have also improved greatly over the years and can now store much more information than they could in the past. It is difficult to buy an item in the supermarket or to borrow a book from a library without that item having a barcode label on it. Credit and debit cards with magnetic strips make it easier to access bank accounts and make retail purchases. Inventions such as these are part of the story of computing, although they are often overlooked.

For example, a minicomputer of the mid-1980s could cost about $500,000 and could contain 64 kilobytes (kb) of random access memory (RAM). By the end of the century, a magnetic floppy disk containing 1.4 megabytes (Mb) of memory sold for less than a dollar, a CD-ROM disk that held 650 Mb was less than two dollars, and desktop microcomputers with 64 Mb of RAM were common household items.

As the industry grew, so did the legends of inventors who made fortunes or revolutionized the industry. William R. Hewlett and David Packard started their company in a garage. Graduate students David Filo and Jerry Yang developed the Yahoo! Internet directory in a dormitory room. Steve Jobs of Apple Computer, Bill Gates of Microsoft, and the heads of many other companies in California's Silicon Valley became known around the world.

Computer engineers and programmers have often exchanged their ideas openly, out of scientific duty. The Xerox Corporation hit on the idea of the graphical user interface (GUI), developed the "mouse," and then told everyone how to produce them. Linus Torvalds developed the Linux operating system as a personal project and then made it available for free. Universities also have a long history of developing software and computers and then sharing the knowledge.

The History of Computers

While digital computers are a relatively recent invention, analog devices have existed for thousands of years. The abacus, sometimes considered to be a computer, was used in medieval China and by the Aztecs of Central America, and earlier "counting boards" were found in ancient Babylon. Another analog device, the slide rule, continues to have a following because some engineers still prefer them to electronic calculators. Circular slide rules, called "dead-reckoning computers," were used by aircraft pilots well into the 1970s to perform navigational tasks.

During the Middle Ages, the Franciscan scholar Ramon Llull used circular disks that had letters and numbers (representing terms from philosophy) written on them. By turning the wheels, Llull could come up with new combinations of concepts. Llull's work continued to influence logicians. Gottfried Wilhelm von Leibnitz made it the topic of a treatise, Dissertio de arte combinatoria, in 1666.

During the industrial revolution, mass-production devices such as the Jacquard loom became common. Designs to be woven into cloth could be punched onto the cards that controlled the loom. Charles Babbage, working with Lady Ada Lovelace in the early nineteenth century, first thought of using punched cards to do mathematics. Their Analytical Engine wove numbers into tables the way the loom wove cloth from strands of thread. The modern Ada computer language commemorates their work. Toward the end of the nineteenth century, Herman Hollerith, who founded International Business Machines (IBM), developed the punched cards used in early digital computers.

In a 1936 paper, "On Computable Numbers," the British mathematician Alan Turing first suggested the idea of a general-purpose computing machine. With electronic digital computers, Turing's idea became realizable. Turing and the Hungarian-American mathematician John von Neumann are Page 176  |  Top of Article
Grace Hopper works on a 1944 manual tape punch, which was an early computer. (Bettmann/Corbis)

Grace Hopper works on a 1944 manual tape punch, which was an early computer. (Bettmann/Corbis)
two of the many pioneers of digital computing. Turing designed machines called, individually, the Bombe and Colossus to break the "Enigma" cypher—a secret code used by Germany during World War II. He also proposed the famous "Turing test" for artificial intelligence. The Turing test suggests that if a person cannot tell the difference between responses from a computer and responses from a human, then the computer must be considered to be "intelligent."

The first generation of electronic computers, which included the Mark 1, the ENIAC, and other machines built with vacuum tubes, were huge, expensive, and apt to fail or "crash." Grace Hopper once repaired the U.S. Navy's Mark II computer by removing a moth from its circuitry. The term "debugging" is often associated with this incident.

The transistor made it possible to produce computers in quantity. However, mainframe computers such as the IBM 370 were still huge by modern standards, and only universities, government agencies, or large companies could afford them. By the 1980s, with integrated circuits, a new generation of minicomputers was born. Digital Equipment Corporation (later Compaq), Hewlett-Packard, and Data General were some of the key manufacturers. These machines were about the size of a refrigerator.

By the end of the 1970s, desktop microcomputers began appearing in smaller offices and in ordinary people's homes. Beginning with the Osborne, the Commodore 64, the Apple, and the IBM PC, microcomputers and their software systems came to dominate the market. These machines used microcomputer chips—room-sized central processing units shrunk to less than the size of a penny. The Intel 8080 and the Motorola 6800 were among the very first such chips, appearing in the latter half of the 1970s. Many programmers joked about these new "toys." During the next decade, microcomputers would grow into powerful workstations—powered by chips from Intel and Motorola and built by companies such as Sun Microsystems, IBM, Apple, Dell, Toshiba, Sony, and Gateway, to name just a few.

Digital Information

Computing involves three activities: input, process, and output. Data enters the computer through a keyboard or mouse, from a camera, or from a file previously recorded on a disk. A program or "process" manipulates the data and then outputs it to a screen, printer, disk, or communications line.

Over the years, many different input devices have been used, including punched paper tape, punched cards, keyboards, mice, microphones, touch-screens, and video cameras. Output devices have included paper printouts, teletypewriters, and video monitors. The part of the computer that does the processing is known as the central processing unit (CPU). Collectively, everything other than the CPU, including memory boards, disks, printers, keyboards, mice, and screens can be thought of as peripheral devices, or just "peripherals."

There are two sorts of computer software. Operating systems, such as Microsoft Windows, Macintosh, or UNIX, allow machines to perform their basic functions—accepting input, running programs, and sending output to users. Applications programs, such as word processors, Internet browsers, electronic mail programs, or database Page 177  |  Top of Article management programs, do the work required by computer users.

Digital computers use data that has been encoded as series of zeros and ones—binary digits or bits. Text, images, sounds, motion pictures, and other media can all be represented as strings of zeros and ones and processed by digital computers. Programs—the instructions on how to manipulate data—also are represented in binary form. The earliest digital computers were designed to store and manipulate the numbers and letters of the alphabet that were found on typewriter keyboards. The American Standard Code for Information Interchange (ASCII) uses 128 combinations of bits to represent the letters, numbers, and symbols on a typewriter keyboard. Plain text worked well when computers were used primarily for mathematics.

Binary numbers can represent visual and audio information as well. By the end of the 1980s, designers had expanded the coding systems to store drawings, photographs, sounds, and moving pictures. Each dot on a screen is called a "picture element" (or "pixel"). To display graphics on the screen, computers use groups of binary numbers—ones and zeros—to represent the color, intensity of light, and position of each pixel.

Modern computers almost always use some type of GUI. Programmers use small graphics called "icons" to represent a program, a document, a movie, or a musical work. When a user selects an icon, the computer can open a file or program that is associated with it. This technique is object-oriented programming.

When the price of computers dropped, it became possible to distribute work among several machines on a network instead of using a large central computer. A piece of software called a "server" could now send information to smaller programs called "clients" located at the workstations. Shared files remain on large computers called "file servers," so several users can access them at once. Internet browsers, such as Netscape and Internet Explorer, are good examples of "client/server" design at work, where the browser is a client and an Internet site hosts the server software and the large files of information.

There are many programming languages, each better at addressing certain types of problems. The Formula Translation language (FORTRAN) was developed to handle scientific problems. The Beginner's All-purpose Symbolic Interchange Code (BASIC) and the Common Business-Oriented Language (COBOL) were better for office automation. The languages C, C++, Java, and Visual Basic use libraries of small, interchangeable programs that perform frequently required tasks, such as sorting items or displaying them on a screen. Programmers can combine these small programs into more complex systems, allowing programmers to build new applications quickly. Other languages, such as Prolog and LISP, were invented for work in artificial intelligence, while Ada was designed to address military needs.

Once personal computers were available, the demand for special software packages or "applications" increased. Spreadsheets, such as the early Super Calc and Excel, have simplified accounting and statistical processes, and they allow users to try out various financial scenarios. If the costs or quantities of items change, the results will appear immediately on the screen. A whole range of database management packages, including dBase, Fox-Pro, Oracle, and Access, help users do inventories, maintain customer profiles, and more. Because records in databases can be matched against ones in different files, say a customer demographic file with a warehouse inventory file, businesses can predict supply and demand trends and improve the delivery of goods and services. Geographic information systems, online census data, and telephone directories make it easier to market products in areas where there is demand. Some critics argue that using data for reasons other than those for which it was collected is an invasion of privacy. In many countries, freedom of information and privacy protection laws have been passed to address these issues.

Computing and Knowledge

Computers have changed the world in which people live and work, and they have provided new ways of thinking about, and making sense of, that world. At the beginning of the twenty-first century, computer science is a mature academic discipline, with almost every university or college offering computer courses.

As an academic subject, computer science may involve information theory, systems analysis, software engineering, electrical engineering, programming, and information studies that examine the Page 178  |  Top of Article use of digital information. The founders of information theory, Claude Shannon and Warren Weaver, published The Mathematical Theory of Communication in 1949. The mathematician Nor-bert Wiener, who coined the term "cybernetics," showed how computing theories could be applied to problems of communication and control in both animals and machines. Ludwig von Bertalanffy founded general system theory because he saw that large complex systems did not necessarily behave in the same what that their individual components did. He is considered one of the founders of systems analysis.

Professional associations have also played important roles in the development of computing theory, practice, and standards. The Association for Computing Machinery, the Institute of Electrical and Electronic Engineers, the International Standards Organization, and the W3 Consortium are all agencies concerned with computing methods and standards. Less widely known groups, such as the International Society for Systems Sciences and Computer Professionals for Social Responsibility, concern themselves with professional ethics and the social effect of computing. Computing has its own journals and magazines that are aimed at special groups of professionals and at consumers.

Modern computing researchers come from many backgrounds. In turn, scholars from other areas apply computing theory and systems analysis to their own disciplines—from philosophy to psychology to social work. Centers such as the Media Lab at the Massachusetts Institute of Technology or the Xerox Corporation's Palo Alto Researcher Center bring together experts from many fields to design "neural networks" that simulate the human brain, to build smaller and faster machines, or to find better ways of managing digital information. Nicholas Negroponte, Marvin Minsky, and their colleagues at the Media Lab are associated with developments in artificial intelligence and robotics.

Some people fear that while computers relieve humans of repetitive tasks, they may also "de-skill" workers who forget how to do such tasks by hand. Others suggest that having to cope with computers on the job adds extra stress, raises expectations of promptness, and requires ongoing retraining of workers. Because computing has made it possible to recombine and repackage stories, pictures, and sounds, some fear that the work of authors may one day be regarded as interchangeable, much like mechanical parts. In addition, as people depend more on computers, they become more vulnerable to system failure. If the world's computers should fail all at once, economic and social chaos might result. A series of Internet "worms" and "viruses" heightened concern over society's dependence on computers during 1999 and 2000. Governments, banks, companies, and individuals worried that the clocks in their computers might fail at the beginning of 2000, but the "Y2K" crisis they feared did not occur.

Computer designers and computer users think about computers in different terms, and they use different jargon. Hackers, who explore aspects of computers that designers could not have foreseen, have their own way of looking at and talking about computers. People who use computers for destructive purposes are more properly called "crackers." Finally, those people who do not have access to computers run the risk of economic and educational hardships.

The Internet and the Future

During the early 1980s, the Defense Advanced Research Projects Agency (DARPA)—the central research and development organization for the U.S. Department of Defense—commissioned work on a standard design for its wide area networks, computer connections that could link entire countries or continents. In response, communications standards called the Transmission Control Protocol and the Internet Protocol were published in 1981.

Many computer networks, with names such as Decnet, Usenet, and Bitnet, were already in operation, but within about a decade, the new standards were adopted around the world. At first, because there were no graphics, the Internet was used for electronic mail and discussions and for text-only directory services such as Gopher (from the University of Minnesota) and WAIS (wide area information service). Then Berners-Lee and his colleagues at CERN, the European nuclear research center in Switzerland, came up with a new set of protocols that could be used to mix pictures and sounds with text and let users locate any document on any network computer anywhere in the world. The result was the World Wide Web.

Page 179  |  Top of Article


The Internet has created a new forum for expression and discussion of social issues. In April 2000, for example, the Dalai Lama, in New Delhi, India, was given a demonstration of a website that is intended to provide basic knowledge of a citizen's rights during a police complaint. (Reuters New Media Inc./Corbis)

The Internet has created a new forum for expression and discussion of social issues. In April 2000, for example, the Dalai Lama, in New Delhi, India, was given a demonstration of a website that is intended to provide basic knowledge of a citizen's rights during a police complaint. (Reuters New Media Inc./Corbis)

Briefly, this is how the web works. Every computer on the Internet has a numeric Internet Protocol (IP) address, which looks like four groups of numbers separated by periods. Because humans would have trouble with addresses such as 123.12.345.1, websites also have "domain names," such as "wayne.edu " or "acme.com ," which are easier to understand. Scattered around the world, domain name servers (DNSs) provide large telephone-directory style lists, which map the names to the numbers.

Every item on the web, whether a file of text, a picture, or a sound, can be found and retrieved by its uniform resource locator (URL). A URL contains the domain name of the computer on which the item is stored and, optionally, additional information about the file folders and file names on that computer. Documents on the web, called "pages," are written in the Hyper Text Markup Language (HTML) and exchanged using the HyperText Transmission Protocol (HTTP).

Berners-Lee (1998) believes that once most of human knowledge is made available over the Internet, and once the Internet becomes the primary way in which individuals communicate with one another, humans will have the wisdom to use computers to help analyze society and to improve it.

While the promise is bright, the Internet presents many challenges for information scientists. While URLs provide a way of locating individual documents anywhere on the network, the web is always in flux, and URLs are quite "volatile" or apt to change from day to day or even from minute to minute. In addition, because material on the web may look highly polished, it is sometimes hard for users to distinguish reliable information from unreliable information. Metadata—data about data—is one of the schemes proposed to reduce confusion. Metadata tags are similar to subject, author, and title entries in a library catalog, and can be written at the top of a web document.

Page 180  |  Top of Article

Increasingly, the computer network is the medium through which scientists assemble and exchange knowledge from many sources and train future generations. The Human Genome Project and simulations to train surgeons or aircraft pilots are examples. Many scholars publish directly to the Internet by posting their discoveries to the World Wide Web, newsgroups, or mailing lists. This speeds the process of information exchange, but since such works are not examined by editors, it also increases the chances of error and makes it harder for readers to determine whether the information is reliable. The need to be able to index and describe web-pages has led to the development of metadata as a way of categorizing electronic documents. However, with millions of authors publishing to the web, the task of indexing and describing their work is staggering.

Computers continue to become smaller, less expensive, more powerful, and more essential to society. So far, dire predictions of de-skilled workers or massive unemployment due to an increased use of computers in the workplace have yet to materialize. In the future, computers will be still smaller and many times more powerful as engineers find ways to use nanotechnology to build microscopic machines. Some people predict that computers will eventually use individual molecules, or even subatomic particles, to store and manipulate the ones and zeros that make up digital information.

By building microprocessors into cars, aircraft, and even household devices such as microwave ovens, designers have produced a raft of "smart" devices. Steve Mann and his colleagues at MIT and the University of Toronto have even developed smart clothing, which can detect signs of sudden illness in the wearer. Increasingly, computers will be able to assist people with disabilities. Smart cars and smart houses have obvious social benefits. However, the same technologies can be used to produce smart weapons. Sensors in a smart office can prevent burglaries or announce guests. They can also monitor employees, minute by minute. Will ubiquitous computers have positive or negative effects on society? This is a question for which only the future can provide an answer.

Bibliography

Berners-Lee, Tim. (1998). "The World Wide Web: AVery Short Personal History." .

Bertalanffy, Ludwig von. (1976). General System Theory, Foundations, Development, Applications. New York: G. Braziller.

Biermann, Alan W. (1997). Great Ideas in Computer Science: A Gentle Introduction, 2nd edition. Cambridge, MA: MIT Press.

Brookshear, J. Glenn. (1999). Computer Science: An Overview. New York: Addison-Wesley.

Carlson, Tom. (2001). "The Obsolete Computer Museum." .

Gardner, Martin. (1982). Logic Machines and Diagrams. Chicago: University of Chicago Press.

Hiltz, Starr Roxanne, and Turoff, Murray. (1993). The Network Nation: Human Communication Via Computer, revised edition. Cambridge, MA: MIT Press.

Kidder, Tracy. (1997). The Soul of a New Machine. New York: Modern Library.

Negroponte, Nicholas. (1995). Being Digital. New York:Vintage Books.

Raymond, Eric S. (1998). The New Hacker's Dictionary. Cambridge, MA: MIT Press.

Shannon, Claude, and Weaver, Warren. (1949). The Mathematical Theory of Communication. Urbana: University of Illinois Press.

Sudkamp, Thomas A. (1996). Languages and Machines: An Introduction to Theory of Computer Science. New York: Addison-Wesley.

Turing, Alan M. (1936). "On Computable Numbers:With an Application to the Entscheidungsproblem." Proceedings of the London Mathematical Society (2nd series) 42:230-265.

Valovic, Thomas. (2000). Digital Mythologies: The Hidden Complexities of the Internet. New Brunswick, NJ: Rutgers University Press.

Wiener, Norbert. (1965). Cybernetics; Or, Control and Communication in the Animal and the Machine, 2nd edition. Cambridge, MA: MIT Press.

CHRISTOPHER BROWN-SYED

TERRI L. LYONS

Source Citation

Source Citation   

Gale Document Number: GALE|CX3402900060