Before the mainframe became the guiding paradigm for computers, there were a variety of forms that were developed. By the 1970s the main computer paradigm was that of the mainframe. Intel developed a microprocessor that created the possibility for microcomputers but they viewed the chip’s primary value as a general purpose component, which would be used in a number of different capacities. There were prototypes for building smaller computers but in general, the industry did not see a potential market for less powerful computers.

Digital Equipment Corporation (DEC) was one of the first companies to attempt to produce and market minicomputers. It was readily apparent that the marketing staff were doubtful of the ability to sell these computers as consumer products. The mainframe paradigm drove the industry. The vision was that computer-facilitated services would be offered by mainframes, used in conjunction with telecommunications systems. The microcomputers could not offer as much power as the mainframes and no one thought that their would be consumer demand for less powerful machines. DEC eventually dropped their microcomputer project.

By the mid 1970s, electonics enthusiasts had begun to experiment with microprocessors. Because microchips were not easily acquired by a wide public, the majority of those with access had some affiliation with the computer industry (Steven Jobs initially worked at Atari). Furthermore, there were links between these individuals and the organizations they worked for so it would be a falsification to say that the microcomputers were completely developed by grassroots users. By 1975, broader interest in computers was facilitated through popular electronics magazines. Computers were branded as the next major technological challenge, leading hobbyists (especially in the United States) to begin experimenting with the technology. Before long, communities of hobbyists began developing.

In the late 1970s, US companies selling microcomputers emerged from this community. The first company was Apple computers. The focus of these early computers was mainly for educational and business purposes. The computer industry still had lingering doubts about whether or not computers could ever reach a mass audience. IBM joined the market in 1981, dubbing the microcomputer as the “personal computer.”

Following the success of the micro among hobbyists, there were a myriad of opinions over the shape of the technology that would be most suitable for a domestic situation. The predominant view was that the micro should be a machine for running software but the nature of the software was debated (professional, gaming). Eventually, most companies began emphasizing the micro as an educational machine although the concept of “edutainment” soon began to enter the equation.

The vision of computer as a “player,” for different types of media helped shape the early form of the artifact further in the late 70s and early 80s. This influenced the idea of the computer as a delivery system.

In 1983, home computers began selling to mass markets. The “boom” in sales was accompanied by an increase in the industry of home computers (magazines, software houses). By the mid-80s, the boom in home computers had stalled and claims of the home computer being a “fad” began spreading. Through these early years, computer games were one of the factors that sustained a steady growth in the home computer market. It was not until much later that the home computer started to become identified as a tool for word processing.

By the mid-1990s, the second boom in home computers occurred, concurrent with the dot-com boom and with a proliferation in home computer multimedia equipment (audio speakers).

References

Haddon, L. (1988) ‘The Home Computer: The Making of a Consumer Electronic’, Science as Culture, No.2, pp.7-51