[Andrew Ross, "Hacking Away at the Counter-culture," part 2, continued from ROSS-1 990. Distributed by _Postmodern Culture_ in vol. 1, no. 1 (Sep. 1990); copyright (c) 1990 by Andrew Ross, all rights reserved] _The Culture and Technology Question_ [31] Faced with these proliferating practices in the workplace, on the teenage cult fringe, and increasingly in mainstream entertainment, where, over the last five years, the cyberpunk sensibility in popular fiction, film, and television has caught the romance of the popular taste for the outlaw technology of human/machine interfaces, we are obliged, I think, to ask old kinds of questions about the new silicon order which the evangelists of information technology have been deliriously proclaiming for more than twenty years. The postindustrialists' picture of a world of freedom and abundance projects a sunny millenarian future devoid of work drudgery and ecological degradation. This sunny social order, cybernetically wired up, is presented as an advanced evolutionary phase of society in accord with Enlightenment ideals of progress and rationality. By contrast, critics of this idealism see only a frightening advance in the technologies of social control, whose owners and sponsors are efficiently shaping a society, as Kevin Robins and Frank Webster put it, of "slaves without Athens" that is actually the inverse of the "Athens without slaves" promised by the silicon positivists.^23^ [32] It is clear that one of the political features of the new post-Fordist order--economically marked by short-run production, diverse taste markets, flexible specialization, and product differentiation--is that the New Right has managed to appropriate not only the utopian language and values of the alternative technology movements but also the marxist discourse of the "withering away of the state" and the more compassionate vision of local, decentralized communications first espoused by the libertarian left. It must be recognized that these are very popular themes and visions, (advanced most famously by Alvin Toffler and the neoliberal Atari Democrats, though also by leftist thinkers such as Andre Gortz, Rudolf Bahro, and Alain Touraine)--much more popular, for example, than the tradition of centralized technocratic planning espoused by the left under the Fordist model of mass production and consumption.^24^ Against the postindustrialists' millenarian picture of a postscarcity harmony, in which citizens enjoy decentralized, access to free-flowing information, it is necessary, however, to emphasise how and where actually existing cybernetic capitalism presents a gross caricature of such a postscarcity society. [33] One of the stories told by the critical left about new cultural technologies is that of monolithic, panoptical social control, effortlessly achieved through a smooth, endlessly interlocking system of networks of surveillance. In this narrative, information technology is seen as the most despotic mode of domination yet, generating not just a revolution in capitalist production but also a revolution in living--"social Taylorism"--that touches all cultural and social spheres in the home and in the workplace.^25^ Through routine gathering of information about transactions, consumer preferences, and creditworthiness, a harvest of information about any individual's whereabouts and movements, tastes, desires, contacts, friends, associates, and patterns of work and recreation becomes available in the form of dossiers sold on the tradable information market, or is endlessly convertible into other forms of intelligence through computer matching. Advanced pattern recognition technologies facilitate the process of surveillance, while data encryption protects it from public accountability.^26^ [34] While the debate about privacy has triggered public consciousness about these excesses, the liberal discourse about ethics and damage control in which that debate has been conducted falls short of the more comprehensive analysis of social control and social management offered by left political economists. According to one marxist analysis, information is seen as a new kind of commodity resource which marks a break with past modes of production and that is becoming the essential site of capital accumulation in the world economy. What happens, then, in the process by which information, gathered up by data scavenging in the transactional sphere, is systematically converted into intelligence? A surplus value is created for use elsewhere. This surplus information value is more than is needed for public surveillance; it is often information, or intelligence, culled from consumer polling or statistical analysis of transactional behavior, that has no immediate use in the process of routine public surveillance. Indeed, it is this surplus, bureaucratic capital that is used for the purpose of forecasting social futures, and consequently applied to the task of managing the behavior of mass or aggregate units within those social futures. This surplus intelligence becomes the basis of a whole new industry of futures research which relies upon computer technology to simulate and forecast the shape, activity, and behavior of complex social systems. The result is a possible system of social management that far transcends the questions about surveillance that have been at the discursive center of the privacy debate.^27^ [35] To further challenge the idealists' vision of postindustrial light and magic, we need only look inside the semiconductor workplace itself, which is home to the most toxic chemicals known to man (and woman, especially since women of color often make up the majority of the microelectronics labor force), and where worker illness is measured not in quantities of blood spilled on the shop floor but in the less visible forms of chromosome damage, shrunken testicles, miscarriages, premature deliveries, and severe birth defects. In addition to the extraordinarily high stress patterns of VDT operators, semiconductor workers exhibit an occupational illness rate that even by the late seventies was three times higher than that of manufacturing workers, at least until the federal rules for recognizing and defining levels of injury were changed under the Reagan administration. Protection gear is designed to protect the product and the clean room from the workers, and not vice versa. Recently, immunological health problems have begun to appear that can be described only as a kind of chemically induced AIDS, rendering the T-cells dysfunctional rather than depleting them like virally induced AIDS.^28^ In corporate offices, the use of keystroke software to monitor and pace office workers has become a routine part of job performance evaluation programs. Some 70 percent of corporations use electronic surveillance or other forms of quantitative monitoring on their workers. Every bodily movement can be checked and measured, especially trips to the toilet. Federal deregulation has meant that the limits of employee work space have shrunk, in some government offices, below that required by law for a two-hundred pound laboratory pig.^29^ Critics of the labor process seem to have sound reasons to believe that rationalization and quantification are at last entering their most primitive phase. [36] These, then, are some of the features of the critical left position--or what is sometimes referred to as the "paranoid" position--on information technology, which imagines or constructs a totalizing, monolithic picture of systematic domination. While this story is often characterized as conspiracy theory, its targets--technorationality, bureaucratic capitalism--are usually too abstract to fit the picture of a social order planned and shaped by a small, conspiring group of centralized power elites. Although I believe that this story, when told inside and outside the classroom, for example, is an indispensable form of "consciousness-raising," it is not always the best story to tell. [37] While I am not comfortable with the "paranoid" labelling, I would argue that such narratives do little to discourage paranoia. The critical habit of finding unrelieved domination everywhere has certain consequences, one of which is to create a siege mentality, reinforcing the inertia, helplessness, and despair that such critiques set out to oppose in the first place. What follows is a politics that can speak only from a victim's position. And when knowledge about surveillance is presented as systematic and infallible, self-censoring is sure to follow. In the psychosocial climate of fear and phobia aroused by the virus scare, there is a responsibility not to be alarmist or to be scared, especially when, as I have argued, such moments are profitably seized upon by the sponsors of control technology. In short, the picture of a seamlessly panoptical network of surveillance may be the result of a rather undemocratic, not to mention unsocialistic, way of thinking, predicated upon the recognition of people solely as victims. It is redolent of the old sociological models of mass society and mass culture, which cast the majority of society as passive and lobotomized in the face of the cultural patterns of modernization. To emphasize, as Robins and Webster and others have done, the power of the new technologies to despotically transform the "rhythm, texture, and experience" of everyday life, and meet with no resistance in doing so, is not only to cleave, finally, to an epistemology of technological determinism, but also to dismiss the capacity of people to make their own uses of new technologies.^30^ [38] The seamless "interlocking" of public and private networks of information and intelligence is not as smooth and even as the critical school of hard domination would suggest. In any case, compulsive gathering of information is no _guarantee_ that any interpretive sense will be made of the files or dossiers, while some would argue that the increasingly covert nature of surveillance is a sign that the "campaign" for social control is not going well. One of the most pervasive popular arguments against the panoptical intentions of the masters of technology is that their systems do not work. Every successful hack or computer crime in some way reinforces the popular perception that information systems are not infallible. And the announcements of military-industrial spokespersons that the fully automated battlefield is on its way run up against an accumulated stock of popular skepticism about the operative capacity of weapons systems. These misgivings are born of decades of distrust for the plans and intentions of the military-industrial complex, and were quite evident in the widespread cynicism about the Strategic Defense Initiative. Just to take one empirical example of unreliability, the military communications system worked so poorly and so farcically during the U.S. invasion of Grenada that commanders had to call each other on pay phones: ever since then, the command-and- control code of Arpanet technocrats has been C5-- Command, Control, Communication, Computers, and Confusion.^31^ It could be said, of course, that the invasion of Grenada did, after all, succeed, but the more complex and inefficiency-prone such high-tech invasions become (Vietnam is still the best example), the less likely they are to be undertaken with any guarantee of success. [39] I am not suggesting that alternatives can be forged simply by encouraging disbelief in the infallibility of existing technologies (pointing to examples of the appropriation of technologies for radical uses, of course, always provides more visibly satisfying evidence of empowerment), but technoskepticism, while not a _sufficient_ condition of social change, is a _necessary_ condition. Stocks of popular technoskepticism are crucial to the task of eroding the legitimacy of those cultural values that prepare the way for new technological developments: values and principles such as the inevitability of material progress, the "emancipatory" domination of nature, the innovative autonomy of machines, the efficiency codes of pragmatism, and the linear juggernaut of liberal Enlightenment rationality--all increasingly under close critical scrutiny as a wave of environmental consciousness sweeps through the electorates of the West. Technologies do not shape or determine such values. These values already exist before the technologies, and the fact that they have become deeply embodied in the structure of popular needs and desires then provides the green light for the acceptance of certain kinds of technology. The principal rationale for introducing new technologies is that they answer to already existing intentions and demands that may be perceived as "subjective" but that are never actually within the control of any single set of conspiring individuals. As Marike Finlay has argued, just as technology is only possible in given discursive situations, one of which being the desire of people to have it for reasons of empowerment, so capitalism is merely the site, and not the source, of the power that is often autonomously attributed to the owners and sponsors of technology.^32^ [40] In fact, there is no frame of technological inevitability that has not already interacted with popular needs and desires, no introduction of new machineries of control that has not already been negotiated to some degree in the arena of popular consent. Thus the power to design architecture that incorporates different values must arise from the popular perception that existing technologies are not the only ones, nor are they the best when it comes to individual and collective empowerment. It was this kind of perception--formed around the distrust of big, impersonal, "closed" hardware systems, and the desire for small, decentralized, interactive machines to facilitate interpersonal communication--that "built" the PC out of hacking expertise in the early seventies. These were as much the partial "intentions" behind the development of microcomputing technology as deskilling, monitoring, and information gathering are the intentions behind the corporate use of that technology today. The growth of public data networks, bulletin board systems, alternative information and media links, and the increasing cheapness of desktop publishing, satellite equipment, and international data bases are as much the result of local political "intentions" as the fortified net of globally linked, restricted-access information systems is the intentional fantasy of those who seek to profit from centralised control. The picture that emerges from this mapping of intentions is not an inevitably technofascist one, but rather the uneven result of cultural struggles over values and meanings. [41] It is in this respect--in the struggle over values and meanings--that the work of cultural criticism takes on its special significance as a full participant in the debate about technology. In fact, cultural criticism is already fully implicated in that debate, if only because the culture and education industries are rapidly becoming integrated within the vast information service conglomerates. The media we study, the media we publish in, and the media we teach within are increasingly part of the same tradable information sector. So, too, our common intellectual discourse has been significantly affected by the recent debates about postmodernism (or culture in a postindustrial world) in which the euphoric, addictive thrill of the technological sublime has figured quite prominently. The high-speed technological fascination that is characteristic of the postmodern condition can be read, on the one hand, as a celebratory capitulation on the part of intellectuals to the new information technocultures. On the other hand, this celebratory strain attests to the persuasive affect associated with the new cultural technologies, to their capacity (more powerful than that of their sponsors and promoters) to generate pleasure and gratification and to win the struggle for intellectual as well as popular consent. [42] Another reason for the involvement of cultural critics in the technology debates has to do with our special critical knowledge of the way in which cultural meanings are produced--our knowledge about the politics of consumption and what is often called the politics of representation. This is the knowledge which demonstrates that there are limits to the capacity of productive forces to shape and determine consciousness. It is a knowledge that insists on the ideological or interpretive dimension of technology as a culture which can and must be used and consumed in a variety of ways that are not reducible to the intentions of any single source or producer, and whose meanings cannot simply be read off as evidence of faultless social reproduction. It is a knowledge, in short, which refuses to add to the "hard domination" picture of disenfranchised individuals watched over by some by some scheming panoptical intelligence. Far from being understood solely as the concrete hardware of electronically sophisticated objects, technology must be seen as a lived, interpretive practice for people in their everyday lives. To redefine the shape and form of that practice is to help create the need for new kinds of hardware and software. [43] One of the latter aims of this essay has been to describe and suggest a wider set of activities and social locations than is normally associated with the practice of hacking. If there is a challenge here for cultural critics, then it might be presented as the challenge to make our knowledge about technoculture into something like a hacker's knowledge, capable of penetrating existing systems of rationality that might otherwise be seen as infallible; a hacker's knowledge, capable of reskilling, and therefore of rewriting the cultural programs and reprogramming the social values that make room for new technologies; a hacker's knowledge, capable also of generating new popular romances around the alternative uses of human ingenuity. If we are to take up that challenge, we cannot afford to give up what technoliteracy we have acquired in deference to the vulgar faith that tells us it is always acquired in complicity, and is thus contaminated by the poison of instrumental rationality, or because we hear, often from the same quarters, that acquired technological competence simply glorifies the inhuman work ethic. Technoliteracy, for us, is the challenge to make a historical opportunity out of a historical necessity. _______________________________________________________ NOTES 1. Bryan Kocher, "A Hygiene Lesson," _Communications of the ACM_, 32.1 (January 1989): 3. 2. Jon A. Rochlis and Mark W. Eichen, "With Microscope and Tweezers: The Worm from MIT's Perspective," _Communications of the ACM_, 32.6 (June 1989): 697. 3. Philip Elmer-DeWitt, "Invasion of the Body Snatchers," _Time_ (26 September 1988); 62-67. 4. Judith Williamson, "Every Virus Tells a Story: The Meaning of HIV and AIDS," _Taking Liberties: AIDS and Cultural Politics_, ed. Erica Carter and Simon Watney (London: Serpent's Tail/ICA, 1989): 69. 5. "Pulsing the system" is a well-known intelligence process in which, for example, planes deliberately fly over enemy radar installations in order to determine what frequencies they use and how they are arranged. It has been suggested that Morris Sr. and Morris Jr. worked in collusion as part of an NSA operation to pulse the Internet system, and to generate public support for a legal clampdown on hacking. See Allan Lundell, _Virus! The Secret World of Computer Invaders That Breed and Destroy_ (Chicago: Contemporary Books, 1989), 12-18. As is the case with all such conspiracy theories, no actual conspiracy need have existed for the consequences--in this case, the benefits for the intelligence community--to have been more or less the same. 6. For details of these raids, see _2600: The Hacker's Quarterly_, 7.1 (Spring 1990): 7. 7. "Hackers in Jail," _2600: The Hacker's Quarterly_, 6.1 (Spring 1989); 22-23. The recent Secret Service action that shut down _Phrack_, an electronic newsletter operating out of St. Louis, confirms _2600_'s thesis: a nonelectronic publication would not be censored in the same way. 8. This is not to say that the new laws cannot themselves be used to protect hacker institutions, however. _2600_ has advised operators of bulletin boards to declare them private property, thereby guaranteeing protection under the Electronic Privacy Act against unauthorized entry by the FBI. 9. Hugo Cornwall, _The Hacker's Handbook_ 3rd ed. (London: Century, 1988) 181, 2-6. In Britain, for the most part, hacking is still looked upon as a matter for the civil, rather than the criminal, courts. 10. Discussions about civil liberties and property rights, for example, tend to preoccupy most of the participants in the electronic forum published as "Is Computer Hacking a Crime?" in _Harper's_, 280.1678 (March 1990): 45-57. 11. See Hugo Cornwall, _Data Theft_ (London: Heinemann, 1987). 12. Bill Landreth, _Out of the Inner Circle: The True Story of a Computer Intruder Capable of Cracking the Nation's Most Secure Computer Systems_ (Redmond, Wash.: Tempus, Microsoft, 1989), 10. 13. _The Computer Worm: A Report to the Provost of Cornell University on an Investigation Conducted by the Commission of Preliminary Enquiry_ (Ithaca, N.Y.: Cornell University, 1989). 14. _The Computer Worm: A Report to the Provost_, 8. 15. A. K. Dewdney, the "computer recreations" columnist at _Scientific American_, was the first to publicize the details of this game of battle programs in an article in the May 1984 issue of the magazine. In a follow-up article in March 1985, "A Core War Bestiary of Viruses, Worms, and Other Threats to Computer Memories," Dewdney described the wide range of "software creatures" which readers' responses had brought to light. A third column, in March 1989, was written, in an exculpatory mode, to refute any connection between his original advertisement of the Core War program and the spate of recent viruses. 16. Andrew Ross, _No Respect: Intellectuals and Popular Culture_ (New York: Routledge, 1989), 212. Some would argue, however, that the ideas and values of the sixties counterculture were only fully culminated in groups like the People's Computer Company, which ran Community Memory in Berkeley, or the Homebrew Computer Club, which pioneered personal microcomputing. So, too, the Yippies had seen the need to form YIPL, the Youth International Party Line, devoted to "anarcho- technological" projects, which put out a newsletter called TAP (alternately the Technological American Party and the Technological Assistance Program). In its depoliticised form, which eschewed the kind of destructive "dark-side" hacking advocated in its earlier incarnation, _TAP_ was eventually the progenitor of _2600_. A significant turning point, for example, was _TAP_'s decision not to publish plans for the hydrogen bomb (which the _Progressive_ did)--bombs would destroy the phone system, which the _TAP_ phone phreaks had an enthusiastic interest in maintaining. 17. See Alice Bach's _Phreakers_ series, in which two teenage girls enjoy adventures through the use of computer technology. _The Bully of Library Place_, _Parrot Woman_, _Double Bucky Shanghai_, and _Ragwars_ (all published by Dell, 1987-88). 18. John Markoff, "Cyberpunks Seek Thrills in Computerized Mischief," _New York Times_, November 26, 1988. 19. Dennis Hayes, _Behind the Silicon Curtain: The Seductions of Work in a Lonely Era_ (Boston, South End Press, 1989), 93. One striking historical precedent for the hacking subculture, suggested to me by Carolyn Marvin, was the widespread activity of amateur or "ham" wireless operators in the first two decades of the century. Initially lionized in the press as boy-inventor heroes for their technical ingenuity and daring adventures with the ether, this white middle-class subculture was increasingly demonized by the U.S. Navy (whose signals the amateurs prankishly interfered with), which was crusading for complete military control of the airwaves in the name of national security. The amateurs lobbied with democratic rhetoric for the public's right to access the airwaves, and although partially successful in their case against the Navy, lost out ultimately to big commercial interests when Congress approved the creation of a broadcasting monopoly after World War I in the form of RCA. See Susan J. Douglas, _Inventing American Broadcasting 1899-1922_ (Baltimore: Johns Hopkins University Press, 1987), 187-291. 20. "Sabotage," _Processed World_, 11 (Summer 1984), 37-38. 21. Hayes, _Behind the Silicon Curtain_, 99. 22. _The Amateur Computerist_, available from R. Hauben, PO Box, 4344, Dearborn, MI 48126. 23. Kevin Robins and Frank Webster, "Athens Without Slaves...Or Slaves Without Athens? The Neurosis of Technology," _Science as Culture_, 3 (1988): 7-53. 24. See Boris Frankel, _The Post-Industrial Utopians_ (Oxford: Basil Blackwell, 1987). 25. See, for example, the collection of essays edited by Vincent Mosco and Janet Wasko, _The Political Economy of Information_ (Madison: University of Wisconsin Press, 1988), and Dan Schiller, _The Information Commodity_ (Oxford UP, forthcoming). 26. Tom Athanasiou and Staff, "Encryption and the Dossier Society," _Processed World_, 16 (1986): 12-17. 27. Kevin Wilson, _Technologies of Control: The New Interactive Media for the Home_ (Madison: University of Wisconsin Press, 1988), 121-25. 28. Hayes, _Behind the Silicon Curtain_, 63-80. 29. "Our Friend the VDT," _Processed World_, 22 (Summer 1988): 24-25. 30. See Kevin Robins and Frank Webster, "Cybernetic Capitalism," in Mosco and Wasko, 44-75. 31. Barbara Garson, _The Electronic Sweatshop_ (New York: Simon & Schuster, 1988), 244-45. 32. See Marike Finlay's Foucauldian analysis, _Powermatics: A Discursive Critique of New Technology_ (London: Routledge & Kegan Paul, 1987). A more conventional culturalist argument can be found in Stephen Hill, _The Tragedy of Technology_ (London: Pluto Press, 1988).