Networks and Layers: Technocultural Encodings of the World Wide Web

Ganaele Langlois (York University)

Abstract: This paper calls for a cultural analysis of the World Wide Web through a focus on the technical layers that shape the Web as a medium. Web technologies need to be acknowledged as not only shaped by social processes, but also deployed, through standardization and automation, as agents that render representation possible by following specific technocultural logics. Thus, the focus is on the relationship between hardware, software, and culture through an analysis of the layers of encoding that translate data into discourse, and vice versa.

Résumé : Cet article appelle au développement d'une analyse culturelle du Web axée sur les couches technologiques qui transforment le Web en média. Les technologies Web ne doivent pas seulement être reconnues comme étant façonnées par des processus sociaux, mais aussi comme étant déployées, au travers de phénomènes de standardisation et d'automatisation, comme agents qui rendent toute représentation possible en suivant des logiques technoculturelles spécifiques. Ainsi, la priorité est donnée aux relations entre le matériel informatique, les logiciels et les processus culturels au travers d'une analyse des couches de codage qui traduisent les données informatiques en discours, et vice-versa.

Introduction

The rise of the World Wide Web in the 1990s marked a significant shift in the use of the Internet. It could be said that the ease of use of this service, which combines the communication infrastructure of the Internet with hypertext principles (Berners-Lee, 1999) has played a central role in this development.1 After all, surfing the Web to find information, to entertain oneself, or to buy products does not require an extensive knowledge of the inner workings of computers, as Web browsers use everyday language, not computer language. However, if we stop for a moment and think about the technological apparatus needed for the Web to exist, we end up with a paradox. There is a vast range of technological tools that need to be harmonized to create what we understand as the Web: hardware, protocols for transmission, and the many kinds of software needed on both the production and distribution sides to encode and decode Web languages. These are designed to be hidden, to give users a sense of "instantaneous" communication. The technical assemblage that forms the Web is only felt when communication breaks down, when the server is not working, or the browser is not capable of decoding a new language.

This paradox between seeming simplicity and technological complexity forces us to acknowledge the many layers that are required to produce representations on the Web. It forces us to see the Web not as a singular object, but as an assemblage of technical tools, from the hardware, electric signals, algorithms, and protocols for communication to the software, interface, and sites of representation they offer. The theoretical challenge thus consists of analyzing the Web as a medium through an acknowledgment of its construction as a layered technocultural entity. To assess the cultural importance of the Web as a medium, it is necessary to examine what kind of actions and cultural possibilities are embedded in those layers, how they influence each other, and how, especially with the rise of automated, dynamic, and adaptive forms of communication, they give rise to agents that regulate the discursive possibilities of the Web.

How should we consider the layers that constitute the Web? This article argues for the need to go beyond categorizing those layers as hardware and software that are shaped by users. It answers the question of how those layers are made to act as a medium by rejecting a treatment of the layers as separate, hierarchized entities. It calls for a consideration of the links between the layers through an analysis of the junctures where technocultural agencies are negotiated and mediated.

This article uses actor-network theory (ANT) and discusses its links with cultural studies, medium theory, and new media and Internet studies. Actor-network theory, which has its roots in the works of Bruno Latour, Michel Callon, and John Law, focuses on the relationships between technical objects and humans in order to examine how socio-technical networks come into being. Actor-network theory argues that technical objects have a space of agency and power and, as such, should be considered as actors alongside human actors. Moreover, actor-network theory complements medium theory approaches to computer communication, particularly the one offered by Friedrich Kittler (1995). Medium theory primarily considers the characteristics of media technologies and the ways in which they shape information flows and, by extension, knowledge and social relationships (Meyrowitz, 1994). Finally, ANT's focus on the relationships and links between actors complements analyses of new media and underlines the cultural importance of media technologies, such as the ones developed by Lev Manovich (2001), Alexander Galloway (2004), Lawrence Lessig (1999), and Matthew Fuller (2003).

Locating the Web: Media, technology, culture

The main challenge in studying the Web is best problematized in Lev Manovich's statement that new media appear when the "computer is not just a calculator, control mechanism or communication device, but becomes a media processor, a media synthesizer and manipulator" (2001, p. 25-26). There have been numerous studies of the Web as a medium. Political economy approaches have been useful in demonstrating the shaping of the Internet and the World Wide Web by the market and the state (Lessig, 1999; McChesney, 2000; Mosco, 2004). At the level of content, methodologies such as content analysis and discourse analysis have been adapted to examine the meanings propagated through websites and Web spheres, and new methodologies such as hyperlink analysis (Garrido & Halavais, 2003) have been developed to examine Web-specific textual characteristics. Methodologies, drawing on ethnography, have been reworked to analyze the social consequences and uses of those textual characteristics (Hines, 2000; Schneider & Foot, 2004) - for instance, in political struggles (Kahn & Kellner, 2004; Meikle, 2002).

This non-comprehensive list of the types of research that are being undertaken for the study of the World Wide Web have managed to partly adapt Stuart Hall's classic definition of a cultural studies approach to communication (1980). Indeed, Hall's focus on the encoding and decoding of messages invites us to explore the relationships between frameworks of knowledge, relations of production, and the technical infrastructures that shape media messages. Cultural studies approaches to technology have successfully demonstrated that the study of content cannot be separated from the social, political, and economic context of communication. In turn, it is necessary to acknowledge that technologies of communication are not simply carriers of content, but are parts of complex technocultural entities that participate in the material constitution of discourse.

As is suggested by Jennifer Slack's invitation to focus on "the interrelated conditions within which technologies exist" (1989, p. 329), the analysis of the Web as a space of representation and discourse requires an examination of its material and technological basis. Picturing the Web not only as a hypermediated network, but also, and more importantly, as a site of interrelated social and technical processes, opens the way for both an exploration of not only the surface of the Web - what appears on our computer screens - but also for a deeper multi-level analysis. In that sense, and to borrow from Latour, we can understand the Web as a "black box." The image of the black box is used to represent a "piece of machinery or a set of commands [that is] too complex" (1987, p. 2-3). Furthermore, the black box represents the process by which "many elements are made to act as one" (Latour, 1987, p. 131).

On many levels, the World Wide Web is like a black box. We all know what it is, although attempting to describe its characteristics reveals its complexity as a set of technologically mediated cultural relationships, the Web is at the same time a computer network, a cyberspace of virtual relationships, and a giant shopping mall; it represents the end of privacy, the rise of new forms of control, surveillance, and propaganda; the re-fashioning of social and cultural inequalities, as well as the birth of more democratic forms of communication. The call to go beyond and below the surface of the Web, then, allows us to examine the multiple actors, as well as the relationships between those actors, that form the social networks within which the Web is embedded.

Actor-network theory and cultural studies

As its name indicates, actor-network theory examines the relationships among the actors that form socio-technical networks.2 The term "actor" designates not only the human actors in charge of implementing those systems, but also non-human actors, such as technologies, institutions, and discourses (Latour, 1999 pp. 174-215).3 ANT was developed as a form of ethnomethodology within the field of science and technology studies, and it has been mainly used to describe scientific processes (Latour, 1999) and the implementation of technologies, from transportation systems (Latour, 1996) to information systems (Avgerou, Ciborra, & Land, 2005). Cultural studies approaches to technology have engaged with ANT, especially in the works of John MacGregor Wise and Jennifer Daryl Slack (Wise, 1997; Slack, 1989; Slack & Wise, 2002). These types of cultural studies of technology and ANT share a common set of key theoretical inspirations, among which the rejection of the modernist paradigm that establishes hermetic and essentialist categories; that is, technology versus society, nature versus culture (Latour, 1993; Wise, 1997). The ANT framework recognizes the complexity of assemblages of human and non-human actors through an acknowledgment of the limits of modernist categories. Latour (1993) uses the concept of hybridity to show the impossibility of separating technology, science, nature, culture, and society. To use the example of the Web, the concept of hybridity underlines that the Web is not simply a technology, but is also a cultural artifact and a political and economic entity. The usefulness of ANT within a cultural studies framework lies in the development of analytical tools to account for the multiple facets of this socio-technical network.

Furthermore, ANT's insistence that technological entities should be considered as actors alongside human and other non-human actors leads to a critical assessment of the concept of causality (Wise, 1997). One of the most provocative examples in the examination of human and non-human actors is Latour's Aramis, or the Love of Technology (1996), which not only describes the relationship between the different institutional bodies and human actors that were in charge of implementing a failed transportation system, but also gives voice to the technology itself. What Latour suggests is that the relationship between social agents and technical actors is not mono-causal, but reciprocal and multicausal, thus echoing the concept of articulation as developed by cultural studies.4 What we see as mere technological objects offer constraints and possibilities, and as such, they are best defined as actors who develop spaces of agency. For ANT, the risk in focusing solely on the social agents and cultural processes that shape technologies is of falling into some form of social determinism, where technology is seen as a "receptacle for social processes" (Latour, 1993, p. 55).

The kind of analysis that ANT promotes seeks to open the black box - that is, the Web as a singular object - in order to examine the network of the many actors that constitute it. ANT invites us to see the Web not as a computer network, but as a socio-technical network that assembles human and non-human actors: computer developers, hardware, technical standards, and protocols, as well as institutional bodies that regulate the architecture of the Web, software, and software developers, users, and so on. As Callon and Latour argue, the concept of actor encompasses "any element which bends space around itself, makes other elements dependent upon itself and translate their will into a language of its own" (Slack & Wise, 2002, p. 489). In tracing the flows of agency that define the actors and their space of agency within a network (Slack & Wise, 2002), the approach developed by ANT is reminiscent of the cultural studies concept of articulation as the "non-necessary connections of different elements that, when connected in a particular way, form a specific unity" (Slack, 1989, p. 331).

For ANT, the description of the connections among actors leads to a critical analysis of the ways in which technological actors express cultural and social concerns. Web hypertextuality offers an illustration of this conception of technology as a site of cultural processes. Tim Berners-Lee, the principal developer of the Web, acknowledges a theoretical and conceptual affiliation with Vannevar Bush and Ted Nelson, who first defined the concept of hypertext as trails of association and non-sequential collaborative writing (Berners-Lee, 1999). The Web as a hypertextual system was first envisioned as an embodiment of this concept of hypertext, where the boundaries of print and the separation between authors and readers would be abolished. However, the hypertextual system that we know on the Web today has failed to fully implement this concept of hypertext. Berners-Lee's explanation for the failure of the Web as a collaborative system is not that it was technically impossible to develop such a system, but that it required "much more of a social change in how people worked" (1999, p. 57). Thus, the link between an ideal of communication (hypertext) and its technical actualization (the HTML and HTTP standards that allow for the creation of hyperlinks on the Web) is negotiated by computer developers, who in turn incorporate broader cultural and institutional processes (the traditional separation between authors and readers and the traditional one-way system of mass communication). ANT's concept of "translation" as the process through which different sets of practices give rise to hybrids that are not only technological, but also social (Latour, 1993) is useful here, as it underscores the ways in which HTML and HTTP express the negotiations between an ideal of communication and the cultural processes that limit this ideal. Latour (1999) also describes this process as one of mediation, where the original meaning - non-sequential collaborative writing - is changed through its material implementation, such that, in this example, the use of hyperlinking on the Web is mostly non-collaborative and sequential.

The conjunction of ANT and a cultural studies approach to technology is thus beneficial for a study of the Web: ANT offers a way of seeing technological layers as technocultural layers, (that is, as sites of cultural negotiations), while at the same time, cultural studies can complement ANT by re-incorporating the question of power into ANT's analytical framework (Wise, 1997). In regard to the brief analysis of Web hypertextuality mentioned above, a cultural studies approach that incorporates ANT could examine the broader ideological, economic, and political forces that stall the development of two-way, collaborative communication (for instance, problems raised by copyrights and the notion of authorship).

The analysis of the layers that form the Web, however, cannot limit itself to the developmental stages of the Web. Indeed, one of the central questions that remains to be examined is what happens once Web technologies, which are extremely standardized and automated, are deployed throughout society, so that they do not solely belong to their creators, but materialize in specific cultural processes. How are those technological layers made to act as a specific medium with "fixed characteristics" that "make it physically, psychologically and sociologically different from other media" (Meyrowitz, 1993, p. 61)? The problem with ANT is that its focuses on communication technologies as technological actors, not as media actors promoting only specific rules of transmission but also specific representational - that is, semiotic - systems. In order to address this issue, I look at what medium theory has to say about Web-mediated communication.

Actor-network theory and medium theory

Medium theory has its roots in the works of the Toronto School, particularly those of Harold Innis and Marshall McLuhan, who "developed a way of writing about western civilization by focusing on the media not only as technical appurtenances to society but as crucial determinants of the social fabric" (Carey, 1968, p. 271). Innis (1951) focused on media technologies as producing social biases that favour specific dynamics and specific groups and shape specific monopolies of knowledge, while McLuhan's work (1995) focused on the physiological and psychological effects of media. Innis (1951) argued that writing and print create the possibility for greater territorial extension and control, thus allowing for the creation of political and economic empires that control vast expanses of space. This space bias of writing and print also has consequences for the ways in which knowledge is defined in terms of abstraction and universality. McLuhan (1995) argued that the sensory imbalance towards sight that is created by writing produces a cultural definition of knowledge based on continuity, sequence, and rationalization.

The works of Paul Virilio (2000), Arthur Kroker and Michael Weinstein (1994), and Friedrich Kittler (1990; 1999) can be considered extensions of medium theory. Virilio's and Kroker's analyses of communication networks and of the proliferation of media highlight the rise of an ideology of speed and virtualization, where the boundary between reality and virtual reality is constantly blurred. While their analyses offer numerous insights into some of the cultural consequences of the information age, their tendency is to theorize about a broad configuration of both older media, such as television, and new media rather than to examine the properties of one particular media technology, which is what Kittler does in his examination of the computer as a mode of communication. The approach put forward here with regard to medium theory and the World Wide Web rests on ANT's recommendation that analysis be grounded within a specific technocultural setting.

There is a similarity between ANT and medium theory in their acknowledgment that technologies are neither neutral nor passive, but rather, that they are active in promoting change and establishing new social relationships. However, there are, nonetheless, strong differences between ANT and medium theory. Medium theory has been focused on large-scale social change related to the deployment of different media technologies, while ANT, an ethnomethodology (Latour, 1997), has traditionally been focused on more localized phenomena. Rather than attempting to establish a broad picture of the social impact of the Internet, research that uses ANT has focused on the development of particular information systems within specific organizations (see, for instance, Avgerou, Ciborra, & Land, 2005). Furthermore, ANT is also characterized by its rejection of pre-existing frameworks and categories in favour of learning "from the actors without imposing on them an a priori definition of their world-building capacities" (Latour, 1997, p. 20).

One of the fundamental differences between ANT and medium theory lies in the problem of technological determinism. While Innis' work has been critically assessed as focusing on the cultural consequences of the conjunctures of media technologies and social, political, and economic processes (Buxton, 1998; Wernick, 1999), medium theory, particularly the work of McLuhan, has been criticized for the ways in which it ignores the institutional and cultural context that fosters the development of specific media forms to the detriment of others (Meyrowitz, 1994; Williams, 1975). In the case of a medium theory approach to the Internet, charges of technological determinism have surfaced against the idea that computer networks have ushered in a new cultural order. McLuhan's concept of the global village, for instance, has been revived to express some of the potentialities of information technologies in terms of re-organizing not only modes of communication, but also social relationships and knowledge.

Those types of utopian and dystopian discourses have been rightly criticized for their failure to take into account the context within which new technologies are developed (Mosco, 2004). The critics, however, do not so much deny that communication technologies have an impact, but rather show that there is a need to distinguish between the ideological discourses that are constructed around technologies, the ways in which technologies are appropriated by social and economic forces, and the ways in which technologies sometime resist those appropriations and create new possibilities. ANT's invitation to examine in detail and without a priori knowledge the relationships that form the networks within which new information technologies are located opens the way for recognition of the complex and paradoxical effects of media technologies. For instance, the Web might be seen as yet another outlet for media giants (McChesney, 2000), at the same time as it offers the possibility for people to express themselves and reach audiences through online forums and blogging. Through the mapping of the flows of agency that circulate between human and technological actors within specific contexts, ANT helps us recognize that there might not be a simple pattern to the relationships between media and culture.

It seems difficult, then, to establish any links between ANT and medium theory, which can be characterized as an attempt to find the essential features of media technologies regardless of their contexts of use and deployment. However, going back to the limits of ANT with regard to the acknowledgment of broader structures of power, there is a need to recognize that although there are problems with essentializing approaches to media, there are some stable features that are established over time. For instance, the uses of the Web might be paradoxical, but representational forms on the Web are fairly harmonized through Web protocols and design conventions. This leads us back to the question of the ways in which the technological layers of the Web offer a certain range of possibilities and delineate the fields of agency within which they are articulated.

A medium theory approach to the World Wide Web calls for an examination of the technical characteristics of the World Wide Web and the ways in which those characteristics offer new cultural possibilities. Medium theory invites us to explore not only what is beyond the surface of Web representations - the social context - but also what is below - the hardware and software that shape what we see on our computer screens as mediated messages. For instance, Kittler's concept of discourse networks as the "networks of technologies and institutions that allow a given culture to select, store and process relevant data" (1990, p. 369) invites us to consider the technical processes of specific forms of communication in order to uncover the ways in which the machinery of computer networks shapes knowledge according to specific technicocultural rules. Adapting such an analytical framework to the Web, then, demands an acknowledgment of the processes of computing in terms of their ability to create specific possibilities of communication.

The examination of those possibilities as they are expressed on and through the Web raises the question of how we should apprehend the problem of technological layers. The Web is, after all, only a service allowing for hypertextual communication that makes use of the communication infrastructure offered by the Internet. The Internet, in turn, allows for communication between computers through the definition of specific protocols. At its basis, the Internet makes use of common computing principles that transform electric signals into binary code, which is then processed through algorithms and Boolean algebra. In that sense, it is possible to see the Web as the extension of basic computing principles at a global level. A medium theory approach to the World Wide Web can then be defined as focusing on the cultural and philosophical values embedded in those basic computing principles. For instance, Kien (2002) underlines that the philosophical works on logic by Leibniz and Boole are the basis of computer engineering. The computer as the embodiment of Leibniz's and Boole's scientific methods for assessing truth through the use of monist rationality propagates a new way of seeing the world by transforming everything into arbitrary symbolic signs that can be subjected to a series of calculations. As Bolter (1984) argues, the translation of information into digital code that can be manipulated through algorithms tends to erase the "dividing line between nature and the artificial" (p. 218). Computing principles, then, invite us to conjure the world - and ourselves - as data that can be logically analyzed.

As Lister, Dorey, Giddings, Seth, and Kelly (2003) argue, the principle of digitization is "important since it allows us to understand how the multiple operations involved in the production of media texts are released from existing only in the material realm of physics, chemistry and engineering and shift into a symbolic computational realm" (p. 16). This can be seen as an extension of Bolter's argument that computing fosters a blurring of the natural and the artificial. The idea of dematerialization, which is often referred to in characterizing digitization, offers an illustration of this. Dematerialization can be taken as problematic and paradoxical, in that it does not mean the absence of material supports for representation, but rather points to the new material relationships that are established between content and hardware. Computing dematerializes representations through a series of calculations in order to make them readable on a range of devices (computers, PCs, or Macs, PDAs, et cetera). A digital picture, then, is not something produced by a camera through chemical processes, but a representation that samples the "real" and that can be easily manipulated.

What the concept of dematerialization highlights is that the status of images, and similarly videos, audio pieces, and texts - is different when mediated by computers. Manovich (2001) offers an illustration of the importance of the binary code when he defines the five principles of new media as numerical representation, modularity, automation, variability, and transcoding (Manovich, 1999). What those principles suggest is that the production of representations through computers makes representations malleable through the artifice of the binary code.

Consequently, a question that is raised relates to the status of ordinary language itself as it is processed through and mediated by computer languages. An illustration of the issues surrounding the encoding of language is the new problematic raised by the production of signs through computing. Following Saussure (1983), a sign is made up of a signifier (i.e., a string of letters) and a signified, the concept that is attached to that specific string of letters. Processing signs through computers requires another layer of mediation, in that the signifier itself needs to be mediated. A word processor, for instance, allows users to create signifiers by recognizing that different keys on the keyboard are associated with specific letters. This operation requires that the act of typing be converted into binary code that is then processed in order for the output - the word on the screen - to appear. In that sense, the seemingly instantaneous act of typing, and by extension, the seemingly instantaneous act of recording sound with an MP3 player recorder or having a picture displayed on the screen of a digital camera, is actually a complex process that requires a mediation of the signifiers. As Kittler's Discourse Networks (1990) suggests, the area of exploration that emerges from this focuses on the ways in which changes in the material basis for representation transforms the cultural concept of representation itself, and, by extension, relations of power and what we understand as knowledge.5

It thus becomes necessary to explore the ways in which the World Wide Web extends those principles of malleability, artificiality, and mediation through binary code. While regular users never see the strings of zeros and ones that are processed by the computer, those operations are essential, in that they shape the representations that appear on the computer screen. In that sense, the mediated texts that circulate on the Web can be seen as extensions of monist rationality, Boolean logic, and algorithmic formulas, mixed with the electric signals of the hardware. This allows us to reconsider Manovich's remark about the transition from calculation to representation in a new way. Whereas Manovich (2001) considers this transition in historical and genealogical terms, it also appears that this problematic bridging of mathematics and culture is one of the omnipresent (that is, always necessary) characteristics in new media, including Web communication. In particular, the necessary involvement of mathematical formulas in the production of cultural representations raises questions as to the relationships between information and its material support, as discussed previously with the question of dematerialization. More generally, in the computing process, the mathematical layer becomes a new mediator that encodes physical input and decodes it as a string of signifiers. This mathematical inclusion within the semiotic process was absent in pre-computer forms of printing and writing.

Kittler (1997) goes a step further in examining the relationship between code and culture by declaring that "there is no software" (p. 150). Kittler focuses our attention onto the hardware of the computer, in that "all code operations . . . come down to absolutely local string manipulations and that is, I am afraid, to signifiers of voltage differences" (p. 150). In particular, Kittler usefully points out the ways in which the software layer is constructed so that computer language can appear as everyday language. This prevents us from focusing our attention on the effects of the computer as a medium. As Kittler declares: "What remains a problem is only the realization of these layers which, just as modern media technologies in general, have been explicitly contrived in order to evade all perception. We simply do not know what our writing does" (p. 148).

Kittler's conclusion regarding the unique characteristics of computer communication presents us with several insights, as well as unresolved questions. By focusing on the technical conventions of computer communication, Kittler usefully points out that the study of computer-mediated texts does not consist simply of studying the interface, but more importantly of re-discovering the hidden processes that make the existence of text and discourse possible. As Hayles (2004) argues, we need to recognize that whereas print is flat, code is deep. However, the one limit of Kittler's approach, which is by extension a problem in finding out the unique characteristics of a medium, is a tendency to reduce a complex technical system to one essential operation - that is, the production of electric signals. This is where ANT can be used to investigate the relationships between the elements that form the technical materiality of the Web.

Actor-network theory and new media studies

While the layers of hardware and software that encode knowledge as electric signals and data are invisible to the user, they actually promote specific ways of using and interacting with messages and offer new cultural definitions of knowledge and discourse - as malleable representations, for instance - that are medium-specific. However, it must also be acknowledged that the algorithmic processing of electric signals is only one of the elements that construct a medium such as the World Wide Web. That is, if the World Wide Web establishes rules to transmit and represent data, it should then be looked at in terms of the kinds of principles it propagates through these rules. The process, then, is not one of peeling back the layers to get at some core essential feature, but one of studying their interactions. As Galloway (2004) describes in his analysis of protocol, the exploration of technical standards must take into account the multiple layers of technical encoding that are used to build the World Wide Web:

[T]he content of every protocol is always another protocol. Take, for example, a typical transaction on the World Wide Web. A Web page containing text and graphics (themselves protological artifacts) is marked up in the HTML protocol. The protocol known as Hypertext Transfer Protocol (HTTP) encapsulates this HTML object and allows it to be served by an Internet host. However, both client and host must abide by the TCP protocol to ensure that the HTTP object arrives in one piece. Finally, TCP is itself nested within the Internet Protocol, a protocol that is in charge of actually moving data packets from one machine to another. Ultimately the entire bundle (the primary data object encapsulated within each successive protocol) is transported according to the rules of the only "privileged" protocol, that of the physical media itself (fibre-optic cables, telephone lines, air waves, etc.). (pp. 10-11)

What Galloway suggests is that an examination of the physical and mathematical structure of the World Wide Web is not enough. A representation on the World Wide Web is produced through the interaction between different layers of codes and protocols, and different layers of hardware and software. The question, then, is not so much one of finding the fundamental technical characteristic of a medium so as to draw some essential cultural characteristic, but to examine its technical formation in genealogical terms. As the "network" approach developed by ANT suggests, there is a need to problematize the description of the Web as layers of technical processes. That is, it is necessary to investigate not only what forms those layers of hardware and software, but also how they are related to each other and how they potentially influence each other. ANT's invitation to treat technological objects as actors becomes all the more relevant when dealing with a complex automated system such as the Web, and examining the relationships, articulations, and translations among those actors could lead to a better understanding of the characteristics of Web communication.

A starting point for examining the layers that constitute the Web is the analysis of the different cultural values that are encoded within the technical objects and processes that form the Web. In that regard, electric signals, algorithms, binary representations, and the Leibnizian and Boolean logic they embody are but one part of the problem. What also needs to be considered, as mentioned earlier, is hypertext as a connection device. It was supposed to be an extension of the mind, but was also re-shaped according to specific power relations (Moulthrop, 1994). Also of importance is the conception of the Internet as a distributed network, an anti-hierarchical structure that seems to embody Deleuze and Guattari's (1987) concept of the rhizome (1987). In taking this approach, the goal is to examine the conjuncture of different technocultural processes and the hybrids they produce.

This cultural level of analysis, however, is not enough by itself. Another way of analyzing the layers that form the Web is to consider the rules of transmission they propagate. At the level of transmission over the Internet, the works of Lessig and Galloway offer a first foray into the space of agency of computer networks defined as networks of struggles and power relationships. In particular, Galloway critically assesses distributed networks such as the Internet by examining the ways in which protocols - the sets of "recommendations and rules that outlines specific standards" (2004, p. 6) - have become sites of struggle. For Galloway, the equation between distributed networks, in which there are "no chains of command, only autonomous agents" (p. 38) operating according to pre-agreed rules, or protocols, and the concept of a free, uncontrolled, and uncontrollable network does not hold. While the concept of protocol is anti-hierarchical, it is still a "massive control apparatus" (p. 243). Because protocol defines the rules of data exchange, its potential can be dangerous. For instance, the protocols that make the Internet an open network are also the ones which allow for something like surveillance to exist. Furthermore, the actors in charge of defining the protocols and rules of Internet communication can also be criticized for representing specific interests. The Internet Corporation for Assigned Names and Numbers (ICANN), for instance, has come under fire for privileging U.S. interests (Wikipedia, 2005a; 2005c). Thus, regulatory bodies can serve specific interests and can re-introduce hierarchical and centralized relationships within distributed networks (Galloway, 2004; Lessig, 1999). The fact that protocol is not only a set of technical standards, but also the site of power struggles thus illustrates the ways in which a study of the underlying technical structure of the World Wide Web is important.

Such approaches to the rules of transmission over the Internet need to be extended to the rules of transmission of the Web and other processes that make computer communication possible. To expand on Galloway's comments on the layers that form the Internet, it is not really a question of "privileging" one protocol over another, but rather of examining the ways in which physical signals are encoded and translated through different protocols. ANT's concept of mediation as the examination of the ways in which an original meaning, goal, or intent is changed and distorted through the process of technical translation is important here (Latour, 1999). That is, there might be a need to use ANT's concept of mediation not only with regard to the relationships between the human and the non-human, but also with regard to the relationships and negotiations between a set of automated non-human actors. In the final instance, the examination of those relationships should operate not only in terms of transmission, but also in terms of representation.

Technocultural layers and the question of representation

At the beginning of this article, it was pointed out that one of the reasons why computer communication is important for media studies is that the computer is not simply a transmission device, but also a device for representation. The question of layers, then, concerns not only the protocols that are used for ensuring communication between computers, but also requires a consideration of the ways in which technical elements participate in the construction of representation. There is a need to understand the relationships between the layer of transmission and the layer of representation. The layer of representation brings us back to the most visible layer of the Web - the interface. While the interface is designed to be manipulated by users, I would like to focus the last part of this article on treating the technological elements that form this layer as actors, and not simply as tools to be used. This is intended to highlight the space of agency of those software actors in order to examine their deployment as communicative agents.

Web standards should not only be analyzed as transmission devices, but also as representational devices. In order to operate this shift, it is also important to consider technical standards and computer processes not only in terms of the control and limits they express, but also in regard to the cultural environments they create. An illustration of this is the Amazon website (www.amazon.com). The Amazon website demonstrates the ways in which technical tools, which automatically process users' surfing and reading preferences, aim to create a qualitative environment through quantitative analysis. The automatization and quantification of the traditionally qualitative process of recommending books highlights the different technocultural layers that are examined in this article. At the social level, the experience is both commercial - (buying books) - and cultural (as it is a search for meaningful artifacts). The hypertextual characteristic of the website adds a multi-level experience that is specific to the Web: the user can search by using trails of association that can follow a specific theme, author, Amazon's recommendations, or other users' recommendations. The technical layer that registers the user's clicks enables this entire cultural experience to be increasingly customized the longer the user surfs on the website. In the end, the user is interacting only with a set of machines that process both personal and social behaviours so as to produce something culturally relevant. The software processes surfing behaviour in order to define the correct cultural answer. In that sense, the software processes users in order to represent them within the cultural space of the website.

The example of Amazon illustrates some of the relationships between the technical apparatus of the Web and the cultural experiences that are propagated through interactions with the medium. The agency of software, then, needs to be fully acknowledged, as software becomes not only the actor with which users have to interact, but also the mediator that defines the conditions and cultural richness of those interactions. This re-casts the analysis of computer code in terms of exploring the ways in which cultural experiences are constructed through a series of encodings that "encapsulate information inside various wrappers, while remaining relatively indifferent to the content of information contained within" (Galloway, 2004, p. xiii). This opens the way for a re-assessment of the relationship between meaning and computing and between media representation and artificial calculation. Positions such as the one developed by Kittler (1997) when he declares that "there is no software" (p. 150) is that software is just illusion masking specific technical processes that need to be critically assessed. If the software layer is that which creates the connection between the ordinary languages that are used to articulate representations and the hardware layer of electric connections, its role as yet another mediator needs to be taken into account. If we start from the premise that computer networks such as the World Wide Web become important only when they develop the capacity to encourage the production of meaning, we have to focus on software as being that which fabricates those cultural opportunities out of the hardware.

It is first necessary to further define the differences and relationships between software and hardware. While the "hardware" refers to the physical parts of the computer - from the motherboard to the modem - the "software" refers to the computer programs that are stored in a computer. The system software is defined as the set of computer programs that help run the hardware (for instance, operating systems such as Windows). Earlier parts of this article reviewed the role of system software in terms of its implementation of a specific form of logic, but it is also necessary to focus on software in terms of its signifying function. In that regard, it is important to look at the application software, which is developed so that users can accomplish several tasks (Wikipedia, 2005b).

There are thus several layers of software, and each of those computer programs has specific goals. For instance, the range of software needed in order to produce a website includes programs to create and edit graphics: an editor who can add the HTML, XML, or JavaScript descriptors; and, sometimes a program such as Flash to create animations elements. The kind of application software needed by the Web user is a Web browser, which is capable of translating data, codes, and protocols into a graphical user interface that uses ordinary language and thus allows the user to draw meanings from what appears on the screen.

Software is important in the study of the World Wide Web, as it "forges modalities of experience - sensoriums through which the world is made and known" (Fuller, 2003, p. 63). There has been a great interest in software in terms of the legal issues that are raised through the copyrighting of software, and the alternatives offered by the open source software movement. Those issues illustrate that software does not simply raise commercial issues, it also raises cultural ones. As software is a technical means through which one can express oneself, it is in some ways akin to the alphabet (Mezrich, 1998). However, software is not simply a means to an end, it is also a computer program that defines the ways in which users can interact with texts. To re-formulate the agency of software in Foucauldian terms, software is part of the assemblage that defines the rules of discourse, and thus delineates a specific range of activities for users. In that sense, it is interesting to note that the field of software analysis seems to have been mostly ignored by cultural studies.

Web design programs such as Dreamweaver would be interesting objects for cultural studies in that they embed some of the conventions of Web presentation by giving the user a determined range of choice in how to organize information in Web format: pre-designed pages, framesets, CSS styles, and so on. Design conventions are embedded in the software, propagating specific ways in which information should be packaged. As such, web-design software participates in the development of specific rhetorical strategies for Web texts.

The aesthetic examination of software reveals some of the specific cultural representations that surface on the World Wide Web. Manovich (1999), for instance, sees software in a historical perspective, by arguing for an understanding of the similarities between the avant-garde movement and the representations that are made through computer software. The presentation of hyperlinked and coexisting multimedia elements, for instance, is reminiscent of "visual atomism," and the windowed presentation of the graphical user interface can be traced back to the montage experiments carried out by Dziga Vertov, among others. In some ways, then, there is an intriguing evolution of the concept of the artifice of the virtual as it changes from the artificial coding of information, to the creation of representations that acknowledge their artificial combining of disparate elements in order to foster meanings that can be communicated to users.

Consequently, the form of the Web - its technical structure - influences the content that is being transmitted. The software layer allows for the representation of information and data through metaphors that, as Fuller explains, generate "a way for users to imaginarily map out in advance what functional capacity a device has by reference to a pre-existing apparatus" (2003, p. 100). In the case of the World Wide Web, it is interesting to notice that the experience of surfing is actually an act of "invocation" (Chesher, 1997) or of "calling up HTML files on a computer" (Fuller, 2003, p. 87). The spatial imagery of surfing, and of cyberspace in general, is but "an effect of the bricolage of digital images, text, and other elements linked together by hypertext references" (Shields, 2000, p. 145). The "bricolage" is important here as the process whereby the technocultural assemblages that form the World Wide Web act to represent data and in that sense establish the rules of discourse - "not only the expressive value and formal transformation of discourse, but its mode of existence" (Foucault, 1977, p. 137).

Conclusion

Part of the challenge of attempting a cultural analysis of the World Wide Web lies in examining the paradox of the Web as both socially shaped and culturally distinct through a renewed focus on its technological characteristics. The goal of this article was to analyze the Web in depth, not only by considering it as an assemblage of technocultural layers, but also by examining how, through the conjuncture of those layers, it is made to act as a medium. The use of actor-network theory, in conjunction with cultural studies, medium theory, and new media studies, aimed to bring Web technologies to life so as to acknowledge their social construction as well as agencies. In so doing, this article presented initial approaches to recognizing the importance of studying the Web as a complex technocultural entity. This complexity comes not only from the contexts within which the Web is deployed, but also from inside, from the many layers of standards and protocols that form the rules of transmission and representation on the Web. Acknowledging the multiplicity of technical processes and examining their relationships appears, in the end, to be a useful area of research for exploring the language of the Web.

Acknowledgments

This paper has benefited from comments by two anonymous reviewers. The author would also like to thank Kim Sawchuk, Steve Bailey, and Barbara Crow for their generous help at various stages of the paper.

Notes

  1. For more information on the history of the development of the Internet and the World Wide Web, see: Net History URL: http://www.nethistory.info/index.html; Hobbes' Internet Timeline URL: http://www.zakon.org/robert/internet/timeline/#Growth; and "Internet Statistics: Growth and Usage of the Web and the Internet" URL: http://www.mit.edu/people/mkgray/net/.

  2. The term "actant" is sometimes used in ANT literature instead of "actor." As Latour (1999) states: "since in English "actor" is often limited to humans, the word "actant," borrowed from semiotics, is sometimes used to include nonhumans in the definition" (p. 303). "Actant" is more precise; however, the majority of the texts cited in this article use the term "actor." Thus, for reasons of consistency, I use the term "actor" in the remainder of this article.

  3. For a comprehensive bibliography on ANT, see the "Actor Network Resource". URL: http://www.lancs.ac.uk/fss/sociology/css/antres/antres.htm.

  4. For a more detailed discussion of the concept of articulation, see Slack (1996); Grossberg (1996); and Grossberg (1987).

  5. I am aware that some authors and works that would be useful for further exploration are not included in this discussion. In particular, Hayles' (2004) work offers a rich analysis of the need to adopt a medium-specific approach within the field of literary analysis. As well, Aarseth (1997), offers an analytical framework for examining hypertext. Finally, the question of materialization/dematerialization has not included a discussion on the changes in perception and the more general question of the relationship between technology and the body, as developed by, for instance, Hansen (2000).

References

Aarseth, Espen. (1997). Cybertext: Perspectives on ergodic literature. Baltimore: John Hopkins University Press.

Avgerou, Chrisanti, Ciborra, Claudio, & Land, Frank. (Eds.). (2005). The social study of information and communication technology: Innovation, actors, and contexts. Oxford, U.K.: Oxford University Press.

Berners-Lee, Tim. (1999). Weaving the web: The original design and ultimate destiny of the World Wide Web by its inventor. San Francisco, CA: Harper San Francisco.

Bolter, Jay David. (1984). Turing's man: Western culture in the computer age. Chapel Hill, NC: University of North Carolina Press.

Buxton, William. (1998). Harold Innis' excavation of modernity: The newspaper industry, communications, and the decline of public life. Canadian Journal of Communication, 23(3), 321-339.

Callon, Michel & Latour, Bruno. (1981). Unscrewing the big Leviathan: How actors macro-structure reality and how sociologists help them do so. In Karin D. Knorr-Cetina & Aaron V. Cicourel (Eds.), Advances in social theory and methodology: Toward an integration of micro- and macro-sociologies (pp. 277-303). Boston, MA: Routledge & Kegan Paul.

Carey, James. (1968). Harold Innis and Marshall Mcluhan. In Raymond Rosenthal (Ed.), McLuhan: Pro & con (pp. 270-308). New York, NY: Funk and Wagnalls.

Chesher, Chris. (1997). The ontology of digital domains. In David Holmes (Ed.), Virtual politics (pp. 79-92). London, U.K.: Sage.

Deleuze, Gilles & Guattari, Felix. (1987). A thousand plateaus: Capitalism and schizophrenia (Brian Massumi, Trans.). Minneapolis, MN: University of Minnesota Press.

Foucault, Michel. (1977). What is an author? In D. Bouchard & S. Simon, (Trans.) Language, counter-memory, practice (pp. 113-138). Ithaca, NY: Cornell University Press.

Fuller, Matthew. (2003). Behind the blip: Essays on the culture of software. Brooklyn, NY: Autonomedia.

Galloway, Alexander. (2004). Protocol: How control exists after decentralization. Cambridge, MA: MIT Press.

Garrido, Maria, & Halavais, Alexander. (2003). Mapping networks of support for the Zapatista movement: Applying social network analysis to study contemporary social movements. In Martha McCaughey & Michael D. Ayers (Eds.), Cyberactivism: Online activism in theory and practice (pp. 165-184). New York, NY: Routledge.

Gray, Matthew. (1996). Internet statistics: growth and usage of the web and the internet. URL: http://www.mit.edu/people/mkgray/net/ [September 12, 2005].

Grossberg, Lawrence. (1987). We gotta get out of this place: Popular conservatism and postmodern culture (pp. 113-130). New York: Routledge.

Grossberg, Lawrence. (1996). On postmodernism and articulation: An interview with Stuart Hall. In Kuan-Hsing Chen (Ed.), Stuart Hall: Critical dialogues in cultural studies (pp. 131-150). London: Routledge.

Hall, Stuart. (1980). Encoding/decoding. In Stuart Hall (Ed.), Culture, media, language: Working papers in cultural studies (pp. 128-138). London, U.K.: Hutchinson.

Hansen, Mark. (2000). Embodying technesis: technology beyond writing. Ann Arbor: The University of Michigan Press.

Hayles, N. Katherine. (2004). Print is flat, code is deep: The importance of media-specific analysis. Poetics Today, 25(1), 67-90.

Hines, Christine. (2000). Virtual ethnography. London, U.K.: Sage.

Innis, Harold. (1951). The bias of communication. Toronto, ON: University of Toronto Press.

Kahn, Robert & Kellner, Douglas. (2004). New media and Internet activism: From the "Battle of Seattle" to blogging. New Media & Society, 6(1), 87-95.

Kien, Grant. (2002). The digital story: Binary code as a cultural text (Unpublished master's thesis). Toronto, ON: York University.

Kittler, Friedrich. (1990). Discourse networks, 1800/1900. (Michael Metteer & Chris Cullen, Trans.). Palo Alto, CA: Stanford University Press.

Kittler, Friedrich. (1997). There is no software. In John Johnston (Ed.), Literature, Media, Information Systems: Essays (pp. 147-155). Amsterdam: G+B Internationial. URL: http://www.ctheory.net/text_file.asp?pick=74 [January 14, 2005].

Kittler, Friedrich. (1999). Gramophone, film, typewriter (Geoffrey Winthrop-Young & Michael Wutz, Trans.). Palo Alto, CA: Stanford University Press.

Kroker, Arthur, & Weinstein, Michael A. (1994). Data trash: The theory of the virtual class. Montréal, PQ: New World Perspectives.

Latour, Bruno. (1987). Science in action: How to follow scientist and engineers through society. Cambridge, MA: Harvard University Press.

Latour, Bruno. (1993). We have never been modern. Cambridge, MA: Harvard University Press.

Latour, Bruno. (1996). Aramis, or the love of technology. (Catherine Porter, Trans.). Cambridge, MA: Harvard University Press.

Latour, Bruno. (1997). On recalling ANT. URL: http://www.lancs.ac.uk/fss/sociology/papers/latour-recalling-ant.pdf [January 14, 2005].

Latour, Bruno. (1999). Pandora's hope: Essays on the reality of science studies. Cambridge, MA: Harvard University Press, 1999.

Law, John. (2004). The Actor Network Resource. URL: http://www.lancs.ac.uk/fss/sociology/css/antres/antres.htm [October 12, 2004].

Lessig, Lawrence. (1999). Code and other laws of cyberspace. New York, NY: Basic Books.

Lister, Martin, Dovey, Jon, Giddings, Seth, Grant, Iain, & Kelly, Kieran. (2003). New media: A critical introduction. London, U.K.: Routledge.

Manovich, Lev. (1999). Avant-Garde as software. URL: http://www.manovich.net/ [February 19, 2005].

Manovich, Lev. (2001). The language of new media. Cambridge, MA: MIT Press.

McChesney, Robert. (2000). So much for the magic of technology and the free market. In Andrew Herman & Thomas Swiss (Eds.), The World Wide Web and contemporary cultural theory (pp. 5-36). New York, NY: Routledge.

McLuhan, Marshall. (1995). The Playboy interview. In Eric McLuhan & Frank Zingrone (Eds.), Essential McLuhan (pp. 233-269). Toronto, ON: House of Anansi.

Meikle, Graham. (2002). Future active: Media activism and the Internet. Annandale, NSW: Pluto Press.

Meyrowitz, Joshua. (1993). Images of media: Hidden ferment - and harmony - in the field. Journal of Communication 43(3):55-66.

Meyrowitz, Joshua. (1994). Medium theory. In D. Growley & D. Mitchell (Eds.), Communication Theory Today, (pp. 50-77). Cambridge, MA: Polity.

Mezrich, Jonathan. (1998). Extension of copyrights to fonts: Can the alphabet be far behind? Computer Law Review and Technology Journal, 4(3), 61-67.

Mosco, Vincent. (2004). The digital sublime: Myth, power, and cyberspace. Cambridge, MA: MIT Press.

Moulthrop, Stuart. (1994). Rhizome and resistance: Hypertext and the dream of a new culture. In George Landow (Ed.), Hyper/text/theory (pp. 299-319). Baltimore, MD: Johns Hopkins University Press.

Peter, Ian. (2004). History of the Internet Explained. URL: http://www.nethistory.info/index.html [September 10, 2005].

Saussure, Ferdinand de. (1983). Course in general linguistics (R. Harris, Trans.). London, U.K.: Duckworth.

Schneider, Steven, & Foot, Kirsten. (2004). The web as an object of study. New Media & Society, 6(1), 114-122.

Shields, Rob. (2000). Hypertext links: The ethic of the index and its space-time effects. In Andrew Herman & Thomas Swiss (Eds.), The World Wide Web and contemporary cultural theory (pp. 145-160). New York, NY: Routledge.

Slack, Jennifer Daryl. (1989). Contextualizing technology. In Lawrence Grossberg, Brenda Dervin, Barbara O'Keefe, & Edith Wartella (Eds.), Rethinking communication, Vol. 2: Paradigm exemplars (pp. 329-345). Thousand Oaks, CA: Sage.

Slack, Jennifer Daryl. (1996). The theory and method of articulation in cultural studies. In Kuan-Hsing Chen (Ed.), Stuart Hall: Critical dialogues in cultural studies (pp. 112-130). London: Routledge.

Slack, Jennifer Daryl & Wise, J. MacGregor. (2002). Cultural studies and technology. In Leah A. Liewrouw & Sonia Livingstone (Eds.), Handbook of new media: Social shaping and consequences of ICTs (pp. 485-501). Thousand Oaks, CA: Sage.

Virilio, Paul. (2000). The information bomb. London, U.K.: Verso.

Wernick, Andrew. (1999). No future: Innis, time, sense, and postmodernity. In Charles A. Acland & William B. Buxton (Eds.), Harold Innis in the new century: Reflections and refractions (pp. 261-280). Montréal, QC & Kingston, ON: McGill-Queens University Press.

Wikipedia. (2005a). ICANN. URL: http://en.wikipedia.org/wiki/ICANN [April 10, 2005].

Wikipedia. (2005b). Software. URL: http://en.wikipedia.org/wiki/Software [April 10, 2005].

Wikipedia. (2005c). Working Group on Internet Governance. URL: http://en.wikipedia.org/wiki/Working_Group_on_Internet_ Governance [April 10, 2005].

Williams, Raymond. (1975). Television: Technology and cultural form (pp. 9-31). New York, NY: Schocken Books.

Wise, John MacGregor (1997). Exploring technology and social space. Thousand Oaks: Sage.

Zakon, Robert Hobbes. (2005). Hobbes' Internet Timeline. URL: http://www.zakon.org/robert/internet/timeline/ [September 10, 2005].



  •  Announcements
    Atom logo
    RSS2 logo
    RSS1 logo
  •  Current Issue
    Atom logo
    RSS2 logo
    RSS1 logo
  •  Thesis Abstracts
    Atom logo
    RSS2 logo
    RSS1 logo

We wish to acknowledge the financial support of the Social Sciences and Humanities Research Council for their financial support through theAid to Scholarly Journals Program.

SSHRC LOGO