1. 1.3 Definition and Development of the Internet


The purpose of this section is to provide a comprehensive introduction to the technical, systematic, and historical aspects of the Internet. Whilst as a social theoretical study, this essay cannot possibly hope to provide the same sort of technical detail that is available in reference manuals, it is hoped that readers unfamiliar with these aspects will be able to envisage how the system operates and how this interfaces with public policy. Particular difficulties arise of course in any attempt to describe the rapidly changing technology and institutional status of the Internet. This section thus begins with a description of the current situation in terms of institutional and technical features before describing a providing a brief history of the Internet.

In the first instance, the key techical feature of the Internet, the TCP/IP protocol, is described in some depth, along with the those commonly used components from the Application Layer of the Protocol where a conceptual difference is made between communication and information orientated technologies. Following this, a brief description of the major Internet governing institutions, such as the Internet Architecture Board (IAB), the Internet Engineering Task Force (IETF), the Internet Assigned Numbers Authority (IANA) and Internet Corporation for Assigned Names and Numbers (ICANN). With regards to the history of the Internet, three historical phases are noted namely, a Milnet phase, an APRAnet phase and a Internet phase. with distinct strutcural differences in the institutional authority governing the Internet, the network typology and, not surprisingly, the shared symbolic values that constitutes the cultural use of the Internet and the relevant commercial and political organizations.

Before describing the TCP/IP stack, there is some need to go into the basic details computer-mediated communication. A computers system, as any contemporary should know, is an electronic device capable of manipulating data in an electronic or electro-magnetic form. The typical physical components include a case or chassis, including a power supply, a motherboard - housing essential system components and connections, a processor or central processing unit, memory, in the contemporary form of integrated circuits and storage, in the form of an electro-magnetic or magnetic-optical disk. Further, a variety of input/output devices are used to aid the interface between the human user and the computer machine and to allow communications. The most common of these devices include monitors, keyboards and mice, printers and scanners, network interface cards and modems. It will be the latter two input/output devices that are particularly relevant in this discussion and will elaborated on further.



Described in this manner, a computer system is merely a shell of physical objects or hardware. For operation, a computer system also requires programs, or software, a set of instructions utilizing discrete mathematics in the form of a command code. A general typology of programes includes programming languages, for the development of operating systems and applications, operating systems for the manipulation of files and general specific utilities, and applications for the reflexive and administrative work and entertainment for end-users, such as word processing, database and spreadsheet operations, games and communications. Whilst the entire TCP/IP stack will be described, there is a particular emphasis on the Applications Layer experienced by most users.

A computer system may be single-user or multi-user, networked or stand-alone. Single-user, or personal computer systems are a relatively recent invention, first published in Radio Electronics in 1974, and available commercially in 1975 as the kit-form Altair 8800. They are distinguished by using one set of user input/output devices. Multi-user computer systems are usually distinguished by a number of user input/output devices connected to a central computer system. A stand-alone computer system is defined in the absence of input/output devices for connectivity and a protocol to manage that connectivity. A networked computer system includes such hardware and software, namely connectivity devices and a communications network protocol.

A network can be defined as two or more computer systems that share data or hardware resources. It can be as small as two computers which share the same printer or as large as the Internet. To be a network at least three hardware and software requirements need to be met: i) cables or wireless connection between the systems and the appropriate input/ouput devices or ports ii) a set of communications rules known as a network protocol and iii) and an operating system capable of supporting networking. Networks conceptually exists as a local area network (LAN) usually based based on either the Ethernet or Token Ring typology, a wide area network (WAN), consisting of a number of LANs in different locations connected by high-speed fibre optic, satellite or leased 'phone lines and the Internet. Whilst many connect to the Internet by dial-up accounts through a modem (using PPP or, Point-to-Point Protocol), Integrated Services Digital Network (ISDN), CableTV “modems” or Digital Subscriber Line (xDSL), rather than a LAN or WAN, any user of the Internet through the TCP/IP protocol is part of the Internet. Leased lines currently have the most impressive Internet connection bandwidth with 1.55Mbps for T-1 connections and about 45Mbps for T-3 connections.

A network protocol suite is a set of common rules for transferring data. The International Standards Organization and the International Telecommunication Union have developed a seven-layer model for network protocols called the Open Systems Interconnection (OSI) model. Whilst the dominant TCP/IP was already in existence when OSI was introduced, it doesn't strictly conform to the OSI model, however there are significant similarities between the two as would be expected as they have the same aim. In the TCP/IP model the Application, Presentation and Session Layer's of OSI are bundled into a single Application Layer. Further, the Data Link and Physical Layers of OSI are bundled into a Network Access Layer.

TCP/IP is the most popular and common network protocol suite for both local area networks and Internet connection. It's success in both these fields is due to the fact that it was designed in such a manner that suited the technical requirements of APRAnet, particularly decentralized end-node verificcation and dynamic routing, that it wasn't a proprietory standard, and that specialized gateway connections from the local area network to the Internet were not required. The Internet, as a packet-switched network, began with the Network Control Protocol (NCP), but from 1972 onwards the protocols to replace it were worked on. By 1983 the TCP/IP suite was adopted as a military standards and every host on the APRAnet was required to switch from NCP.

For the purposes of this inquiry, the “Internet” includes the use of NCP for historical purposes, which is technically inaccurate. The study also describes a social theoretical perspective that is more appropriate for any type of computer-mediated communication. Indeed, the thesis title could easily be “A Social Theory of Computer Mediated Communication”, if it wasn't for the additional concerns beyond the communicative aspect (including the dominance of TCP/IP). For example, other networks such as UUCP (Unix-to-Unix Copy Program), FidoNet, BITNET, AOL (America OnLine) and Compuserve for example also use computer mediated communication and indeed are often used for communications to and from the Internet proper. An attempt has been made to describe the various networks available (including the Internet) as “the Matrix”, a term coined by William Gibson. Further, there are LANs (many of which use the TCP/IP protocol), and Bulletin-Board Services which do not connect with other networks, let alone the Internet. France's Teletel (commonly known as Minitel) was an a very large network, but was technically not part of the Internet itself.

[

John S. Quarterman, Smoot Carl Mitchell, What Is The Intenet Anyway?, Matrix News, 4(8), 1994, http://www.mids.org/what.html

www.i-minitel.com

]

The United States of America Federal Networking Council has defined the Internet as follows:

The Federal Networking Council (FNC) agrees that the following language reflects our definition of the term "Internet".

"Internet" refers to the global information system that --

(i) is logically linked together by a globally unique address space based on the Internet Protocol (IP) or its subsequent extensions/follow-ons;

(ii) is able to support communications using the Transmission Control Protocol/Internet Protocol (TCP/IP) suite or its subsequent extensions/follow-ons, and/or other IP-compatible protocols; and

(iii) provides, uses or makes accessible, either publicly or privately, high level services layered on the communications and related infrastructure described herein.

[FNC Resolution: Definition of "Internet" 10/24/95 <http://www.itrd.gov/fnc/Internet_res.html>]

This is a far superior definition to some that had attempted within the judiciary and legislative systems. For example, the Communications Deceny Act of the U.S.A., prior to being detemined as unconstitutional described the Internet as: “the international computer network of both Federal and non-Federal interoperable packet switched data networks”, whereas the Children's Online Protection Act (also declared unconstitutional) attempted to define the Internet as “the combination of computer facilities and electromagnetic transmission media, and related equipment and software, comprising the interconnected worldwide network of computer networks that employ the Transmission Control Protocol/Internet Protocol or any successor protocol to transmit the information.”

Both these definitions are wrong for different reasons. The first includes all packet-switched networks, which includes Asynchronous Transfer Mode (ATM), Frame Relay, or the ITU's X.25 and doesn't include non-packet switched networks. The second, whilst restricting the definition to IP networks, includes IP networks that are not globally linked (such as local area network that isn't linked to the Internet). Further, the addition, “any successor” protocol is meaningless – nothing can be defined in terms of what it may become.

[47 U.S.C. § 230(f)(1) (the Communications Decency Act, 47 U.S.C. § 231(e)(3) (the Children's Online Protection Act), for other “stupid” legal definitions see Robert Cannon, Will the Real Internet Please Stand Up: A Quest to Define the Internet, Presented at the Telecommunications Policy Research Conference 2002]

Returning to the comparison between OSI and TCP/IP, the following table provides an illustration of the two models:

OSI and TCP/IP

Application Layer


Presentation Layer

Application Layer

Session Layer


Transport Layer

Transport Layer

Network Layer

Internet Layer

Data Link Layer

Physical Layer

Network Access Layer



Each layer in the TCP/IP plays a part in the entire communication process adding additional information as headers to the data from the Application Layer down the stack which is then converted as by the receiving computer. A data package created at the Application Layer is refered to as a message. The package at the Transport Layer is referred to as segment, if it was created by the Transport Control Protocol or a datagram if it was created by the User Datagram Protocol. This classification is maintained at the Internet Layer, and at the Network Access Layer the package is referred to as a frame.

The Network Access Layer provides the interface with the physical network and network adapter, formats the date for the communications medium and provides the physical addresses as well as error-checking. The Internet Layer provides logical addressing so that data can be transmitted among subnetworks with different architectures and relates physical addresses to these logical addresses. At the Internet layer, routing is also performed to improve network traffic. At the Transport Layer error checking, acknowledgement services and flow control is performed. Finally, at the Application Layer network and file management tools are provided along with Application Programming Interfaces (APIs) that allow programs to be written for particular operating systems.

On the Network Access Layer, the protocol interfaces directly with the network adapter and formats the data received into a frame and converts that frame to the stream of electric pulses that pass through the transmission medium. The Network Access Layer also provides error checking to outgoing frames and acknowledgement of incoming frames (and rejection if the acknowledgement is not received). The Network Access layer deals with network typologies such as Ethernet, Token Ring and the modem based PPP.

Within the Internet Layer, a key feature of the TCP/IP protocol is the concept of logical addressing. In addition to using the unique physical address of a network adapter, networks are logically divided into subnets to enhance performance. In TCP/IP the logical address of a computer is the IP address which is resolved through the Address Resolution Protocol and the Reverse Address Resolution Protocol (ARP and RARP). A router, principly using the Internet Control Message Protocol (ICMP) reads the logical addressing information and ensures that data addressed to the local subnet doesn't cross the router. The numeric IP address is simplified for human useage with a parallel Domain Name Service (DNS), which are the addresses that are most commonly associated with the Intenert, for example the IP address 128.250.6.182 is also www.unimelb.edu.au

IP host addresses are differentiated according to Class A, Class B and Class C. With Class A networks the first byte specified the network portion and the remaining bytes the host portion. There are over 125 Class A networks, each capable of more than 6 million host values or nodes. With Class B networks, the first two bytes specify the network portion and the last two the host portion. There are more than 16,000 Class B networks, each capable of 65,000 host values. In Class C networks, the first three bytes signify the network portion and the last byte the host portion. There are more than 2,000,000 Class C networks each with 254 nodes. Class D networks are not assigned to hosts and is used from multicasts, when a single is electively send to a subset of computers on a network using the Internet Group Management Protocol (IGMP). Class E networks are experimental.

The Domain Name System provides the means by which a logical demarcation can be made in Internet addresses according organization (*.com, *.edu., *.gov, *.mil, *.org, *.net, *.int), through the domain name and the top-level domain (TLD). Nation-states have been allocated country-code top-level domains (ccTLDs, such as *.us, *.fr, *.uk, *ru, *cn), derived from the International Standards Organization 3166 which determines two-letter country codes. There is also a network infrastructure *.apra code. Interestinly, the *.us domain is largely ignored by institutions of the United States, who prefer – in what can be interpreted as a symbolic global claim – to adopt the generic global domain names for *.mil, *.edu, *.gov, etc, all of which are under the authority of the United States government and its agencies. Attempts to contact the U.S.A. government root website, www.gov leads to a redirection to www.gov.com “a partnership of private enterprise and public-sector government news and information bureaus”.

[cf., John Postel, 1994, Domain Name System Structure and Delegation, Network Working Group Request for Comments 1591, ISI, http://www.isi.edu/in-notes/rfc1591.txt]

Within the Transport Layer, two protocols are provided, TCP and UDP (User Datagram Protocol). The former is a connection-orientated service, meaning that connection, acknowledgement and disconnection is checked with data transmissions. UDP simply sends the data. Also this layer performs the tasks of multiplexing and demultiplexing, whereby serveral input sources are combined into a single output source and vice-versa. Finally, it is at the Transport Layer where the data security of firewalls are implemented, the means by which particular ports are closed off.

The Application Layer is an assortment of network-aware software that send information to and from the TCP and UDP ports. They are no by no means equivalent in either their design or complexity. Applications that connect with TCP/IP include network services, such as file and print services, name resolution services and redirectors. A local operating system may have it's only components that assist users with network access, the most common being NetBIOS. However, including these operating and vendor-specific options there a range of key Applications which are directly part of TCP/IP which can be differentiated as Connectivity Utilities, File Transfer Utilities, Remote Utilities and Internet Utilities.

Connectivity utilities include IPConfig, which displays a TCP/IP configuration setting, Ping, a simple utility for testing connectivity, AR/RARP, which allows the user to view the cache that contains the physical address of a remote or local computer, Traceroute, that traces the path of a data package, Route, that allows editing or additions to a router table and Netstate that displays IP, UDP, TCP and ICMP statistics. Among the File Transfer Utilities are FTP (File Transfer Protocol) transfers files from one computer to another using TCP, TFTP (Trivial File Transfer Protocol) which uses UDP instead and RCp (Remote Copy Protocol). Among Remote Utilities is Telnet, using for opening a remote terminal window, Rexec, a utility that runs commands on a remote computer, Rsh (Remote Shell) which invokes a the shell on a remote computer to execute a command and Finger, which displays user information.

It is however, in the field of Internet Utilities that most users are familiar with Internet connectivity. Here we find the variety of web browsers (Internet Explorer, Netscape, Mozilla, NSCA Mosiac, Opera, Lynx etc) using HyperText Transfer Protocol (http) and derivatives, usenet newsreaders (Outlook Express, Agent, Newswatcher, Threaded Internet News) using the Network News Transfer Protocol (NNTP) , email clients (Outlook, Lotus Notes, PINE) and servers that operate through the Simple Mail Transfer Protocol (SMTP), directory services (whois, the American Resigstry of Internet Numbers, the Asia-Pacific Network Information Center etc) as well as the by-passed Archie and Gopher utilities which respectively provided indexes of anonymous FTP sites and an information utility. From a user's perspective, Archie and Gopher were both by-passed by the evolution of the web.

Also worthy of inclusion here is Internet Relay Chat (IRC), Mult-User Domains (MUDs) and ICQ. IRC was originally writter in 1988 as a replacement for the Unix “talk” program, a multi-user text message system based on public or private channels. MUDs allow multiple users to interact in a (currently) text-based virtual reality. The first MUD was designed in 1978 and now there are several hundred MUDs in existence. About 70% of MUDs are based on a fantasy-mythic historical themes. ICQ (“I Seek You”) is a vendor-provided simple message service for computer-to-computer or computer-to-'phone messages. Introduced in 1996, ICQ has achieved an impressive 200,000,000 downloads from the CNET site by May 2002.

[cf www.icq.com and exceprt from CNET newsletter]

Analytic differences between the types of Internet applications can be determined, alongthe basic medium of whether they are a communications application or an information application, with the basic criteria being that the former allows reciprocal rights of response and input. This includes electronic mail and mailing lists, usenet, IRC and MUDs. Information applications, such as the web and the file transfer protocols, however do not allow this automatic reciprocity and require the respondant to establish their own information service to provide an ersatz version of communication. Note however that the flexibility of the web has allowed however, the establishment of communication services (webmail) and, through commercial vendors, the equivalent of usenet (e.g., Topica groups, Yahoo! groups etc) and ICQ with various instant messaging systems.

Descriptions of the governance of the Internet as “anarchist”, “decentralized” or “public” appear in a great number of post-structuralist academic and techno-utopian populist texts. For example, the course syllabus for “Anarchy and the Internet” offered through Pitzer College claims: “Although unintended, the Internet is the quintessential example of a large scale anarchist organization. There is no hierarchial authority controlling the internet, the subunits participate voluntarily, information flows freely, individuals join and exit associations at will”. The text in progress, “21C The Electronic BriefingBook” from M.I.T. claims: “The Intenet's decentralization is inextricably related to, and made possible by, it's architecture – a small number of key principles followed religiously in the net's technical and organizational design, and reflected in it's culture”. John Perry Barlow, founder of the Electronic Frontier Foundation, stated in his “A Declaration of the Independence of Cyberspace”

“We have no elected government, nor are we likely to have one, so I address you with no greater authority than that with which liberty itself always speaks. I declare the global social space we are building to be naturally independent of the tyrannies you seek to impose on us. You have no moral right to rule us nor do you possess any methods of enforcement we have true reason to fear.”

[Dana Ward, Anarchy and the Internet, Syllabus and Readings, 1998 at http://dwardmac.pitzer.edu/dward/classes/Anarchy/anarchyinternet98.html

Sharon Eisner Gillett, Mitchell Kapor, Thomas W. Malone, undated, The Internet: A Study In Decentralized Organization, http://ccs.mit.edu/ebb/pro/inet.html

John Perry Barlow, 1996, A Declaration of the Independence of Cyberspace, http://www.eff.org/~barlow/Declaration-Final.html

]

To be sure, these comments do contain a kernel of truth and certainly from the perspective of a new-comer to the organization and diversity that constitutes the Internet. However, such optimistic assertions of the operations of the Internet and the relationship between the technical and administrative sides are tempered by an examination of the actual governance and authoritative institutions. In terms of an authoritative hierarchy of policy and administration the government of the United States of America is the senior decision making apparatus of the Internet, followed by all other nation-states, international treaties, the Internet Assigned Names and Numbers Authority, the Internet Corporation for Assigned Names and Numbers, the Regional Internet Registries) down to individual hosts. In terms of establishing technical standards, the Internet Society, the Internet Architecture Board, the Intenet Engineering Task Force, the Internet Engineering Steering Group, the Internet Research Task Force.

To ignore the authoritative rule and technical influence of the United States over the Internet is to ignore the history and ignore of the empirical facts of the network. From the origins of the network, funded by the U.S. Government through the Department of Defence, through the development of the NCP and adoption of the TCP/IP suite, the establishment of the National Science Foundation Network, and the National Research and Education Network, the US government has provided the basic funding to ensure the research and viability of a national communications system that has been adopted internationally. The policy of public funding of the network has been, of course, for the reasons of gaining steering performances from co-ordinated improvements in military and educational communications, and more recently, through private funding, in field of commerce. In addition to financing, the United States government has through direct legislation presented significant laws concerning privacy, encryption, content (especially pornography), copyright, defamation, trademarks, taxation, telecommunications and software licensing. Finally, all major statutory and corporate authorities that manage the Internet come under the legal system of the United States government.

[ www.isoc.org/internet/law/legis.shtml ]

Noone can deny the brilliance and forward-thinking of a nation-state investing in such a computer network and likewise, every nation-state today, within it's ability, has at least the formal sovereign rights to engage in such investment, regulation and legilsative framework. The sheer economic and technical dominance of the Internet in the United States – which in reality means it's a semi-distributed network, rather than a fully distributed network - however makes it quite obvious that the standards adopted there are far greater in the authority than those adopted in other nation-states, not least of all the dominance of the American Standards Code for Information Interchange (ASCII), although the adoption of Unicode, or ISO 10646 has significantly altered this. Differences in science and technical policy, public and investment and conflicts between differing legislative systems have, of course, raised new issues which will discussed in greater detail later this study. This is not the main cause of concern here. The issue being discussed is the authoritative hierarchy of the Internet which currently rests on the decisions of the Internet Assigned Names and Numbers Authority (IANA).

[Request for Comment, 2070

Request for Comment 2277

In the early days single person, Jon Postel, maintained the list of host names and addresses in the early days of APRAnet whilst a graduate student at UCLA. As the list grew some of the administrative work was transferred to SRI, however Postel still headed what became the Internet Assigned Numbers Authority, which was subsidised by the US government. Over time, the United States government contracted the coordination of IP numbers and domain names through a number of organizations, including IANA. In June 1998 however, the U.S. Government published a “White Paper” on the Internet Domain Name System, Specifically, the National Telecommunications and Information Administration (NTIA) of the United States Department of Commerce issued Docket Number: 980212036-8146-02 on ``Management of Internet Names and Addresses''. This paper advocated in part privitization of the management of the IP and Domain Name System and partially increasing the international participation in the management of the system.

[http://www.ntia.doc.gov/ntiahome/domainname/6_5_98dns.htm

John S. Quarterman 1998, the U.S. DNS White Paper, Matrix News, 8(6), June 1998, http://www.mids.org/mn/806/dns.html

]

The White Paper acknowledges “IANA has functioned as a government contractor, albeit with considerable latitude, for some time now.” Part of this latitude apparently included handing over the administration of a large number of IP addresses to RIPE (Reseaux IP Europeens) formed by European service providers to allow the operation of the pan-European IP Network in 1989 and the Asia Pacific Network Information Centre. Interesting RIPE and APNIC continued funding IANA after 1997 when the funding from the US government ceased. These facts, along with the questionable ownership of the United States to ccTLDs, were subsumed with the formation of a new revenue-neutral U.S. Corporation, the Internet Corporation for Assigned Names and Numbers, or ICANN.

[The White Paper does state: “Of course, national governments now have, and will continue to have, authority to manage or establish policy for their own ccTLDs.”. But this fails to mention that the ultimate control of the IP address is held by a Corporation that operates under the laws of a particular nation-state.]

Today IANA is “dedicated to preserving the central coordinating functions of the global Internet for the public good” which includes allocating IP addresses from the pool of unallocated addresses to the Regional Internet Registries (APNIC, RIPE, ARIN and LACNIC) according to their needs, maintaining the database of global top level domain names and country-code top level domains, providing a whois service for the domains which IANA has authority over (such as *.int). IANA is also responsible as the registrar for protocols developed by the IETF and IRTF.

[www.ianna.org]

Whilst IANA has some important roles, most of it's power today is primarily titular and marginal. The real authority of the Internet is ICANN. ICANN has responsbility for most IP address space allocation, protocol parameter assignment, domain name system management and root system management. The initial Board of Directors were partially elected by indivdiuals members of the Internet, and partically by business, technical and academic interests. ICANN also makes the rather high-sounding (and far-fetched) claim to have: “no inherent authority... no statutory or other government power; its authority is entirely an consequence of voluntary contracts with its consensus policies of the global Internet community. It has no power to force any individual or entity to do anything; its 'authority' is nothing more than the reflection of the willingness of the members of the Internet community to use ICANN as a consensus development vehicle.”

One of the first actions of ICANN was establish competition for the prestigious *.com domain name. From 1991, a single provider, Network Solutions Incorporated, had provided, with U.S. government approval, a monopoly service of the registration of global domain names. In 1998, ICANN accredited five new competitive registrars (America Online, CORE, France Telecom, Melbourne IT and register.com). The assumption is, of course, that the steering benefits of regulation can be maintained through having a single registry, whilst the competitive production advantages can be gained by having several competitive registrars.

[Constant awareness of the distinction between a registry, which holds the table of IP addresses and domain names and a registrar, a body which provides the data to be placed in the registry table, is required in this consideration of the Internet authoritative institutions]

[www.icann.org]

The Regional Internet Registries include the Asia-Pacific Network Information Centre (APNIC), Réseaux IP Européens Network Coordination Centre (RIPE NCC) , American Registry for Internet Numbers (ARIN) and the Latin American and Caribbean IP address Regional Registry (LACNIC). As the engineering needs for a topological IP address assignment were becoming increasingly evident there was also political and administrative needs for decentralization, including the volume of IP numbers, the distance of the registry from the network, the lack of a global funding system and the lack of local support. Regional Internet Registries face the dual and competitive goals of aggregation and conservation of IP numbers whilst maintaining their allocated registry.

The first Regional Internet Registry established was RIPE in 1992, located in the Amsterdam, The Netherlands, and shortly afterwards, global guidelines were published as a Request For Comment. The following year, the APNIC was established in Tokyo, Japan (it moved to Brisbane, Australia in 1988). In the United States, the situation was somewhat different. In 1991, the US government granted a monopoly contract to Network Solutions, Inc. which included IP address registration, domain name registration and support and a variety of other services. In 1993, the National Science Foundation started a project called InterNIC under a cooperative agreement to provide registration and the allocation of domain names and IP addresses.

With consultation with IANA, the IETF, RIPE, APNIC, the NSF and the US Federal Networking Council a general consensus was reached to separate the management of domain names from the management of IP numbers, based on the requirement for careful administration of limited IP numbers and the relatively unlimited supply of domain names. As a result, ARIN was established in late 1997 as a nonprofit corporate with an open membership. Whilst originally ARIN also covered a number of African and South American nations, the management of IP numbers to these locations has recently been passed to LAPCNIC. LACNIC is a recently emerged RIR, established in 2000, based in Montevideo, Uruguay, with registrary offices in Sao Paulo, Brazil. AfriNIC is an emergent RIR which is almost ready to begin operations.

[RFC 2050

http://www.afrinic.org/



]



[

Daniel Karrenberg, Gerard Ross, APNIC, Paul Wilson, Leslie Nobile, 2001, Development of the Regional Internet Registry System, Cisco Internet Protocol Journal, December 2001.

http://www.ripe.net/ripencc/about/regional/rir-system.html

RFC 1366, 1466

]

On the authoritative side of technical specifications, the Internet Society reigns supreme. According to their mission statement, the society exists “to assure the beneficial, open evolution of the global Internet and its relate internetworking technologies through leadership in standards, issues, and education". It is an international, non-profit organization with a membership of 150 organizations and 11,000 individual members in over 180 nation states, who elect the Board of Trustees. The Internet Society provides co-ordination and financial and legal support to the Internet Architecture Board, the Internet Engineering Task Force and related groups. The Internet Society has chareted the IANA as the clearinghouse to assign and coordinate the use of Internet protocols.

The Society was formed in 1992 by an number of people who had been heavily involved with the Internet Engineering Task Force, which preceeds the Society. The stated reason was uncertainty of the long-terms support for the standards-making activity of the IETF which had previously come from research supprting agencies of the United State government (ARPA, NSF, NASA an DOE). The Internet Society has membership relations with the International Telecommunications Union for the purposes of co-ordinated activity, especially in regards to the ITU-T (Standardization Sector), the ITU-R (Radio Sector) and the ITU Development Sector. Like other organizations, the Internet Society is an incorporated body under United States law.

[www.isoc.org]

The origin of today's Internet Architecture Board lies in the Internet Configuration Control Board (ICCB), which was created in 1979 by Vint Cerf, at that time program manager at DARPA, to advise him on technical issues. In 1984 the ICCB disbanded and was replaced by the Internet Advisory Board (IAB) concurring with policy changes in DARPA. When the Internet Society was formed in 1992, the IAB requested that it's activities fall under the auspices of the Internet Society, and thus the Internet Advisory Board became the Internet Architecture Board, with greater independence to the Internet Engineering Task Force, the Internet Research Task Force and the Internet Engineering Steering Group. The Internet Architecture Board also approves the organization that will act as the Request for Comments Editor and the RFC general policy. The RFC Editor is funded by the Internet Society.

Today the Internet Architecture Board consists of 13 members, of which seven are drawn from the Internet Engineering Task Force and is endorsed by the Board of Trustees of the Internet Society. The IAB does not producte technical papers, but rather stimulates action in the IETF and IESG by noting areas of concern. The boundaries are somewhat fuzzy of course, and the IAB often finds itself dealing with the boundaries of engineering, political and administrative policy.

[www.iab.org

RFC 1160, written by Vint Cerf

RFC 3160

RFC 1602 IETF and ISOC] ]

The related Internet Engineering Task Force (IETF) is an open community open to any interested individual in the evolution of the Internet's architecture. It's first meeting in San Diego in 1986 was with a mere 21 attendees. The technical work is carried out through working groups and mostly handled through mailing lists. The administration is managed by Area Directors, who are members of the Intenet Engineering Steering Group, whilst oversight is provided bythe Internet Architecture Board, which also acts as an adjucating body. As a management group to the IETF, the Internet Engineering Steering Group is responsible for co-ordination, publishing and proedural administrayion of the IETF. The IESG consists of area directors, who are selected by a nominations committee which is open to those involved in the IETF. The Internet Research Task Force concentrates on long term research groups related to Internet protocols, applications and architecture. It has it's own steering group and the chair is appointed by the Internet Architcture Board.

[http://www.ietf.org/

The Tao of the IETF. This was published as RFC3160 ]

J. Galvin, 2000, IAB and IESG Selection, Confirmation, and Recall Process: Operation of the Nominating and Recall Committees. Request for Comments: 2727 , The Internet Society

http://www.irtf.org

RFC 2014

]

This brief definition and description of the major technical and administrative characteristics of the Internet does lead out of course the historical development. Being limited to authoritative policy, administrative and technical bodies for instance means that non-authoritative, yet still influential organizations such as the Electronic Frontier Foundation, have been ignored. Further, cultural interpretation of the “spirit” of the culture associated with computer-mediated communication is of course, completely absent. Whilst the information provided in the opening section of this study provided some historical background to the development of the Internet, it is now opportune to bring this to closer attention.

The following analytical phases are suggested for an inquiry into the history of the Internet; a Milnet phase, an Apranet phase, and an Internet phase. The operating principle in this distinction is the locus of administrating the network, however there are also correlating technical and cultural changes. In general, the proposed historical structure is that the first phase (Milnet) emphasized a technology built on centralized computer systems and was institutionalized under the authority of the U.S. Department of Defence with co-operation with Universities. However, as the widespread existence of an anti-authoritarian “hacker” culture proved problematic and the demands for University research needs grew at faster pace that the needs of the military, the two networks separated. This lead to a APRANET/BBS phase, which included the widespread use of personal computer, dial-up Bulletin Board Services, effective network control by the University system with significant internal democracy and a politically astute cyberpunk subculture. The third Internet phase represents the incorporation of other networks into the Internet, the internationalization, commercialization and widespread adoption of the Internet through GUI operating systems and a netizen culture.

When the Soviet Union launched Sputnik in 1957, there was widespread fear that the United States was falling behind in the technology stakes. Thus, the U.S. government fromed the Defence Advanced Projects Research Agency (DARPA). In the cold war environment the U.S. military was concerned with establishing a computer network which could withstand nuclear attack. Baran, in the RAND corporation document 'On Distributed Networks', examined how various communication systems were subject to failure. The report concluded with a system where there was no central command and all serving points would be equal as contact points, was administratively indpendent, and used data packets. One of Baran's farsighted recommendations was for a computer data public utility: "Is it time now to start thinking about a new and possibly non-existent public utility, a common user digital data communication plant designed specifically for the transmission of digital data among a large set of subscribers?" (4)

The initial plan for APRAnet was circulated at the October 1967 Association for Computer Machinery. The first APRAnet machines were Honeywell 516's, which with 12K of memory, were exceptional machines at the time. The initial design networked four sites, with the first installation occuring at UCLA on September 1, 1969. Additional nodes added on October 1 with the Stanford Research Institute, the University of California Santa Barbara campus joined on the 1st of November and the University of Utah in December. The four sites, chosen as they were already receiving APRANET conracts, were each given specialist areas to concentrate on. UCLA was to be the Network Measurement Center, SRI, the Network Information Center, UCSB, concentrated on interactive mathematics and Utah on graphics. It was hardly a stable system: the first packets sent on October 29 at UCLA with a remote connection to SRI crashed the system as the letter G of LOGIN was entered. (October 29).

[Hauben]

In the following year, APRANET published its original protocol, the APRANET Host-Host Protocol, which became the Network Control Protocol. In Hawaii, ALOHAnet, the first packet radio network became operational and was connected to the APRANET in 1972. In 1971, Project Gutenberg is started by Michael Hart with the purpose of making copyright-free works, electronically available. Also in 1971, Ray Tomlinson of BBN invents email program to send messages across a distributed network as a combination of two other programs, an inter-machine mail system called SENDMSG and an incomplete file transfer program called CPYNET. The following year this was further modified to include the now famous “@” symbol. 1972 also saw the introduction of the International Networking Group with Vince Cerf as the first chair. This followed the First International Conference on Computer Communications (ICCC) at Washington D.C. With representatives from France, Japan, Norway, Sweden, Great Britain and the United States.

[C.S. Carr, S. Crocker, V.G. Cerf, 1970, "HOST-HOST Communication Protocol in the ARPA Network," in AFIPS Proceedings of SJCC ]

In 1973 the first international connections were established with the University College of London via Norway's NOSAR . SRI began published APRANET news. At this point in time the estimated number of readers was estimated at 2,000. However, contrary to the expectated instrumental use of shared resources and programs, an APRA study showed that communications, specifically email use, composed 75% of all traffic. 1973 was also the year for the “Christmas Day Lockup”, where a hardware problem in the Harvard IMP caused it broadcast zeo-length distance to any APRANET desitination, thus causing all other IMPs to send their traffic to Harvard.

[AFIPS: "Computer Network Development to Achieve Resource Sharing" 1971]

In 1974 Vinton Cerf and Bob Kahn designed the details for a Transmission Control Protocol (TCP) and BBN started Telenet, the first commercial and public packet data service. The following year saw the management of the APRANET transferred to the DCA (now DISA) and the formation of the first mailing list with the unofficial SF-Lovers list rapidly becoming the most popular. Satellite links are established to Hawaii and the UK and the first TCP tests are run at Stanford. In 1976, the Unix-to-Unix Copy program (UUCP) is develop and in 1978 TCP is split into TCP and IP. In 1979, USENET was established using UUCP between Duke University and the University of North Carolina by Tom Truscott, Jim Ellis and Steve Bellovin, with all groups under the net.* hierarchy. In addition, the first Multi-User Domain was written by Richard Bartle and Roy Tubshaw at the Univeristy of Essex. The recommended use of emoticolons by Kevin McKenzie was also proposed in 1979.

Evidently a very busy year in the network, 1979 also saw the establishment of the Internet Configuration Control Board (now the Internet Architecture Board), Packet Radio Network experiments and, in what would become a transformative event for the APRANET, a meeting organized by Larry Landweber between the University of Wisconsin, DARPA, the National Science Foundation and a number of computer scientists to establish a computer science research computer network. This research network, CSNET (originally called Computer Science Network but later Computer and Science Network) was established in 1981, with initial finance provided by the National Science Foundation and with the involvement of the University of Delaware, Purdue University, the University of Wisocnsin, RAND Corporation and BBN. This was the same year that BITNET (“Because It's Time Network”, originally “Because It's There Network” in reference to the free NJE protocols providied with the IBM systems) started as a cooperative network at the City University of New York, with connections to Yale University and providing electronic mail and mailing list services as well as file transfers.

The feeling of impending transformation was evident, but obviously with some difficulties. As an item of historical note, on October 27, 1980 the entire APRANET was halted due to a status-message virus. Nevertheless in 1981, RFC 801 was published, outlining the planned transition period between the Network Control Protocol and the the Transmission Control Protocol/Internet Protocol suite. Norway made the transition first, leaving the APRANET and adopting TCP/IP, along with University College London. One could justifiably argue that Norway and the United Kingdom were therefore connected to the Internet before the United States. But not for long – in the same year the U.S. Department of Defence declared TCP/IP to bethe standard.

Elsewhere in continental Europe, network activities were beginning to take hold. France had deployed Teletel (aka Minitel) in 1981 through France Telecom. In 1982, the EUnet (European Unix network) was established to provide email and USENET services with connections between the Netherlands, Denmark, Sweden and the United Kingdom. The following year saw the establishment of the European Academic and Research Network (EARN). With the development of these new networks, it seemed most appropriate that 1982 also saw the publication of RFC 827, the Exterior Gateway Protocol for connections between networks.

1983 was the year of transformation. January 1 was the cutover date from NCP to TCP/IP. A gateway between CSNET and APRANET was established. The European Movement Information Net (MINET) connected to the Internet. The Internet Activities Board replaced ICCB. Most tellingly, APRANET was split into APRANET and MILENT, with the latter becoming integrated with the Defense Date Network which had been established the previous year. Over half (68 of 113) of the existing hosts went into MILNET. As all this was happening, a protocol was established that would lead to the rapid growth of Bulletin Board Services – the development of FidoNet by Tom Jennings.

It is worthwhile pausing at this stage to consider this technical and institutional evolution with the comments of cultural interpretation. Parallel to these technical and institutional developments was the development of a computer culture, primarily by students at M.I.T. This culture, and the associated 'hacker ethic' has been explored by Levy. The key components were:

First, access to computers should be unlimited and total: "Always yield to the Hands-On Imperative!"

Second, all information should be free.

Third, mistrust authority and promote decentralization.

Fourth, hackers should be judged by their prowess as hackers rather than by formal organizational or other irrelevant criteria.

Fifth, one can create art and beauty on a computer.

Finally, computers can change lives for the better.

[Steven Levy, 1984, Hackers: Heroes of the Computer Revolution, Doubleday, p36]

The followin comments from Richard Stallman, author of the popular UNIX text-editor EMACS and who was part of the M.I.T. AI Lab in 1971 concurs with Levy.

"I don't know if there actually is a hacker's ethic as such, but there sure was an M.I.T. Artificial Intelligence Lab ethic. This was that bureaucracy should not be allowed to get in the way of doing anything useful. Rules did not matter - results mattered. Rules, in the form of computer security or locks on doors, were held in total, absolute disrespect. We would be proud of how quickly we would sweep away whatever little piece of bureaucracy was getting in the way, how little time it forced you to waste. Anyone who dared to lock a terminal in his office, say because he was a professor and thought he was more important than other people, would likely find his door left open the next morning. I would just climb over the ceiling or under the floor, move the terminal out, or leave the door open with a note saying what a big inconvenience it is to have to go under the floor, "so please do not inconvenience people by locking the door any longer." Even now, there is a big wrench at the AI Lab entitled "the seventh-floor master key", to be used in case anyone dares to lock up one of the more fancy terminals."

[http://project.cyberpunk.ru/idb/hacker_ethics.html]

Whilst access to resources of the mainframe, the iconic computer paradigm of the early days of the Internet, was greatly sought after, the hacker culture emphasized the need for computers to be designed “for the people”. This meant single user machines, beginning with the Altair 8800 kit computer in 1975, Steven Wozniak's Apple I in 1976 and the Apple II in 1977, and the TRS-80 and the Commodore PET in the same year. Steve Wozniak and Steve Jobs, the founders of Apple computers, were both previously members of the Homebrew Computer Club and made the original “blue boxes” to hack into telephone system using the handles “Oak Toebark” and “Berkeley Blue” respectively. Mimicing telephone signals to obtain free calls had previously been initiated by John Draper who used a free whilstle from a cereal box and thus earning the name “Captain Crunch”. The period also saw the publication of “YIPL/TAP” (Youth International Party Line/Technical Assistance Program) and “2600”, magazines dedicated to marginal appropriation; electrical power through leakage, telephone calls through simulating dial tones, and computer access time by-passing passwords.

The hacker culture of the early days of the Internet was either antithetical or indifferent to the institutional leadership and the cold-war ideology. It wa also however, independent of the mainstream counter-culture at the time which emphasized naturalism, often to the point of technophobia. Hackers were unique, the first self-conscious members of the Internet community, technologically savvy and anti-establishment. It is possible to suggest that the transformation of the APRANET from a military network to an academic one was, at least in part, due to the difficulty of regulating such an culture, particularly the emphasise on communication as the means to instrumental action rather than the instrumental action being the sole orientation. With the advent of personal computers and modems, an academic-orientated Internet rather than a military one and a new protocol and transformed institutional authorities, a new era of decentralization, personalization and networking had begun.

In 1984, the Domain Names System was introduced with the Information Science Institute at the University of Southern California given responsibility in 1985 for management and DNS NIC registrations conducted by the Stanford Research Institute. Also in 1984, the Japan Unix Network (JUNET) established through UUCP, moderated newsgroups were introduced to USENET and the Soviet Union indicated connectivity to USENET. Canada initiates a one year plan to connect all it's Universities, and connects to BITNET in Ithaca from Toronot, which to plan, finished in 1985.

In 1986, the National Science Foundation Network was established with a backbone speed opf 56kbps with five centers established to provide widespread computing power (Princeton, Pittburgh, University of California San Diego, Cornell and UIUC). This led to a rapid expansion in network connections, especially among universities. At the same time, the Internet Activities Board establishes the Internet Engineering Task Force and the Internet Research Task Force, dedicated to open technical development for the network. Cleveland establishes the first freenet, and USENET undergoes the radical “great renaming”.

China joined the network with an email exchange on the 20th of September 1987 with Germany using CSNET protocols, and a national US research and educational network is proposed by Gordon Bell to the Office of Science and Technology in response to a congressional request by Al Gore. This network took four years to establish. The following year saw the Cliff Morris Internet worm affect some 10% of the entire Internet and the formation of the Computer Emergency Response Team (CERT) by DARPA in response. 1988 also saw the formation of the Internet Assigned Names and Numbers Authority, with Jon Postel, who had been mainatining the hosts.txt file for years, as Director. Internet Relay Chat was also developed in 1988 and FidoNet gateway tools were established allowing the exchange of email and news.

In 1989, Reseaux IP Europeems was formed for administrative and technical coordination of the European IP network. The first relays betwene a commercial electronic mail carrier and the Internet occurred with MCI Mail through the Corporation for the National Research Initiative (CNRI) and CompuServe through Ohio State University. The Corporation for Research and Education Networking (CREN) was formed through the merger of CSNET and BITNET and the Australian Academic Research Network (AARNET) was established by the Australian Vice-Chancellors Committee (AVCC) and the Commonwealth Scientific and Industrial Research Organisation (CSIRO) and complete links to NSFNET via Hawaii were established.

1990 was another year of technical and administrative transformation. APRANET was decommissioned with the majority of its work, and the network backbone, now carried out through the NSFNET and the first commercial dial-up providers of Internet access (world.std.com) was established. 1990 also was the year of “Operation Sundevil”, a crackdown on hackers by the United States Secret Service that led to the formation of the Electronic Frontier Foundation (EFF), dedicated to protecting and enhancing civil rights on the Intenet, by Mitch Kapor, John Perry Barlow and John Gilmore. On an amusing note, the first remotely operated machine was connected to the Internet (the Internet Toaster) by John Romkey.

Again, the opportunity is taken for reflection on the cultural behaviour that operated parallel to technical and institutional changes. The two key elements here was the massive introduction of personal computers and Bulletin Board Services and the effective transformation of the Internet from a military network to an academic network. These changes created a more convivial environment for the previous hacker culture, which transformed itself, quite self-consciously, into a cyberpunk culture. Initially a literary movement in science fiction, it was quickly adopted by computer savvy youth, particularly those involved in industrial music. Cyberpunks took on board the objectives of the original hackers but with greater political astuteness. Further, because of the science fiction background cyberpunks had a unified, if dystopic, view of the future, including environmental degradation, rampant and corrupt capitalism, and a transformation of the human through genetic engineering and cybernetics. Under this mental model, cyberpunks saw their underground activities as emancipatory.

[Michael Synergy, in The Mondo 2000 User's Guide To The New Edge, p54]

The self-styled computer underground formed an extensive social network for the exchange of resources and support. Their lack of an organised division of labour and a leadership hierarchy belittled claims of modern organized criminality, although their activities did include password breaches, software cracking and piracy as well as credit card fraud. The underground was widely sensationalized by the mass media who potrayed them as with a sense of fear and awe whilst at the same time describing them as maladjusted, pathological and anti-social. However, in the face of the available evidence the culture generated had it's own merit-based structure of social status, it's own magazines (e.g., Phrack, Legion of Doom Technical Manual, Computer Underground Digest etc) where the ethics of underground activity was debated with a degree of seriousness that far surpassed anything discussed in the popular media.

[Gordon Meyer, 1989, The Social Organization of the Computer Underground, Masters Thesis, North Illinios University available at: http://sun.soci.niu.edu/theses/gordon

Gordon Meyer, Jim Thomas, 1989, The Baudy World of the Byte Bandit, Published in Schmalleger (ed.), Computers in Criminal Justice, Bristol (Ind.): Wyndham Hall. An earlier version of this paper was presented at the American Society of Criminology annual meetings, Reno (November 9, 1989). Available at http://sun.soci.niu.edu/~gmeyer/baudy.html



]

On the Internet this cultural shift was also evident. Rather than the prankster activities and technological orientation of the early hackers, there was evidence that new structures and systems would be thoroughly discussed as social as well as technical problems. The most public example of this was the Great Renaming of USENET. The Great Renaming of usenet from July 1986 to March 1987, and the establishment of the alternative hierarchy. Where, in the first instance, the existing top level groups were renamed according to the seven main hierachies that exist today (comp., misc., news., rec., sci., soc., and talk). A real problem arose however when, after following the accepted voting method to establish new groups, administrators of the APRANET backbone refused to carry the new groups rec.sex and rec.drugs (i.e., the use of sex and drugs as recreational activities). In response, Brian Reid, a member of the backbone and under prompting from Brian Kantor, established a new usenet hierachy 'alt' (alternative) for alt.sex, alt.drugs and (for aesthetic purposes!) alt.rock-n-roll.


Reid recounts;


From: [email protected] (Brian Reid)

Message-Id: <[email protected]>

Date: 3 Apr 1988 1754-PST (Sunday)

To: [email protected], [email protected], [email protected]

Subject: Re: soc.sex final results

In-Reply-To: Gene Spafford / Sun, 03 Apr 88 18:22:36 EST. <[email protected]>

To end the suspense, I have just created alt.sex.


That meant that the alt network now carried alt.sex and alt.drugs. It was therefore artistically necessary to create alt.rock-n-roll, which I have also done. I have no idea what sort of traffic it will carry. If the bizzarroids take it over I will rmgroup it or moderate it; otherwise I will let it be.


Brian Reid

T5 (5th thoracic)


"T5" is the name of a vertebra (the 5th thoracic vertebra). This was my attempt to remind these people that I was an official voting member of the backbone.


[Reid, 1993]


This was the breaking of the 'backbone cabal', a phrase in Internet history as a play on the term for the main usenet communications 'cable', the 'cabal' being the site administrators on this backbone. The routing around the APRANET machines was succesful, and with dissent growing among cable site administrators it was within five months before the rest of the APRANET backbone agreed to carry the alternative hierarchy.

The APRANET phase of the Internet's history indicates an experiment of electronic democracy and collective decision making on a large scale and through a decentralised technologies. Of all the periods of the Internet's history, this was the most democratic to date. The institutional transformation from a military to an academic network was matched with the introduction of a decentralized and multiplicity of networks through personal, rather than centralized, computers. The resultant computer culture represented an evolvution from the first phase hackers with their disrespect for institutional authority and technological orientation. The cyberpunks were politically as well as technologically aware, albeit with a dystopian view, and profoundly interested in social as well as technological solutions. It was under these conditions that a new phase of the Internet was entered to – deinstitutionalized, commercialized and networked – the Internet phase.

Commercial restriction on the Internet were completely lifted by the National Science Foundation in 1991 and the Commercial Internet Exchange Association (CIX) was formed soon afterwards. The World Wide Web was also developed this year, by Tim Berners-Lee and released by the European Organisation for Nuclear Research (CERN). A public key encryption system, Pretty Good Privacy (PGP) was released by Paul Zimmerman with a method so effective that it would be declared as “munitions” by the United States government who would seek to ban it's export. Four years later, Richard White has himself declared as a munition by having an RSA file encryption program tattooed on his arm.

The following year saw the reformation of the Internet Activities Board as the Internet Architecture Board and the formation of the Internet Society. In 1993, InterNIC was created by the NSF to provide directory and database services, registration services and information services. Major organizations, such as the United States Whitehouse, the World Bank and the United Nations establish a World Wide Web presence and Internet Talk Radio begins broadcasting. With easy to use GUI operating systems and web browsers such as Mosiac, business and the media finally start paying attention to the Internet as the world wide web records a 341, 634% annual growth rate in service traffic.

In 1994 the ugly side of commercialization reared it's head as the U.S. law firm Canter & Siegel “spams” the Internet with a mass email and usenet posting campaign advertising “green card” U.S. immigration lottery services. After a little over two years in existence, the World Wide Web becomes the second most used service on the NSFNET (behind File Transfer Protocol and above Telnet) based on traffic. The top ten domains by host number are com, edu, uk, gov, de, ca, mil, au, org and net. 1994 also saw the formation of Trans-European Research and Education Network Association (TERENA) is formed by the merger of RARE and EARN, with representatives from 38 countries.

In 1995 the NFSNET reverts back to a research network with commercial networks now being able to provide effective backbone and interconnected service. A new NSFNET is however created as the National Science Foundation establishes a very high speed Backbone Network Service (vBNS) linking super-computer centers. The World Wide Web surpasses FTP as the most used service on the NSFNET and some of the major commerical non-Internet networks (CompuServe, America Online, Prodigy) start providing Internet access. Further commercialization is evident with the charging of domain names, hitherto free, with a $50 annual fee was imposed. The following year 9,272 organizations are delisted by InterNIC for not paying this fee.

With commercialization, the fetishization of domain names begins: the domain name .tv.com is sold to CNET for $15,000 USD in 1996 – the following year business.com is sold for $150 000 USD and resold in 1999 for $7.5 million USD. Also in 1996, the controversial Communications Deceny Act, designed to prohibit the distribution of “indecent” material to minor over the Internet becomes law, only to be unanimously declared unconstitutional by the Supreme Court the following year. Internet censorship becomes a major international issue with regards to religious, political and sexual content. In the realm of the malicious , The New York Public Access Networks Corporation (PANIX) is shuttown after repeated attacks and a cancelbot released on USENET wipes out more than 25,000 messages.

In 1997, ARIN is established to take the administration and registration of IP numbers, previously handled by Network Solutions through InterNIC. In protest of InterNICs monopoly, Eugene Kashpureff, owner of AlterNIC hacks the Domain Name System so that access to www.internic.net is redirected to www.alternic.net. In a particularly bad year for Network Solutions, human error caused the DNS table for .com and .net to be corrupted making millions of systems unreachable. Hackers evidently directed their attention with an apparent greater degree of seriousness in 1996 and 1997 with the Indonesian government facing no less than five successful intrusions, N.A.S.A. two, the U.S. Department of Justice, the U.S. C.I.A. and the U.K. Conservative Party one and the U.K. Labour Party one.

Turkmenistan NICs faced a flood of registrations in 1998 as companies sought to register their name under the .tm ccTLD, the English abbreviation for trademark. In a different type of commercialization, the U.S. Department of Commerce releases its plan to privitize the Domain Name System, which comes into effect with the establishment of the Internet Corporation for Assigned Names and Numbers. A second version of the Communications Deceny Act and a ban on Internet taxes are legislated in the United States. In China, Lin Hai is put on trial for “inciting the overthrow of state power” for providing 30,000 email addresses to a U.S. Internet magazine and is later sentenced for two years. The top ten domains by host numbers are: com, net, edu, mil, jp, us, uk ,de, ca, and au.

Public access to the Internet is established in Saudi Arabia in 1999 – previously it was restricted to univeristies and hospitals. ICANN allows five test registrates to engage in competition for the registry system (AOL, CORE, France Telecom, MelbourneIT and register.com). By the end of the year some 98 companies are involved in competition for domain name registration. A forged Web page made to look like a Bloomberg financial news story raised shares of PairGain Technology by 31% and increased their trading volume sixfold. Politically-inspired hacking of websites by activists opposed to the NATO war on Yugoslavia and by Chinese after their PRC embassy was bombed inBelgrade saw the the Whitehouse website down for three days, the Departments of Energy and the Interior sites altered, along with recreation.gov and the City of Los Angeles website. In an unrelated incident, a list of the U.K.'s M16 agents being published on the web. Activists also attempt to overload the world's financial centers at the time of the G8 summit, but with marginal success. In a less directed manner, the Melissa virus raged through the Microsoft/Outlook world. The main technical news of the year was that Abilene, the Internet2 network, crosses the Atlantic and conencts the NORDUnet and the Netherlands SURFnet.

Politically orientated hacking, hactivism and malicious hacking continued in the year 2000 with denial of service attacks launched against Yahoo, Amazon and eBay and website hijacks against internet.com, bali.com and web.net. The RSA encryption standard is hacked, along with Apache, Western Union and Microsoft and the Love Letter virus propagates in a similar manner to the Melissa virus. As the Napster file sharing site is challenged in the courts, users engage in the world's largest copyright infringement. On the technical side, Internet2 connects to Mexico and California and adopts Ipv6 with an enhanced IP number range. Second-level non-English domain names registered in Chinese, Japanes and Korean characters is conducted by VeriSign with the authorization of the IETF. ICANN allows the estbalishment of several new gTLDs; .aero, .biz, .coop, .info, .museum, .name, .pro and transfers the authority of the Australian .au domain to auDA.

In 2001 VeriSign extends its use of multilingual domains to encompass a number of European languages and later the full Unicode character set, incorporating most of the world's languages. The European Council finalizes the first treaty addressing criminal offenses committed over the Internet. Afghanistan's Taliban regime places a total bans on Internet access in an attempt to control content. Meanwhile, Brazil becomes connected to Internet2. The Code Red worm and Sircam virus infiltrate thousands of web servers and email accounts respectively and in 2002, a Distributed Denial of Service attacks strikes the 13 DNS root servers. Early in 2003, the SQL Slammer worm, taking ten minutes to spread worldwide, takes out 5 of the 13 DNS root servers along with tens of thousands other servers and affects ATM systems, air traffic control and emergency systems. Meanwhile, the first official online takes place in Anières, Switzerland.

Providing an account for the third phase of the Internet is made difficult by the fact that it is still occurring. The key technical and administrative changes can be stated: the Internet has become a global network, almost replacing all other networks and the decentralized and fragmented anarchy of the previous BBS distribution. The personal computer has transformed into the networked personal computer. The widespread adoption of the Graphic User Interface in personal computer operating systems, improved bandwith and access speeds, the development of the World Wide Web and institutional commercialization has brought the Internet to hundreds of millions, when previously it was the domain of specialists and enthusiasts. As previously, this has led to changes in the content and character of the Internet culture. The term “netizens” is an accurate description as it takes into account the introduction of net-citizens rather than the technically savvy, as well as the political agenda of citizenship rights on the Internet. Nevertheless, the technical side of the political imperative – “hactivism” has become increasingly prevalent as a form of non-violent civil (and criminal) disobedience.


[Michael Hauben, Ronda Hauben, Netizens: On the History and Impact of Usenet and the Internet, IEEE Computer Society Press, 1997 also at: http://www.colombia.edu/~rh120]

The new Internet phase's fiery birth was none other than a widespread attempt to crackdown on computer hackers (Operation Sundevil) and a concerted and deceptive attempt to ensure that neither the culture or the technical behaviour that characterized the cyberpunk culture would be acceptable to the new world of the commercial Internet. Instead this raised the stakes of civil liberties and in subsequent years, the system and the technology clashed again and again over content, encryption, standards, management and access. Whilst the Operation was mostly a legal failing, the conflict of online civil rights and democratic control has been the most significant conflict on the Internet for more than a decade. The critical questions that are discussed in the third section of this thesis are these very issues that continue to strain the relationship between the potential and use of a world wide communications technology that is now available with minimal computer skills and the legal and political attempts to regulate the content, character, style and management of that technology.

No immediate attempt is made here to even make a prelimary prediction the outcome of the current situation – such a estimation will be reserved until the final sections of this study. Barring international disaster of course, some positive suggestions can be made with a degree of universal certainty – the Internet will grow, it will replace other networks and the technological perfomance indicators will double every two years. Likewise however, it is unlikely that the conflicts thus deferred, avoided or subject to contradictory standards will be reduced in their intensity until lasting, rational solutions and institutional changes are implemented. The Internet is now truly international and as such the rights of Netizens is now an international issue.

- - -

11, 060 words. Last update March 27













    1. SECTION BIBLIOGRAPHY

John Perry Barlow, A Not Terribly Brief History of the Electronic Frontier Foundation, http://www.eff.org, 1990


Adam Gaffin, The EFF Guide to The Internet (v3.0), Electronic Frontier Foundation, http://www.eff.org, 1996


Michael Hauben, Ronda Hauben, Netizens: On the History and Impact of Usenet and the Internet, IEEE Computer Society Press, 1997


Hardy, H.E., The History of the Net, Master's Thesis, School of Communications, Grand Valley State University, Allendale, (v 8.5), 1993


Kehoe, B., Zen and The Art of The Internet, DEET Communications Link Project., Murdoch University, 1992


Gordon Meyer, The Social Organisation Of The Computer Underground, Masters Thesis for University Of North Illinios


Bruce Sterling, B., The Hacker Crackdown: Law and Disorder on the Electronic Frontier, Penguin, 1994 [FP 1992]




Site scripted by Lev Lafayette. Last update August 1, 2003

Hosted by www.Geocities.ws

1