Chapter 3.5 Data Integration

Chapter 3.5 Data Integration

"If a Nation expects to be ignorant and free in a state of civilization, it expects what never was and never will be..." Thomas Jefferson

3.5.1 Definition and History (2422 words)

Previous subchapters concerned with critical issues of access to the means of data production, the moral limits of data expression, and the security of data expression. These components provide the necessary and sufficient conditions for technologically mediated communication and, as the preceeding subchapters make clear, raise a number of complex issues themselves. However, technologically mediated communication also requires integration into society as a whole. By 'integration' what is meant is the bringing together of different technological traits in a harmonious fashion (from the Latin "integratio", a renewing, restoring), with the mathematical association with the integral (Latin integer, complete, Medieval Latin integralis, making up a whole). This general issue can be further divided into four comprehensive components, namely (i) pedagogy and public opinion formation, (ii) production and exchange of goods and services, (iii) technological standards and institutional status and (iv) social adaptability and dynamicism.

In terms of narrative, the question of data integration does not neatly fit entirely into the ontological narrative of previous subchapters. Whereas individuals must have access to the technological means of production and reception prior to expressing and receiving content, and the expression and reception of content is ontologically prior to matters of security of information, integration is both prior to access of technology in terms of intention and motivation and subsequent, particularly in terms of unexpected or undeveloped issues that arise from the mass introduction of a new communications medium. As the ontologically prior matters have been already been discussed in terms of the history and culture of the Internet, this subchapter concentrates on those issues relating to data integration which have been arisen subsequent to mass introduction.

Data integration is considered a critical issue due to the qualitative changes that external regulation can have on the Internet and the effects that the Internet can have on social systems. Social systems engage in pedagogy and public opinion formation through structured institutions, reflexive labour and transaction costs are a normally incorporated into the cost of a final product or service, technological standards are established through a combination of political, economic and scientific institutions and social adaption is normally tested through political action and lobbyists. The Internet fundamentally changes the common assumptions of matters of data integration through a massive decentralization of power and means of production and accessibility, through globalization, and potential and relative speed of data distribution.

As with preceeding subchapters, this definition and issues is introduced by illustrations by relevant historical examples, both modern and premodern. As established previously even non-literate societies who utilize a natural rather than technologically mediated means of communications still must address matters of data integration as cultic mythologies are expressed signs and foci, although the lack of technological medium does not amplify agent decisions. The concerns of pedagogy and public opinion formation, for example, are universal to all societies for system and lifeworld symbolic production and reproduction and may be demarcated between reflexive and non-reflexive approaches to theoretical and practical questions.

[Jurgen Habermas, Legitimation Crises, pp14-17 for an elaboration on these approaches to learning]

Empirical content examples from pre-modern societies initially includes consideration of mythic Polynesian and Melanesian society, before moving on to illustrations from traditional mid-eastern and mediterranean societies, the schools of thought and practises of Confucianism, Christianity and Islam. Following this is a more detailed examination of data integration in modernity. In each of these cases the four major themes for discussion in relation to the Internet are identified as noted above. An interest exists in noting how qualitative changes to the technological means of information and communication production framed the social possibilities for each particular social system and cultural lifeworld.

Ethnological data from Bronislaw Malinkowski in the Trobiand Islands indicates the a strong tie between the teaching of children and adult contributions to public discourse which are tied to mythic narrative. These are structurally correlated with the establishment of technological standards and institutional procedures (such as the circulation of gifts) and the ceremonial presentation of agricultural techniques and even social change, although as noted in the latter case severe qualitative changes cause catastrophic modifications to the existing social order. In a similar manner, the research of Margaret Mead in Pacific cultures also notes primary social division along age and sex lines (e.g., tautala laitia in Somoa or "presuming above one's age"), the inclusion of cooperative and competitive economic activity, and gendered cultural dynamics. In all these cases on can note the incorporation within a mythic narrative to a functional whole.

[ Bronislaw Malinkowski, The sexual life of savages in north-western Melanesia; an ethnographic account of courtship, marriage and family life among the natives of the Trobriand Islands. 1929 Coral gardens and their magic; a study of the methods of tilling the soil and of agricultural rites in the Trobriand Islands (2 volumes), 1935 The Dynamics of Culture Change 1945; ed. by P. M. Kaberry Margaret Mead, The Coming of Age in Somoa, 1928 1935 Sex and Temperament in Three Primitive Societies. New York: William Morrow. ]

Of course, any empirical recollection of a mythic society must be considered in light of the temporary nature of social mores. Contrary to conventional wisdom, mythic societies are very dynamic in their reproduction of cultural mores. Further, the sheer diversity of cultures makes it appear to be very difficult to establish universal features. The establishment of writing as a means of communication correlates with the social formation of traditional society and a religious mode of consciousness. The generation of surplus allows for the prospect of political rank as an additional means of social differentiation to those of age and sex as established in mythic societies. At this point in time confirmation can be truly given to the claim of Egydi, that technical standards are endogenous to technology and more to the point in this instance that the success of standards has an explicit correlation between the reflexivity of pedagogy, the openness of public opinion formation, economic productivity and commercial success.

[ Tineke Egyedi, Shaping Standardization: A study of standards processes and standard policies in the field of telematic services, Doctoral Thesis, Delft Technical University, 1996. ]

State-enforcement of knowledge standards with a religious caste of rulers is typical of traditional societies. Historically, the earliest recorded introduction of technical standards related to unit and measurement. Written numbers, it seems preceded written words and provisional notation with place values was in use in Mesopatamia by 4000 years BCE and developed in the Egyptian and Mayan civilisations some hundreds of years later, indicating the importance to economic and commercial measurement, although it must be acknowledged that archeologists also report the use of tally inscriptions on bone in groups of five as old as 30,000 years BCE. In terms of of an educational and public opinion institutions, evidence exists that in Ancient Egypt these were relegated to the political ranks of the ruling class and in accords to the "Books of Instruction". The strict system of education correlated with the strict system of political rank. The public sphere was minimal, if not non-existent. Similar experiences have been noted in Bablyon and Assyria: the strength of the Egyptian system was not necessarily its rigidity, but the extensiveness in the institutional and systematic production of knowledge and the specialization of labour.

[K Krechmer, The Fundamental Nature of Standards Economics Perspective, (EDIT URL)] Jack Goody, Alphabets and Writing in Raymond Williams ed., Contact: Human Communication and Its History, Thames and Hydson, 1981, pp105-126 Oppenheim, A. Leo (1977). Ancient Mesopotamia: Portrait of a Dead Civilization. 2nd ed. Chicago: University of Chicago. James Bennett Pritchard (ed), Ancient Near Eastern Texts, Princeton University Press, 1969. ]

Again with the explicit conservatism and system maintaining orientation, the ancient Chinese dynasties provided the new innovation of group education without class-political prohibitions with astounding success with the establishment of a centralized bureaucracy arising from a number of feudal states (c400 BCE). Divine right of rulers was combined with the stratification and stability orientations of Confucian thought, which nonetheless tolerated competing philosophies providing a stronger basis for social adaptability in the form of a nascent public sphere, combining with a meritocratic system of professional employment. Commercial standards with cross-cultural codification and technological developments in a concrete, nonabstract manner ensured a system that was stable, adaptable, yet ultimately uncapable of abstract mastery.

[ Joseph Needham (editor): Science and Civilisation in China, Cambridge University Press Volume II. (1956) History of Scientific Thought; Joseph Needham, Wang Ling. ]

A strong comparison can be made between these centralized states an those of Ancient Greece, albeit with their recognition with their influence on the formation of the vast Roman Empire. Athenian Greece operated with a private education system, although the male children of nearly all citizens attended, with female children receiving some home education. In comparison Sparta had a state-sponsored system, albeit with a narrower approach. The Roman system was both state-sponsored with childhood (but not young adult) education for women. Despite the comments by Plato in The Republic, both Ancient Greek and Roman education included reflexive approaches to learning, which in turn helped constantly regenerate their public sphere, the production and commerce of knowledge, and remained relatively free from non-reflexive religious doctrine.

[ Marrou, H. I. A History of Education in Antiquity. Trans. George Lamb. New York: Sheed and Ward, 1956. Bonner, Stanley F. Education in Ancient Rome. Berkeley and Los Angeles: Univ. of California Press, 1977. ]

The same freedom cannot be said of medieval Christian or Islamic societies. Despite the familiar use of the public-private contrast from Roman law in the former, the ancient Germanic tradition of common and particular had greater usage, a distinction which also seems to have applied to medieval Islam. In both cases education was mediated through religious instructors and religious institutions with Charlegmagne declaring that cathedrals and monsateries were to provide male education and with the second capliph Umar, appointing qassin for narration of the Quaran and Hadith, which eventually evolved into the makab which taught boys and girls. Strictly theoretical research was curtailed with non-theoretical limitations, especially in Christianity. In contrast medieval Islam, especially from the Mutazilite school and the multicultural Abbasid Caliphs of Baghdad, applied the concept of tjihad - struggle - to the scientific method, resulting in equivalent benefits to trade and technology.

[ Lapidus, Ira. M. A History of Islamic Societies, 2nd Ed. New York: Cambridge University Press, 2002. Richard W. Southern: The Making of the Middle Ages, Yale University Press, 1953 ]

The introduction of written text correlates with the introduction of a religious mode of consciousness and equivalent ruling social classes. The capacity of such traditional social systems to survive depended mainly on their ability to maintain internal stability, both in the education system and in class relations, although economic and scientific growth depended on more reflexive outlooks; growth is inherently destabilizing. With the rapid introduction of movable type press in Europe in the fifteenth century the foundations were laid for the religious reformation at a time when medieval Islam and Confucian China was at a period of extreme conservative stasis. Along with new debates over scripture and metaphysical interpretations, scientific and political reasoning also arose posing revolutionary challenges which were mostly clearly expressed in the American, French and Russian modernist revolutions.

In the field of education, one notes the gradual development of free, public and secular education. From Rabelais onwards, the claim that education should be conducted in a spirit of free inquiry, unrestrained by religious censorship, has remained a thematic ideal evident in Rousseau and Pestalozzi although tempered by the advocacy of national education by Machiavelli and Luther. Faced with the challenge of increasing complexity of industrial production, the United States of America deserves special mention where free inquiry, secular education and mass democracy is expressed for the function of system stability in John Dewey. Despite criticisms from more radical pedagogy theorists (e.g., Bruner, Freire, Ilich, Vygotsky) - whose contributions will be analyzed later in this chapter - contemporary educational theory partially subscribes to the notions of free inquiry and partially in accord to systematic requirements which are increasingly defined by business interests rather than that by the state.

[ Francois Rabelais, Gargantua and Pantagruel, W.W. Norton & Company, 1991, FP: c1532-1562 Burton Raffel (Translator) John Dewey, Democracy and Education: An Introduction to the Philosophy of Education, Free Press, 1997 FP: 1916 ]

A similar experience may be described in the realm of the public sphere in modernity. As carefully elaborated by Jurgen Habermas, the representative publicity of traditional society developed into a public sphere (and likewise the development of a private sphere) importantly mediated through print medium and a literate individuals who engaged in debate of public processes, laws and morals, reaching a height in the 18th century. Subsequent to that however, the public sphere became increasingly distorted with a press increasingly concerned with commercial profit and ruling class compliance over rational-critical discussion and the party political system increasingly truncuating independent participation (and which it must be added, the dramatically declining ratio of representation). The public sphere, and civil society, faces internal contradictions with the commodification of private life and therefore the commercialization of public discussion whilst at the same time giving increasing civil liberties and private rights.

[Jurgen Habermas, The Structural Transformation of the Public Sphere: An Inquiry into a Category of Bourgois Society, trans. by Thomas Burger, MIT Press, 1991, FP 1962 'Manufacturing Consent - The Political Economy of the Mass Media', Edward Herman and Noam Chomsky, 1988, Pantheon Books ]

The production of intellectual goods and services, likewise released from the bonds of religious and doctrinal requirements and with a qualitatively new production possibility frontiers in replication and distribution, also has developed with evident competing interests between state functionalism, democracy and commercialization. Much of this trajectory has already been discussed in the preceeding section on the history of copyright, but it must also be considered in light of changes in education and public opinion formation where institutional behaviour structural affinities are notable. Whilst early modernist discourse suggested utopian ideals generated from the free circulation of ideas which came into conflict with the doctrines of the ancien regime, the result of the conflict was not a thoroughly reflexive system of intellectual production but one that was distorted by class relations. Even the relatively free university system or institutional research bodies were heavily influenced by state interests and now by transnational corporations.

[Bill Readings, The University in Ruins, Harvard University Press, 1997]

This theme of the distorted ideal of modern data integration through state and corporate interventions and procedural imperatives becomes most explicit through an institutional evaluation. Data integration has never been provided a institutional subsystem or a procedural orientation in its own right that is sufficiently independent from either state commands or commodification mechanisms. At best, modernity has produced supranational expert bodies such as United Nations specialized agencies, special bodies of the General Assembly, or in the case of the International Standards Organization, a non-government organization with substantial membership from the private sector, still operating a ratio of one member per nation, and with a system of voluntary adoption from market derived consensus for international use.

[ Jack Latimer, Friendship Among Equals: Recollections from ISO's first fifty years, International Standards Organization, 1997 Nils Brunsson, A World of Standards, Oxford University Press, 2000 ]

Even considering the developments of linotype, monotype and offset printing, the information and communicative content of movable type remained qualitatively the same. However coinciding with the inclusion of universal adult suffrage, regardless of sex and class, in public opinion formation was the rise of new mass information media and the subsequent displacement of critical public opinion formation. Early electronic media, such as telegraph, radio, telephone, film and television were not able to surpass print media for both the capacity to participate in information-formation and information-retrieval.

Radio, despite the domination of receivers over transmitters, included the systematic conditions and structural results of the members of the computer underground culture. The telephone, with a very high level of communicative technology, suffered the same limitation as the radio - lack of information storage and retrieval. Film and television, due to capital equipment requirements, became the mainstay of a conservative capitalist mass information culture industry and that of the equally, if not more conservative, state socialist regimes. Far more than any other technology, they lacked a communicative aspect and the systematic resources poured into the industries indicated a paralysis of critical reasoning.

The relationship between modern information technology and data integration largely has to do with the dominance of print media. The collapse of elite political rank through mass literacy has forced competing ideals of universalism and nationalism with internal crisis generated through class differentiation. Current trends towards neoliberalism are indicative of political paralysis rather than a resolution orientation. The following examination of critical issues that have arisen with Internet technologies serve to elucidate this paralysis in data integration prior to proposing a new systematic means appropriate to the changes in the means of production.

3.5.2 Critical Issues on Data Integration [8067 words]

Pedagogy and Public Opinion Formation

Early liberal-democratic analysis of the introduction of Internet technologies to childhood and young adult education invariably amounts to a positive interpretation of the benefits of the technology, prospects of marketability of educational products and tempered by existing inequalities to access. A similar approach has been taken to the formation of public opinion and the mass media. Since then, apparently unexpected problems have arisen with have widened the debate on these matters in a more critical manner, such as the wide prospect for plagarism with search tools, as the Internet replaces the library as the main source for information, the lack of income-earning potential of online educational material and concerns on the debasement of standards of reporting (and public opinion formation) as the institutional mass media is replaced by non-institutional or semi-professional reporting.

["The Internet in the Classroom", pp185-201 and "The Market for Educational Applications of the Internet", pp203-218" in The Harvard Conference on The Internet and Society, O'Reilly and Associates, Harvard University Press, 1997) c.f., The Internet and Education: Findings of the Pew Internet & American Life Project, (Washington, Pew Internet & American Life Project, Sept. 2001), c.f., Mehmet Arslan Lutfi, Elevating the Standards of Journalism Through the Internet: The Impact of Online Media Watchdogs and a Case Study of Medyakronik, Master of Arts thesis, Georegtown University, 2002 ]

For the purposes of this discussion concerns of inequality of access are largely screened out as they have been dealt with the subchapter entitled 'Data Access', except to recognise that the repetition of existing structural inequalities causes greater threats to system stability due to technological amplification. Rather the main concern here is to examine what are potential systematic limits to education and public opinion formation within the environment of the Internet. Specific concerns include the increasing involvement of corporations in education, including early childhood, young adult, the tertiary level and research institutes; the establishment of online public libraries; the formation of public opinion and public debate online and; institutional attempts to establish "Internet democracy" or, at the very least, consultative forums that have direct input into the political system.

The general intention of modern education has already been elaborated. As an ideal, it aims to provide free, public and secular inquiry. In reality, it is has been strongly influenced by the remmanents of traditional-religious orientations, class inequalities, and state functionalism. Increasingly in advanced economies commercial corporations are taking increasingly roles that correlate with a proportional decrease in the application of education as a state mediated social welfare function. In this sense, the use of commercial and proprietary computer software and hardware is hardly unusual. When Apple developed the first personal computers it was a very short space of time before they appeared in specialist computer classes in secondary schools. Such provision is highly beneficial to the commercial provider - having a workforce ready developed with a particular knowledge base - and to the exclusion of competitors - is considered sufficiently advantageous to the point that computer hardware and software providers usually offer very large discounts to educational institutions.

As computer useage becomes a necessity in advanced knowledge economies, the economic and educational stakes are substantially raised with substantial use in childhood as well as young adult education. Grave concerns are raised over the prospect of providing educational skills that have are non-generic and proprietory, even if current market conditions are favourable. For example, in 2001 Microsoft, facing 100 outstanding private antitrust cases, offered $1 billion in hardware, software and training to some 12,500 US schools. Recognising that the software valuation is vastly overrated and insignificant to Microsofts accounts, Red Hat - a supplier and support company for the public domain and open source Linux operating system, recommended that Microsoft provide hardware to schools and Linux would - free of charge - provide the software. It was also noted that the Microsoft proposal requires schools to pay licensing fees after five years, whereas Linux has no time limit. Ironically, the Microsoft proposal occurs in the same year that the software giant launched an investigative audit on Philadelphia's public school system for valid licenses.

[ http://news.com.com/2100-1001_3-276058.html?tag=st_rn Microsoft cuts another antitrust deal November 20, 2001 By Joe Wilcox Matt Loney, Critics take aim at Microsoft schools deal, CNET, November 27, 2001 http://news.com.com/2100-1001-276216.html?legacy=cnet http://news.com.com/2100-1040_3-276109.html?tag=st_rn Deal may put Microsoft at head of the class November 21, 2001 By Michael Kanellos and Joe Wilcox Damien Cave, Microsoft to schools: Give us your lunch money!, July 10, 2001 http://dir.salon.com/tech/feature/2001/07/10/microsoft_school/index.html ]

The privitisation and corportisation of education is not just a factor in primary and secondary education. The most advanced computer mediated communication research consortium, Internet2, combines over 200 universities and 60 companies in a model of business welfare whereby private industry is effectively subsidised in the creation of future computer technologies. Whilst Internet2 is not designed as a separate network and it claims to be a continuity of email and www technologies the stated key goal is the "diffusion of advanced Internet technology, particularly in the commercial sector". The objective is to ensure the supremacy of United States capital through interconnection, collaboration and a transfer from the education community to the commercial marketplace. The main technological investment at this point in time is the development of digital video and remote control of instruments. To its credit however, Internet2 seems mainly orientated towards hardware (e.g., the Abilene Backbone Network) and interoperability between public standards and commercial products.

[ FAQs about Internet2 http://www.internet2.edu/html/faqs.html ]

The development of online public libraries provide a mediating point between the computer mediated pedagogy and a computer mediated public sphere. Along with the provision of local and historical content by regional libraries on their own sites, general examples include the Internet Public Library, Project Gutenberg and Project Gutenberg (Australia), the University of Texas' eBook site (some 50,000 titles), MIT's CogNet, the U.S. National Digital Library. These are just a sampling of the quantity of available online texts which are considered to have literary merit. The Alex Catalogue for example defines this as meaning "literature withstanding the test of time and found in authoritative reference works like the Oxford Companions or the Norton Anthologies".

The Internet Public Library, hosted by the School of Information and Library Studies at the University of Michigin and with international mirror sites, mainly operates as a series of links to reputable resource materials. In comparison Project Gutenberg, initiated by Michael Hart at the University of Illinois in 1971, provides about 10,000 texts in plain ASCII format. Due to copyright restrictions, Project Gutenberg only provides those texts which are in the public domain and concentrates on literary and referenc publications. Project Gutenberg (Australia) has a larger range due to a more liberal copyright regime. The U.S. National Digital Library includes a congressional database, images from the Library of Congress collections, and the Networked Digitial Library of Theses and Dissertations. Finally, MIT's CogNet offers over four hundred MIT Press books and 7 journals in the cognitive and brain science disciplines.

[ The Internet Public Library is located at: http://ipl.sils.umich.edu/ and mirrored at: South America: http://www.ipl.org.ar/ At the Universidad de Belgrano in Buenos Aires, Argentina and Asia: http://ipl.ulis.ac.jp/ At University of Library and Information Science in Tsukuba, Ibaraki, Japan Project Gutenberg http://www.pg.net University of Texas eBook http://www.lib.utexas.edu/books/etext.html The Alex Catalogue http://www.infomotions.com/alex/ The Avalon Project http://www.yale.edu/lawweb/avalon/avalon.htm Networked Digital Library of Theses and Dissertations http://www.ndltd.org/ MIT CogNet (http://cognet.mit.edu) ]

If the corporatisation of education and the teaching of proprietory applications and operating systems rather than public and general standards of the new information technologies causes concern, then even greater concern needs to be raised over pedagogical issues. Whilst the mass introduction of computer technologies to the education sector have seen the initial emergence of computer-assisted instruction and in particular experimentation in multimedia education and distance education, pedagogical theory and practise has not kept pace with the technology and indeed, in many ways the application of neoliberal political and economic theory to pedagogical practices is resulting in a command-control cybernetic model which is often contrary to the emancipatory potential of the technology itself, particularly with regards to distributed learning. Instead, excessive concern is placed on breaches of copyright and plagarism, rather than emphasizing 'fair use' strategies in the former and alternative pedagogical strategies in the latter - a necessity as one report notes that some 20% of physics students at the University of Virginia had plagarised their work.

[ Sosteric, Mike. (1999). Endowing Mediocrity: Neoliberalism, Information Technology, and the Decline of Radical Pedagogy. Radical Pedagogy: 1, 1. http://radicalpedagogy.icaap.org/content/issue1_1/sosteric.html Herb Thompson (2001), Cybersystemic Learning, Radical Pedagogy 3, 2 http://radicalpedagogy.icaap.org/content/issue3_/thompson.html e.g., Jill Suarez, Allison Martin (2001). Internet plagiarism: A teacher's combat guide. Contemporary Issues in Technology and Teacher Education [Online serial], 1 (4) . Available: http://www.citejournal.org/vol1/iss4/currentpractice/article2.htm Fighting Fire with Fire: Using the Internet to Reduce Electronic Plagiarism http://www.glencoe.com/sec/teachingtoday/educationupclose.phtml/19 February 2004 ]

If the Internet is making substantial changes in the educational system, albeit led by the students rather than existing and emergent institutions, then a similar effect is also being witnessed in public media and public opinion formation. Current surveys of Internet use suggest such a change although they are far from conclusive at this stage. The German and Austrian group, Seven One Media, noted a 74% increase in Internet time-use in 2001 over the previous year and noting the Internet moving to fourth place in time-use among all media (after radio, television and books), did not note any reduction in time-use by other media and indeed noted a slight increase in radio usage. The qualitative study of the US-based Lycra Research group suggests as user familiarity with Internet technologies increase, usage of other media sources decreases. This equates the results of the First National (US) Internet study by Scarborough Research that suggested a drop in television viewing, magazine and newspaper readership as Internet use increased.

[ Europemedia.net German traditional media not hurt by increase in internet use 28/11/2001 http://www.europemedia.net/shownews.asp?ArticleID=6976 Internet Becoming Preferred Information Source By Michael Pastore | May 10, 2001 http://cyberatlas.internet.com/big_picture/demographics/article/0,,5901_762881,00.html ]

Whilst the initial chapter of this study emphasized the use of the network news transfer protocol (NNTP), usenet and email lists, mention must also be made of commercial and web-based mailing lists such as Yahoo! groups (some 90,000 by November 2003), Topica, and Livejournal communities, and online newspapers and journals which encourage discussion threads (most famously, Slashdot). There are particular concerns that the original Internet public sphere, usenet, appears to be in decline after many years of expansion partially due to the range of alternatives and partially due to email harvesting by direct mail agents on such groups and the posting of commercial advertisements, especially those which are considered to be "off-topic". Such commercial intrusions have also arisen in other newsproviders. Nevertheless such groups are still the Internet equivalent of the public sphere, "a domain of life in which such a thing as public opinion can be formed". It is open to all participants and may include discussion on all topics, including the affairs of government and the state. Whilst it is evident that this public sphere is not yet an "ideal speech situation", it certainly more so than any other currently available media.

[ The recent decline is usenet is evident by studies using Microsoft's Netscan http://netscan.research.microsoft.com/Static/about/ Jurgen Habermas, The Structural Transformation of the Public Sphere, p38 Mark Poster, CyberDemocracy: Internet and the Public Sphere, University of California, 1995 http://www.hnet.uci.edu/mposter/writings/democ.html Poster makes the remarkably erroneous claim that the public sphere is necessarily (rather than preferably) an ideal speech situation: "For Habermas, the public sphere is a homogeneous space of embodied subjects in symmetrical relations, pursuing consensus through the critique of arguments and the presentation of validity claims. This model, I contend, is systematically denied in the arenas of electronic politics. We are advised then to abandon Habermas' concept of the public sphere in assessing the Internet as a political domain." ]

Pedagogy and public opinion formation are substantial characteristics of the Internet. However, they have dramatically failed to reach their potential and suffer the prospect on encroached colonisation by the privitisation and commodification of knoweledge, the lack of institutional base or legislative protection, and a lack of administration and standardisation. The stubborn survival of the Internet public sphere and indeed in some instances growth is indicitive of widespread popular support - people do want a public sphere. The relentless failure of the political, economic and educational sub-systems to incorprate and integrate existing and potential knowledge resources and utilities from the Internet indicates a gulf between population needs and desires and these subsystems. This disparity will grow whilst the institutional base and procedural orientation of these subsystems continues with a invalid telos of privatised and commodified information.

Production and Exchange of Goods and Services

In strict economic science, the purpose of the Internet is to lower costs (especially transaction costs), to reduce resource usage, to increase the efficiency of reflexive labour, to increase productivity, and to provide a data resource by which bureaucratic planning is more capable of directing production so that ex ante assessment more closely match a posteiri evaluations, along with the provision of if not "perfect information" (an assumption of microeconomic decision making) but "improved information", in terms of of being more free, more complete and more universal than pre-Internet information. Combined these effects should mean greater organizational profits, both individually and generally. Numerous examples have been given however which indicate that commercial behaviour is not always the most economically rational. Individual interests, especially with the lure of "rent-seeking" and monopolistic behaviour often engage in activities which substantially benefit the individual organization, but impoverish the general wealth and the capacity for an economic system as a whole to reduce costs. The privitisation and commodification of information production and the education system noted previously is one extremely important example. In this theme, other critical issues for the use of the Internet in the economy are also reviewed, including the problems and costs associated with unsolicited commercial email, the Microsoft anti-monopoly case (with reference to other corporate monopolistic behaviour), and finally the economics of open source software.

Before discussing these critical issues, mention must be made of the economic benefits of the Internet which whilst extremely significant, continuing in development, and organizationally transformative. This includes the remote control of various elements of production, such as the networked gloabl ordering and manufacturing system used by the apparel company, Benneton. In a similar vein, systems with centralized physical storage system (and invariably much reduced overheads compared to retail outlets) with distributed consumer demand, most notable used by the literature and music outlet Amazon (and now even used by grocery stores). Catering for individuals and organizations the virtual marketplace of Ebay, with their associated pseudo-bank Paypal, have provided a more personal window to the economic opportunities of globalisation. Internet banking, Internet payment of utilities, Internet purchases and sales of goods and services - all have vastly enhanced consumer opportunities, reduced transaction costs and increased the possible market distance between consumer and producer.

[ By Junko Yoshida EE Times: The Industry Source for Engineers and Technical Managers Worldwide March 12, 2003 (4:41 p.m. ET) Clothier Benetton adopts Philips' RFID technology for 'smart' labels http://www.eetimes.com/sys/news/OEG20030311S0028 PayPal and eBay: The legal implications of the C2C electronic commerce model Andr�s Guadamuz Gonz�lez University of Edinburgh www.bileta.ac.uk/03papers/Guadamuz.htm 18th BILETA Conference:Controlling Information in the Online Environment April, 2003 QMW, London ]

Two other economically beneficial gains due to the implementation of the Internet that are often criticised (but, in this inquiry, not deemed critical) is the globalisation of reflexive labour and widespread data mining by marketing firms. In the former instance, the corporation, seeking to reduce their variable capital (i.e., labour) costs seeks the least expensive workforce (lowest level of labour and conditions) relative to the quality of production. In the production of a variety of information services this has meant that a number of employment positions are being transferred from advanced nations to developing nations which have an educated population. Not surprisingly, such transfers threaten the job security of workers in advanced nations and appropriately, raises the ire of labour unions who represent the sectional interests of such workers. There is a general inevitability that any corporation will always seek to reduce their labour costs, as any unionist (or any economist) should know. The technological reality is that rate of transfer of information processing employment will tend to correspond to the extent that the Internet is distributed. The onus is on the unions to globalize like capital has globalized. Rather than seeing workers in impoverished countries as a threat, they must see them as an opportunity for growth and a necessity for international collective bargaining.

In the case of data mining by marketing firms, such bodies seek to improve the correlation between evaluations of consumer needs and wants match with production and distribution. Carried out successfully, such data mining radically reduces the degree of resource misallocations and thus assists in the overall and general reduction of productivity costs. The Internet provides exciting opportunities for marketing organizations, as it is far easier correlate the behaviour of an Internet user (web sites visited, time spent on each site etc) than assess alternative media (e.g., television viewing). Concerns however are invariably raised by consumers who dislike their electronic "footprint" being utilised either by in-house commercial marketing agents or for that matter sold to third parties, particularly when concerned with personal and sensitive purchases and interests. Whilst the misuse of cookies by marketing agents has already been discussed in the previous section, marketing information requires a minimum of government regulation as a backup to non-government solutions, such as the independent non-profit organizations TrustE (established by the Electronic Frontier Foundation and CommerceNet), the Platform for Privacy Preference (P3P, established by the Open Profiling Standard and the World Wide Web Consortium), and the commercial orgazation VeriSign. The advantage of these systems, as opposed to a heavy degree of government regulation, is that are "transforming passive consumers into active consumers who can (through third parties) monitor vendor practises for themselves".

[ An example of market segmentation according to webuse is available from Booz-Allen and Nielsen/Net Ratings. Interestingly, those most suseptible to advertisuing as those who treat websites in a low-interaction manner (web 'surfing'). Those who are more strongly a part of the Internet culture, according to the immersion and interactivity, are most resistant to advertising. http://cyberatlas.internet.com/big_picture/traffic_patterns/article/0,1323,5931_731421,00.html In 2001, a 17.1 percent fall in online advertising was recorded by the Internet research company NUA. http://www.nua.com/surveys/index.cgi?f=VS&art_id=905357527&rel=true Esther Dyson, Release 2.1: A Design for Living in the Digital Age, Penguin, 1998, p253. Dyson provides a summary description of TrustE, P3P and Verisign on pages 254-264] ]

One aspect of marketing that has become a critical issue is the impact of unsolicited commercial email, or spam. In Australia online, according to the National Office the Information Economy, it now takes up 20 percent of all email and can cost up to $960 per worker per year. According to an AC Nielsen computer survey, it is considered by Internet users to be greatest Internet problem, outpolling viruses and credit card protection. Internet users groups have - from the Canter and Seigel 'Green Card' affair - long been aware of the problems of unsolicited commercial email. The Coalition Against Unsolicied Commercial Email (CAUCE), with chapters in the United States, Europe, Canada, Australia and India have long sought a legal solution to such email. As the production cost of spam is extremely low with most cost is borne by the recipient (storage costs, time) it is extremely likely that the problem with increase rather than decrease.

However, targetting senders of unsolicited commercial email has probelms. The US experience of the legal path indicates the degree that sectional business interests are able to influence legislation. One recommended solution is a sender-pays model, where the sender email is charged a fee (after warning) according to pricing regimes established by the receiver (who, one would hope, would have a 'free-list' for the friends). This method also has the advatage placing demand on marketing agencies to target potential consumer audiences appropriately. A major flaw in this "consumer empowering" model however is that many 'spam' emails have irrelevant email return addresses, redirecting the potential consumer within the message. Sender pays software would send an email to return address with the costing offer, which would invariably bounce and thus consume more bandwidth.

[ In 1997 CAUCE proposed an amendment to the US Federal statute that makes junk facsimilies illegal to also include email. The House of Representatives took out key elements from a bill that gave consumers the right to take action against parties sending out bulk-email (2). Members of the Judiciary Committee removed the right for consumers to sue companies that failed to take their details off bulk email lists. However, typical of contemporary politics in the United States, provisions were added to the bill to ensure that any companies advertising adult content had to clearly label email as such and provided ISPs the right to sue. The Age, 'Spam, the Plague giving the Net indigestion', Josh Gordon, August 2, 2002 James Evans, ITworld.com 05/24/01]

Microsoft, in just over twenty years, has risen to become the largest corporate concentration of capital of all U.S. companies, valued at over 261 billion USD as of September 1998. Critics have claimed that this is at least in part due to unfair business practises and monopolistic behaviour, which have led to a number of U.S. state and Federal lawsuits as well as those initiated by other software companies. In a strict sense, Microsoft is not a monopoly. Other software companies exist providing similar products (e.g., operating systems, applications etc) and there are no obvious advantages in Microsoft's scale that led to a single firm to produce output at a lower average cost. There are however significant barriers to entry, given the scale of Microsoft's installation base on the client market and the profits achieved per unit sale. In the past, the United States government considered these barriers of entry and vertical integration sufficient to justify the enforced separation of AT&T with regional Bell Operating Companies. Even if a company did not provide special information to their developers, the suspicion that they may inhibits competition. .

[ Timeline: Microsoft legal wrangles, November 5, 1999 http://news.bbc.co.uk/1/hi/special_report/1998/04/98/microsoft/506492.stm] ]

In 1990, the Federal Trade Commission began an investigation into Microsoft with claims that it engaged in provision of special information to its application developers, that Original Equipment Manufacturers (OEMs) required to license MS applications as well as the operating system and that Microsoft licensed its operating systems to OEMs on a per processor basis and with long term licensing agreements. In 1993 the Department of Justice took over the investigation, and in 1994 Netscape Communications released the Navigator webbrowser, which soon became market leader. In late 1995, Microsoft released Internet Explorer with an OEM agreement requiring that PC manufacturers include IE. A U.S. court order required Microsoft to lift these licensing terms and in October 1997, the Department of Justice sought a $1 million USD per day fine for ignoring this order. In response, Microsoft released an obsolete version of MS-Windows 95 without Internet Explorer. In May 1998, Microsoft, the US Department of Justice and 20 US state governments launch anti-trust suits against Microsoft which opened in October 1998. Testimonies from members of Netscape, AOL, Apple Computers, Intel, IBM, Sun Microsystems suggested anticompetitive and predatory practises. In November 1999 Judge Thomas Penfield Jackson ruled that Microsoft wielded monopoly power and used it to harm consumers, rivals, and other companies. This led to Government attorneys calling for a breakup of Microsoft, which Judge Jackson agreed to in June 2000. However, Microsoft lodged an appeal amidst claims that Jackson was not impartial in his ruling. This was sustained by the appeals court, and by September 2001, government attorneys were seeking settlement with Microsoft that did not include the breakup of the company.

[ Sporkin, Stanley. Ruling Decision, United States of America v. Microsoft Corporation, Civil Action No. 94-1564. Decided February 14, 1995. Located at http://www.vanderbilt.edu/Owen/froeb/antitrust/cases/microsoft/sporkin.html. US District Court, United States et al vs Microsoft Corporation, Civil Action No. 98-1233 (TPJ) http://www.usdoj.gov/atr/cases/f3800/msjudgex.htm and http://usvms.gpo.gov/ms-final2.html Appeals court: Don't break up Microsoft Last modified: June 28, 2001, 5:15 PM PDT By Joe Wilcox and Scott Ard Staff Writer, CNET News.com http://news.com.com/2100-1001-269179.html?legacy=cnet http://www.lib.umich.edu/govdocs/dn00/dn00cour.html ]

Accusations of monopolistic behaviour in computer hardware and software corporations are not new. DeLamarter's allegations of IBM's monopolistic behaviour followed thirteen years of investigation (1969-1982) by the U.S. government until the case was dropped by the Reagan adminstration. IBM had previously been under investigation for monopolistic behaviour which resulted in the 1956 consent decree. Likewise the microprocessor manufacturer Intel have also been accused of monopolistic behaviour and patent infringement. Intel's total processor monpoly itself was only broken due to a cross-licensing agreement with American Micro Devices - a matter which Intel did not let occur of its own violition. As a further comparison, for years Apple held its hardware and operating system as tightly held in-house secrets, fearing a wave a clones similar to those of the Apple II. This was only released with the formation of PowerPC based Common Hardware Reference Platform and most recently with Darwin, the core of Apple's OS X, which is based on the public and open source FreeBSD and Mach 3.0 operating systems.

[ Richard Thomas DeLamarter, Big Blue: IBM's Use and Abuse of Power, McMillan, 1986 The 1956 consent decree (Civil Actiion No 72-344 in Southern District of New York) can be found in Appendix I, pp349-341. Working Papers In Economics Title: The FTC's Challenge to Intel's Cross-Licensing Practices Author: Carl Shapiro (Haas School of Business, University of California, Berkeley) http://econwpa.wustl.edu/eprints/dev/papers/0303/0303006.abs Justin P. Johnson, David P. Myatt, "Multiproduct Quality Competition: Fighting Brands and Product Pruning", Oxford University Department of Economics Discussion Paper 105, Oxford University, June 2002 PowerPC Platform FAQ http://www.mug.jhmi.edu/mirrors/InfoAlley/0896/06/powerpc.html Apple's Open Source FAQ http://developer.apple.com/darwin/ps-faq.html ]

The tendancy for computer mediated communication hardware and software to tend towards standardization conflicts with proprietory and closed software standards in corporate commercialization. Thus, there is a natural tendency towards monopoly and monopolistic behaviour across the number of compter markets and the industry as a whole. This has led many professionals to advocate the use of open-source software, which economists have engaged in initial studies. Stephan Kooths, from the Munster Institute of Computational Economics, claims that Open Source Software does not create any value-added potential and is market destructive ("no price = no market"). Free availability to Kooths is a threat to profits, income, jobs and taxation revenue. Other authors (Learner and Triole) consider altruism and status to be motivational factors, whereas some take the perspective of scientific management (Mockus et al), or even as a deliberate strategy by firms to commodify complementary products and undermine monopolies. In comparison, Hawkins considers open source software as a quasi-public good. In a strict sense, whilst all players can benefit from the provision of the good, no individual entity receives sufficient benefit to produce the good. However, in software development maintenance costs are substantial, thus it is actually more efficient to open the code to public scrutiny.

[ Dr. Stephan Kooths, The Economics of Open Source Software: Prospects, Pitfalls and Prospects, Microsoft Research Cambridge lecture, January 15, 2004 http://mice.uni-muenster.de/download/mice_pr-kooths_OSS_MSR-Cambridge2004.pdf Josh Lerner, Jean Triole, "Some Simple Economics of Open Source", Journal of Industrial Economics, 50, pp197-234, 2002 www.people.hbs.edu/jlerner/simple.pdf Audris Mockus, Roy T. Fielding, James Herbsleb, "A Case Study of Open Source Software Development: The Apache Server", Proceedings of the 22nd International Conference on Software Engineering, pp263-272, 2000 opensource.mit.edu/papers/mockusapache.pdf Richard E. Hawkins, The Economics of Open Source Software, submission to Computational Economics, December 2003 slytherin.ds.psu.edu/hawk/research/ opensource/opensource.pdf ]

A critical issue is the transferral of particular uses of the open source model in software to that of a general application for the production of information goods and services. Applied with increasing universality, the increasing practise of reducing the financial cost of complementary goods ensures an increasingly dynamic market typology. This requires substantial organizational change, not only in information technology industries, but in all production and service-provision that is dependent on information. Initial research in this field strongly indicates the need for "flatter" hierachies and high levels of self-organization, emphasizing professional associations and "club-like" models whose product is ultimately self-regulation as well. Eric Raymond's highly influential essay, "The Cathedral and the Bazaar" emphasizes this point, but also points out that open source development requires a code-base (even if poorly coded, and buggy) or a "plausible promise" in order to build a development community. This of course necessitates the need for a different organizational model in the first instance - as is evident in the initial development of the Internet, the connections between the Free Software Foundation and MIT and the University of Helsinki when Linus Tolvards in the initial stages of the Linux project.

[ Giampaola Garzarelli, Open Source Software and the Economics of Organization, Universita' degli Studi di Roma, La Sapienza, Dipartmento di Teoria Economica e Metodi Quantitavi per le Scelte Politiche, 2002 opensource.mit.edu/papers/garzarelli.pdf Eric Raymond, The Cathedral and the Bazaar, First Monday, http://www.firstmonday.dk/issues/issue3_3/raymond/ Nikolai Bezroukov, A Second Look at the Cathedral and the Bazaar, First Monday, http://www.firstmonday.dk/issues/issue4_12/bezroukov/ ]

Technological Standards and Institutional Status

The establishment of technical standards for the Internet and the institutional status of the Internet are key critical issues for the integration of data in contemporary society. These are by no means simple associations, as there are multiple standards organizations and multiple institutional bodies with highly diverse levels of authority and jurisdiction. The peak international standards body, the ISO, is a voluntary non-treaty organization which includes a range of national standards organizations (such as ANSI (US), BSI (UK), ANFOR (France), DIN (Germany) etc). Within the telecommunications world, the UN Specialised Agency, the International Telecommunications Union has its origins from the International Telegraph Union of 1865, the International Radiotelegraph Convention of 1906 and finally the International Telecommunication Union signed at Madrid in 1932. The ITU became a UN Specialised Agency by agreement with the Economic and Social Council and was approved by the General Assembly on November 15, 1947.

As mentioned in section 1.3, Definition and Development of the Internet, the key technical standard of the Internet is the TCP/IP suite of protocols. The major governing institutions are the Internet Architecture Board (IAB), the Internet Engineering Task Force (IETF), the Internet Assigned Numbers Authority (IANA) and Internet Corporation for Assigned Names and Numbers (ICANN). Within the domain of Internet services themselves, governance may be carried out over specific services by regional legislatures (especially that of the United States), local systems adminstrators and the policy of hosting organization, the moderation of discussion groups, whether usenet-based or by through some other service (moderated mailing lists, moderated webboards), and voluntary standards organizations (e.g., the World Wide Web consortium). Recent discussions have also recently been carried out under the suggestion that the Internet domain name system authority (ICANN) governed and other aspects by an international authority: In a press release, Secretary General of the United Nations, Kofi Annan, said that the working group for the World Summit on the Information Society would "developing a working definition of Internet governance, identifying the relevant public policy issues, and developing a common understanding of the respective roles and responsibilities of governments, international organizations, the private sector and civil society".

[ United Nations Press Release PI/1560, August 3, 2004 http://www.un.org/News/Press/docs/2004/pi1560.doc.htm See also; World Summit on the Information Society website: http://www.itu.int/wsis/ ]

It would be highly inaccurate to presuppose that these competing technical and institutional authorities come to equivalent or even necessarily compatiable decisions. Whilst the International Standards Organization recommended the use of a seven-layer Open Systems Interconnection model (OSI) for networks, the pre-existing four-layer TCP/IP model has achieved clear dominance (albiet with a high degree of compatiability). A case study of the development of IEEE standard 802/ISO 8802 is particularly illustrative of the the clash of institutionalised powers, technical needs and the role of compromise over consensus. The largest standard for LANs is Ethernet, a product from the research of Metclafe and Boggs on behalf of the Xerox Corporation and so quickly adopted by many companies that Intel built a single-chip controller. This differed however, from a proposal from General Motors for a standard LAN based on a token-bus network. Meanwhile, IBM announced that it too had developed a LAN that ought to be considered as standard, based on token ring technology developed from its Zurich laboratories.

The IEEE committee on determining LANs thus found themselves facing three proposed standards, each with peculiar technical advantages, proposed uses and institutional backing. The Ethernet standard was supported by DEC, Xerox, Intel and was well suited to administrative support. The Token Bus standard was backed by General Motors and supporters of factory automation. Finally, the de facto power of IBM, its technicians and its customer base supporting the Token Ring standard also made a potent case. Eventually, after many months of debate and with no possible consensus or even majority decision deemed likely, IEEE determined to accept all three (because three standards is better than none) standards known now as IEEE 802.3 (Ethernet), 802.4 (token bus) and 802.5 (token ring).

[ Andrew S. Tanenbaum, Computer Networks (2nd edition), Prentice-Hall, 1989, pp36-38 ]

Most Internet standards have fared somewhat better, probably because of the lack of intensive commercial competition, with the Internet Engineering Taskforce codifying most standards and recommendations which are considered by the Internet Engineering Steering Group, with appeals directed to the Internet Architecture Board, and announced by the Internet Society with the Network Working Group's Request for Comments organizing the standards in their final form. Using IETF standards as a baseline, the Internet Mail Consortium includes additional protocols which are considered "standard" by software developers, such. For example the consortium has over the past several years promoted and developed vCard and vCalendar for personal data interchange.

[ Network Working Group, RFC 2026, The Internet Standards Process - revision 3, S. Bradner, Harvard Univeristy, October 1996 http://www.ietf.org/rfc/rfc2026.txt Internet Mail Consortium, www.imc.org ]

One particular area of concern has been the management of the Domain Name system and in particular its management by the Internet Corporation for Assigned Names and Numbers (ICANN) following designation by the U.S. Department of Commerce in November 1998. On an institutional level, many question why a private nonprofit organization under the authority of the attorney general of a U.S. state, completely dominated by corporate interest groups (individual domain holders may not become members) and with meetings closed to the general public should be the most important institution in practical terms to a technologically mediated service that allocates address space, protocol paramters and management of the root server system on a global level. Particular concerns were raised by Michael Froomkin, David Post and David Farber who created ICANNwatch. Specific concerns apart from the national-institutional issue include competition between domain name registrars, ICANN's "Uniform Dispute Resolution Policy", a lack of internal democracy, a lack of the institutional accountability. In a social and political theory sense, David Post's comments on ICANNS lack of "consent of the governed" raises serious issues of the legitimacy of the organization.

[ Cited: 50 Duke L. J. 17 (2000). Also available as a .pdf file at http://personal.law.miami.edu/~froomkin/articles/icann.pdf. 50 Duke Law Journal 17 (2000). � A. Michael Froomkin 2000, 2001 WRONG TURN IN CYBERSPACE: USING ICANN TO ROUTE AROUND THE APA AND THE CONSTITUTION A. MICHAEL FROOMKIN{�} http://personal.law.miami.edu/~froomkin/articles/icann-main.htm The "Unsettled Paradox":� The Internet, the State, and the Consent of the Governed 5 Indiana J. Global Legal Studies 521 (1998) http://www.temple.edu/lawschool/dpost/Sov.html ]

Long term consumer-rights advocate Ralph Nader has recently suggested instead that ICANN's authority should be determined by a multilateral government charter, with a membership to all Internet users and freedom of information. Country-code top level domains would remain under the discretion of national governments and generic top level domains (notably not those under the jurisdiction of the U.S., e.g., .gov, .edu, .mil etc) would be declared a public resource held in trust but without ownership. With a number of nations unhappy with the management of ICANN (such as the addition of global top-level domains, dispute resolution policy or at-large membership rules) or the principle of ICANN being under U.S. control, some have suggested that a United Nations agency should take over. In contrast, Ether Dyson, chair of ICANN claims that "ICANN does not aspire to address any Internet governance issues; in effect it governs the plumbing, not the people" and that "ICANN's goals and its actions are in fact the result of public debate and consensus - though not of unanimity".

[ The following is a proposal presented by Ralph Nader to Governing the Commons: The Future of Global Internet Administration, a conference organized by Computer Professionals for Social Responsibility, September 24-25, 1999, in Alexandria, Virginia. http://www.cptech.org/ecom/cpsr-icann.html International Herald Tribune, UN takeover of Internet? Some are 'not amused', Monday, December 8, 2003 http://www.iht.com/articles/120570.html Written exchange between Ralph Nader, Jamie Love and Esther Dyson http://www.icann.org/nader-questions.htm http://www.icann.org/chairman-response.htm ]

In comparison standards for the World Wide Web have an ever more problematic history. David Siegal's, "The Balkanization of the Web", positively suggested that the future of web standards will be more like a federation rather than a global village, with the World Wide Web consortium painted into a corner as browser companies produce proprietory features. Of course, in ironic reference to oft-quoted title, the Balkans are anything but a federation (at the moment) and likewise so is the world wide web. The "balkanisation" started with Netscape introducing an array of elements that only their browser supported, most notably frames which weren't accepted as a W3C standard until html 4.0. As Netscape 4.0 included more properitory elements, Microsoft's Internet Explorer 4.0 offered greater CSS and html standard compatiability, but having gained market ascendency, started introducing more of their own proprietory elements (e.g., DHTML). Today numerous, incompatiable web standards exist which reduce accessibility and platform independence. As the inventor of the World Wide Web and the founder of the W3C, Tim Berners-Lee explains:

"Anyone who slaps a 'this page is best viewed with Browser X' label on a Web page appears to be yearning for the bad old days, before the Web, when you had very little chance of reading a document written on another computer, another word processor, or another network."

[ The Balkanization of the Web http://www.dsiegel.com/balkanization/ Tim Berners-Lee, Technology Review, July 1996 quoted in http://www.anybrowser.org/campaign/ ]

In part, the balkanisation is due to the institutional status of the World Wide Web Consortium, which does not have the same status as the International Standards Organization. Although SGML (Standard Generalized Markup Language) is an ISO standard, from which html is derived, the W3C does not issue standards in the same sense that the ISO does, rather its consortium makes recommendations which member organizations and individuals may accept or ignore - and as a review of member organizations websites indicates, most choose to ignore. None of this suggest that the W3C is a perfect or even perfectable institution - it has certainly engaged in its own activities contrary to public interest recommendations, such as in August 2001 with the proposal to introduce fee and royalty-based standards through a licensing model that would include RAND ("Reasonable and Non-Discriminating"), that would apply to undefined "higher level" services. It was seriously questioned by many, particularly those from the open source community that if the W3C adopted RAND that they could be considered the legimate organization for recommended web standards.

[ "Information Processing -- Text and Office Systems -- Standard Generalized Markup Language (SGML)", ISO 8879:1986. Please consult http://www.iso.ch/cate/d16387.html for information about the standard. A review of the following major organizations in the W3C on March 23, 2004 with the W3C validator revealed: Microsoft www.microsoft.com - fatal error no doctype specified Intel www.intel.com - fatal error no doctype specified IBM www.ibm.com - not valid (1 error) Verisign www.verisign.com - not valid (95 errors) Sun Corporation www.sun.com - not valid (8 errors) Cisco Corporation www.cisco.com - not valid (249 errors) AT&T www.att.com - not valid (8 errors) America Online www.aol.com - no character encoding label, unable to test. British Broadcasting Commission www.bbc.co.uk - not valid (25 errors) Library of Congress www.loc.gov - not valid (36 errors) Intelsat www.intelsat.int - Host not operating. Defense Information Systems Agency www.disa.mil - fatal error no doctype specified IEEE Computer Society www.computer.org - fatal error no doctype specified "RAND means that someone may or may not pay a fee, and that is at the discretion of the license holder" (http://www.w3.org/2001/08/patentnews). ]

Moving down the scope of institutional and organizational authority, moderation of public and private electronic mailing lists and bulletin boards (such as usenet, web-based groups such as Yahoo! and Topica etc) is another locus for consideration. In early usenet history the mod* heirarchy was established due to the introduction of off-topic postings in net.sources and net.sources.d. Protection from off-topic postings remains the prime motivation for moderated public discussion lists, especially protection from "net kooks" and "spamming" - the former normally reserved for political opinions and the latter for advertising. Whilst these may have obvious correlations in "real world", there is additional technical leverage (and resultant cultural significance) on the Internet where all speakers have greater equality of discourse presentation. Because moderation is invariably conducted by volunteers problems regarding redundancy, responsibility and transferral of moderation rights are significant. The problems regarding moderation are a critical issue to any consideration of the Internet (or specifically, particular Internet services) being realistically considered as a public sphere with at least some sense of a signal to noise ratio.

[ Network Working Group Kent Landfield Request for Comments: XXXX Category: Informational Sometime 1995 Usenet Sources Newsgroup Moderation Crank Dot Net (http://www.crank.net/index.html) maintains a compilation of "net kooks" established through either websites or usenet positing. See also Serdar Argic, Wikipedia entry at: http://en.wikipedia.org/wiki/Serdar_Argic Russ Allbery, Pitfalls of Newsgroup Moderation, version 1.5 2004 http://www.faqs.org/faqs/usenet/moderation/pitfalls/ ]

The preceeding sketch of contemporary critical issues indicates the difficult relationship that the Internet, from a global and general sense to a local and particular sense, is having with existing institutions, expectations and standard procedures. Of course, these critical issues need to be tempered by those extremely functional improvements that the Internet has contributed to institutional bodies and technical standards, to which the history of the Internet as a military and academic network and the use of the world wide web for physicists to share information stands as extremely clear examples. However under the general theme of data integration it is clear that the internationalization, decentralization of use, and incorporation of competing institutional and personal interests is causing significant strain on traditional methods. Failure to establish a consistent working model for technical standards and institutional status of the Internet with an orientation of the integration of data into the global community and as a whole is a matter that must be addressed in the conclusion of this subchapter.

Social Adaptability and Dynamicism

Adaptability is a recognized functional requirement for any society. In modern societies, dynamicism becomes and additional rquirement as modernity is characterised by creating normative and productive forces from itself, rather than recourse to prior models (normative) or simple artisanship (productive). As Marshall Berman correctly describes modernity is an experience and a narrative where, as Karl Marx and Frederich Engels once described capitalism, "all that is solid melts into air". With regards to the Internet, certainly dynamic, the critical question for contemporary society is its role in normative and productive dynamicism and adaptability. A society that is too adaptable and too dynamic will misallocate resources from other functional requirements (goal attainment, integration, latency), and will prove impossible to manage, let alone govern. A society with a deficiet in adaptability and dynamicism runs the risk of crisis tendencies arising from contradictions and inequalities in its social formation.

[ Marshall Berman, "All That Is Solid Melts Into Air: The Experience of Modernity", Simon & Schuster, 1982 Karl Marx , Frederich Engels, Manifesto of the Communist Party, (chapter 1), Progress Publishers, 1969 (first published 1848) http://www.marxists.org/archive/marx/works/1848/communist-manifesto/ch01.htm ]

Significant literature exists concerning the role of the Internet in changing the imbalance generated by capital-intensive mass media. Graham Meikle's distinction between alternative and tactical media is useful here. In the former instance, numerous examples exist of alternative media organizations making use of the Internet even when subject to official censure. When NATO airstrikes were called against the independent radio station B92, Yugoslav officials suspended their transmissions and confiscated their equipment. The station however continued via a RealAudio stream distributed on the Internet with 50,000 site hits per day. Another celebrated example, that of the McSpotlight website and email list, was initiated following libel charges against two critics of the McDonalds fast food chain. Cursory analysis of webtraffic engines clearly indicates the levelling effect of a web presence for media that cannot rely on capital accumulation. Whilst fair comment can be raised about the information quality of some websites that stand as alternatives to capital intensive media, the same accusations can also be raised against the established media, especially the tabliod press and disposable magazines. Readers and contributors are in the same situation as ever - they must make judgements based on reputations for accuracy and internal consistency in articles.

[ Graham Meikle, Future Active: Media Activism and the Internet, Routledge and Pluto Press, 2002 ibid, p59-60 and pp63-69. See also Interview with Veran Matic, Editor-in-Chief, Radio B92 Belgrade "independent media are the main lever for democratic changes" JURIST, August 16, 1999 KOSOVO & YUGOSLAVIA: LAW IN CRISIS a presentation of JURIST: The Law Professors' Network http://jurist.law.pitt.edu/matic.htm McSpotlight Frequently Asked Questions, Version 5, June 2000 http://www.mcspotlight.org/campaigns/current/mcspotlight/faq.html ]

With regards to "tactical media", particular attention is drawn to the semiotic guerreilla warfare under the endearing title of "culture jamming", the use of creative parody to undermine deceptive advertising by appropriating and remodelling advertising signs with an incisive element of truth. The most well known of these efforts on the Internet are the activities of the Vancouver-based Adbusters Media Foundation who, through their website and their magazine of the same name (estimated circulation per issue as per March 2004, 120 000) produce effective parody advertisments for political causes (but only "if the cause is right"). Their most famours campaign, "Buy Nothing Day", is held annually on November 28. A celebrated single-person activion was that of Jonah Peretti who attempted to have a customized pair of Nike shoes marked "sweatshop" following their website offer to personalize shoes on the basis of "about freedom to choose and freedom to express who you are". However, the most politically sophisticated (if not necessarily aesthetically so) are those activities by RTMark Inc who, taking advantage of free speech and limited liability attributes granted to U.S. corporations, engages in parody campaigns deliberately designed to provoke a mass media response. Two of their more succesful campaigns have been the pro-Zapatista FloodNet "virtual sit-in", GWbush.com and gatt.org, a parody of the World Trade Organization.

[ Adbusters: Culture Jammers Headquarters http://www.adbusters.org/home/ Culture Jamming, Memes, Social Networks, and the Emerging Media Ecology The "Nike Sweatshop Email" as Object-To-Think-With (work in progress) http://depts.washington.edu/ccce/polcommcampaigns/peretti.html Jonah Peretti, My Nike Adventure, The Nation, April 9, 2001 http://www.thenation.com/doc.mhtml?i=20010409&s=peretti RTMark Frequently Asked Questions, 2000 http://www.rtmark.com/faq.html The FloodNet virtual sit-in: http://www.rtmark.com/zapfloodpr.html GWBush.com: http://www.rtmark.com/bush.html gatt.org http://www.rtmark.com/yes.html ]

The most dramatic example of using the Internet in such a manner is that of the Zapatista National Liberation Army (EZLN) of Mexico in 1994. In a dramatic variation of peasant uprisings in Latin America, the EZLN used the Internet extensively to publish their demands and attract international support, especially among the disparate opponents of NAFTA (North American Free Trade Agreement). The extent of this support, and the dramatic success of their uprising, forced the Mexican government to negotiations. As Harry Cleaver notes, the EZLN demands were less about a bigger share of national wealth or income, but rather autonomy from the Mexican system, especially the sale of communual Indian land arising from NAFTA. Another critical issue was the organizational form - one which neither adheres to traditional Lenninist or social democratic models - but rather a network of cooperation between disparate groups: "Connection comes with mutual recognition and the understanding that struggles can be complementary and mutually reinforcing".

[ Harry Cleaver, The Chiapas Uprising and the Future of Class Struggle in the New World Order, Studies in Political Economy, 1994, pp141-157 available at: http://www.eco.utexas.edu/Homepages/Faculty/Cleaver/chiapasuprising.html see also Harry Cleaver, "The Zapatistas and the Electronic Fabric of Struggle" in John Holloway and Elo�na Pel�ez (eds.), Zapatista! Reinventing Revolution in Mexico, London:Pluto Press, 1998. http://www.eco.utexas.edu/Homepages/Faculty/Cleaver/zaps.html and Harry Cleaver, The Zapatista Effect: The Internet and the Rise of an Alternative Political Fabric*, Essay for the Journal of International Affairs, Vol. 51, No. 2, Spring 1998, pp. 621-640 http://www.eco.utexas.edu/Homepages/Faculty/Cleaver/zapeffect.html ]

Two serious criticisms of the "Zapatista model" have been forthcoming. One, by RAND corporation under the phrase "netwar", expresses concern that Mexico - and potentially any society - becomes ungovernable by the capacity of marginalized groups to engage in serious strategic disruption by forming temporary alliances. The problem according to Ronfeldt and Arquilla is that whilst such groups are capable of disrupting effective national government, none are capable of asserting governmental authority themselves, leading to a state of perpetual crisis. In contrast, Judith Hellman's criticism of "real" versus "virtual" revolutionaries accuses online activists of ignoring the "real" struggle in favour of propogating the work of "real revolutionaries". To those dedicated to fundamental social change, a political strategy that emphasizes first and foremost "online activism" is, according to Hellman, an orientation that disinvests time from participation in the struggle "at home" in favour of "thirdworldism".

[ John Arquilla, David Ronfeldt, "Cyberwar is coming!" Comparative Strategy, Vol 12, No 2, Summer 1993, pp141-165 http://www.memefest.org/shared/www/cyberwar_is_coming.html Judith Adler Hellman "Real and Virtual Chiapas: Magic Realism and the Left", Socialist Register, 2000 [FP 1999 in Este Pais in Spanish]. http://www.yorku.ca/socreg/hellman.html ]

In a less revolutionary mould, Christopher Hunter correctly describes public opinion as having a functional democratic role by serving as guide to government institutions. Functional aggregation is normally channelled through the system of acclamation (voting), but also through opinion polls and lobby groups. In comparison to the activist orientations noted above Internet techologies have also been utilized in a formal role in the democratic process, initially with the 1994 Minnesota E-Democracy website where the Governor of Minnesota and members of the United States Senate participated in debate. A similar process was repeated in Canberra, Australia in 1997. In the United Kingdom, the edemocracy government website was launched on July 16, 2002 to incorporate information and communication technologies with existing democratic procudures, especially the introduction of electronic voting (perceived as a means to reverse the decline in turnouts to elections). In comparison, Singapore has decided against electronic voting due to fears over ballot secrecy. This however, has not stopped the U.S. Democratic Party from using electronic voting for their 2004 Presidential primaries.

[ The Internet and the Public Sphere: Revitalization or Decay?, 1998 by Christopher D. Hunter Ph.D. Candidate University of Pennsylvania Minnestota e-democracy, http://www.e-democracy.org/ G. Scott Aikens, A History of Minnesota Electronic Democracy 1994 http://www.firstmonday.dk/issues/issue5/aikens/ Karin Geiselhart, Steve Colman, Two Web-based Australian Experiments in Electronic Democracy, Presentation to AusWeb99, 1999 http://ausweb.scu.edu.au/aw99/papers/geiselhart/paper.html http://www.edemocracy.gov.uk/ http://www.edemocracy.gov.uk/downloads/your_response_report.pdf E-lections not on the cards for next GE By Susan Tsang, CNET.com Tuesday, October 16 2001 11:22 AM http://asia.cnet.com/newstech/industry/0,39001143,38025196,00.htm nb: In 2001 Singapore's 84 elected member parliament consided of 82 members for People's Action Party, 1 member Singapore Democratic Alliance, 1 member Workers Party. http://news.com.com/2100-1028_3-5168670.html?tag=st_lh E-voting smooth on Super Tuesday Last modified: March 2, 2004, 5:05 PM PST By Declan McCullagh Staff Writer, CNET News.com ]

Martin Hagen makes a useful typology of three versions of electronic democracy; "teledemocracy", "cyberdemocracy" and "electronic democratization". In the former, direct democracy is advocated through virtual town meetings - however examples of actual implementation suggest that these are largely staged events and as Scott London points out suffer the same problems as opinion polls. In a cyberdemocracy type, political debate, activism and community building has priority. "Their prime concern is to (re-)create (virtual and non-virtual) communities as a counter-base to centralized forms of government". Finally, electronic democraticization involves using new information and communication technologies to existing democratic institutions and procedures. Electronic Democracy New Zealand for example, in part suggests a cyberdemocracy in terms of civilian discussion with governments providing highly informative websites, a position not dissimilar to that of Claudia Lynch of University of North Texas publication Benchmarks. In a very innovative collection of essays (From Digital Divide to Digital Democracy), a number of authors consider the democratic potential of the Internet is best channelled through the education system and with a specific orientation to those social groups which have been traditionally disenfranchised.

[ Martin Hagen, A Typology of Electronic Democracy, 1996. Derived from "A Road to Electronic Democracy? Politische Theorie, Politik und der Information Superhighway in den USA", in Hans J. Kleinsteuber (ed.) 1996: Der "Information Superhighway". Opladen: Westdeutscher Verlag, S. 63-85. http://www.uni-giessen.de/fb03/vinci/labore/netz/hag_en.htm Scott London, Electronic Democracy: A Literature Survey, Kettering Foundation, 1994 Paul Huges, Electronic Democracy: An Opportunity for the Community to Improve Its Power of Governance, Electronic Democraacy New Zealand, 1997 Claudia Hughes, Electronic Democracy, Benchamrks, University of North Texas, Summer 1996 http://www.unt.edu/UNT/departments/CC/Benchmarks/sum96/index.html From Digital Divide to Digital Democracy. Editors: De Los Santos; Mark David Milliron; League for Innovation in the Community College, 2003 ]

The Internet has simulataneous provided and outlet and evidence for the hypothesis that the political and economic system in advanced societies is suffering from a deficiet in legitimacy and mass loyalty from the perspective of the lifeworld of its citizens. The extremely modest and curtailed attempts to introduce the potentially democraticizing elements of communications and information technology by the system represent a serious problem. For if the Internet allows the generation of action-motivated meaning, but the system lacks the ability to channel this as system-supporting generalized motivations then a conflict between the system and the lifeworld will inevitably arise. An example of such a crisis concludes this case analysis. This however is not the only manner by which the introduction of the Internet and other information and communications technologies can threaten system stability. They can also do through threats to the international political balance, through threats to the political administration of financial capital, and also to economic productivity and changes to reflexive labour and transaction costs.

[Jurgen Habermas, Legitimation Crisis, p44, pp49-50]

With regards to the international balance, qualitative changes in military technology are certainly more common than social revolutions in the means of production, or even political revolutions in the relations of production. In strict military terms, information dominance, divided into strategic and operational dimensions, indicates advantageous capacity to collect, communicate and protect information over an adversary and thus facilitating land, sea and air operations. Recent conflicts (e.g., the Persian Gulf war of 1991) indicate extreme importance of information dominance, although it is still a matter of debate whether this constitutes a revolution rather than an evolution of military organization, strategy and operations. John Aquilla and David Ronfeldt for example, suggest that the improvements in the scope and speed of communications necessitates widespread fundamental organizational changes among state and commercial bodies that correlate with the structural changes of the technology; a flattening of hierachial structures, decentralized networks of trust relationships capable of rapid and autonomous response and with a lack of centralized command-and-control. According to Litton, the concept of massing forces has changed to massing effects (c.f., terrorist attacks), objective destruction of force and will becomes the less tangible strategic paralysis and long term planniung (e.g., computer viruses), offensive action gains priority over defensive (speed of information and effect), maximun use of force is preferred to an economy of force (rendering the concept of reserve forces suspect), an alteration from chain-of-command to a network, and the loss of "surprise at will" to acts of stealth and speed. Finally Litton also adds one addition and critical principle for an information based society - legitimacy.

[ For example Andrew Krepinevich Jnr's influential claim of 10 military revolutions in modern times, from pikemen (14th century) to the ultima ratio of the prospect of destruction of the world through thermonuclear means: Andrew Krepinevich Jr.'s "Cavalry to Computer: The Pattern of Military Revolutions" (National Interest, Fall 1994) Jeremy Shapiro, Information and War: Is It A Revolution?, pp113-153 in Strategic Appraisal: The Changing Role of Information in Warfare, Zalmay Khalilzad, John P. White, Andrew W. Marshall (eds), RAND Corporation, 1996 available at: http://www.rand.org/publications/MR/MR1016/ John Arquilla, David F. Ronfeldt, The Advent of Netwar, RAND, 1996 and John Arquilla, David F. Rondfeldt, In Athena's Camp: Perparing for COnflict in the Information Age, RAND, 1997 Nigel Thompson, Terrorism and Internet Warfare, Australian Institute of Criminology Internet Crime Conference, February 1998, http://www.aic.gov.au/conferences/internet/thompson.pdf Maj. Leonard G. Litton, The Information Based RMA and the Principles of War, Air and Space Power Chronicles, September 2000, available at: http://www.iwar.org.uk/rma/resources/info-rma/litton.htm ]

Whilst the effects of globalization and the revolution in information and communications technology have significant effects on military organization and operations, significant problems have also arisen in the political administration of financial capital, or taxation regimes. In two different manners, globalization and information technology heighten the prospect of capital flight and capital strikes. In the former, the organizational development of multinational corporations allow for competing considerations. In the latter, the technology allows easy transfer of intangible assets. Of course, none of this suggests that capital should not also seek productive and educated workers in an environment with strong infrastructure. However the available evidence to date is that whilst capital is globalizing at a rate greater than other economic indicators - contrary to the view of some economists who claim that multinational corporations and foreign direct investment has little economic effect - and more importantly, the rapid growth of international treaties and strengthening of global capital instututions. The political reality is that such institutions have vigorously promoted a move away from state-regulated capitalism (which included market complementing and market replacing actions as well as social property) in favour of a neoliberal environment and a significant downgrading of the social wage.

[ Ernest J. Wilson III, Chapter Two - Strategic Restructing: A Framework for Analysis, Centre for International Development and Conflict Management, University of Michigan, 2003 http://www.cidcm.umd.edu/wilson/leadership/chapter2.pdf James Crotty, Gerald Epstein, Patricia Kelly, Multinational Corporations, Capital Mobility and the Global Neo-Liberal Regime: Effects on Northern Workers and Growth Prospects in the Developing World, Seoul Journal of Economics, Winter 1997 available at: http://www.people.umass.edu/crotty/SNJ-MNC-ULT.pdf ]

Whilst the contradictions of a global advanced capitalism order will arise through a telos of private and centralized wealth and public and mass relative impoverishment leading to crises of output, it is no longer sufficient to assume that their resolution can be assured through quasi-political means due to the prospects of capital flight and strike as noted in the proceeding paragraph. In an attempts to finance solutions from dysfunctional market behaviour, the Internet has been considered as a taxation source, and indeed one where collection could be easily automated. Nevertheless there are substantial problems of juridication and consistency which has led governments to generally avoid taxation of Internet activity independent of pre-existing sales and retail taxation systems and in some cases, establish the Internet as a commercial taxation free zone. Whilst research exists that such activities do continue the shift of the taxation burden from capital to labour, this is usually expressed as as a result of ideology rather than as a telic inclination of the technology. The fact of the matter is that the physically intangible finance capital is more capable of "flight" or "strike" that physically tangible consumer-worker. Dealing with this facticity is key to any resolution of the finance crisis facing political administrations even if the political will exists for a more egalitarian distribution of wealth and income.

[ c.f., The Quill v North Dakota case of state retail taxation regimes (1992). http://vls.law.vill.edu/prof/maule/cle_demo/Quill.htm Preying on the Web: Tax Collection in the Virtual World, 28 Florida State University Law Review 649 - 786 (2001). www.law.fsu.edu/journals/lawreview/ downloads/283/vetter2.pdf "Section-by-Section Analysis of The Internet Tax Freedom Act (P.L. 105-277)." Accessed 3 December 2001. "Study: States lose $13.3 billion in e-taxes." ZDNet 2 October 2001. Accessed 31 October 2001. Michael Mazerov, Iris J. Lav, A Federal Moratorium on Internet Commerce Taxes Would Erode State and Local Revenue and Shift Burden to Lower-Income Households, Center on Budget and Political Priorities, May 1998 http://www.cbpp.org/512webtax.htm ]

A closely related issue is that of economic productivity. This is a matter of substantial and continuing criticism of those who claim revolutionary implications of the Internet and the information society in general. In general the claim is that because the substantial changes in information and communications technology have not resulted in the sort of productivity increases that the agrarian and industrial revolutions have, then it is manifest that the information "revolution" is certainly anything but a social revolution. Indeed, significant evidence exists that the introduction of the Internet and computer technology has actually correlated with a deceleration of non-durable in the retail and service productivity growth, leading to the claim that the Internet does not consitute as a "great invention" when compared to those of 1860-1900. Often cited as Solow's paradox, the Internet is analyzed as an invention which converges or duplicates existing technologies rather than providing new needs and whilst excellent at reducing transaction costs and improving the efficiency of reflexive labour, is ultimately restricted both by the economics of time .

[ Robert J. Gordon, "Does the 'New Economy' Measure up to the Great Inventions of the Past?", Journal of Economic Perspectives, No 14 (4), Fall 2000, pp49-74 "We see the computer age everywhere except in productivity statistics", Robert Solow (1987). Solow claimed that the paradox was obsolete in due to a high productivity growth rates in 1998 and 1999 - just prior to the famous "dot crash" period - Uchitelle (2000) ]

Whilst substantial microeconomic investments have been noted - a return on investment of over 50 percent, compared to 15 to 20 percent on other investments - on a macroeconomic level postive change is modest at best. Gordon expresses the problem succinctly; "Internet surfing may be fun, but it represents a far smaller increment in the standard of living than achieved by the extension of day into night achieved by the electric light, the revolution in factory efficiency achieved by the electric motor, the flexibility and freedom achieved by the automobile, the saving of time and shrinking of the globe achieved by the airplane, the new materials achieved by chemical industry, the first sense of two way communication achieved by the telephone, the arrival of live news and entertainment into the family parlor achieved by radio and then television, the enormous improvements in life expentency, health and comfort achieved by urban sanitation and indoor plumbing".

[ Brynjolfsson, Erik and Lorin M. Hitt (1996) "Paradox Lost?� Firm-Level Evidence on the Returns to Information Systems," Management Science 42 (4): 541-558 http://portal.acm.org/citation.cfm?id=250441.250455&dl=portal&dl=ACM&CFID=19764964&CFTOKEN=44038057 and and Hitt, Lorin M. and Erik Brynjolfsson (1996) "Productivity, Business Profitability, and Consumer Surplus:� Three Different Measures of Information Technology Value," MIS Quarterly 20(2): 121-142. http://ccs.mit.edu/papers/CCSWP190.html Robert J. Gordon, "Does the 'New Economy' Measure up to the Great Inventions of the Past?", Journal of Economic Perspectives, No 14 (4), Fall 2000, pp49-74 ]

3.5.3 Recommendations and Justifications

As mentioned in this introduction, every society needs to develop procedures by which information is generated and gathered by said society is integrated effectively and efficiently. A society that does not engage in these activities risks extreme misallocation of economic resources, dysfunctional education and research, political crises of legitimation, psychological crises in motivation and eventually fragmentation between, and disintergration of, social institutions. The choice of issues - pedagogy and public opinion, the production of goods and services, the establishment of technological standards and institutions and adaptability and dynamicism - are thus relevant for all societies and not just those advanced contemporary societies.

Modern society has developed the issue of data integration through a careful combination of state regulation and institutions with varying degrees of autonomy for commercial organizations. As the industrial revolution, and especially the industrialization of agriculture, raised production possibilities to the point where the essential maintenance of life processes could be potentially assured, numerous commentators have remarked on the relative importance of deriving greater efficiency from reflexive labour, reducing transaction costs, and to a lesser extent robotics, and computer aided design and manufacturing. Whilst data integration in modern liberal-democratic industrial societies is occurs through; (a) secular and public education institutions (b) mass information media (c) scientific and technical research and development and (d) acclamation and public opinion sensitivity, the critical issues raised in this chapter suggest the need for substantial institutional and procedural change to match the equivalent changes in technology.

[ Thomas S. Ashton, The Industrial Revolution, 1760-1830, Oxford University Press, 1998 [FP 1948] Daniel Bell, Teletext: The New Networks of Information and Knowledge in Computer Society, Basic Books, 1980 Biren Prasad (Editor), Cad/Cam Robotics and Factories of the Future: Integration of Design, Analysis and Manufacturing: 3rd International Conference on Cad/Cam Robotics and Factories of the Future, Springer Verlag, 1990 ]

Perhaps the most immediate critical issue relating to data integration and the Internet is the relation of contemporary communication systems to national and international security. It is, of course, hardly surprising that the widespread use of information and communications technology and their deep integration to infrastructure has meant that such technologies have become both a target and a means to threaten state security, whether by invasion, revolution or terrorism. It is also hardly surprising that the same technology is a means to "prevent, detect and mitigate" such theats. When combined with attacks on other physical infrastructure, targetted attacks on the the telecommunications network (including the Internet) and embedded and real time computers (especially SCDA - supervisory control and data aquisition systems) would be devastating, especially in their ability to heighten the damage of other attacks, reduce response time, and to increase fears by nullifying information services or spreading false information.

[ John L. Hennessy, David A. Patterson, and Herbert S. Lin (Editors), Information Technology for Counterterrorism: Immediate Actions and Future Possibilities, Committee on the Role of Information Technology in Responding to Terrorism, J, National Research Council, 2003 http://www.nap.edu/books/0309087368/html/ High-Impact Terrorism: Proceedings of a Russian-American Workshop, Committee on Confronting Terrorism in Russia, Office for Central Europe and Eurasia Development, Security, and Cooperation, National Research Council in Cooperation with the Russian Academy of Sciences, 2001 http://www.nap.edu/catalog/10301.html especially: The Role of Internal Affairs Agencies in Efforts to Fight Terrorism Under High-Technology Conditions (pp 61-68), Computer Terrorism and Internet Security Issues (pp 181-197), and Preventing and Responding to Cybercrime and Terrorism: Some International Dimensions (pp 198-206) ]

However, in establishing effective contigency plans to counter these new threats it is rare that institutional means are considered. The reason for this is that "the Internet model" - a decentralized network of trust relationships with an equal distribution of power and autonomy - is contrary to existing distributions of power and organization and the political orientation of leaders of such organizations. Nevertheless, this is very the model that is being used by new criminal organizations, terrorist groups, national liberation armies and - very few these days - genuine revolutionary organizations. In a global environment with concentrated infrastructure and organization minor organizations can have a severely disrupting effect, yet at the same time, incapable of becoming an alternative authority. Under these conditions, the most effective means of national and international security is the conversion of standing armies to a federation of regulated militia.

Whilst the idea of the militia has been - falsely - taken over from a state or revolutionary organization representing the will of the people to minority extremists in the United States, the principle and organizational model still holds the greatest level of structural correlation and effectiveness in the global and Internet age. Of course, such a change does have the uncomfortable fact that any government must truly have the consent of the governed and it is well known that whilst militia are an extremely effective defensive force, their capacity to invade is almost non-existent. As the Virginal delegations recommendation of a U.S. bill of rights recognized, "standing armies, in times of peace, are dangerous to liberty, and therefore ought to be avoided". The new information and communication technologies, and the Internet in particular, make the control of centralized regimes increasingly unteneable as their competitors have greater capacity for political networking, economic development and inventiveness.

[ Extremist militia organizations (invariably of the radical right) invariably ignore the "well-regulated" component of the second ammendment to the U.S. constitution. See: David C. Williams, The Militia Movement and Second Amendment Revolution: Conjuring With The People, Cornell Law Review, Cornell University, Vol 81, May 1996, pp 879-952. http://www.saf.org/journal/9_militia.html and The Terrorist Next Door: The Militia Movement and the Radical Right by Daniel Levitas (Author) Thomas Dunne Books; 1st edition (November 1, 2002) Virginia declaration in; Halbrook, Stephen P. "The Right of the People or the Power of the State Bearing Arms, Arming Militias, and the Second Amendment". Originally published as 26 Val. U. L.Rev. 131-207, 1991. See also Halbrook, Stephen P., That Every Man Be Armed: The Evolution of a Constitutional Right, University of New Mexico Press, 1984. ]

The development of models based on decentralized network typologies, with ad-hoc self-organization and emergent properties, peer to peer networks and webs of trust, is perhaps one of the most startling contributions of the Internet outside of its own sphere as a purely technological implementation. The criticisms of Robert Gordon that the Internet does not represent a "great invention" because of a lack of a positive direct economic effect can be challenged on this basis. After all, the introduction of a comparable technology - the movable type printing press - to Europe did not directly enhance living standards. However, it did prove to be a necessary condition for the breakdown of the medieval empire and rise of scientific discourse and industrial technology. Business management research and economic theory is beginning to recognize this, as the widespread introduction of the Internet in the 1990s correlated with the highest level of business failures since the 1930s.

[ See working papers from Inventing the Organizations for the 21st Century, M.I.T. Sloan School of Management, 1994-1999, http://ccs.mit.edu/21c/ Rob Holland, Failing to Plan is Planning to Fail, Center for Profitable Agriculture, University of Texas, 1999 http://cpa.utk.edu/level2/releases/adcreleases/aug1999.htm ]

According to Drucker, the change in organizational typology and the business of knowledge production represents a change in magnitude whereby "[i]n a matter of decades society altogether rearranges itself - its world view, its basic values, its social and political structures, its arts, its key institutions" a perspective endorsed by Martin Kenney who recommends increased worker autonomy, which has been described as a return to a guild-like structure providing a sense of community across companies boundaries, such as professional associations, trade unions and even staffing companies, in contrast to traditional industrial relations where the traditional employment contract of loyal service, job security and insurance benefits, has been replaced by an environment of flexible and responsive temporary employment conditions with temporary employment, subcontracting and outsourcing. The new conditions clearly complexify industrial era concepts of careers and even bureaucratic advancement to new levels of adaptability and uncertaintity.

[ Peter Drucker, Peter Drucker on the Profession of Management, Harvard Business School, 1998, p113. See also pp113-150 and Peter Drucker, Post-capitalist Society, Harper Business, 1993 Martin Kenney, The Role of Information, Knowledge and Value in the late 20th century, Futures 28(8), Elsevier Science, pp695-707 Robert Laubacher, Thomas W. Malone, Retreat of the firm and rise of the Guilds: The Employment Relationship in an Age of Virtual Business, MIT Initiative on Inventing Organizations for the 21st Century, Working Paper #033, M.I.T., 2002 [FP 2000] http://ccs.mit.edu/papers/pdf/21CWP033.pdf Peter Capelli, The New Deal at Work: Managing the Market-driven workforce, Harvard Buisness School, 1999 Edited by Michael B. Arthur and Denise M. Rousseau, The Boundaryless Career: A New Employment Principle for a New Organizational Era, Oxford University Press, 1996 ]

It would be wildly improper of course to suggest that these new conditions represent a problem-free new environment for capitalist and workers. New versions of old problems remain, as enterprises engage in speculative failure and unfulfilled commitments - "vapourware" - and of course, the ubiquitious universal behaviour of business - to establish monopoly conditions. Likewise human resource management literature is replete with comments about "empowerment" as part of TQM (Total Quality Management) of "high performance systems" in the new enterprise conditions. Such fashionable terms are essentially a cover for a lack of industrial democracy and genuine workplace participation in management and what empirical evidence does exist certainly suggests that claims of "empowerment" in the new work conditions are comprehensively rejected as "non-empowering" by employees and even advocates such as Adrian Wilkinson (et al) recognize problems between a common language concept of "empowerment" and the practise by employers. Such a gulf is inevitably in a class divided society whereby the investing class in social production also determine the management of that production. If the organizational revolution of commerce is to have real effectiveness and represent a genuine improvement for the conditions of workers then the word "empowerment" must equate with the meaning of empowerment - democractic management. Only under such conditions can an enterprise genuinely claim to have a structural correlation with the Internet network typology.

[ In Search of Stupidity: Over 20 Years of High-Tech Marketing Disasters by Merrill R. Chapman (Author)Rick, Merrill R. Chapman, APress, 2003 and F'd Companies: Spectacular Dot-Com Flameouts by Philip J. Kaplan, Simon & Schuster; 1st edition (April 2002) Jay A. Conger, Rabindra N. Kanungo, The Empowerment Processs: Integrating Theory with Practise, Academy of Management Review, 13(3), pp471-482 and and Cynthia Hardy, Sharon Leiba-O'Sullivan, The Power Behind Empowerment: Implications for Research and Practice, Human Relations 51(4), pp451-483 and Adrian Wilkinson, Tom Redman, Ed Snape, Mich Marchington, Managing with Total Quality Management: Theory and Practise (Management, Work and Organizations), Palgrave McMillan, 1998 John L. Mariotti, The power of partnerships : the next step beyond TQM, reengineering, and lean production, Blackwell Business, 1996. David Buchanan, James McCalman, High Performance Work Systems: The Digital Experience, Routledge, 1989 Carole Pateman, Participation and Democratic Theory, Cambridge University Press, 1970 and Michael Poole, Towards a new Industrial Democracy: Workers' Participation in Industry, Routledge and Kegan, 1986 Bill Harley, The Myth of Empowerment: Work Organisation, Hierarchy and Employee Autonomy in Contemporary Australian Workplaces, Work, Employment and Society, Vol 13 (1), pp41-66, 1999 Adrian Wilkinson, Empowerment: Theory and Practice, Personal Review 27(1), MCB University Press, 1998, pp40-56 and Adrian Wilkinson, Grahan Godfrey, Mich Marchington, Bouquets, Brickbats and Blinkers: Total Quality Management and Employee Involvement in Practice, Organization Studies, 18(5), EGOS, 1997, pp799-819 ]

Changes in organizational structure however must also correlate with changes to organizational practise. Although transaction costs have been lowered with the widespread introduction of the Internet, equating in more efficient use existing resources, total global output and industrial production growth rates have by no means increased proportionally, and in places where Internet density is higher, current indications is that growth rates and industrial production has actually decreased as the table below shows. To be sure, the fact that the nations with the highest industrial and general growth rates are those which are rising out of the most impoverished conditions and are thus undergoing a similar transformation as advanced nations did from the period of 1830-1950. But it must be nevertheless disconcerting to claims of a "new information economy" to be faced with the facticity that such an econonomy correlates to a reduction in industrial production and low rates of increase in the gross domestic product.

Nation, Internet Connectivity, Industrial Production, Gross Domestic Product U.S.A., 0.591, -0.4, 2.4 Japan, 0.441, -1.4, 0.2 Germany, 0.385, -2.1, 0.2 France, 0.284, -0.3, 1.2 United Kingdom, 0.574, -0.34, 1.8 Italy, 0.334, -2.8, 0.4 Mexico, 0.034, 4.9, 0.7 Russia, 0.124, 3.7, 4.3 Brazil, 0.079, 1.5, 2.3 Turkey, 0.037, 8.5, 7.8 Thailand, 0.019, 3, 5.3 China, 0.036, 12.6, 8 Phillipines, 0.053, 4, 4.4 Egypt, 0.011, 2.2, 3.2 Indonesia, 0.019, 4.9, 3.7 India, 0.007, 6, 4.3 Pakistan, 0.008, 2.4, 4.4 Vietnam, 0.005, 10.2, 7 Bangladesh, 0.001, 1.8, 4.8 Nigeria, 0.001, 0.4, 3.2 [Figures from CIA World Factbook 2003]

Multiple reasons are given here for this correlation. In the first instance, recognition is given that the new information economy has had insufficient time for the positive effects to be felt. After all, as mentioned, it was a substantial period of time before the movable type printing press translated into the distribution of scientific discoveries, institutional reform and industrialization. Secondly, during this period there is a necessary over-emphasis on the accumulation of information and its transferrence to the new medium, rather than effective production and processing. Thirdly, this necessary over-emphasis is complemented with an unnecessary over-emphasis on reflexive labour (e.g., administration) an under-utilising the medium for direct productive use (e.g., programming) and especially the industrialization of the service economy (i.e., direct reduction of transaction costs). Fourthly, continuing profitable, yet economically counter-productive activities, are allowed by major social systems (e.g., private ownership of natural resources) which further distort the allocation of computerised resources. Finally, and most remarkably, the abdication of the state system from traditional responsibilities in advanced economies and the inability of the privitised commercial sector to take up these responsibilities in an effective manner has further hampered the productive potential of information and communications technologies.

As Maryam Ossooli recognizes, "[t]he very foundation of law and the very core concepts of taxation may need to be changed in order to address the economic challenges posed by an exponentially growing Internet and the phenomenon of electronic commerce. The continuing viability of local based sales tax and the tests of taxpayer-state contact and activity-state contact are considered suspect in the new economy. It is no longer possible to determine effectively whether a business directly benefits from state infrastructure in electronic commerce, who the consumer is and where they are located, and nor is it possible for effective to conduct effective cross-state, let alone international taxation. These existing problems will undoubtably increase as Internet commerce expands, stripping governmental authorities from their ability to collect transactional taxes. The question of the legitimacy of taxable activities is rarely raised beyond that espoused by John Locke's "mixing of labour" and appropriation (although many discuss what it should be used for, or whether it satisfies horizontal or vertical equity, or whether a tax is efficient) to engage in that discourse, one needs to consider the site-rent taxation system for natural resources of Henry George - indeed the collective rental from collective goods is increasingly becoming the only viable, just and environmentally sound form of "taxation".

[ Maryam Ossooli, The State & Local Tax Dilemma and the New Economy, Georgia State University College of Law Law & the Internet, 2001 http://gsulaw.gsu.edu/lawand/papers/fa01/ossooli/ John Locke, Two Treatises on Government, [FP 1690] especially II:26-34, http://www.gutenberg.net/etext/7370 Milton Friedman, Capitalism and Freedom, University of Chicago Press, 1963 and Robert Nozick, Anarchy, State and Utopia, Basic Books, 1974 Henry George, Progress and Poverty: An inquiry into the cause of industrial depressions and of the increase of want with increase of wealth... The Remedy, http://www.henrygeorge.org/chp1.htm ]

A related issue raised in this section was concerns over unsolicited commercial or bulk email and the "taxation" by senders to users, both individually and collectively. As elucidated, technical solutions, as appealing as they could be, are problematic to bandwidth considerations as many "spammers" utilize false or incorrect email addresses in the return header and redirect the reader within the message. Another technical innovation, "junk-mail" filters on mail programs, whilst fairly efficient, increasingly let innocuous sounding spam through and will occasionaly "trash" a genuine email. The fact that national legislative attempts suffer the problem of simply driving spam-producers outside the reach of legal authorities, beckons the prospect of international legislative cooperation against this expensive and irritating practise. This will only work if they are based entirely on opt-in commercial and bulk email preferences. As recent experiences have shown, even "single-use" bulk email legislation is inefficient. A new technical alternative, the Trusted Open Email Standard (TEOS), supported by the Coalition Against Unsolicited Commercial Email, could provide valuable assistance in ensuring the identity of senders thus reducing the capacity of spammers and email fraud.

[ Trusted Open Email Standard http://www.eprivacygroup.net/teos/ The Australian government adopted comprehensive anti-spam legislation in 2003. http://scaleplus.law.gov.au/html/pasteact/3/3628/top.htm ]

Whilst the adoption of site-rental for natural resources may be sufficient to conduct the majority of natural monopoly collective activity (maintenance of militia defense, legal enforcement of internal contract and rules, health and welfare standards) education and information production and reproduction evidently requires a different method. In the later stages of liberal and in regulated capitalism, as well as mixed or socialist economies, the state replaced the highly inefficient and ineffective free market mechanisms to ensure a general level of education standards. This model is increasingly proving to be no longer viable either for the economic and commercial conditions of the new economy. However the current response - a downgrading of the public education system, the introduction of increasing "user-pays" education, the corporate colonialisation of teaching material and funding and the conversion of higher education institutions into "think-tanks" for the needs of business - has increasingly led to inequalities of opportunity, a weakening of general interest or public good research in tertiary education. Rather than locating the causes of these problems in individual action or as the result of vested interests (poor policy) it is important also to note the institutional weaknesses of education that makes it subservient to corporate interests in the new economy.

[ For a strong advocacy of "capitalist schooling" see: Herbert J. Walberg, Joseph L. Bast, Education and Capitalism: How Overcoming Our Fear of Markets Can Improve America's Schools, the Hoover Institution, Standford University, 2003 http://www-hoover.stanford.edu/publications/books/edcap.html#toc John Dawkins, Higher Education: A Policy Statement, Australian Government Publishing Service, 1988 ]

The problem surprisingly well recognized. The skills representing the minimum effective level to to participate in the commercial and political system, is widely considered a democratic requirement and functional necessity for contemporary society and has been throughout most of the modern period. For that reason, the "trinity" of free, public and secular education has been strongly advocated and commands extraordinary success. The fact that these principles remain under control of non-adaptive institutions tightly regulated by outside sources is however no justification for their conversion into competing private enterprises. Such an action, due to the peculair economic and political significance of skill aquisition and use, would be counterproductive. Not only are basic work skills acquired by young adults at a time when they are least inclined to engage in a deferral of gratification, firms are less orientated to engage in skills investment due to the increased mobility of labour; from the viewpoint of an individual employer they represent a collective good. Whereas it is inevitable due to specialization of work processes that some skill development must occur in the enterprise, it is antithetical to free market operations that these are introduced to the general education system.

[ c.f., the Preambule to the 1946 French Constitution "The Nation guarantees equal access for children and adults to instruction, vocational training and culture. The provision of free, public and secular education at all levels is a duty of the State" http://www.elysee.fr/ang/instit/text2.htm Wolfgang Streek, Skills and the Limits of Neo-Liberalism: The Enterprise of the Future as a Place of Learning, Work, Employment and Society, Vol 3, No 1, pp89-104, March 1989 ]

The institutional change required for education is therefore no less than the systematic autonomy achieved by commercial organizations in liberal capitalism. Education, already holding a relative degree of autonomy, must become an independent subsystem in its own right - not disestablished from the state, in the way that state and religious institutions are, but independent. Of course, the practical reality is that schools need to be built, teachers need to be paid and students taught vocational and general skills. As the primary beneficiaries of education standards, collective commercial corporate taxation on profits should be directly used to fund the general education subsystem, at a rate determined democratically by the political-administrative subsystem, which presumably is intelligent enough to invest more in a collective good collectively than individuals can in a collective good privately. The principles of education - free, public and secular - remain, with a fourth in place - "open" - referring to the proposition that any information generated by any education institution is "open source", freely available in the public domain. Directed commercial sponsorship of particular proprietory skills and confidentiality clauses between education institutions and commercial institutions would be prohibited - as specialist education is the duty of the commercial body or individual enterprise and as such is contrary to a system dedicated to open knowledge - although direct financial sponsorship of particular educational institutions would encourages high competitive standards. A general schematic model can be expressed as follows:

Political Subsystem - Steering Perfomances - >> Economic Subsystem - Financial Support - >> Education Subsystem
Political Subsystem << - Natural Resource Rents - Economic Subsystem << - Open Source Knowledge - Education Subsystem

Further: Political Subsystem - Social Welfare Benefits - >> Cultural Lifeworld << - Intellectual Motivation - Education Subsystem
Political Subsystem << - Mass Loyalty - Cultural Lifeworld - Meaningful Expressions - >> Education Subsystem
Economic Subsystem << - Labour and Consumption - Cultural Lifeworld << - Goods and Services - Economic Subsystem

Of course, such significant changes to the institutional status of general educational bodies also suggests significant changes in the procedures of teaching, orientation and evaluation which correlate more strongly with the new economic conditions, work practises and the acquisition of cognitive skills. Whereas the distortions of cognitive processes resulting from class divisions have been known and resolvable since the pioneering work of Jerome Bruner and Paulo Freire and their implementation has been less than comprehensive. In a similar spirit, the necessity of developing proximal and social intelligence over isolated and individualistic intelligence has also been known since Lev Vygotsky. As Ivan Illich emphasized the school needs to be "disestablished" from the controls of the state or the colonialisation of corporations. Contrary to Illich however rather than a desinstutionalization of education, what is recommended is a flattening of the hierachy between education institutions and political and commercial subsystems. Rather than schools and the education subsystem existing as a function and under the direction of the commercial desires or the direction of the state, the generation of knowledge is a process which includes practical orientations, skill exchanges, peer-matching and reference services. The knowledge generated from these processes provide information from which the political and commercial subsystems can feed from - but the public sphere, the production and reproduction of knowledge - must be independent from their steering demands.

[ Jerome Bruner, The Process of Education, Harvard University Press, 1960 and Toward A Theory of Instruction, 1966 Paulo Freire, Pedagogy of the Oppressed, FP 1970 and Pedagogy in Process: The Letters to Guinea Bissau FP 1977 Lev Vygotsky, Thought and Meaning, 1962 [FP 1934] Ivan Illich, Deschooling Society, Harper and Row, 1971 available at: http://reactor-core.org/deschooling.html ]

An extremely important contribution that an open-source education subsystem can make with the economic subsystem is the provision of de jure standards arising from periods of commercial competition. Standards are, an effect, an anti-competitive device, but one when the timing is right, actually enhances new competition and productivity. David Clark's visually appealing "apocalyse of the two elephants" illustrates the point - if standards are written too soon, when the subject matter is still poorly understood by professionals, then the standards will be dysfunctional. If the standards are written too late then significant investment income will be lost. If standards aren't written at all then connectivity between different institutions will be lost and the economy as a whole will be dysfunctional or at the very least, prone to monopolistic behaviour. At the time of writing, the standards developed by the world wide web Consortium for xhtml and cascading style sheets are sufficiently mature after several incarnations as html that further delay in adopting them as an ISO standard will be extremely dangerous to the functionality of the world-wide web. The same case does not hold for their Platform for Privacy Preference.

[ Andrew S. Tanenbaum, Computer Networks (2nd edition), Prentice-Hall, 1989 pp30-31 see also: Timothy Schoechle, The emerging role of standards bodies in the formation of public policy, Computers, Freedom and Privacy Conference, Ontario, 2000 www.cfp2000.org/papers/schoechle.pdf Charles P. Kindleberger, Standards as public, collective and private goods, Kyklos: International Review for the Social Science 36(3), Blackwell, 1983, pages 377-396. Brian Kahin and Janet Abbate, eds, Standards Policy for Information Infrastructure, Cambridge: MT Press, 1995, in particular William Lehr's chapter, Compatibility standards and interoperability: Lessons from the Internet pp121-147 ]

Open source knowledge, especially when mediated through the education system, will provide a certain death-knell for private commercial monopolistic behaviour. Whilst is undoubtably true, as has been analysed above, that (for example) Microsoft Corporation has engaged in extremely anti-competitive behaviour and has largely escaped fines and other legal sanctions that they cerainly deserved, it is also true that the extremely modest degree of open-source development in computer applications and operation systems significantly aided Microsoft's rise to market dominance in a period when serious competitors (such as the Macintosh System and UNIX System V or BSD) were also proprietory systems, although there was a relatively open standard of compatiability through POSIX (Portable Operating System Interface). It has been through the relatively recent developments of FreeBSD, OpenBSD, Mac OS X and the multiplicity of distributions of Linux, that all proprietory standards in software development has been seriously challenged. This process will inevitably continue and it must be suggested at this point that through commercial competition new de facto standards will rise in the future and most immediately with interoperability as the new challenge.

[ See: IEEE POSIX Certification Authority http://standards.ieee.org/regauth/posix/ National Institute of Standards and Technology, Testing Laboratories and Validated Products for NIST POSIX (FIPS 151-2) December 31, 1997 http://standards.ieee.org/regauth/posix/finalregieee.html ]

One increasingly critical issue noted is the institutional management of the domain-name systems. Not only have there been conflicts between Verisign, the owners of the global top level domains .com and .net, and ICANN, national governments have also engaged in disputes with ICANN over the management of ccTLDs. Clearly, in an institutional sense, as the Internet domain name registration tends towards monopoly - it is theoretically possible to establish a competing "internet" but it would not be viable - that a multilateral governing authority is the only viable, just and legitimate solution. In the short term, a United Nations specialized agency with open individual membership is recommended to "represent the needs of all the world's people" (Kofi Annan), rather than domination by a largely unelected body from advanced economies. One serious issue that such an authority will have to consider is the development of new gTLDs, the commercial sale of ccTLDs, and the specific ownership of gTLDs by the United States government (.gov, .mil) or Educause (.edu). Specifically, the ccTLDs should not be sold as they represent a collective good held in trust by nation-states; nation-specific gTLDs must be transferred to country-specific equivalents (i.e., .gov.us, .mil.us, .edu.us). In the longer term however, a new type of index to represent domain names must be developed that more accurately reflects the institutional location and authority of a particular domain name. The Internet version of "rent-seeking" must end, for the very same reasons that it is economically destructive and unjust in the real world.

[ VeriSign sues ICANN to restore Site Finder, Declan McCullagh, CNET News.com http://news.com.com/2100-1038-5165982.html Story last modified February 26, 2004, 5:19 PM PST Thirty-eight percent of domain-name registrations occur through ccTLDs. Michael Geist, Governments and Country-Code Top Level Domains: A Global Survey, Preliminary Report, December 2003 http://www.michaelgeist.ca/geistgovernmentcctlds.pdf United Nations ponders Net's future, Declan McCullagh, CNET News.com http://news.com.com/2100-1028-5179694.html March 26, 2004, 4:00 AM PST See also Online Forum on Internet Governance Launched, United Nations Press Release, PI/1560 http://www.un.org/News/Press/docs/2004/pi1560.doc.htm Global TLD Registries are available at: http://gnso.icann.org/gtld-registries/ The only institutions that may receive a .edu domain are post-secondary institutions that are recognized by the United States government's Department of Education's Nationally Recognized Accrediting Agencies. http://www.educause.edu/edudomain/policy.asp See: Lev Lafayette, Submission to Council of Ministers on the .tp Domain Name and the Official English National Name of "Timor Leste", June 2003 http://groups.yahoo.com/group/ittimor/message/618 Note that this recommendation did not include the sale of the ccTLD, unlike the situation with .tv. ]

The institutional autonomy of education as subsystem dedicated to open-source information will also have profound effects on the formation of public opinion. Existing criticism of mass media distortions due to the influence of centralized capital or authority can be tempered by fully accessible academic publications and contributions to a rigorously peer-reviewed Internet public library. Such open resources will also enhance the prospect of "electronic democracy". Not only should the proven technical possibility of "electronic voting" be implemented throughout the political-administrative subsystem, but also that subsystem must open forums whereby representatives would be required to justify and defend their decisions to their constituents - a genuine "Internet town hall".

Hosted by www.Geocities.ws

1