Skip to Content
Uncategorized

Net Compromise in Tunis

Both sides were able to walk away from last week’s summit debate on Internet governance claiming victory.

“We reject kings, presidents, and voting. We believe in rough consensus and a running code.” So goes the informal motto of the Internet Engineering Task Force – the techies who devise the standards behind the Internet and who, in one form or another, have controlled how the network evolves since its inception. Yet this anarchic ethos seems increasingly unsustainable as the Internet spreads into more spheres of society and governments want more control. On November 19, at the United Nations’ World Summit on the Information Society in Tunisia, the nation-state roared back into cyberspace.

The major issue at the three-day Tunis meeting, which was supposed to focus on bringing more technology to the Third World, turned out to be who should control the underlying address system of the Internet. Most countries are resentful that the United States oversees the system, viewing it as yet another example of unilateral U.S. power. Like Augustine going to Carthage, where vice and temptation waited, so too did nations arrive in the modern-day city, with an eye for unseating the ruling power of this era.

The United States, on the other hand, was wary of an ambush and launched a massive international lobbying campaign in the weeks before the summit. In the words of a classified diplomatic cable from U.S. secretary of state Condoleezza Rice and secretary of commerce Carlos Gutierrez to the British government, sent on November 7 and obtained by this reporter: “a new intergovernmental structure would most likely become an obstacle to global Internet access for all our citizens.”

But whether public access or the power of nation-states was first in mind, the main outcome of the summit was a 20-page document called the “Tunis Agenda for the Information Society” that all countries could use to trumpet victory. The United States hailed it because there was no change to the Internet Corporation for Assigned Names and Numbers (ICANN), the private, nonprofit organization set up by the U.S. Department of Commerce in 1998 to administer the day-to-day operation of the domain name system. Meanwhile, other governments returned to their capitals content that a process was formally begun to place Internet matters on a multilateral, intergovernmental footing.

In truth, both sides are right to claim a win. The agreement calls for the creation of an “Internet Governance Forum” to be established by the United Nations before mid-2006. It will not have any binding powers but will be a way to continue the dialogue that the UN summit began. Stakeholders other than governments – such as industry and so-called “civil society” groups that advocate special causes like free speech – will be a part of the process too. The forum will not be limited to discussing the names-and-addresses issue but will examine more mainstream matters involving cyberspace, such as spam and network security, that do not fit comfortably in existing intergovernmental organizations.

How is it that an agreement was reached after 20 months of bitter debate since the first round of the UN summit in December 2003 in Geneva? Mainly, because it was expedient. Even before the 12,000 delegates and 50-plus heads of state arrived for the summit’s November 17 opening, it was apparent to every country that there was no way to settle the issue. So the best that all governments could obtain would be a way to continue discussions further in a forum all felt comfortable with.

For other countries, that meant “not ICANN.” For the United States, which stood almost entirely alone in its position on retaining oversight of ICANN, it meant any venue that had no powers and in which ICANN would not be the only issue on the table.

In the end, all sides went home with something. But the victory may ultimately prove a Carthaginian peace for the Internet itself, which risks getting scorched in the process.

Consider the very negotiations themselves: the penultimate sticking point was a single word. Nations of the world could not agreed on whether a new way to manage the Internet’s underlying technology might be created “if” justified (as the United States wanted), or “when” justified (as Iran urged, supported by Saudi Arabia). After a 30-minute diplomatic debate, the ever-pragmatic British broke the impasse – and so Point 61 of the 122-point document uses the term “where” justified.

The distinction makes no practical difference, but the debate was deeply revealing. It showed the rigidity of the 19th-century nation-state system colliding with the 21st-century ethos of the amorphous, ever-evolving Internet. It was just the sort of thing that the United States wanted to avoid when it created ICANN.

Indeed, American control over the Internet’s addressing system is useful, because the country has taken a largely hands-off approach that other governments might not follow, and because it has ensured that the Internet’s structure reflects American values – not so much those of G.I. Joe, but of Woodstock-era hippie academics.

That is not to say that some sharing of power is not called for – it is. Under the current domain name system, countries do not have complete sovereignty over their two-letter country codes (like .fr for France). The US acknowledges that these suffixes should belong to the respective countries, and the agreement last week in Tunis insists on it. This will take time to happen technically but eventually will.

The domain name system is, in fact, two decades old this year – and has not evolved much in that time compared to other Internet technologies. Meanwhile, much traffic, from instant messaging to peer-to-peer networks, travels outside the ICANN-sanctioned domain name system – underscoring that the Internet is not immutable and that one day all the political bickering may not mean much.

Yet legacy technologies can be awfully persistent. The British Admiralty used Morse code as a back-up for ship communications as late as 2000. Technologies still in widespread use are difficult to unseat, regardless of kings and presidents – or even rough consensus and a running code.

Kenneth Neil Cukier covers technology and regulatory issues for The Economist in London.

The photo accompanying this article on the TechnologyReview.com home page shows President Zine El Abidine Ben Ali of Tunisia addressing delegates at the 2005 World Summit on the Information Society. It was taken by R. Guerra and is reproduced under a Creative Commons license.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.