8 ideas that will revolutionize the 21st century - Ben Hammersley

Link: http://lesblogs.vpod.tv/2005/12/15h4516h45_8_id.html

Which, to put the meat before the fish, are:

  • Information wants to be free

  • Zero distance

  • Mass amateurisation

  • More is much more

  • True names

  • Viral behaviour

  • Everything is personal

  • Ubiquitous computing

His early historic examples of technologies which met a limit - horses and swords - were portrayed as self limiting. The last Pattern sabre was better than the previous Pattern sabre, which in turn was the best cavalry sword. You don't see them much in wall displays, as they're not very pretty. You could make a better sword still, but someone invented trench warfare and the machine gun. So the only swords are now decoration or sporting equipment, where their evolution is intentionally limited by the Federation Internationale d' Escrime. Similarly, you could engineer better horses by genetic enhancement and composite augmentation the skeleton, but we have cars and planes as transport, but what you can do to race horses is limited by the relevant authorities.

The history of these technologies is not that they hit an internal limit, but they were made obsolete by other technologies, became items of nostalgic and leisure interest, and the evolution consciously limited by aesthetic concerns or sportsmanship.

Comparing these technologies to 'information technology', Hammersley put up a log/lin exponential graph showing Moore's law, but then admitted to not knowing what 'exponential' actually means.

You cannot compare swords to 'information technology'. Even the Wikipedia Moore's Law illustration shows five different technologies just in the computational hardware domain - a better comparison would be swords to valves, and 'war-fighting technology' to IT. People still use valves for niche applications - guitar amps, some retro kit - but the mass market moved on to a different technology. Whereas we dropped as much explosive in the first three days of Gulf War I as in World War 2, so war-fighting is still on a pretty high up-curve (though I haven't enough meaningful data points to say whether it's exponential, or even whether it matters if it is).

What does differentiate information technologies from other technologies, as Hammersley points out, is that they augment the creation of technologies. Improvements in the sword relied on improvements in metallurgy, which relied in turn on the rapid industrialisation of the late 19th century. Although factories and foundries make the material for better factories and foundries, IT and particularly media for open communication between otherwise disjoint groups allows intellectual boot strapping.

On the octet above, Hammersley stops his explanation after the first three.

He only illustrates the third - mass amateurisation - with a direct example, that of comparing his early experiments with cine film and the current consumer-level video camera technology. Nowadays, you can get good at video with lower personal cost. That has more to do with the market than democratising the technology - with a cine camera (or a tin-can), you own the technology, not Apple or Sony.

When considering the first, he points out the governments push back against it. This may be true, but technology streams churn faster than legislation. Censorship at a level that democracies will tolerate has to target a particular technology, and so has the effect of bringing forward its obsolescence.

Zero distance matters as specialist communicacies can form, which then mature into thought shops as the technologies they promote become commonplace - the evolution of the xml-dev mailing list is a case in point.

When talking about blogging, Hammersley says 'Blogging is all of these things'. Blogging, Podcasting, Videocasting are current technologies that illustrate these. Blogging seems to have out-competed BBSes which out-competed email lists for certain modes of discussion. Podcasting and video don't seem to promote discussion (this is a written reply to a video, not me on-camera), so IME have less of an effect on viral behaviour. That is different for non-techies - it's not unusual for something on myTube to have a video response, but that really do have to different the creation of technologies and their adoption and their consumerisation.

Certain video blogs - such as the videocast's on Jon's Radio - may have an acceleration effect on technology creation, but the vast majority don't. You also cannot properly absorb technology any quicker - watching Jon demonstrate something doesn't mean you don't have to learn how to use it.

One group of people - the developers of Microsoft's Outlook web access - invented XMLHttpRequest - which allows you to develop richer web applications. Seven years later, we have Ajax toolkits and many professional web application creators use it, and there are now companies offering simple Web2.0 hosting, which are getting the technology from adoption by specialists and into amateurisation. Personally, I'm bored already.

I'm not aware of non-specialists who create technology (not all specialists are professionals). Exponential growth in videocasts may not effect the specialists - you require higher information density, and you have to learn your stuff. Being able to put a video blog together is nothing to do with being able to squeeze more transistors onto a chip, or even to create a video blogging service.

The scaling effect of ubiquitous communication will only accelerate technological change up to the point where those people creating technology are fully in communication with everyone they need to be. I'm already limited not by my ability to find the information I need to create, or people to talk with, but by my ability to absorb the information. I don't believe IT will provide much more acceleration in the creation phase. What viral communication does do is lower the time from adoption to consumerisation.

The 19th century engineers who pioneered the industrial revolution also believed that they 'were the flat-mate[s] of Leonardo da Vinci'; everyone is a child of their own renaissance. Since the early 20th century, it has been impossible for one individual to understand all that humans know of maths or of physics. I'm sure I understood a higher percentage of computing technology when I was a teenager than I do know - I've made conscious decision to drop anything to do with low-level graphics hardware programming, which I did my MSc project in 14 years ago.

Our renaissance has a higher churn - Web2.0 will have much a shorter a lifespan than the small-sword.

I can see two outcomes - either IT really is different, and we have a future in which more and more people rely on technologies further and further away from the understanding of the average individual, or the history of the horse and sword will repeat itself and the movement of technologies from professional into amateur use will inhibit the evolution of the technology to one that is democratically acceptable.