Being a pro-ish singularitarian, I like to believe, like Ray Kurzweil and others have written, that we’ll all be part robot at some point in the not too distant future. There is a following, with some very high powered executives, futurists and scientists at the core. For example, as written in Bill Joy’s Wired article from 2000, there’s a genuine sense of fear and belief that we need to take action against this. Bill Joy is not your average crazy person, one of the founders of Sun Microsystems, writer of the vi text editor and according to wikipedia:

As a UC berkeley Grad Student Bill worked for Fabry’s Computer Systems Research Group CSRG in managing the BSD support and rollout where many claim he was largely responsible for managing the authorship of BSD UNIX, from which sprang many modern forms of UNIX, including FreeBSD, NetBSD, and OpenBSD. Apple Computer has based much of the Mac OS X kernel and OS Services on the BSD technology.

So what does it mean? Well I’m sure there’s plenty of information out there by people far more qualified to talk about it than me, so I won’t go into the details that I believe, but on watching a couple of seminars from the Computer History Museum (search the term on youtube), I see a lot of things that we have today that we pretty much had working in the 60’s.. things that we praise ourselves for and consider new and innovative today. Most of these things are not new ideas, but refinements and technological advances. Things that have become reality to the public.

So what if any, is the significance of, say – the mobile phone, the iPod, or the personal computer? No doubt these and many many other things which we have come to take for granted are significant, but are they really enabling technologies? They have a huge social, commercial and environmental impact, changing the way we think about and view information, money and entertainment. We have been becoming a more connected society for quite some time, where even regular members of the public (as oppose to military, science and education), the majority in fact, are connected in some way. But are the consumers going to see the singularity or will we have to witness it post singular.. in the same way that only the super rich will become space tourists in the near term, only the super rich will be able to afford to become singular part robot super-beings.. the iCulture will be left behind.

I suppose this was in part prompted by an article about a ‘portable web server, right on your iPhone’, which had comments of wow’s and general elation, as if this was the killer app that would change the world. But what would it change? Isn’t it just a sign that we really don’t understand technology en mass? Servers in certain forms are running the back end of all kinds of services. OS X is a BSD operating system, and as such uses the client/server model for many services in ways that I’m not qualified to comment on. If the iPhone is a cut down OS X, then a server on a phone is really nothing more than getting excited that the thing runs software.. it’s implicit in it’s nature! The OS is using a client/server model in many ways so it’s really nothing new. Yet still the public get excited. We are sheep and understand nothing. Ok, so it’s a WEB server.. what is it’s purpose? No signal or no battery = no website?

I can see I leave myself wide open to criticism on this, an unresearched and cited comment making sweeping statements about the vast human population, insinuating that we’re a bunch of idiots. A great quote springs to mind

a person is intelligent but people are stupid

It’s pretty much why marketing works in my opinion. Hyperbole.

In relation to that, I was interested to read a review of a new Canon camera, the eos 50D. If my memory serves me well (it’s quite a long review), the sensor from the 40D to it’s replacement 50D went from 10 Megapixels to 15 Megapixels. Quite a jump you would think, or is it? Ignoring the other improvements in the camera itself, they are very specifically comparing the two and the image quality. Now it turns out that because the sensor is so dense, the light becomes more noisy.. read the review, it’s quite interesting. The point is, the sensor with more MP’s is actually no improvement on the older sensor. Of course Canon won’t tell you this, and possibly the reviewers are wrong, but if they are right, it shows that we are in a marketing trap. More = better.. right?

Similar to computer specs that we drool over and compare endlessly, what is the enabler here? At what point do we have enough power to do what we want to do, and the rest becomes wasted? It’s a complicated marketing issue again.

We are open to new threats because of this reliance and acceptance of ‘new technology’. Take viruses or say, a Denial of Service attack. Companies can disappear off the web. If the infrastructure goes down, chaos ensues and if their core income is web based.. then the company can die. Technology has rendered us fragile. The economy is fragile anyway so perhaps living on the edge isn’t anything new, but it’s a double edged sword these days.

Socially we feel empowered by it all, but it’s really just the size and price that has shrunk, and the ubiquity of the technology because it’s become affordable that makes us feel the difference. Are these really new inventions, or just old inventions tarted up and put out to the mass market? Not that there’s anything wrong with that..

I recently saw an old demo of a ‘webcam’ kind of system from 1962, black and white, but real time. Now if 45 years on we are living in the future with our machines that have built in webcams, I feel somewhat disappointed that that’s all we can do and feel proud of ourselves. Of course the infrastructure to make this work mobile and to make the machines portable is progress, but our machines that are orders of magnitude more powerful still have unreliable slow interfaces (at least the few I own do). What have we really used all that power on? Or is it just that my 4 year old dual G5 machine with 4.5GB RAM can’t handle a few tabs in a web browser and a few programs open in the background? What am I doing wrong? It was touted as able to handle all kinds of complicated things when I bought it.. how is it not able to now? I suppose as the processor has been phased out, the software isn’t optimized to the same degree..

This makes me want to switch to a low powered machine running a light flavor of Linux, one where if I can work out how, I can make sure that the UI is the priority, and the programs are secondary, since the UI is my user experience. Perhaps just updating to a newer machine would solve it, but how the last 4 years is enough to kill my machine when in 40 years we’ve not progressed enough!? OK, I’m laboring the point.

The real point that this post was meant to make was that the exponential growth in technology is a slippery concept, underlying the slippery singularity. In terms of transistors per $, or operations per second, we’re exponentially progressing as the charts seem to show, but what are we doing with this new found power? Flashier graphics, higher resolution for images and videos.. basically stuff we could do already, just better.

I say ‘we’ as in ‘the consumers’, I suppose this shouldn’t be a surprise. In military and HPC science markets perhaps the progress is more tangible but less understandable to the public. Maybe I should make more effort to go to 2600 every month, join an AI hacking group, or a robotics group.. or find some people to work with that are positive about progress whether it’s towards a singularity or not.

 

Comments are closed.