Goodbye Google Reader

So Google Reader is going. It’s a shame, I’ve had so much use out of it, but in the last couple of years I’ve almost completely stopped. I just use Twitter now. To me, part of the art of Twitter is to follow enough of the right people so that you probably won’t miss a story that would interest you. Groupthink determines if like minds will bubble the important to my attention. It does save me a lot of time reading things I don’t want to to read.

The other thing to note with Google Reader’s demise is how they are going about closing it. They have given three months notice, and they link to the tool they have had set up for ages that allows you to remove all your data cleanly, in a format that you can use elsewhere (for me mainly, the OPML file I need for all my subscriptions should I ever set up a feed reader again). I can remember other useful services closing, including Google ones in years gone by, where this data transporting was between hard and impossible. They definitely need a little commendation for doing that.

2014 post update

Digg released a nice clone of Google Reader, branded as Digg Reader. If you want to carry on as before, use that!

Bowie on the Internet

Today Jack Schofield tweeted an old article from the Guardian in 1999, Tomorrow’s Rock ‘n’ Roll , where David Bowie talks to Emma Brockes about his recently-discovered love for the internet. It’s a fascinating document of the time for me, a few months before I started working “in the internet” (for want of a better description) and at a point where I was starting to create my own sites more, and think deeper about what could be done online. I think it’s worth looking a little closer at a few lines from the interview, as there are some interesting insights into how Bowie saw things developing at the time.

Something that is very familiar to me is that he started to learn more about the internet when he created his own site, . Just like the early home computer gamers like myself learnt about programming from trying or succeeding in creating our own games, and from inputting code ourselves, I and many others learnt a lot from creating our own sites. There is a bit of a disconnect now with services like Tumblr, Facebook, Twitter et al where the creation of content is almost completely separate from the code underneath it. Such a disconnect of course brings them to a much wider audience, but often at the expense of a deeper knowledge of how things work. However when when he talks of “The communication between me and my Web audience has become more intimate than its ever been”, it really sounds a lot like the positive side of the experiences many creators have had with Twitter. Twitter isn’t the invention here, it’s simply that I think it’s made that communication even easier, and open to many more than ever.

What really caught my attention was his understanding then of how the internet could affect music. He talks of how he “would like to see record companies changing their delivery systems so that they could send MP3s (an instant free download) straight to the record stores via an ISDN cable. The stores could then burn the CD for them on site. It would reduce the packaging costs and they would make a fortune out of it”. It’s easy to have a bit of a giggle at this, but I think he actually got it in the main. The point to him I think was that the MP3s were already approaching instant, and also that in some ways, they were valueless, 1s and 0s. As he understood it, the difficulty and the aim for the record companies instead was to find a way of making these “free” files have a value, add packaging, add value. It almost seems weird to him that they could charge for the download alone.

Where he’s absolutely spot on is the remix culture of the web. I’d never really thought of him as part of it, but his personas and reinventions clearly were remix culture, and even his music at times, which he freely admits in the article. And he just gets the issues that came with that and with the increased ability to just steal content

But on the issue of straightforward piracy, I tend to go with the flow. I am not indifferent to it, but I look on it as a lost cause. The way our society constantly breaks down parameters has led to the disintegration of intellectual property. Whether thats a good thing or a bad thing is to an extent irrelevant, without a doubt things in the future are going to be different.

He wasn’t the first person to say these things by any stretch, but it is interesting to see someone in retrospect who mainly got it, as say opposed to a Metallica, or almost every record company. All of whom had to pretty much be dragged kicking and screaming into the next century (again, as he predicts they’d have to be)

My Dream Application

Write, publish and read in one place. Writing. Twitter. Email. Facebook. Blogging. MSN. IRC. etc

I am several personas. I want to write things. I then want to send them to selected audiences.

I am a reader. I want to read some or all of what my selected audiences write.

I am a person. I want to talk to people suisse viagra g.

The things that are stopping me from doing all of these are protocols/APIs/rules/T&Cs.

I’d like one app that lets me do all these things. Please.


Tonight I have found that I have lost a scanner. Well it is definitely in the house. But I cannot see it. For other pieces of tech.

It is this that has lead me to the realisation that I have too much tech. New Tech. Current Tech. Tech nearing the end of its life. Dead Tech. Retro Tech. Last Gen Tech. Old Media Formats. Remnants. Items copied to digital, and stored, not removed. Things of affectation. Ideas that didn’t work out. Things that might work in future. Prospects, Hopes, Dreams, Dead ideas. All in tech form. All somewhere in this house.

So the obvious next step is to remove some of them. And instantly that fills me with a dread. What if I might need it? What if I might want it? What if at 3am on Christmas Morning 2031 I suddenly realise I’d quite like to play that old Wonderswan game? What if I want to connect this to that? Or that to the other? What if that screw fits something when I need a screw and I’d got rid of it.

It is of course a hoarding instinct. And I recognise its dangers, its inherent issues. For one, I can’t find my scanner. But look, whilst I still have devices that have SCART slots, I will need SCART leads. Okay I don’t use them now, of course, I use HDMI or component, as they also have those slots. But I MIGHT need them for something else. Something old. That I’ve bought second-hand, and want to use.

Alright I’ll admit the Amstrad E@Mailer wasn’t necessary. TEN POUNDS in the charity shop though. It’s not got a cordless phone as part of it, so I can’t plug it into the landline. I like it though, and it has to be plugged in, otherwise it wouldn’t be serving a purpose.

I’m going to have a sort-out this weekend. Or maybe next.

All Watched Over By Machines Of Loving Grace Episode Two

Having watched the first episode of Adam Curtis’s latest documentary, I wrote how I wasn’t quite at ease with some of his proofs of how technology had influenced society. I was keen to see how he developed his ideas, which rolled on into a new direction, of “How the idea of the ecosystem was invented, how it inspired us, and how it wasn’t even true”. This exploration was preceded by the statement that:

In the mass democracies of the west, a new ideology has risen up. We have come to believe that the old hierarchies of power can be replaced by self-organising networks.

To investigate this and some of the other ideas from the episode further, I need first to go off on a Curtisian diversion.

In the 1950 the US military started to investigate the concept of how to communicate in a post-nuclear war world. This work was lead by Paul Baran of The Rand Corporation, creators of the doctrine of nuclear deterrence via mutually assured destruction, and subsequently spoofed as The Bland Corporation in Dr. Strangelove. To be clear given the tone of the first episode, there is no connection to Ayn Rand.

Their conclusion was that you needed a system that could send packets of information over a network. The packets would search for the best route, and would be reassembled at their destination into a whole message. The aim of this isn’t stability, or to balance the network, or for all nodes of the network to be of equal value. It is instead to have a functioning system for message delivery.

The realisation of this vision was ARPANET, a network for the exchange of information between military computers, and the forerunner of the internet. ARPA achieved this in part through the funding of a group led by Douglas Engelbart.

Englebart’s vision

Englebart was a graduate in the field of Electrical Engineering, and following his degree he laid out a set of principles he himself wished to follow as his career goals. They became his bootstrapping strategy, and he refined them into a set of principles that his laboratory work would follow.

  • Our world is a complex place with urgent problems of a global scale.
  • The rate, scale, and complex nature of change is unprecedented and beyond the capability of any one person, organisation, or even nation to comprehend and respond to.
  • Challenges of an exponential scale require an evolutionary coping strategy of a commensurate scale at a cooperative cross-disciplinary, international, cross-cultural level.
  • We need a new, co-evolutionary environment capable of handling simultaneous complex social, technical, and economic changes at an appropriate rate and scale.
  • The grand challenge is to boost the collective IQ of organisations and of society. A successful effort brings about an improved capacity for addressing any other grand challenge.
  • The improvements gained and applied in their own pursuit will accelerate the improvement of collective IQ. This is a bootstrapping strategy.
  • Those organisations, communities, institutions, and nations that successfully bootstrap their collective IQ will achieve the highest levels of performance and success.

His team’s work at SRI for ARPA produced early iterations of the mouse, hypertext links, tools for online collaboration and precursors to what became the GUI. Engelbart himself was granted a patent on the computer mouse in 1970. Adam Curtis showed a clip of his demonstration of several of these ideas from what is now known as “The Mother of All Demos”.

Doug Engelbart 1968 Demonstration from Nathan Garrett on Vimeo.


This is the whole demonstration, for a couple of quick highlights go to 10:00 to see him editing a shopping list, and to 26:00 for an explanation of the input systems he’s using, including his mouse.

Engelbart saw computers as a means for sharing and collaboration towards the greater good, but there is no mention of equality in his vision. Simply that man could improve his world through collaboration. Similarly, ARPANET didn’t work on the principle that all nodes should share and contribute equally, indeed, a computer network built on these principles would quickly run into bottlenecks. Instead the aim is to simply deliver in an effective manner.

Adam Curtis suggests that it was the ideas of people like Engelbart, Jay Forrester (creator of the Early Warning Network in the 1950s) and Buckminster Fuller (inventor of the Geodesic dome used to house the early warning network), along with Howard and Eugene Odum’s flawed principle of Ecosystems (based on Forrester’s Network Theory) that influenced how the hippie communes of the late 1960s were organised. Curtis explains some of this in a recent article in the Guardian.

His example was of the Synergia Commune, whose philosophies were specifically based on the theories of Buckminster Fuller, living in homes styled on geodesic domes and following his idea that the solidity of structures made of equal nodes could be reflected in a human society. However this is true of but one commune shown. The hippie culture itself drew from an array of influences, from mysticism, alternate philosophies and sheer hedonism. One of the acknowledged major influences on the culture as a whole was Dr Timothy Leary, who advocated to simply “Turn on, tune in, drop out”. Again, this isn’t a principle of equilibrium and balance. In terms of the influences on the communes themselves, one can also look back generations to farming cooperatives, the kibbutz, Socialism, Communism, all more general and far removed from Fuller’s theories.

The final set of examples depicted were the people’s uprisings of the past ten years, which have often been characterised in the media as Internet revolutions, or in recent years as Twitter and Facebook revolutions. Again the social equality of the revolutions is shown as both the underpinning of its methodology and the crux of its failure. What most of them have in common though is that the initial focus has often come from a wronged opposition party. The Rose and Orange Revolutions of Georgia and the Ukraine both occurred following disputed elections, and the initial protests were organised by those wronged parties. In Iran the protests against the government similarly followed the disputed 2009 elections. And in Egypt, some observers credited years of organised protest by trade unions against the Government as a major contributing factor to its eventual overthrow. As for the use of the internet in Eastern Europe, and later the use of social media in the Arab Spring for Iran, Tunisia and Egypt, it is a facile argument that they “won” the day in any of the cases. However they certainly contributed to the organisation of the protests in every case, taking advantage of a lack of knowledge of those systems by the ruling parties to route around the more normal paths of comment and organisation that were being barred by oppressive regimes. Almost like ARPANET, they were able to find a route to deliver their message in the end.

It seems likely now that any oppressive government worth its salt will look to monitor, hack, and disable the commonly used social networks like Twitter and Facebook in future to prevent protest. It is equally likely that opposition groups will just find alternatives to route around the blockages in the system.

Adam Curtis rightly identifies the flaw with the theory of ecosystems explained early in this episode. Data was flawed, misrepresented, and simplified until it met the theory the Odums wanted to prove. It feels though that Curtis in the first two episodes has fallen into the same trap, misrepresenting his own evidence to “prove” a neatly defined argument, when in fact the results show broader and more pragmatic systems are in place.

All Watched Over By Machines Of Loving Grace Episode One

All Watched Over By Machines Of Loving Grace is a three part documentary on BBC 2 by Adam Curtis, who made the fantastic The Power of Nightmares, an exploration of how the Islamic Fundamentalists and the American Hawks rose from the same point in history, each with complementary aims, and a fundamental need of each other. This new series sets out to portray how we as humans have not been empowered by technology, but have instead been somewhat enslaved. A fascinating premise for me, so I was rather excited by the prospect of the series.

After viewing the first episode, I felt slightly hollow inside. I enjoyed it, but some of it didn’t ring quite true for me. Tim Maughan set me off a little further, when he mentioned his doubt about the view of the technologists of the 90s all being fans of Ayn Rand, in particular Bill Gate’s spectacular alturism in simply giving away vast amounts of his fortune. This set me off, and I decided to watch it again and dig a little further, see if I could figure out what was bugging me about it.

This episode sets out the idea that computers were proclaimed to be the salvation of the financial markets in the 1990s, providing logic and automation that could balance out the greed and errors of humans, and thus produce stable productive economies that would boom constantly. This global market would cross the boundaries of government and their influence, and was said to have risen out of the writings of Ayn Rand, claiming that her philosophy of Objectivism was what drove the technological inventions that rose at the start of that decade.

As a quick aside here, I’ve never read an Ayn Rand novel, never felt a compulsion too, so the explanation of the plot of Atlas Shrugged was a bit of a revelation for me, as it seemed instantly to me that the plot of Watchmen in part re-appropriates it for its own purposes, namely the concept of a community of creatives and intellectuals leaving society, waiting for its self-destruction to start building a new world. Also perhaps even the line “Who Watches the Watchmen?” might be a reflection of the “Who is John Galt?” refrain.

In the documentary, it was particular the brief reflections of important moments in the progress of technology that interested me. Two specific events were Loren Carpenter’s experiment with a giant game of collaborative Pong, and the publication of an essay by Humdog.

Carpenter is described as a leading computer engineer who in 1991 invited hundreds of people to a giant shed, to take part in an undescribed experiment. He talks briefly of not telling them anything about what he was doing, simply seating them all in two groups, each with a narrow paddle with a green side and a red side, and a giant screen in front of them. Once they had figured out that there was a giant camera focused on them all, and that they could see the different sides of their paddles picked up by the camera, he started them off with a giant game of pong. A single green paddle would influence one side of the shed’s pong bat to go one way, a single red the other. All of them together would be averaged out, to determined how far up or down their bat would move. This was said to show the flock mentality of the network, subconsciously managing themselves towards efficiency, all equal in their say.

Now, I managed to find a description of this very event by Wired magazine’s Kevin Kelly (who does appear very briefly in the documentary as a talking head). He describes how it actually takes place at a 1991 conference of computer graphic experts in a conference hall in Las Vegas. Carpenter himself is co-founder and chief scientist of Pixar Animation Studios, and rather than an experiment on perhaps a test group of random individuals, is actually demonstrating to his peers. For me this puts it in a different light, this is pretty much at this point in time a roomful of the people most likely to understand what is going on, to figure out how things are working, and to have a desire to collaborate to see what he has achieved. Indeed, at the end of the demonstration as described by Kelly, they are flying and landing a plane in a flight simulator. I don’t think it’s unfair to suggest that Curtis’s portrayal is a mild distortion of the event.

As for the Humdog essay, this describes an essay written by a user named Humdog in the early 1990s describing her dissatisfaction with cyberspace itself. She points out that “Cyberspace is not some island of the blessed”, that she had “commodified my internal thoughts which were then sold on as entertainments”, and that we were all “getting lost in the spectacle”. Again though I feel the purpose is being slightly misrepresented. The essay is essentially a slightly trolly farewell to the internet, mainly arising from a couple of incidents on a bulletin board called The WELL.

The WELL was an early BBS community that in part rose out of the Californian hippies of the 1960s, and is best described in Chapter 4 of Bruce Sterling’s book The Hacker Crackdown. Humdog was angry at these two incidents, and also what she felt was the rising commercialisation of The WELL. Rather than as Curtis portrays, cyberspace itself. Indeed, as her LinkedIn profile shows, she soon got over these fears and in the mid to late 90s worked for Seagate and Sun as a systems analyst in web applications.

Finally, coming back to my initial unease, the suggestion that the computer industry of the time in California was bursting to the seams with Randian utopiaists. This is summed up as the California ideology. I’ve found a good essay on the Californian Ideology by the Hypermedia Research Centre. This to me confirms my feelings that it was vastly more complex that this, constantly contradictory, and certainly not limited to one viewpoint.

Whilst each of these incidents are perhaps mild, and I’m not claiming to have found any smoking guns that disprove Curtis’s polemic, as they were pretty much the basis of the tech side of the first episode, it does to me suggest that he’s either been misunderstanding or misrepresenting them, sanding off the edges to make them fit his truth. I’m looking forwards to the next episode to see if I am being way too harsh, see if there is greater and more convincing evidence used. And also to see what is, absolutely, visually stunning television.

Setting up a robust development environment for WordPress

For a while now I’ve been wanting to create a better development environment for myself for working on my WordPress blogs, and other php projects too. I thought that in itself, this would be a good project to blog about as I set it up, explain and justify what I’m doing, see what other people are doing in similar areas for comparison. Also it might be helpful to people wanting to set up such an environment for themselves, or who want to set up an element of it. I’m going to break it down into several posts, which I’ll gradually link to from this post.

I’ll be setting up a new blog at, which with the new WordPress 3.0 I’ll be able to make a multiple blog installation, able to run whole new blogs underneath this. I’m also hoping to use a plugin to help me set this up with different domains for some of these blogs, so for instance there will be a blog with a different look and feel, but only one install of WordPress and its plugins to upgrade and develop. Finally I’m hoping to then move this blog itself from WordPress to Drupal, as that’s a CMS I need to learn more about too.

So what am I looking for in this development environment?


I want to be able to test changes to the site without affecting the live site. If something breaks on the live site, I’d like to be able to revert back to the previous version easily, get the site back up and running, and fix it in the test environment. So I need a test version of the site, and I need version control, allowing me to make releases of the site, and access previous version.


This is my own personal preference, I work on Macs for both my day job and my own stuff, so it makes sense. However all the tools I’ll use will either be available on Windows and Linux, or will have very close equivalents. Much of what I’m going to do will be within an IDE (Integrated Development Environment) available for all platforms, so my setup will be transferable should I need to (or should other wish to try it too).


I like using only a few tools for developing. Where possible, I want to do all the work within the IDE, and then test in a browser. For instance I currently use Cyberduck for FTP, but I’d rather do this within the IDE (and I’ve got this figured out now).

So, in order that I can do all this, I’ve selected a few tools.


I’ve used Netbeans for a year or two now as my main IDE. It is at heart a Java IDE, but its support for other languages has improved quite a bit, and the bulk of what I’ll be working on, PHP, HTML and CSS are well supported, with code completion and syntax for all, and the promise of good testing support for PHP in particular (this is something I want to delve into more as I go along). As a downside it can be a bit memory hungry, but I’ve found it more stable than Eclipse which I used before.


Subversion has been my version control system for a few years now at work, so this is a simple choice for me, as I know it best. I’ve not had much experience of CVS, and there are good guides and advice available for Subversion, so as something new to setting up their own versioning system for themselves, it makes sense to stick with what I know.


MAMP is Mac Apache Mysql PHP server. This is what will allow me to run a whole site on my local machine and test it easily. It’s pretty easy to set up, and also allows me to configure a more complex testing environment later on. I could have chosen to have a “live” test site on my webhost, and this may be a more suitable option for some people, obviously where you might have more than one potential user of the site itself.

So this is the basic setup I want (and have set up already, but will explain it in more detail as I get it working better). I want to use this to have a process for development where I can:

  • Work and test on my local machine.
  • Make a release version.
  • Save this in my versioning system.
  • Send to my live server.

The next post will be about setting up Subversion on the Mac. I’d be interested to hear other people’s experiences of setting up similar environments, and suggestions for different approaches to what I’m planning.

WordPress 3.0 Custom Post Headers

WordPress 3.0 is out, and is rather nice. I’ve been using the release candidate for a little while on another blog with no hassles at all, and so I was happy to upgrade this site as soon as the full version was out. The easiest way to see what is in the new version is to watch this video:

For now, I’ve moved to the new default theme Twenty Ten. It is a nice clean them, and does have a couple of very handy features. You can add your own custom header for the blog as a whole, but the nice feature introduced is the ability to define a custom post header for each blog post you make. Within WordPress 3.0 this is referred to as Featured Image.

It’s nice and simple to do, it will try and convert any image to a header you choose, but for the most control over how it looks, you need an image that is 940 × 198 pixels. When you write a post, simply click on the Set featured image link at the bottom of the right hand column. After uploading the image, you get the option to insert it into the post. Instead of doing this, look to the bottom of the dialogue box:


Click on “Set as Featured Image”, save and you are done. You can see a custom image for this post’s header as an example.

Why I won’t buy an iPad yet

Thought I’d ponder this a little while, give the product and the chatter a little chance to sink in. The iPad looks gorgeous. Pretty much a few weeks after I’d bought my iPhone, I knew I wanted the same thing, but somewhat bigger. I’m not a genius on this front, I know many people felt exactly the same. And now it exists, it looks right, and as one would expect from Apple, there are a few little twists that make it better than I imagined. Getting properly into the ebook reader market is one, an Amazon that works like the iPhone store is perfect. Price is another, if as it seems it comes in around the £400 mark, that’s a lot cheaper than I’d have guessed.

So why not buy one? Well, for me, the iPad is going to be a device I’d use sat in front of the TV. I’ve got an iMac for doing my own work, and for serious surfing. I’ve got a proper laptop for work. For the sofa surfing, I’ve got my trusty Acer Aspire One netbook. Now, the netbook isn’t the iPad, isn’t as lovely. But it does fulfill the same task very adequately, and I just don’t think I can justify it to myself on the basis of loveliness. If it breaks, and can’t be fixed, then I’ll happily buy it as a replacement, but there isn’t any other reason to get it yet. It doesn’t doing anything else over and above a netbook to me, so I’ll happily wait. By the time I need an iPad, it may be in a second generation. It may be cheaper. Most importantly of all, I’ll have the need for it.