Big Tin

Big tin: IT infrastructure used by organisations to run their businesses. And other stuff too when I feel like it…

Whatever happened to the railway?

Have you noticed how no-one talks about the railway any more? From BBC downwards, the place where you catch a train is now a train station, not a railway station. In other words, we talk about the vehicles, not the system. And the problem with that is that we’re getting a worse service and it’s costing us a lot more.

The reason why we’ve lost the concept of a railway system is clear: there is no system any more. Since the rushed, ideological privatisation of British Rail in the dying days of the last Tory administration, the railway has been run by train operators, such as Southern or First Great Western, and an infrastructure operator, Network Rail. Within each of these two broad groups — train and infrastructure operators — there are further schisms, such as train leasing companies, train refurbishment companies, track maintenance companies, signalling and telecommunications etc etc. The list goes on.

Railways are inherently complex organisations: they’re subject to disruption by all sorts of events, most of them not entirely or at all under the railway’s control. People fling themselves off platforms in front of trains, hardware wears out or fails before its time, external electricity supplies go down, weather results in key personnel — think train drivers and signalmen — being unable to get to work, and so on. You can imagine.

At the best of times, for a system such as a railway to work effectively it needs communication between the various elements. In the days of a single railway organisation, it wasn’t perfect but at least everyone was working for the same employer and could be orchestrated as such.

Today, that is no longer the case. Each organisation has a profit motive first which means there has to be a cash incentive to make something happen that’s out of the ordinary. Usually, that works to the disbenefit of the rest of us. For example, you want to make a train connection but your incoming train is 10 minutes late. In BR days, the connecting train might well have been held for the benefit of the arriving passengers. No longer. There’s a financial penalty for train operators if they are late so today you can happily watch your connecting train drive away as you arrive at the station.

Another classic example is a small incident that happened in July 2011 at the entrance to Edinburgh Waverley station. A train derailed but it was a slow-speed incident, the train stayed upright, and no-one was hurt. In BR days, a crew would have been out to to jack it up and get on its way, and make overnight repairs to the track. In this incident, before the train could be moved, there had to be a full investigation to find out what had failed in order to establish who would pay for the damage. This meant that, instead of there being a delay of perhaps an hour or three, it took a day and a half before Edinburgh was fully open for trains again.

Given that, you can imagine what happens when one railway company needs to contact another in an emergency. Something has gone wrong and it needs to be sorted out, as passengers are stranded in the middle of nowhere. Since the profit motive comes first, the various parties have to talk about who will pay, who is at fault and therefore potentially liable, and whether it’s worth fixing now or later. That’s before they get around to talking about how to solve the problem. Meanwhile, passengers sit in trains for hours.

This is not a hypothetical problem: it’s happened plenty of times. Yes, we’ve had some nice new trains following privatisation. We’ve also had beyond-inflation price increases every year to pay for them — and for the huge profit margins the trains companies demand before they will get involved, even though their profits are underwritten by the government — that’s you and me.

Privatisation of the railways has been a disaster overall. We’ve lost the concept of a railway system, and replaced it with a patchwork of train operators’ turfs, each of which doesn’t connect, and results in a blizzard of confusing ticket prices as they attempt to segment the market and screw more cash out of the customers (we’re no longer passengers). Woe betide you if you miss a train, even if it’s not your fault, as the mega-prices are backed up by penalties if you don’t get exactly the right ticket.

As my good friend John May sings: it’s time for a change.

Advertisements

Filed under: Current affairs, mergers & acquisitions, Railways, , , , , , , , , , , , , ,

What’s the prognosis for true high-speed mobile data?

The mobile industry confuses its customers and doesn’t deliver what it promises.

We all talk much about the latest technology, and how it will transform this that or the other element of our personal and/or working lives.

I spent quite a bit of time yesterday talking about LTE — also known as 4G by some, but not everyone, in the mobile industry. It’s known as 4G because it succeeds 3G, today’s iteration of mobile broadband technology. Even though, confusingly, some of it, such as HSPA which can give you as much as 21Mbits/sec is known as 3.5G.

And LTE isn’t 4G technically, because it doesn’t quite meet the definition of 4G laid down by the global standards body, the ITU, according to one analyst I spoke to. So you’ll find LTE referred as 4G or as 3.5+G, LTE-Advanced — which does meet the 4G spec — or just plain LTE. WiMax, incidentally, is 4G according to the ITU. No wonder the mobile industry confuses its customers. There’s a pithy piece about LTE and 4G here.

But that’s all by the by in some ways. The important thing about LTE is that it promises 100Mbit/sec download and 50Mbits/sec upload speeds. If you know anything about the technology, you’ll know that in practice some 25 percent that is likely to be eaten up by protocol and other overheads. You’ll also know that a further 25 percent is likely to be lost to distance losses, cell sharing, and clogged up backhaul networks.

All this is due to arrive over the next ten years. Yes, ten years. Roll-outs are unlikely to start happening in the UK before 2012, more likely 2015.

Except that this is so much hogwash.

I was in the middle of London — yes, challenging conditions due to the concrete canyon effect, but the kind of area in which the mobile industry has to demonstrate its best technology. And the best mobile data rate I managed inside or out was a standard GSM-level 56kbits/sec. This is early 1990s technology.

So if 20 years after its invention and 15 years after its introduction, that’s the best I can get in the middle of one of the world’s leading capital cities, I suspect it’ll be 2025 before I see LTE speeds.

You know what? I’m not sure how much I’ll care by then…

Filed under: mobile, , , , , , ,

Dust to dust…or is that CPUs?

A quick follow-up from a news story I wrote for eWeek yesterday entitled ‘Moore’s Law – Still Driving Down The IT Footprint’.

The story concerned the research by Stanford University professor, Dr Jonathan Koomey, who found that Moore’s Law was active for long before Gordon Moore coined his eponymous observation. Koomey reckons that Moore’s Law will result in the huge growth in mobile devices with fixed computational needs, such as controllers.

Their requirements won’t grow but the growth of smaller, more power-efficient processors will come towards them, to the point where battery life becomes a non-issue.

Then you’ll get ‘dust’, as this fascinating paper posits. Read and enjoy!

Filed under: mobile, Wireless, , , ,

New wave of real-time collaboration arrives?

When it comes to document collaboration, things can only get better. We’ve seen Google Wave make a splash (sorry) and, together with collaboration tool Google Docs, you could live and work without ever loading any software. Ever.

But this approach has limitations, not least of which is that you can only use the Google-hosted applications, you don’t get to choose your applications, and you really only get the full benefit when all your data is hosted by Google.

Scary? For some — especially financial and similar enterprises encircled by legislative constraints and cautious customers — yes. Even the rest of us might not feel entirely comfortable about it.

One new company, oneDrum, reckons it’s got an answer: keep the applications and documents you use every day and collaborate with colleagues and others in real time using its Java-based (and so platform-independent) platform.

oneDrum is a British company that’s most of the way towards launching a serious competitor to Google Wave and Google Docs. CEO Jasper Westaway welcomes Wave because, you guessed, it helps educate the market and validate the concept. “Without them, we wouldn’t have gone to the market with this”, CEO Jasper Westaway told me.

Westaway says that what he likes about Wave is the way that it blurs the lines between conventional email, IM chat and so on; oneDrum does the same but you can use any application you like. oneDrum’s eponymous platform is set to launch by the end of July.

I tried it. While the software is a bit flaky — it’s still a private beta — it allows users of Microsoft Office applications to open applications and share documents. If you share a spreadsheet, for example, you can each edit the same sheet transparently, with changes appearing instantly on a cell-by-cell basis as if you had made them locally.

An example scenario could involve a sales manager who sees a presentation being opened up by a junior member of his sales team, and changes being made that aren’t relevant for the client the junior exec is about to visit. The boss can step in and make further changes in real time; he or she could also embed comments along the way, effectively using the application as a chat interface.

It works in Office via the Windows COM interface, which is redirected by the Java sandbox across the collaborative medium — in my case it was the Internet but could be a private network — so that the applications running at either end are unaware that they’re collaborating across the globe. “We use the automation API and detect changes and push them out to open copies”, says Westaway.

The underlying mechanism is a Skype-like peer-to-peer system: there’s no central repository. “The overlap with Google Wave is that we use the same operational transformation algorithms“, the benefit of which is that “the algorithm works at whatever level is appropriate for the content, whether bytes or paragraphs, for example,” says Westaway.

And if you go offline, will any changes that happen to be stored on that machine be replicated? Westaway reckons that it depends: “As long as someone is online, you can pick up the changes, it’ll work.”

You could imagine too that there might be problems with security — in some organisations, they’ll want to know exactly where the data is going, for example. “Next year, we’ll offer enterprises the possibility of hosting a central repository for compliance reasons,” says Westaway. “We’ll roll out features that will be attractive to organisations such as banks who want compliance-ready applications. We say to them that we will be ready for you soon — but not yet.”

To start with, the service will be free, but oneDrum will eventually be releasing a subscription model that will include features such as the ability to backup all changes.

Westaway reckons the business is easy to scale, and that that it’s fairly easy to add new applications to the roster. So Westaway reckons that by the end of 2009, support for OpenOffice, for Office for Mac, and Google Docs will be cooked and ready to go, with more in the pipeline.

We’ll have to wait and see how this early promise holds true but, for the moment, the peer-to-peer service, which will be free, could form the basis of a new wave of collaboration without tears.

Filed under: Application collaboration, , , ,

Google Wave: is it evil?

I just finished watching the whole hour of Google’s presentation of Wave, its new collaboration tool. It’s a fascinating idea, bringing together a multitude of communication tools, such as email and instant messaging, while making them more intuitive to use.

From the demo of the early developer code, you could see that the threading of conversations happens naturally, even when not all participants are involved from the start, while you can still keep parts of the conversation private. You can see messages in real time as you type (not good for poor typists) or you can do it in a more familiar ‘type and hit Return’ kind of way.

Content isn’t restricted to text, it can be anything that the developer chooses to add; the demo, which happened at Google IO, its recent devcon. And of course it’s all hosted in the cloud: all you need is a browser.

The idea looks great, the elimination of boundaries between IM and email is a liberating idea because you can still use it just like email — that is, you can choose when to answer messages and not be jumping around to someone else’s set of priorities — it’s smarter, and the browser becomes the only communication tool you need.

So why is it that I found myself wondering if this is really such a good thing? Two things: firstly I’m concerned about the fragmentation of email. Let’s assume Google Wave become insanely popular. Even so, there will always be people who aren’t using Wave but who stick to email and IM. It means you now have another communication channel to understand and manage. That’s on top of the Facebooks, the Twitters, the MySpaces and so on, all of which have private channels of communication that you need to check.

And then there’s the deeper concern: am I really comfortable trusting Google with my communications? With email I can choose from hundreds of suppliers and dozens of pieces of software. None of them want to index and understand my content even remotely on the scale that Google does. And when, due to some force majeure, it drops its ‘do no evil’ philosophy (assuming you buy into that idea from a Wall Street-quoted public company), what then?

Is this just me being paranoid?

Filed under: Uncategorized, , , , , ,

Manek’s twitter stream