Wednesday 18 November 2009

What happens when you virtualise a data centre?

Virtualisation is still a hot topic in the bits of the IT business that I do work in.

No longer is virtualisation restricted to the "low hanging fruit" - non-critical, low end servers that can easily be consolidated and at low risk. People are now looking to embark on the next stage and virtualise their business critical applications. Increasingly, a virtual machine will be the default home for a new application.

As it becomes more pervasive, virtualisation changes the dynamics of the data centre. In many large organisations, it takes many weeks or even months to get a new server installed. Even when the hardware procurement process is pretty streamlined, there are still delays in waiting for the kit to be delivered, finding a suitable change window when it can be installed, finding space for it, connecting it to the power supply, getting it hooked up to the network(s) and perhaps to storage.

By contrast, a virtual server can be created almost instantly.

Not surprisingly, this can stress some operational processes which have previously relied on the server delivery bottleneck, together with space and power constraints to provide a natural buffer to otherwise unbounded demands for more servers.

What happens in the longer term, once these issues have been ironed out? I'm assuming that in many cases, the number of physical servers in the estate will be reduced. That's part of the justification for the virtualisation exercise in the first place. But what about the total number of servers (virtual and physical)?

I would be interested to see statistics on the number of virtual servers that enterprises end up with, some years down the line. Does this rapidly exceed the original number of physical servers? And how do the long-term growth rates compare to those for the physical estate prior to the virtualisation exercise?

Does anyone have any data they are willing to share?

Monday 26 October 2009

Replacing Oracle

At my previous employer (Tideway Systems, now part of BMC Software) we did some work to help Oracle customers identify how much Oracle they were using. More specifically, Tideway automated some of the work involved in discovering what Oracle databases existed in a customer environment. People knowledgeable in Oracle licensing, like the fine folk at Rocela, use this information to identify opportunities to optimise the customer's use of Oracle and the way it is licensed.

Big Oracle customers have hundreds of Oracle databases and spend many millions per year on "maintenance" fees, so there are significant adjustments to be made simply by changing the way in which Oracle is being used and licensed. And Oracle's licensing is sufficiently complex (no doubt intentionally so) that companies such as Rocela can earn a living helping people navigate it.

It seems to me that there is a bigger opportunity here.

While I am prepared to accept that there are probably some applications that make full use of all of the bells and whistles that Oracle provides, I strongly suspect that in many of Oracle's largest customers, there are a significant number of applications which could be moved to an open source database - postgres, for example - without suffering an unacceptable hit in terms of performance or availability.

Simply in terms of avoiding the recurrent Oracle maintenance costs, this could be a huge cost saving.

Is anyone doing this? It looks like a good business opportunity to me. A services-led engagement to identify candidate applications, migrate them from Oracle to a lower cost alternative, then provide ongoing support would be a starting point. Longer term, how about developing tools to assist in the migration, or to remediate those applications that rely on Oracle-specific features?

If you know of examples of people doing this, I'd love to know about it.

Cut bankers' bonuses

It is intriguing to see the various arguments being deployed around bankers' bonuses. The apologists for the bankers, such as Dominic Lawson in the Sunday Times, argue that cutting the bonuses would be a Bad Thing.

I think that Mr Lawson makes some good points, particularly in highlighting that the reasons for the financial collapse go further than the way in which bankers were paid. But I suspect he underestimates the way in which the incentive structures in banks encouraged behaviour that was dangerous and destructive.

By contrast with Mr Lawson, the view of the leader writer of The Economist is much more critical of the banks and their remuneration regimes.

The Economist points out a number of things that Mr Lawson ignores. Perhaps the most important is that all the banks (yes, even Goldman Sachs) have been the recipients of state aid of one kind or another. All the banks are benefiting from the extraordinarily low interest rates set by all the major central banks. Many have been bailed out explicitly at huge expense. Others benefit because their counterparties were bailed out. Governments should attempt to claw this subsidy back on behalf of the tax payers that provided it.

Underlying the arguments of many who wish to leave the bankers to get on with it is a fear that the bankers will up sticks and move elsewhere if governments restrict their ability to earn huge bonuses. But perhaps we don't want the kind of banks that pay these bonuses. In the light of events of the last couple of years, it is tempting to conclude that a highly profitable investment bank is simply an accident waiting to happen, and that we should seek to minimise our exposure to them rather than compete to attract them to or retain them in London.

Ban introductory deals

Why not ban "introductory deals"?

I'm thinking of the kind of sweeteners that encourage people to switch credit card providers by offering an initial period at a low or even zero rate of interest.

Such deals are a loss leader for the companies that offer them. Clearly they hope to entice new customers to join them and then either lock them in to more profitable rates later, or to remain with them through inertia.

Is this a good thing?

Reading a couple of blog posts about the demise of Washington Mutual, a big US financial firm - and one of the biggest to collapse in the recent financial disaster - makes me wonder whether such deals are a symptom of a kind of banking that is in nobody's interest.

Read more here and here.

Detecting hard-wired IP addresses

A few times recently, I've spoken to people that are looking for a way to spot instances where their applications are using hard-wired IP addresses rather than DNS.

This can be a problem if the IP addresses are expected to change. For example, if you have an application that expects to be able to connect to a database on IP address 10.1.1.1 and the database server moves to 11.2.2.2 then the application will break unless you are able to identify the places where the application uses "10.1.1.1" and change them to "11.2.2.2".

Well written applications will use names "dbserver.foo.com" which get looked up in the DNS, so when the server's IP address is changed, the application will continue to work.

The extreme case is where there is a big change coming such as a data centre move. This could involve hundreds or even thousands of servers. If the move will result in changes to the IP addresses of servers, how can you be sure that nothing will break?

One solution is to inspect all of the source code of the applications. If there are hundreds of applications, this is not going to be a trivial task. Needless to say, this is not an appealing approach for many.

What else can you do? I'm open to suggestions.

One possibility is to use "tcpdump" or something similar to do packet traces of activity from each server looking for instances where a connection is opened to an address that has not previously been returned as a result of a DNS lookup. This should identify cases where an application is using an IP address rather than a DNS name. This might help, but it would still fall short of an ideal solution since it would not identify the process responsible for the connection. Better than nothing, though.

I guess you could go a step further and build an agent that would do much the same thing, running continuously on the box. Has anyone come across anything that does this?

Other approaches? Dtrace on Solaris, perhaps? Suggestions welcome.


Monday 19 October 2009

Why vote?

There will be a general election in the UK by summer 2010. I probably won't bother voting. Here's why.

For those of you who don't know, I guess I should explain what general elections are and how they work.

In the UK, the control of the government is held by the party that wins the most seats in a nationwide poll. The poll must happen at least once every five years, but the government can choose the date.

The election is to select members of parliament, 645 of them, each one representing one constituency, a geographical area.

Here's the first problem, and one reason why voting is probably a waste of time for me. I live in central London, in a constituency that is predominantly Conservative. Under almost all conceivable outcomes, the Conservative candidate will win by a large majority. My vote might slightly reduce this large majority but since the opposition will be divided between the two other main parties (Labour and Liberal Democrat in this case) there is no chance of a change. A Conservative MP will be elected to this constituency regardless.

This voting system means that the ratio of votes for the different parties is not accurately reflected in the number of MPs elected for that party. A proportional voting system would fix this but there's no chance of this happening as the only way it could be changed is by the government, which pretty much by definition, does not see such a change to be in its best interest.

So much for democracy.

The second problem is that MPs and Parliament itself seems to be completely useless. Parliament repeatedly fails to hold the executive to account. Perhaps the most egregious example is the Iraq war. Parliamentary debate on the Iraq war was 7 hours. For the vote on banning fox hunting, it was 700 hours. I know foxes are important to some people but are they really more important than the lives of at least 85,000 Iraqis (and probably far, far more). What a farce.

MPs in general just vote along party lines. If that is all they do, there is little or no value in having them there.

Since MPs fail to stop the Government from doing stupid things, what are they useful for? One response is that they work tirelessly (we are told) in the interests of their local constituents. Well, perhaps. Perhaps beefing up the powers of Citizens Advice Bureaux would provide a more cost effective substitute.

Third reason. Holidays. MPs seem to spend most of their time on holiday. Last year, they took 24 days for Christmas and this summer, they went into recess on 21st July and returned to work in 12th October. I see no evidence that they do anything of value during the recess but it appears we carry on paying their salaries regardless.

Fourth reason. Expenses. Enough has been written about the expenses scandal already.

So that's why I probably won't bother voting. OK?

Sunday 13 September 2009

Reverse switching. The horror, the horror...

New job, new toys.

One of them is a suitably corporate Dell laptop with Windows XP.

Having been using a Macbook Pro for the last couple of years, I'm in the interesting position of experiencing the reverse switch from OS X to Windows. The experience is pretty mixed so far.

What's good? Full versions of the Microsoft Office applications, rather than the somewhat crippled ones that Microsoft chooses to make available for the Mac. That includes Outlook, the only client that works properly with the old versions of Exchange that most places are still running. These applications start up pretty quickly too.

What's bad? Pretty much everything else. Here's this morning's example:

I open the laptop - I'd left it in standby mode on Friday. It lets me log in, but that's about it. Disk whirring away...

I try starting the Cisco VPN application I need to pick up email. I can select it from the Programs menu, but nothing visible happens. The hard disk is still busy, but there's no indication of progress. Wait... Go and put kettle on... Wait some more.

Finally I give up. I know, I'll log out. Select "log out". Nothing happens. Try again. Nothing happens. Start quitting various applications manually. Start seeing messages reporting other applications are unable to stop and do I want to "terminate" them.

Eventually, I hit the power button and reboot and after around 15 minutes I'm finally able to read some email.

Clearly "stand by" is a bit of a Microsoft joke. Next time I'll power down.

(For those of you who don't use a Mac: my sympathy. I've become conditioned to using something that works with me rather than against me. When I've finished working with my Macbook, I close the lid. When I want to do something, I open it. Shouldn't that be how a laptop computer works?)

Thursday 6 August 2009

Predictions

Predicting the future is tough. But we still seem happy to quote "experts" who happily deliver us all sorts of predictions. Some of them even get paid for the stuff they churn out.

I came across one nice example yesterday while looking up some references for Apple's "Rosetta" technology that was used to assist in the transition from PowerPC to Intel processors for the Macintosh.

A CNET article quotes an "Illuminata analyst" (someone who should know, I guess) who says "History says that binary translation basically doesn't work."

He goes on to say that "The day may come when someone can do a goood enough job with it, but that concept has been thrown out there many times in the computer industry, and it's always fallen flat on its face."

Well, Rosetta did work, though to be fair, the comments above fall some way short of predicting its failure. They just hint strongly that success is very unlikely.

The financial world is a wonderful source of failed predictions too. What about UK house prices? In the news today is a report from the Royal Institute of Chartered Surveyors. At the beginning of the year, they were predicting falls in prices of 10-15% this year; now it says prices may rise slightly. That's quite a shift.

Economic forecasts generally seem to be pretty pointless. None of them predicted the current recession, the most severe since the great depression. So why bother?

Weather forecasts? The UK Met Office predicted a hot, dry "barbecue summer" and instead we have just had a July with twice the average rainfall, though the Met Office has published some stats to convince us that it really hasn't been that bad.

I have a prediction. This year, we will finally see that producing pointless predictions is futile and we will give up. I will also win the lottery.

Remember, you read it here first.

Wednesday 5 August 2009

Apple Macintoshes

I've just bought my seventh Macintosh. That's ignoring the three that I've had provided for me by employers over the years. Now that I'm waiting for the seventh to be delivered, it seems like a good time to look back and see how things have changed over the years.

My first Mac was a Macintosh Plus. 1MB memory, single 800K 3.5" floppy disk, monochrome screen. The processor was a Motorola 68000 running at the amazing speed of 8MHz. The big thing that the Plus had over the original Macintosh (apart from loads of memory) was expandability: it had a SCSI interface that meant you could add one (or more) external hard disks. I invested a huge amount of money in a 20MB external drive. The Mac was used mostly for writing - my then partner's PhD on the Formanifera of the Thousand Islands Group - and the favourite application of the time was Microsoft Word, version 3.0 if I recall correctly.

I still think that version of Word was one of the best that Microsoft produced. It had styles, allowing some consistency to be imposed on the structure of the document, and you could edit files of "any" size, something that was a novelty back then when many applications imposed pretty arbitrary restrictions. It wasn't quite as snappy as WriteNow! but it worked well. Best of all, it lacked the huge array of superfluous features that get in the way of using recent versions of Word effectively.

It took several years for the finances to recover sufficiently for Macintosh number 2. This was an SE/30. 5MB memory, 40MB internal hard disk and a Motorola 68030 processor running at 16MHz. It was heinously expensive, somewhere over £2,000 if I recall correctly. The form factor was essentially the same as the Plus, with the same built-in 9" monochrome screen. I wrote most of my PhD on this, using TeXtures, a commercial version of TeX and LaTeX.

At around the same time, I got my hands on my first employer-provided Macintosh, a IIsi. This was equipped with an Ethernet card, and had a separate monitor. You could get a colour monitor for the IIsi, but colour monitors were expensive and definitely seen as unecessary in my line of work, so I still had monochrome - though now with scales of grey. The IIsi made a change from the Sun 3/60 diskless workstation that I'd been using up to that point, but the best feature was that it ran Macintosh Common Lisp, one of the nicest development environments that I'd come across at the time.

The SE/30 lasted some time, but back at work, a new research grant allowed the IIsi to be replaced by a Quadra 800, one of the last Macintosh computers based on the Motorola line of processors. This was a monster compared to the previous machines. A Motorola 68040 33MHz processor and 8MB memory. I had a pretty minimal configuration - no CD, since it was too expensive - and a mere 230MB hard disk, but it was still by far the most powerful desktop machine I'd had my hands on at the time. The internet was kicking off too, and it was probably on this machine that I played with NCSA Mosaic, Gopher and Archie visited a rudimentary list of websites called "Yahoo!" and waited for the "information superhighway" to appear.

The Quadra 800 was a great machine, but my next purchase was a disappointment. This was a Power Macintosh 7500, one of the first Apple machines built around the then new PowerPC processor architecture.

On paper, the 7500 looked great. A fast 100MHz processor on a daughter card that could be replaced, twin SCSI busses for internal and external disk expansion, built-in ethernet, huge memory capacity and some advanced video capability. The problem was that by this point, Mac OS was beginning to creak. There were lots of "system extensions" - random bits of software that loaded at boot time, including drivers for bits of hardware and various tweaks for the OS. But there was no protection between these extensions, nor between them and the OS, or applications. The result was occasional random crashes, or more often, a refusal to boot. At this point, the only remedy seemed to be removing random extensions until things started working again. The complexity was increased by Apple's need to support legacy 680x0 binaries, though whether this contributed to the disappointing performance and instability isn't clear.

The 7500 was connected to a Hayes 28Kbps modem, and I started getting to grips with PPP so I could connect to the Internet, first via Sussex University, then through Demon Internet. Well, it worked, though looking back on it, I'm surprised I had the patience.

That was it, for a few years. I was doing lots of work with Solaris and had a dual boot Solaris x86/Windows NT laptop which was significantly more stable than Mac OS 8 or 9 in either mode. Worse, Apple persisted in making it expensive to get hold of development tools for Mac OS. I bought third party tools like MetroWerks Codewarrior but never invested the time necessary to build anything significant.

I moved on from Solaris and Windows NT to Red Hat Linux and Windows 98. Linux was great, allowing us to build some amazingly robust server side software in Java at very little cost. Windows 98 was a less glorious experience, though I had a wonderfully tiny Sony 505GX laptop that seems to have been an early precursor of the current netbook trend.

When Mac OS X first appeared, my initial reaction was curiosity combined with scepticism. The critical test seemed to be whether Apple would remain in existence for long enough to make OS X worth considering. Other new operating systems (Be, for example) had not lasted long and had never gained a critical mass of developers and applications.

Eventually, Microsoft released a version of Office for OS X. I bought a new iMac G4 and a copy of Microsoft Office. I was blown away. UNIX with a decent user interface, at last! The hardware was pretty impressive too: colour, flat-panel display, 700MHz processor and 256MB memory, 40GB hard disk. Better still, Apple remedied some of their failings in the past: OS X shipped with pretty good and comprehensive developer tools including compilers, an IDE and a variety of debugging and tuning tools.

Based on the experience of the iMac, an iBook swiftly followed. Then the iMac was replaced by a G5. The iBook died but the iMac continues to be the workhorse of some friends.

What's next? Well, having enjoyed using a Macbook Pro for the last two years (having persuaded my employer that this was a "good thing"), I'm shortly to take delivery of my own MacBook Pro. I wouldn't bother, but I'm leaving the old company and they won't let me keep the hardware. Along the way, Apple has switched processor families once again, from the PowerPC line to Intel. The transition (thanks in part to some technology from Transitive Inc, now part of IBM) has been seamless.

Just for fun, here's a rough comparison of my first Mac with the latest:

Mac PlusMacbook ProMultiplier
MC 68000, 8MHzIntel Core 2 Duo, 2.53GHz300 times faster
1MB4GB4000 times more memory
800KB250GB300,000 times more storage
512x342 monochrome1280x800 colourAround 5 times more pixels

Sunday 2 August 2009

Navionics charts on the iPhone




I've just discovered the Navionics chart applications on the iPhone.
These are amazing value, and the iPhone UI makes them tremendously easy to use.



Up until now, I've been using an obsolete Raymarine
RC400 hand-held chart plotter.


It's a hand-held GPS receiver, waterproof, and aimed at marine use. There's a small colour screen (about 3.5"), fairly dismal battery life, and it weighs about 430g. Mine also has an annoying habit of crashing at least once a day, presenting a screen with a hex code and an instruction to hold down the power key to reboot.


This never works; I always have to remove the battery pack from
the back and restart it that way. Apart from that annoyance, it's a smart bit of kit.


You buy Navionics Gold charts for it which are stored on Compact Flash cards.


The charts are not cheap - one for the UK costs around £180. (This is much cheaper, however, than buying the equivalent paper charts.) You also need to invest in keeping the charts updated.








Contrast this with the Navionics charts for the iPhone. The equivalent UK coverage chart is £15, downloadable from the iTunes store (just search for "Navionics").

For £15, the coverage is astounding. And there's tidal data, as you would expect, and integration with Facebook (so you can publish your track) which you might not.








Problems? Well, the iPhone isn't waterproof, and I'm not sure that using it with cold, wet, gloved fingers would be a great idea, but as a planning tool these look fantastic.

Saturday 20 June 2009

Internet backups for my Mac

I've been thinking it really is time to do something about backing up my main home machine more effectively.

I have an aging PowerMac G5 which works great. So I'm looking for something that plays well with Mac OS/X.

Once upon a time, I used to do regular backups by burning DVDs. I'd then collect these up and put them in our storeroom. This is far enough from our apartment that it would take a pretty major disaster to make them unavailable.

Now that Time Machine has come along, I've become lazy about backups. Time Machine does a great job, restores actually work, and it seems to have no noticeable performance impact. The problem is that the backups are not offsite, so while it's a great solution to accidental deletion or a hard drive failure, it's not a solution to a bigger disaster like the building burning down.

What do do?

It seems obvious to me that the right way to do this is over the Internet. There seem to be a bunch of services out there which are Mac compatible, but which one?

I've seen positive mentions of Mozy and the pricing looks reasonable at $4.95 per month for "unlimited" data. There have been several positive reviews of it over the last few years, but some less positive ones too - such as these observations from Michael Horowitz at CNET.com. But I've seen comments reporting that it can be slow, that backups sometimes disappear from the Mozy servers and that restores are a pain.

Backblaze gets positive reports too. But the download says "Intel Macs only", so I'm out of luck.

IDrive, perhaps. 150GB of storage for $4.95/month isn't quite unlimited enough. I've around 50GB of photos in Aperture and 150GB of music. I'm sure the latter could be pruned (some of it is podcasts) but I don't like the idea of being limited to a fraction of the capacity of my hard drive.

A more general observation is that all these products appear to be Mac versions of what started out as a tool for Windows users. I don't have a problem with that in itself, but there's a difference in the kind of protection that I'm looking for an Internet-based backup to provide. Since I'm using Time Machine, I'm not looking for something to protect me from finger trouble or accidental deletions of files. Neither am I too concerned about my hard disk failing. I'm pretty well protected against both of those problems.

As a result, I'm probably looking for something simpler than many of these products.

What I want is something that will provide protection against disasters. An offsite copy of my Time Machine backup disk might not be a bad start. I'd also feel slightly happier if the solution didn't involve my data being shipped across the Atlantic, since I'm based in the UK.

Ideas, anyone?




Friday 13 February 2009

Twitter and collaborative filtering

When Twitter comes up in conversation with my colleagues, the most common response is "I don't get it."  I've said it myself more than once.

This probably has more to do with the age profile of my colleagues than anything else.  They (OK, we) are sufficiently old that we remember the days before email and mobile phones, let alone Twitter, Facebook and MySpace.

Twitter only starts to get interesting when you are following a bunch of people whose tweets are themselves interesting.  And if your friends aren't using Twitter, that reduces its utility.  This is the classic network effect of course.  It's not clear to me how this problem can be overcome.

It does bring my back to a fundamental question.  What is Twitter for?  I'm struck by the high proportion of tweets that contain URLs.  Indeed, this phenomenon is so common that some of the Twitter clients provide tools to automatically convert URLs to a short form that fits more easily into the 140 character limit.

What does this mean?  That Twitter is actually being used to pass around references to things rather than the things themselves.  People are using it as a means of drawing attention to something that they have read (or perhaps would like other people to think they have read) in order to make a point, share some knowledge, start an argument or just make someone smile.

I remember reading a paper in back in the early 1990s that described collaborative filtering [1] and being fascinated by the idea.  I tried and failed to persuade some of my students to build a Usenet news reader that exploited some of these techniques, but a few years later someone did it anyway [2].

Twitter, at least in part, is providing a means for collaborative filtering of the Web.  You follow people you know, or whose views you find interesting and they post links to things you may want to read.  We are filtering the web for each other.

If this is what Twitter is actually being used for, then how "fit for purpose" is it?  Not very, I think.  


First, and it is a minor point, I have to jump through some hoops to post links to Twitter.  Doubtless people will build clients that do this seamlessly, indeed, perhaps they already have.

Second, and I think more serious, there's no easy way to find people that are providing feeds of stuff that I would enjoy.  I'd like a system that would recommend people to follow based on my reading habits.  This sounds a little harder to fix than the first point, but it would be much closer to the spirit of collaborative filtering that was described in the two papers I referenced above.  

Oh, and in case you're wondering, I am an infrequent tweeter and you can follow me here, but I can't guarantee that I will say anything you find interesting.




Wednesday 7 January 2009

Gambling

There's an odd phenomenon you can observe in the UK.  I've noticed it recently a couple of times on the flagship morning news programme "Today" on BBC Radio 4.

When talking to investors or businessmen and trying to disparage their efforts to make money, particularly on the stock market, the interviewer will accuse someone of "gambling".  Gambling, you see, is bad, while investing, presumably, is good.

But is there really any difference?

One of today's (and "Today's") news stories is the apparent suicide of Adolf Merckle after losing a huge amount of money, a large chunk of it apparently "gambling" that Volkswagen shares would fall when, in fact, they rose spectacularly.

Was this a gamble, or an investment that went wrong?

The old proverb claims that you have to "speculate to accumulate".  In the sense that any investment involves risks and that higher returns involve greater risk, this is true.  Putting money in the bank is low risk (we all hope - though even this is not completely certain).

Investing in shares is higher risk - there's usually more chance of a company going bust and wiping out the value of your shares than there is of the bank disappearing - but the return on the investment may be greater.  Note the word "may".  There is no guarantee, of course.  Most equity investments at the start of 2008 will have suffered a pretty painful loss in value during the course of the year.

"Shorting" shares - taking a view that their price is too high and that it will therefore fall - can be riskier still, in the sense that you can lose more money than your initial stake if you get things badly wrong.

Does that mean that an investment in shares is a gamble?  I think so.  I think that much (most?) of what we do in life involves risk and so many activities are, in a sense, a form of gambling. Does an investment magically transform itself into a gamble when it goes wrong?  That would be ridiculous.

So why do BBC interviewers still persist in trying to put their interlocutors on the back foot by accusing them of gambling?  I think it tells you more about the BBC, and perhaps the persistent anti-business streak of much of the British establishment, than it does about the people they interview.

Monday 5 January 2009

IT management software: don't build something too new

Pick any category of enterprise software and you will probably find a whole slew of products that do much the same thing.  And it isn't necessarily the case that the  most interesting, useful or innovative products thrive.  That certainly seems to be the case in IT management software (IT service management, network management, asset management and the like - these are the areas I know best and what I'm referring to here).

Why is this?

Many new products originate in start-up companies.  Even if the company starts with an idea that is truly novel, there are strong pressures to make it sound familiar, not least because the bulk of potential buyers are unadventurous and buy things that sound familiar.  There are other pressures too, such as marketing and visibility.  If you want your product to appear in something like a Forrester "Wave" or a Gartner "Magic Quadrant" then you need to have the analysts recognise your product as one of a number in a more-or-less established category.

So even if you go to the bother of creating something shiny and new, you may well end up trying to make it look familiar.  You don't want to be in a category of one: it's hard to be visible if there are no direct competitors for you, your prospective customer and everyone else to compare you with.

Think about the exits for a software company.  With little happening on the IPO front, the most lucrative exit may well be a sale to one of the existing players.

There's been a lot of consolidation in the enterprise software market.  In the IT management space, it's easy to come to the conclusion that the big four (IBM, HP, BMC, EMC) do no research of their own and instead just buy start up companies to acquire new technology. Of course this isn't completely true, but if you're running a thriving IT management software start-up, selling your company to one of these vendors may well be your fast track to a yacht, bigger house and all the rest.

Each of these vendors wants to have a product in every category, so they end up doing lots of acquisitions, at least one in each category.  If you are not in a recognised category, then you're less likely to be acquired - and your exit is less likely to be lucrative as a result.

So one lesson of IT management software is to build stuff that isn't too new or too remarkable.

I plan to write some more about how this creates opportunities and challenges for new vendors in a future posting. Any comments?

Torture bad, counterproductive, recruits terrorists

Bruce Schneier's blog has a couple of nice quotes from Matthew Alexander, a former interrogator with the US in Iraq.

Interestingly, it seems to echo what some others with experience of interrogation say.  David Cornwell, the author who writes as John Le Carre worked for British Military Intelligence way back and says much the same in this interview.

No surprises.  Do bad things and right thinking people will get upset and decide you are worth fighting.  That's why it's so sad that we've allowed ourselves to be associated with despicable behaviour in Iraq and elsewhere.  


Sunday 4 January 2009

Gaza and Israel

Israel's invasion of Gaza is the front page news in today's serious British newspapers (the ones that aren't more concerned with the cosmetic surgery of pointless celebrities).

Clearly Israel has to do something about the hundreds of rockets that have been fired into its territory.  I have much more sympathy with Israel's retaliation than with our own (British) involvement in the second Iraq war.  How many Iraqi missiles had been fired at Britain or Britain's allies before our government felt justified in invading Iraq?  None.  The invasion led, directly and indirectly, to the deaths of many tens of thousands of Iraqis.  By that standard, what is happening now in Gaza seems like a minor incident.

It is not a minor incident, of course.  And the deaths of hundreds of Palestinians is unacceptable, as is the completely indiscriminate firing of missiles into Israel by Hamas.

In the lunchtime news ("The World this Weekend") on Radio 4, a Hamas member of parliament was asked why his organisation continued to fire missiles into Israel.

"Because we want peace" was his response.  (I believe those are the words he used - if I have them slightly wrong, I apologise).

Clearly the Hamas policy of firing missiles has failed to bring about the desired end, at least in the short term.

I suspect that Israel's approach may be just as futile in the longer term.  

With a huge effort and significant loss of life (mainly Palestinian lives) Israel may bring about some kind of halt to the firing of missiles in the  short term, but it is hard to see this as anything other than temporary until both sides acknowledge that long term peace will not be realised through force.

Worrying about something different for a change

Having been pondering the economic crisis, global recession or depression, the return of mass unemployment, jobless bankers roaming the streets and all the rest, it almost makes a pleasant change to find something new to worry about.  How about the environment?

I've come across several items recently which have given me pause.

First, an interview with Chris Priest, a researcher with HP Labs. Chris has contributed to a report called Climate Futures, responses to climate change in 2030 (PDF).  It makes interesting reading, though the interview itself, available as a podcast from the Redmonk site, gives a good summary.

The report describes five different possible scenarios for the world in 2030, ranging from the relatively benign, in which technology allows us to continue living our lives in much the same way as we do now, through to a protectionist world in which governments are forced to take drastic action against widespread environmental change and the consequent social disruption.

Next was a report on the plagues of jellyfish which are appearing around the world, one particularly slimy manifestation of environmental change.  I saw this mentioned first in a typically irreverent report in the Register.  You can find more detail at the National Science Foundation.

Finally, the way we have abused the oceans through the dumping of waste, spillage of pollutants and years of massive overfishing are the subject of a special report in this week's Economist (December 30th).  The link is here.  

The report is 16 pages, if not of doom and gloom, then certainly of material that fails to raise a smile. It makes the good point that we are far too late to "save" the oceans, if by saving, we mean the restoration of the seas to their state before industry changed them.  But if we are to prevent much more damage, we need effective international cooperation.  Sounds problematic to me. 

All rather depressing.