Big Tin

Big tin: IT infrastructure used by organisations to run their businesses. And other stuff too when I feel like it…

2012: the tech year in view (part 1)

As 2012 draws to a close, here’s a round-up of some of the more interesting news stories that came my way this year. This is part 1 of 2 – part 2 will be posted on Monday 31 December 2012.

Storage
Virsto, a company making software that boosts storage performance by sequentialising the random data streams from multiple virtual machines, launched Virsto for vSphere 2.0. According to the company, this adds features for virtual desktop infrastructures (VDI), and it can lower the cost of providing storage for each desktop by 50 percent. The technology can save money because you need less storage to deliver sufficient data throughput, says Virsto.

At the IPExpo show, I spoke with Overland which has added a block-based product called SnapSAN to its portfolio. According to the company, the SnapSAN 3000 and 5000 offer primary storage using SSD for cacheing or auto-tiering. This “moves us towards the big enterprise market while remaining simple and cost-effective,” said a spokesman. Also, Overland’s new SnapServer DX series now includes dynamic RAID, which works somewhat like Drobo’s system in that you can install differently sized disks into the array and still use all the capacity.

Storage startup Tegile is one of many companies making storage arrays with both spinning and solid-state disks to boost performance and so, the company claims boost performance cost-effectively. Tegile claims it reduces data aggressively, using de-duplication and compression, and so cuts the cost of the SSD overhead. Its main competitor is Nimble Storage.

Nimble itself launched a so-called ‘scale to fit’ architecture for its hybrid SSD-spinning disk arrays this year, adding a rack of expansion shelves that allows capacity to be expanded. It’s a unified approach, says the company, which means that adding storage doesn’t mean you need to perform a lot of admin moving data around.

Cloud computing
Red Hat launched OpenShift Enterprise, a cloud-based platform service (PaaS). This is, says Red Hat, a solution for developers to launch new projects, including a development toolkit that allows you to quickly fire up new VM instances. Based on SE Linux, you can fire up a container and get middleware components such as JBoss, php, and a wide variety of languages. The benefits, says the company, are that the system allows you to pool your development projects.

Red Hat also launched Enterprise Virtualization 3.1, a platform for hosting virtual servers with up to 160 logical CPUs and up to 2TB of memory per virtual machine. It adds command line tools for administrators, and features such as RESTful APIs, a new Python-based software development kit, and a bash shell. The open source system includes a GUI to allow you to manage hundreds of hosts with thousands of VMs, according to Red Hat.

HP spoke to me at IPExpo about a new CGI rendering system that it’s offering as a cloud-based service. According to HP’s Bristol labs director, it’s 100 percent automated and autonomic. It means that a graphics designer uses a framework to send a CGI job to a service provider who creates the film frame. The service works by estimating the number of servers required, sets them up and configures them automatically in just two minutes, then tears them down after delivery of the video frames. The evidence that it works can apparently be seen in the animated film Madagascar where, to make the lion’s mane move realistically, calculations were needed for 50,000 individual hairs.

For the future, HP Labs is looking at using big data and analytics for security purposes and is looking at providing an app store for analytics as a service.

Security
I also spoke with Rapid7, an open-source security company that offers a range of tools for companies large and small to control and manage the security of their digital assets. It includes a vulnerability scanner, Nexpose, a penetration testing tool, Metasploit, and Mobilisafe, a tool for mobile devices that “discovers, identifies and eliminates risks to company data from mobile devices”, according to the company. Overall, the company aims to provide “solutions for comprehensive security assessments that enable smart decisions and the ability to act effectively”, a tall order in a crowded security market.

I caught up with Druva, a company that develops software to protect mobile devices such as smartphones, laptops and tablets. Given the explosive growth in the numbers of end-user owned devices in companies today, this company has found itself in the right place at the right time. New features added to its flagship product inSync include better usability and reporting, with the aim of giving IT admins a clearer idea of what users are doing with their devices on the company network.

Networking
Enterasys – once Cabletron for the oldies around here – launched a new wireless system, IdentiFi. The company calls it wireless with embedded intelligence offering wired-like performance but with added security. The system can identify issues of performance and identity, and user locations, the company says, and it integrates with Enterasys’ OneFabric network architecture that’s managed using a single database.

Management
The growth of virtualisation in datacentres has resulted in a need to manage the virtual machines, so a number of companies focusing on this problem have sprung up. Among them is vKernel, whose product vOPS Server aims to be a tool for admins that’s easy to use; experts should feel they have another pair of hands to help them do stuff, was how one company spokesman put it. The company, now owned by Dell, claims it has largest feature set for virtualisation management when you include its vKernel and vFoglight products, which provide analysis, advice and automation of common tasks.

Advertisements

Filed under: Business, Cloud computing, data protection, Enterprise, mobile, Networking, Product, Product launch, Security, Servers, Storage, Systems management, Technology, , , , , , , , , ,

Oracle buys Sun — but who really wins?

The big news this week this is undoubtedly the $7.4 billion purchase of the troubled server company Sun Microsystems by database specialist Oracle. But, given the very different nature of the two companies, will it work?

Well-known in the industry for being the favourite of developers and geeks, and among its customers for its high-powered, reliable but expensive systems, Sun has nonetheless suffered financially since the implosion of the dotcom bubble. Its accounts have bled red for years, and selling the company seems for eons — that’s eons in IT years — to have been the only way out.

Just two weeks ago, IBM made overtures to buy the company. This author among others could see that there would be some synergies, although I struggled to see how Big Blue would swallow Sun’s server range, given that it has a well-established and rational product portfolio already. IBM and Sun would have fitted together mainly on the software side, where the acquisition of Solaris, a major platform in the database world, along with Java and many open source technologies including OpenOffice, would have sat comfortably alongside IBM’s espousal of open source, and its conversion from hardware to software and services company.

It wasn’t to be. Sun demanded too much of IBM — more here — and the deal fell through. We wondered at the time how Sun could have let it happen, and accused the Silicon Valley stalwart of greed and complacency.

What we didn’t know was that it had another suitor in the wings, one willing to pay Sun’s pretty substantial asking price.

Early post-purchase signs are good. Most analysts and observers see more positives than negatives emerging from the deal. Oracle is a software company first and foremost, while Sun’s revenues stem mostly from hardware.

What’s more, Sun’s Solaris is a major platform for Oracle’s eponymous database, which means that Oracle can now offer the whole stack, from raw iron upwards, and so is in a better position to offer more tightly integrated solutions. As the company’s acquisition statement said: “Oracle will be the only company that can engineer an integrated system — applications to disk — where all the pieces fit and work together so customers do not have to do it themselves”.

Some systems integrators may suffer as a result, but that’ll be some way down the line, after two or three product refresh cycles.

The deal has even got some of the opposition thinking. As Colin Barker reports from an HP product launch in Berlin (which I was unable to make, sadly): “HP executives thought that the news was interesting and it was not difficult to see their internal calculators trying to work out any options the move would give them.”

So far so fitted.

But big questions remain to be answered. Sun has always been a fairly open company, and has always seen itself and wanted to be seen as part of a wider community. When open source came along, Sun gradually adopted it and, with no little external persuasion it seemed at the time, even made some of its own, expensively developed technology open source.

In complete contrast, Oracle has rarely if ever done that — apart perhaps from its development of its own version of Red Hat Linux, which the market has largely ignored. Oracle’s proprietary approach and eagerness to squeeze every last dollar out of its large enterprise customers is the stuff of legend.

This is unlikely to change, especially now that it can lock down those customers to a tightly integrated hardware platform. The reactions of those customers, of the competition, many of whom are in alliances with either or both the parties to the acquisition, and of the channel remain to be seen.

There will be layoffs too, given the economic situation, and the more obvious lack of need for duplicated sales, marketing or HR departments, for example. One analyst is reported to have predicted up to 10,000 job losses. I would expect the culture shock to squeeze quite a few through the out door.

But if you’re a customer, you might prefer not be locked in. If you’re a hardware partner of Oracle’s, you’re likely to be re-thinking that deal, big time. HP is in that boat, given that it’s co-developed servers for Oracle, in the database company’s first venture into hardware, back in 2008. And if you either work for Sun or are one of the developer community in Sun’s orbit, you might well find yourself wondering where to go next, whether voluntarily or not.

My take is that most customers will stay put. It’s not the time to start launching into expensive new IT roll-outs. That’s not to say that those with an aversion to single-supplier deals won’t bail as soon as possible.

However, the pressure on the competition in the current climate is likely to result in more mergers and acquisitions, and a jungle populated by fewer but bigger beasts.

But who and which? Here are some questions: will IBM swallow EMC? Will Cisco buy Brocade? And could Microsoft finally buy Yahoo!? And how many more yachts will this deal enable Oracle CEO Larry Ellison to buy?

Filed under: mergers & acquisitions, operating systems, Servers, , , , , , ,

New HP servers take battle to Cisco

HP has today launched a swathe of servers in multiple form factors — rack, blade and tower — driven by Intel’s latest processor architecture, codenamed Nehalem.

But there’s much more to it than that.

Time was when server companies, especially those such as HP, which analysts say has the biggest server market share, would boast and blag about how theirs were the biggest and fastest beasts in the jungle.

No longer. Instead, HP put heavy emphasis on its management capabilities. That’s a shot fired across the bows of network vendor Cisco, which just two weeks ago unveiled a new unified computing initiative, at whose core is a scheme to manage and automate the movement of virtual machines and applications across servers inside data centres. Oh yes, there’s a server in there too — a first for fast-diversifying Cisco.

But this is a sidetrack: back to HP’s launch of the ProLiant G6. Performance was mentioned once in the press release’s opening paragraph — they’re twice as quick, apparently — but when he spoke to me, European server VP Christian Keller focused almost entirely on manageability, and performance per watt.

“We have 32 senders that give health information about temperatures and hotspots. Unlike our competitors, we don’t just control all six fans together — we can control them separately using advanced algorithms. These are based on computational fluid dynamics and are based in a chip, so it works even if the OS is changing — for example during virtualisation moves,” he said.

Keller went on to talk about how the servers’ power draw can be capped, again using hardware-based algorithms, which means that a server that’s been over-specified for the purposes of future-proofing won’t draw more power than it needs.

The result, Keller went on, is that “you can use the data centre better and pack more servers into the same space.” The bottom line is that the organisation reaps big total cost of ownership savings, he reckoned, although with finance very tight, he said that quick payback was at the top of mind of his customers.

“Customers are looking for faster payback today due to recession,” he said. “With HP, you need fewer servers to do the same amount of work and payback is achieved in around 12 months.” And there’s a bunch of slideware to back up his claims. You can get more on the products here.

Management software
HP’s keen to make more of its data centre management software — during a recent conversation, one HP exec said he reckoned the company had indulged in stealth marketing of its software portfolio.

And it’s true that HP’s new raft of software, much of it launched over six months ago and based on Systems Insight Manager, has barely been mentioned outside conversations with HP’s customers. It covers a wide range of functionality, enabling data centre managers to manage partitions within and across blades, which can be in the same chassis or in separate chassis — depending on what you want to do.

I saw a demo of the system and it was impressive. One of the core modules is the Capacity Advisor, which allows what-if planning so you can size your hardware requirements. It includes trending out to the future – which was a features on HP’s HP/UX platform but is now on x86. It not only allows the manager to size systems both for current and future use, it automatically checks how well the sizing operation matches reality.

Virtualisation Manager adds a view of all resources and virtual machines, and can display application resource utilisation inside VMs, while Global Workload Manager allows you to change priorities depending on which application is the most critical. So backup gets resources when the payroll cheque run is finished, for example. There’s lots more to it, so you can find out more here.

This isn’t intended to be a serious review of HP’s system management software — I didn’t spend nearly enough time with it for that. However, amid the noise surrounding VMware and Microsoft, and a host of third parties vying for position as top dog in the data centre management space, and together with the brouhaha surrounding Cisco’s recent launch, HP has quietly got on with developing hat looks like a seriously useful suite of software.

Apart from a press release six months ago, the company just hasn’t told many people about it.

Filed under: Product launch, Servers, Systems management, , , , , , , , , ,

Manek’s twitter stream