Business

Companies face a big data dilemma, says Teradata CTO

Each week we ask high-profile technology decision-makers three questions.

Image caption Big data is posing a huge analytics problem, says Stephen Brobst

This week it is Stephen Brobst, chief technology officer (CTO) of Teradata.

Teradata is a technology company that specialises in data warehousing and analytic applications. The Ohio-based company has over 900 customers worldwide, and had revenue of $1.9bn in 2010.

What's your biggest technology problem right now?

I guess first of all when you think of problems, to an entrepreneur there's no such thing as problems - there's only opportunities.

If I were to put it in that light, we can look at the advancement in central processing unit (CPU)processing power over the last 30 years.

Moore's law basically allows you to double processing power every 18 months for the same cost. This has given us a 5m x improvement in CPU processing power.

At the same time input/ouput (I/O) from disk drives has only increased by a factor of five, so there's a huge imbalance between CPU processing power and I/O delivery rates from disk drives.

The disk drive manufacturers are increasing the density of storage on the disk drives, but they're not speeding up the disk drives at nearly the same rate. So the performance from an I/O perspective is decreasing on a per gigabyte basis at the same time as CPU power is increasing.

This leads to a dramatic imbalance in terms of computational configuration of processing architectures.

So that's a big problem that at Teradata we're addressing using what we call a virtual storage architecture, that allows us to combine in memory processing with solid state disk drives, together with electro-mechanical drives, because it's the electro-mechanical drives that are not keeping up with the CPUs.

On the other hand, in-memory and solid state drives (SSD) are more expensive per terabyte, so it doesn't make economic sense to store all of your data in-memory or SSD.

So what we have done is allow the data to be spread across multiple technologies, so you can optimise both price and performance at the same time.

What's the next big tech thing in your industry?

The next big tech thing is pretty clearly centred around the idea of big data. Not just the bigness of it - the volume of it - but also the diversity of it.

So with social media coming into play, with sensor data coming into play, the nature of data is changing dramatically.

It used to be essentially rows and columns of records being stored that were the result of doing business. But now data is the basis for creating new businesses. And this data comes in many different forms.

We believe the diversity of data, as well as the volume of data, is going to completely change the nature of analytics.

In the future, every object will know its location, will know its temperature, will know the humidity. All the vital signs for an object.

And that object might be a human by the way, for healthcare sensing. It might be an object in the supply chain so you know where the object is in your manufacturing plant.

Anything that's been shipped from one place to another, sensor devices will allow collection of huge amounts of data that by its very nature needs and wants to be analysed to drive better decisions.

So this big data, which is volumes of data but also diversity of data, is clearly the next big thing in our industry.

Image caption While CPU processing power has increased massively over the last 30 years, disk drives have lagged far behind, says Stephen Brobst

What's the biggest technology mistake you've ever made - either at work or in your own life?

A number of years ago, Intel in a joint venture with Hewlett Packard developed a new chip architecture called Itanium.

I will take partial should I say discredit for the investments Teradata made in porting our software onto this new chip architecture.

It was supposed to be the next generation and take over the world in processing architectures and so on.

In the end it was dead on arrival.

They had schedule delays, there were a lot of issues between HP and Intel and in the end it was a dead chip. We wasted a lot of money porting our software onto Itanium.

Luckily we recovered pretty quickly and diverted those resources once it became clear this technology wasn't going to go anywhere.

But we clearly got diverted by this Itanium chip, which was not such a good decision and I will certainly take part of the blame for that decision.

Related Internet links

The BBC is not responsible for the content of external Internet sites