The Makers of Hardware and Software for Database Centers Join Hands to Usher in Web 3.0
In the database industry (or data center computing as they like to call it), they seem to be a little sick of the whole tradition they have of assembling computers for a new data center, bit by painful bit, each from a different company, and trying to get them all to work together. Parts of the hardware come from one company, the rest come from another and the software comes from a third. That's what all the high-profile market acquisitions these days are about - Oracle, the database software maker just bought Sun, Microsoft's server and database software division just signed a pact with Hewlett-Packard, and server/data center equipment maker Cisco, has signed on with EMC. All the megacorporations whose software-for-database business model would be a lot less painful to implement with their own hardware configurations have finally taken the plunge, and gone the Apple route of tight hardware-software integration. Data management is really such an overwhelming task these days with information pouring in from the Internet, from smartphones, and sensors all over the place. And intelligent Internet-connected applications running every hospital, every research facility and transportation operation, make data handling so expensive, the last thing anyone wants any more, is to mess with the nuts and bolts of the basic setup that runs the whole show.
While this is basically the right idea, experts feel that just having a hardware company and a software company shake hands on "better cooperation" is hardly going to cut it. The utility of all of this, would be minimal at best. What the makers of software for database need to realize is that they have the bigger contribution to make in all of this. With the talk of Web 3.0 that comes up all the time these days, the whole focus of running a network is beginning to change from merely managing a database and serving information when asked, to having software that can understand and perform intelligent data interpretation. Computing science isn't quite there yet, but having huge corporations completely recognizing that this is where the future lies, should be quite a step forward.
All the acquisitions and partnerships do make a lot of sense here; parallel computing is where Web 3.0 is at. Every request for a database search that is generated, if it could be broken down into little tasks, and if each could be processed separately, concurrently with the others a lot more power and speed could be funnelled into each operation. So far, hardware makers have only made general-purpose computer chips - ones that are as good at handling server functions, as they are at video processing. These are two completely different tasks, and need to be recognized as such. Massively multicore computing power, is where the answer is, experts say. Makers of hardware and software for database centers should realize that managing databases is labor-intensive, and costs much more than any investments made in their products. Find a way to automate these, and you have a great new competitive advantage.
Companies are like IBM are already on their way to this finding - IBM has its Power 7 processors that have 32 cores for server and database applications. The applications are endless - automating utility power grids, finance and research centers, MRI computing. And the answer to everything, lies in making computers more intelligent - intelligent enough to understand and interpret data on the run.
In the database industry (or data center computing as they like to call it), they seem to be a little sick of the whole tradition they have of assembling computers for a new data center, bit by painful bit, each from a different company, and trying to get them all to work together. Parts of the hardware come from one company, the rest come from another and the software comes from a third. That's what all the high-profile market acquisitions these days are about - Oracle, the database software maker just bought Sun, Microsoft's server and database software division just signed a pact with Hewlett-Packard, and server/data center equipment maker Cisco, has signed on with EMC. All the megacorporations whose software-for-database business model would be a lot less painful to implement with their own hardware configurations have finally taken the plunge, and gone the Apple route of tight hardware-software integration. Data management is really such an overwhelming task these days with information pouring in from the Internet, from smartphones, and sensors all over the place. And intelligent Internet-connected applications running every hospital, every research facility and transportation operation, make data handling so expensive, the last thing anyone wants any more, is to mess with the nuts and bolts of the basic setup that runs the whole show.
While this is basically the right idea, experts feel that just having a hardware company and a software company shake hands on "better cooperation" is hardly going to cut it. The utility of all of this, would be minimal at best. What the makers of software for database need to realize is that they have the bigger contribution to make in all of this. With the talk of Web 3.0 that comes up all the time these days, the whole focus of running a network is beginning to change from merely managing a database and serving information when asked, to having software that can understand and perform intelligent data interpretation. Computing science isn't quite there yet, but having huge corporations completely recognizing that this is where the future lies, should be quite a step forward.
All the acquisitions and partnerships do make a lot of sense here; parallel computing is where Web 3.0 is at. Every request for a database search that is generated, if it could be broken down into little tasks, and if each could be processed separately, concurrently with the others a lot more power and speed could be funnelled into each operation. So far, hardware makers have only made general-purpose computer chips - ones that are as good at handling server functions, as they are at video processing. These are two completely different tasks, and need to be recognized as such. Massively multicore computing power, is where the answer is, experts say. Makers of hardware and software for database centers should realize that managing databases is labor-intensive, and costs much more than any investments made in their products. Find a way to automate these, and you have a great new competitive advantage.
Companies are like IBM are already on their way to this finding - IBM has its Power 7 processors that have 32 cores for server and database applications. The applications are endless - automating utility power grids, finance and research centers, MRI computing. And the answer to everything, lies in making computers more intelligent - intelligent enough to understand and interpret data on the run.