Oracle Rises to the Big Data ETL Challenge

As demand for Big Data applications continues to increase developers are being forced to confront some fundamental issues associated with traditional extract, transform and load (ETL). Simply put the ETL processes and technology they have in place was never designed to handle loading massive amounts of data into an application.

According to Brad Adelberg, vice president of development for Oracle, the time has come to put legacy ETL tools aside in favor of a modern Oracle Data Integrator (ODI) technology that is orders of magnitude faster. To underscore that point Oracle today released Oracle Data Integrator 12c, which takes greater advantage of multicore processors to load data in parallel. In addition, Oracle also released Oracle GoldenGate 12c, a data Integration toolset that it acquired in 2009 that is now tightly integrated with Oracle Data Integrator 12c.

Adelberg says that Oracle Data Integrator is not only faster; it presents developers with a simple declarative interface that is easy to master. Oracle has optimized its data integration tools for Oracle databases and as of today is now providing tighter integration with Oracle Warehouse Builder tools. But Adelberg says developers can use ODI to target any database environment they choose, which includes support for Big Data platforms such as Hadoop, Hive, HDFS, and Oracle Big Data Appliance.

Oracle has been making the case for a more modern approach to ETL for several year now. But with the advent of Big Data Adelberg says a seminal moment is approaching that will finally force the legacy ETL tools issue. Most organizations are not going to be able to hire armies of developers simply to load data. Adelberg says they will need an approach that automates the ETL Function at scale.

Big Data represents a massive opportunity for developers. Most organizations are going to rely on APIs to expose that data. But before anybody can really take advantage of Big Data Adelberg says developers are going to need a much more efficient way to load data into their applications.

Recent surveys suggest that about half of IT organizations on average will be spending $10 million or more on Big Data projects by 2016. Clearly, developers that can build Big Data applications will be in high demand. But none of that activity may prove to be rewarding if developers are spending all their time waiting for data to load.

Be sure to read the next News Services article: TIBCO Looks to Democratize Integration