Amazon Kinesis Handles Large-Scale, Real-Time Data Processing

Yesterday at the 2013 AWS re:Invent Conference, Amazon Web Services (AWS) announced Amazon Kinesis, a fully managed service designed to handle real-time streaming and high-volume data processing at any scale. The announcement comes on the heels of the launch of CloudTrail, a new Web Service that records API calls made to AWS accounts and then provides log files via Amazon's S3 bucket.

Amazon Kinesis

Image Credit: Amazon Web Services

Amazon Kinesis is designed to support applications that need to process large-scale, real-time data. Using AWS, Amazon Kinesis can handle any amount of data from any number of sources, Scaling up or down as needed. Amazon Kinesis does not use batch-based processing techniques, which allows the service to process real-time, rapidly changing streams of data. Terry Hanold, AWS vice president of new business initiatives, comments about batch-based data processing in the press release:

"Database and MapReduce technologies are good at handling large volumes of data. But they are fundamentally batch-based, and struggle with enabling real-time decisions on a never-ending—and never fully complete—stream of data. Amazon Kinesis aims to fill this gap, removing many of the cost, effort and expertise barriers customers encounter with streaming data solutions, while providing the performance, durability and scale required for the largest, most advanced implementations."

Amazon Kinesis can be implemented using the AWS management console, the AWS command line interface or the Kinesis API. The Amazon Kinesis API is one of the Analytics APIs available from AWS. Other AWS APIs in the Analytics category include Amazon Elastic MapReduce and AWS Data Pipeline.

Currently, Amazon Kinesis is only available as a Limited Preview. Developers can sign up for the waiting list to use Amazon Kinesis at the AWS website.

Be sure to read the next News Services article: Glassbeam Analytics Platform for IoT Exposes RESTful APIs