Google Debuts Coral to Speed Up Local AI Development

Google this week made Coral available to developers as a public beta. Google built Coral to boost the presence of on-device Artificial Intelligence. Making the AI calculations possible on a local, rather than server, level eliminates latency and improves the turn-around time for results. 

Google calls Coral a "complete local AI toolkit" that can help developers take their idea from simple sketches to scaled production. The kit includes both software and hardware modules, as well as content so developers can "create, train, and run neural networks locally." In this case, locally means on a device such as a phone or PC. Giving neural networks space to run on the device not only improves speed, it also improves privacy be keeping all the data in one spot. Coral is power efficient, so it won't drain device batteries no matter the number crunching load it pushes through. Moreover, Google says Coral's individual components were created with rapid prototyping and simplified Scaling in mind -- all the way to production lines if waranted. 

The initial hardware component is an Edge TPU, a tiny ASIC that Google designed to provide high-performance Machine Learning ( ML) inferencing in a highly efficient manner. Google says the Edge TPU can, for example, execute mobile vision models such as MobileNet V2 at more than 100 frames per second. The board costs $150.

Google says it designed the Coral Dev Board as an integrated system. It is composed of a carrier board with a system-on-module (SoM) attached. It packages together the Edge TPU with an NXP iMX8M SoC, eMMC memory, RAM, Bluetooth, and WiFi. A separate camera, which connects to the Dev Board via MIPI interface, is available as an add-on for computer vision prototypes. 

The $75 Coral USB Accelerator lets developers add the Edge TPU to existing Linux systems via USB 2.0 or 3.0. Google says PCIe versions are on the way, too, which will snap into M.2 or mini-PCIe slots. 

On the software side of things, Google adopted TensorFlow and TensorFlow Lite. Google indicated that TF Lite models must be quantized and then compiled with its toolchain in order to work properly with the Edge TPU. Google put togeher more than a dozen models for developers to use as blueprints with the Coral Dev Board. Coral can be used with Google Cloud IoT and combined with other cloud services with an on-device software Stack for edge computing and ML capabilities.

"AI can be beneficial for everyone, especially when we all explore, learn, and build together," said Billy Rutledge (Director) and Vikram Tank (Product Mgr), Coral Team, in a blog post. "Google's been developing tools like TensorFlow and AutoML to ensure that everyone has access to build with AI."

Google says its beta Coral products are already available, complete with Documentation, datasheets, and Sample Code. You can find more information here. As with all Google betas, the company hopes developers will provide feedback on this early version of the software.

Be sure to read the next Artificial Intelligence article: Uber Launches Plato, an AI Platform for Developers