IBM Expands Human-Machine Interaction with New Watson APIs

IBM continues to enhance Watson's senses by expanding its cognitive APIs. Recently, IBM announced new and enhanced Watson APIs that extend the boundaries of human-machine interactions. API access to such features allows for hardware and software systems across the board to take advantage of such advances. IBM's latest Watson enhancements include three new APIs in beta (Tone Analyzer, Emotion Analysis, and Visual Recognition), and an updated and re-released Text to Speech (TTS) engine (now called Expressive TTS).

The Tone Analyzer API gives users insights into the tone contained within a portion of text. The Tone Analyzer recognizes emotions (e.g. joy, disgust, fear, sadness), and social propensities (e.g. extraversion, emotional range, etc.). Expanding beyond previous releases, the updated Tone Analyzer has progressed from single word analysis to analyze entire sentences to provide more contextual understanding.

The Emotion Analysis API has been expanded to detect more then simple sentiment (e.g. positive sentiment vs. negative sentiment). The API now detects a broader range of emotions (e.g. joy, fear, sadness, disgust, anger, etc.). IBM anticipates the API be used to better review and analyze a wide range of expressive venues including customer reviews, surveys, and social media posts.

The Visual Recognition API allows users to train Watson on a custom set of images. Alternative visual search engines on the market are limited by pre-determined tags and classifications. Watson's Visual Recognition API allows users to present material to the platform and the platform will learn to classify the material based on the training. The Visual Recognition API's ability to learn through training mimics the way Watson's natural language classification feature works.

Expressive TTS, now generally available, expands Watson's existing TTS capabilities with emotional IQ. Emotional IQ allows systems to create and deliver adaptive emotion within vocal interactions. What does this mean? Apps and systems integrated with Expressive TTS not only understand natural language; rather, such systems understand the tone and context associated with such language. Further, Expressive TTS enables systems to respond with the appropriate nuance and inflection.

In addition to the new APIs, IBM continues to offer new developer tools that developers can use to access Watson capabilities. IBM has released new SDKs for Node, Java, Python, iOS Swift, and Unity. Further, new Application Starter Kits streamline developers' use of Watson. 

Be sure to read the next Machine Learning article: How Machine Learning APIs are Being Used to Predict Startup Success


Comments (0)