How to Start Using the Google Cloud Natural Language API

The last couple of years have seen a large number of organizations and developers rush towards getting familiar with Machine Learning fundamentals and coming to grips with what it takes to integrate it into their applications. While you can definitely build out your own Machine Learning Platform, it is not for everyone and companies like Google are now releasing fully managed API platforms where they expose the Machine Learning platform that they have built over the years. The main value to potential users is that these companies have likely trained their Machine Learning models for years and now the best of these services can be had with a single API call.

Image Reference:

The latest offering from Google is the Cloud Natural Language API which gives developers insights into unstructured text. It currently focuses on 3 key pieces of information for any text:

  • Entity Recognition: This feature extracts out the key entities in the text. It is currently able to identify various types of entities, some of which include landmarks, public figures and more. Take a look at the list of Entity Types.
  • Sentiment Analysis: This feature helps to identify the overall sentiment in the text. We have seen several Sentiment Analysis APIs emerge over the last few years and popular usages have been to analyse the sentiment on social media networks and this is particular used by brands to gauge customer sentiment.
  • Syntax Analysis: This feature breaks down your text into tokens, parts of speech and identify relationships between the entities.

Language API - Endpoints and Methods

A REST API is available to invoke the above functionality and we are going to deep dive into the Sentiment Analysis part of the API to first understand how it works and then build out a Slack Team helper that decodes the sentiment of the text provided to it.

The API Endpoint is available at and the current version is v1beta1 with the API discovery document available at$discovery/rest?version=v1beta1

The endpoint provides 3 methods: analyzeEntities, analyzeSentiment and annotateText which are the 3 features that we mentioned earlier.

Sentiment Analysis

The Sentiment Analysis method analyzeSentiment accepts a JSON document as input, a representation of which is given below:

 "type": enum(Type),
 "language": string,

 // Union field source can be only one of the following:
 "content": string,
 "gcsContentUri": string,
 // End of list of possible types for union field source.

The key thing to notice is that you can provide either the content as a string or you could point to a Google Cloud Storage URI, which could be a file that you have uploaded with its content as the text that you want to perform Sentiment Analysis on.

The response returned is a JSON object and the specific element that we are interested in checking is given below:

 "polarity": number,
 "magnitude": number,

As per the definition the polarity element is a value between -1.0 and 1.0, where a higher number is a more positive sentiment. The magnitude field is interesting and it is a number ranging from 0 to infinity. It takes into consideration the magnitude of the sentiment, irrespective of the polarity. So each sentiment (whether positive or negative) contributes to the magnitude. 

What this means is that you should use both magnitude and polarity to determine if its a positive, neutral or negative sentiment. Refer to the section on Interpreting Sentiment Analysis values in the official Documentation to better understand how you would use this.

Sample program with Google Cloud Project

Let us take the Sentiment Analysis feature through its steps. Our goal here will be to get familiar with the basic configuration steps and minimal code that we need to write to incorporate the Sentiment Analysis API in our application. We will be using the Python language in our sample application. The goal of the program is to take some text, invoke the Sentiment Analysis API on it and return back the sentiment polarity and magnitude.

The first step is to setup a Google Cloud Platform project. Follow the steps given below:

Create a Google Cloud Platform project

Visit GCP Cloud Developer Console and click on Create Project as shown below:

This will bring up a dialog where you can give a name to your project and then click on Create. This will create a Google Cloud Platform. You will need to wait for a few seconds for the project to be initialized.

Enable the API and download the JSON key

We need to enable the Cloud Natural Language API for the project next. From the navigation menu on the left, click on API Manager as shown below:

Click on Library and then start entering the words “Cloud Natural…” in the edit box as shown below.

This will show the link to the Google Cloud Natural Language API. Click on that to bring up the API details. You will see a button to Enable the API. Click on it and then give it a while to enable the API.

This will enable the API. The next thing to do is to get the Credentials for accessing this service. Since we will be invoking this service from a standalone program that we are running outside of the Google Cloud Platform environment, we will be using a Service Account key and we need to get the credentials for that, which is provided as a downloadable .json file. To get this file, do the following:

Click on Credentials and then Create Credentials as shown below:

Select Service account key as shown below.  

This will bring up a form to provide the service account details. Enter a name for the Service Account Name and select Project → Viewer as the role.

Click on Create. This will download a .json file to your local machine. Keep the file safely, we will be needing this when we run the application.

This completes the configuration of the Google Cloud Platform. The next step is to write a basic program to exercise the Sentiment Analysis feature of Google Cloud Natural Language API.

Sample program -

The sample program is a modification of a code listing that is available at the official documentation page.

The code listing is given below:

from googleapiclient import discovery
import httplib2
from oauth2client.client import GoogleCredentials


def main():

  HTTP = httplib2.Http()

 credentials = GoogleCredentials.get_application_default().create_scoped(


 service ='language', 'v1beta1',
                           http=http, discoveryServiceUrl=DISCOVERY_URL)

 service_request = service.documents().analyzeSentiment(
     'document': {
        'type': 'PLAIN_TEXT',
        'content': "Google Cloud Natural Language API rocks. It works well."

 response = service_request.execute()
 polarity = response['documentSentiment']['polarity']
 magnitude = response['documentSentiment']['magnitude']
 print('Sentiment: polarity of %s with magnitude of %s' % (polarity, magnitude))
 return 0

if __name__ == '__main__':

To run this code, ensure that you set the environment variable GOOGLE_APPLICATION_CREDENTIALS to the JSON key file that you downloaded in the previous step as follows:

export GOOGLE_APPLICATION_CREDENTIALS=<json key file path>

A sample run of the program is shown below:

$ python
Sentiment: polarity of 1 with magnitude of 1.1

The key parts of the code listing are explained below:

  • We use Application Default Credentials to authenticate ourselves to the service. This mechanism looks for the Service account’s JSON key file and looks for the environment variable GOOGLE_APPLICATION_CREDENTIALS, which we set in the previous step.
  • Once the Authentication is done, we generate the service via the discovery endpoint and invoke the analyzeSentiment method on it.
  • The result is then parsed and we print out the magnitude and polarity of the text. Notice that we have used a hard coded text here for demonstration purposes.

Slack Team Helper to Perform Sentiment Analysis

We will now combine what we have learned so far into a Slack Team helper that performs sentiment analysis for us. We would like to provide a functionality in the Slack Team channel where the user can ask for sentiment analysis to be done on the text that they provide.

Be sure to read the next Natural Language Processing article: Performance Comparison of 10 Linguistic APIs for Entity Recognition