Classifying content

The maximum lifespan for a custom model is six months. You must create and train a new model to continue classifying content after that amount of time.

Web UI

  1. Open AutoML Natural Language UI, select Launch app in the AutoML Text Classification box, and click the lightbulb icon in the left navigation bar to display the available models.

    To view the models for a different project, select the project from the drop-down list in the upper right of the title bar.

  2. Click the row for the model you want to use to classify your content.

  3. Click the Predict tab just below the title bar.

  4. Enter the content you want to classify into the text box and click Predict.

The AutoML Natural Language UI does not show low confidence predictions.

Command-line

After your model has been successfully trained using the AutoML API, you call the predict method to classify content using the model.

  • Replace model-name with the full name of your model, from the response when you created the model. The full name has the format: projects/{project-id}/locations/us-central1/models/{model-id}
curl -X POST \
  -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
  -H "Content-Type: application/json" \
  https://automl.googleapis.com/v1beta1/model-name:predict \
  -d '{
        "payload" : {
          "textSnippet": {
               "content": "Google, headquartered in Mountain View, unveiled the new Android
                 phone at the Consumer Electronic Show.  Sundar Pichai said in his keynote
                 that users love their new Android phones.",
                "mime_type": "text/plain"
           },
        }
      }'

You should see output similar to the following. You can use the value of the score field to determine which labels apply to the document. Higher values indicate more confidence that the label applies to the document.

{
  "payload": [
    {
      "displayName": "Technology",
      "classification": {
        "score": 0.8989502
      }
    },
    {
      "displayName": "Automobiles",
      "classification": {
        "score": 0.10098731
      }
    }
  ]
}

Python

Before you can run this code example, you must install the Python Client Libraries.

  • The model_full_id parameter is the full name of your model. For example: projects/434039606874/locations/us-central1/models/3745331181667467569.
# TODO(developer): Uncomment and set the following variables
# project_id = 'PROJECT_ID_HERE'
# compute_region = 'COMPUTE_REGION_HERE'
# model_id = 'MODEL_ID_HERE'
# file_path = '/local/path/to/file'

from google.cloud import automl_v1beta1 as automl

automl_client = automl.AutoMlClient()

# Create client for prediction service.
prediction_client = automl.PredictionServiceClient()

# Get the full path of the model.
model_full_id = automl_client.model_path(
    project_id, compute_region, model_id
)

# Read the file content for prediction.
with open(file_path, "rb") as content_file:
    snippet = content_file.read()

# Set the payload by giving the content and type of the file.
payload = {"text_snippet": {"content": snippet, "mime_type": "text/plain"}}

# params is additional domain-specific parameters.
# currently there is no additional parameters supported.
params = {}
response = prediction_client.predict(model_full_id, payload, params)
print("Prediction results:")
for result in response.payload:
    print("Predicted class name: {}".format(result.display_name))
    print("Predicted class score: {}".format(result.classification.score))

Java

/**
 * Demonstrates using the AutoML client to classify the text content
 *
 * @param projectId the Id of the project.
 * @param computeRegion the Region name.
 * @param modelId the Id of the model which will be used for text classification.
 * @param filePath the Local text file path of the content to be classified.
 * @throws IOException on Input/Output errors.
 */
public static void predict(
    String projectId, String computeRegion, String modelId, String filePath) throws IOException {

  // Create client for prediction service.
  PredictionServiceClient predictionClient = PredictionServiceClient.create();

  // Get full path of model
  ModelName name = ModelName.of(projectId, computeRegion, modelId);

  // Read the file content for prediction.
  String content = new String(Files.readAllBytes(Paths.get(filePath)));

  // Set the payload by giving the content and type of the file.
  TextSnippet textSnippet =
      TextSnippet.newBuilder().setContent(content).setMimeType("text/plain").build();
  ExamplePayload payload = ExamplePayload.newBuilder().setTextSnippet(textSnippet).build();

  // params is additional domain-specific parameters.
  // currently there is no additional parameters supported.
  Map<String, String> params = new HashMap<String, String>();
  PredictResponse response = predictionClient.predict(name, payload, params);

  System.out.println("Prediction results:");
  for (AnnotationPayload annotationPayload : response.getPayloadList()) {
    System.out.println("Predicted Class name :" + annotationPayload.getDisplayName());
    System.out.println(
        "Predicted Class Score :" + annotationPayload.getClassification().getScore());
  }
}

Node.js

  const automl = require(`@google-cloud/automl`);
  const fs = require(`fs`);

  // Create client for prediction service.
  const client = new automl.v1beta1.PredictionServiceClient();

  /**
   * TODO(developer): Uncomment the following line before running the sample.
   */
  // const projectId = `The GCLOUD_PROJECT string, e.g. "my-gcloud-project"`;
  // const computeRegion = `region-name, e.g. "us-central1"`;
  // const modelId = `id of the model, e.g. “ICN12345”`;
  // const filePath = `local text file path of content to be classified, e.g. "./resources/test.txt"`;

  // Get the full path of the model.
  const modelFullId = client.modelPath(projectId, computeRegion, modelId);

  // Read the file content for prediction.
  const snippet = fs.readFileSync(filePath, `utf8`);

  // Set the payload by giving the content and type of the file.
  const payload = {
    textSnippet: {
      content: snippet,
      mimeType: `text/plain`,
    },
  };

  // Params is additional domain-specific parameters.
  // Currently there is no additional parameters supported.
  const [response] = await client.predict({
    name: modelFullId,
    payload: payload,
    params: {},
  });
  console.log(`Prediction results:`);
  response[0].payload.forEach(result => {
    console.log(`Predicted class name: ${result.displayName}`);
    console.log(`Predicted class score: ${result.classification.score}`);
  });

Was this page helpful? Let us know how we did:

Send feedback about...

AutoML Natural Language
Need help? Visit our support page.