Cloud AutoML: Node.js Client
🔔 AutoML API NodeJS Client is now available in Vertex AI. Please visit node-js-aiplatform for the new NodeJS Vertex AI client. Vertex AI is our next generation AI Platform, with many new features that are unavailable in the current platform. Migrate your resources to Vertex AI to get the latest machine learning features, simplify end-to-end journeys, and productionize models with MLOps.
Cloud AutoML API client for Node.js
A comprehensive list of changes in each version may be found in the CHANGELOG.
- Cloud AutoML Node.js Client API Reference
- Cloud AutoML Documentation
- github.com/googleapis/nodejs-automl
Read more about the client libraries for Cloud APIs, including the older Google APIs Client Libraries, in Client Libraries Explained.
Table of contents:
Quickstart
Before you begin
- Select or create a Cloud Platform project.
- Enable billing for your project.
- Enable the Cloud AutoML API.
- Set up authentication with a service account so you can access the API from your local workstation.
Installing the client library
npm install @google-cloud/automl
Using the client library
const automl = require('@google-cloud/automl');
const fs = require('fs');
// Create client for prediction service.
const client = new automl.PredictionServiceClient();
/**
* TODO(developer): Uncomment the following line before running the sample.
*/
// const projectId = `The GCLOUD_PROJECT string, e.g. "my-gcloud-project"`;
// const computeRegion = `region-name, e.g. "us-central1"`;
// const modelId = `id of the model, e.g. “ICN723541179344731436”`;
// const filePath = `local text file path of content to be classified, e.g. "./resources/flower.png"`;
// const scoreThreshold = `value between 0.0 and 1.0, e.g. "0.5"`;
// Get the full path of the model.
const modelFullId = client.modelPath(projectId, computeRegion, modelId);
// Read the file content for prediction.
const content = fs.readFileSync(filePath, 'base64');
const params = {};
if (scoreThreshold) {
params.score_threshold = scoreThreshold;
}
// Set the payload by giving the content and type of the file.
const payload = {};
payload.image = {imageBytes: content};
// params is additional domain-specific parameters.
// currently there is no additional parameters supported.
const [response] = await client.predict({
name: modelFullId,
payload: payload,
params: params,
});
console.log('Prediction results:');
response.payload.forEach(result => {
console.log(`Predicted class name: ${result.displayName}`);
console.log(`Predicted class score: ${result.classification.score}`);
});
Samples
Samples are in the samples/
directory. Each sample's README.md
has instructions for running its sample.
Sample | Source Code | Try it |
---|---|---|
Batch_predict | source code | |
Delete_dataset | source code | |
Delete_model | source code | |
Deploy_model | source code | |
Export_dataset | source code | |
Get_dataset | source code | |
Get_model | source code | |
Get_model_evaluation | source code | |
Get_operation_status | source code | |
Import_dataset | source code | |
Language_entity_extraction_create_dataset | source code | |
Language_entity_extraction_create_model | source code | |
Language_entity_extraction_predict | source code | |
Language_sentiment_analysis_create_dataset | source code | |
Language_sentiment_analysis_create_model | source code | |
Language_sentiment_analysis_predict | source code | |
Language_text_classification_create_dataset | source code | |
Language_text_classification_create_model | source code | |
Language_text_classification_predict | source code | |
List_datasets | source code | |
List_model_evaluations | source code | |
List_models | source code | |
List_operation_status | source code | |
Quickstart | source code | |
Translate_create_dataset | source code | |
Translate_create_model | source code | |
Translate_predict | source code | |
Undeploy_model | source code | |
Vision_classification_create_dataset | source code | |
Vision_classification_create_model | source code | |
Vision_classification_deploy_model_node_count | source code | |
Vision_classification_predict | source code | |
Vision_object_detection_create_dataset | source code | |
Vision_object_detection_create_model | source code | |
Vision_object_detection_deploy_model_node_count | source code | |
Vision_object_detection_predict | source code |
The Cloud AutoML Node.js Client API Reference documentation also contains samples.
Supported Node.js Versions
Our client libraries follow the Node.js release schedule. Libraries are compatible with all current active and maintenance versions of Node.js. If you are using an end-of-life version of Node.js, we recommend that you update as soon as possible to an actively supported LTS version.
Google's client libraries support legacy versions of Node.js runtimes on a best-efforts basis with the following warnings:
- Legacy versions are not tested in continuous integration.
- Some security patches and features cannot be backported.
- Dependencies cannot be kept up-to-date.
Client libraries targeting some end-of-life versions of Node.js are available, and
can be installed through npm dist-tags.
The dist-tags follow the naming convention legacy-(version)
.
For example, npm install @google-cloud/automl@legacy-8
installs client libraries
for versions compatible with Node.js 8.
Versioning
This library follows Semantic Versioning.
This library is considered to be stable. The code surface will not change in backwards-incompatible ways unless absolutely necessary (e.g. because of critical security issues) or with an extensive deprecation period. Issues and requests against stable libraries are addressed with the highest priority.
More Information: Google Cloud Platform Launch Stages
Contributing
Contributions welcome! See the Contributing Guide.
Please note that this README.md
, the samples/README.md
,
and a variety of configuration files in this repository (including .nycrc
and tsconfig.json
)
are generated from a central template. To edit one of these files, make an edit
to its templates in
directory.
License
Apache Version 2.0
See LICENSE