Melatih model ekstraksi entity teks

Halaman ini menunjukkan cara melatih model ekstraksi entity AutoML dari set data teks menggunakan Konsol Google Cloud atau Vertex AI API.

Sebelum memulai

Sebelum dapat melatih model ekstraksi entity teks, Anda harus menyelesaikan langkah-langkah berikut:

Melatih model AutoML

Konsol Google Cloud

  1. Di konsol Google Cloud, di bagian Vertex AI, buka halaman Datasets.

    Buka halaman Datasets

  2. Klik nama set data yang ingin Anda gunakan untuk melatih model agar dapat membuka halaman detailnya.

  3. Pilih kumpulan anotasi yang ingin Anda gunakan untuk model ini.

  4. Klik Train new model.

  5. Untuk metode pelatihan, pilih AutoML.

  6. Klik Continue.

  7. Masukkan nama untuk model.

  8. Jika Anda ingin menetapkan pemisahan data pelatihan secara manual, luaskan Advanced options dan pilih opsi pemisahan data. Pelajari lebih lanjut.

  9. Klik Start Training.

    Pelatihan model dapat memerlukan waktu berjam-jam, bergantung pada ukuran dan kompleksitas data serta anggaran pelatihan, jika Anda menentukannya. Anda dapat menutup tab ini dan kembali membukanya lagi di lain waktu. Anda akan menerima email saat model telah menyelesaikan pelatihan.

API

Pilih tab untuk bahasa atau lingkungan Anda:

REST

Buat objek TrainingPipeline untuk melatih model.

Sebelum menggunakan salah satu data permintaan, lakukan penggantian berikut:

  • LOCATION: Region tempat model akan dibuat, seperti us-central1
  • PROJECT: Project ID Anda
  • MODEL_DISPLAY_NAME: Nama untuk model seperti yang muncul di antarmuka pengguna
  • DATASET_ID: ID untuk set data
  • PROJECT_NUMBER: Nomor project yang dibuat secara otomatis untuk project Anda

Metode HTTP dan URL:

POST https://LOCATION-aiplatform.googleapis.com/v1/projects/PROJECT/locations/LOCATION/trainingPipelines

Meminta isi JSON:

{
  "displayName": "MODEL_DISPLAY_NAME",
  "trainingTaskDefinition": "gs://google-cloud-aiplatform/schema/trainingjob/definition/automl_text_extraction_1.0.0.yaml",
  "modelToUpload": {
    "displayName": "MODEL_DISPLAY_NAME"
  },
  "inputDataConfig": {
    "datasetId": "DATASET_ID"
  }
}

Untuk mengirim permintaan Anda, perluas salah satu opsi berikut:

Anda akan menerima respons JSON yang mirip dengan yang berikut ini:

{
  "name": "projects/PROJECT_NUMBER/locations/us-central1/trainingPipelines/PIPELINE_ID",
  "displayName": "MODEL_DISPLAY_NAME",
  "inputDataConfig": {
    "datasetId": "DATASET_ID"
  },
  "trainingTaskDefinition": "gs://google-cloud-aiplatform/schema/trainingjob/definition/automl_text_extraction_1.0.0.yaml",
  "modelToUpload": {
    "displayName": "MODEL_DISPLAY_NAME"
  },
  "state": "PIPELINE_STATE_PENDING",
  "createTime": "2020-04-18T01:22:57.479336Z",
  "updateTime": "2020-04-18T01:22:57.479336Z"
}

Java

Sebelum mencoba contoh ini, ikuti petunjuk penyiapan Java di panduan memulai Vertex AI menggunakan library klien. Untuk mengetahui informasi selengkapnya, lihat Dokumentasi referensi API Java Vertex AI.

Untuk melakukan autentikasi ke Vertex AI, siapkan Kredensial Default Aplikasi. Untuk mengetahui informasi selengkapnya, lihat Menyiapkan autentikasi untuk lingkungan pengembangan lokal.


import com.google.cloud.aiplatform.util.ValueConverter;
import com.google.cloud.aiplatform.v1.DeployedModelRef;
import com.google.cloud.aiplatform.v1.EnvVar;
import com.google.cloud.aiplatform.v1.FilterSplit;
import com.google.cloud.aiplatform.v1.FractionSplit;
import com.google.cloud.aiplatform.v1.InputDataConfig;
import com.google.cloud.aiplatform.v1.LocationName;
import com.google.cloud.aiplatform.v1.Model;
import com.google.cloud.aiplatform.v1.Model.ExportFormat;
import com.google.cloud.aiplatform.v1.ModelContainerSpec;
import com.google.cloud.aiplatform.v1.PipelineServiceClient;
import com.google.cloud.aiplatform.v1.PipelineServiceSettings;
import com.google.cloud.aiplatform.v1.Port;
import com.google.cloud.aiplatform.v1.PredefinedSplit;
import com.google.cloud.aiplatform.v1.PredictSchemata;
import com.google.cloud.aiplatform.v1.TimestampSplit;
import com.google.cloud.aiplatform.v1.TrainingPipeline;
import com.google.rpc.Status;
import java.io.IOException;

public class CreateTrainingPipelineTextEntityExtractionSample {

  public static void main(String[] args) throws IOException {
    // TODO(developer): Replace these variables before running the sample.
    String trainingPipelineDisplayName = "YOUR_TRAINING_PIPELINE_DISPLAY_NAME";
    String project = "YOUR_PROJECT_ID";
    String datasetId = "YOUR_DATASET_ID";
    String modelDisplayName = "YOUR_MODEL_DISPLAY_NAME";

    createTrainingPipelineTextEntityExtractionSample(
        project, trainingPipelineDisplayName, datasetId, modelDisplayName);
  }

  static void createTrainingPipelineTextEntityExtractionSample(
      String project, String trainingPipelineDisplayName, String datasetId, String modelDisplayName)
      throws IOException {
    PipelineServiceSettings pipelineServiceSettings =
        PipelineServiceSettings.newBuilder()
            .setEndpoint("us-central1-aiplatform.googleapis.com:443")
            .build();

    // Initialize client that will be used to send requests. This client only needs to be created
    // once, and can be reused for multiple requests. After completing all of your requests, call
    // the "close" method on the client to safely clean up any remaining background resources.
    try (PipelineServiceClient pipelineServiceClient =
        PipelineServiceClient.create(pipelineServiceSettings)) {
      String location = "us-central1";
      String trainingTaskDefinition =
          "gs://google-cloud-aiplatform/schema/trainingjob/definition/"
              + "automl_text_extraction_1.0.0.yaml";

      LocationName locationName = LocationName.of(project, location);

      InputDataConfig trainingInputDataConfig =
          InputDataConfig.newBuilder().setDatasetId(datasetId).build();
      Model model = Model.newBuilder().setDisplayName(modelDisplayName).build();
      TrainingPipeline trainingPipeline =
          TrainingPipeline.newBuilder()
              .setDisplayName(trainingPipelineDisplayName)
              .setTrainingTaskDefinition(trainingTaskDefinition)
              .setTrainingTaskInputs(ValueConverter.EMPTY_VALUE)
              .setInputDataConfig(trainingInputDataConfig)
              .setModelToUpload(model)
              .build();

      TrainingPipeline trainingPipelineResponse =
          pipelineServiceClient.createTrainingPipeline(locationName, trainingPipeline);

      System.out.println("Create Training Pipeline Text Entity Extraction Response");
      System.out.format("\tName: %s\n", trainingPipelineResponse.getName());
      System.out.format("\tDisplay Name: %s\n", trainingPipelineResponse.getDisplayName());

      System.out.format(
          "\tTraining Task Definition %s\n", trainingPipelineResponse.getTrainingTaskDefinition());
      System.out.format(
          "\tTraining Task Inputs: %s\n", trainingPipelineResponse.getTrainingTaskInputs());
      System.out.format(
          "\tTraining Task Metadata: %s\n", trainingPipelineResponse.getTrainingTaskMetadata());
      System.out.format("State: %s\n", trainingPipelineResponse.getState());

      System.out.format("\tCreate Time: %s\n", trainingPipelineResponse.getCreateTime());
      System.out.format("\tStartTime %s\n", trainingPipelineResponse.getStartTime());
      System.out.format("\tEnd Time: %s\n", trainingPipelineResponse.getEndTime());
      System.out.format("\tUpdate Time: %s\n", trainingPipelineResponse.getUpdateTime());
      System.out.format("\tLabels: %s\n", trainingPipelineResponse.getLabelsMap());

      InputDataConfig inputDataConfig = trainingPipelineResponse.getInputDataConfig();
      System.out.println("\tInput Data Config");
      System.out.format("\t\tDataset Id: %s", inputDataConfig.getDatasetId());
      System.out.format("\t\tAnnotations Filter: %s\n", inputDataConfig.getAnnotationsFilter());

      FractionSplit fractionSplit = inputDataConfig.getFractionSplit();
      System.out.println("\t\tFraction Split");
      System.out.format("\t\t\tTraining Fraction: %s\n", fractionSplit.getTrainingFraction());
      System.out.format("\t\t\tValidation Fraction: %s\n", fractionSplit.getValidationFraction());
      System.out.format("\t\t\tTest Fraction: %s\n", fractionSplit.getTestFraction());

      FilterSplit filterSplit = inputDataConfig.getFilterSplit();
      System.out.println("\t\tFilter Split");
      System.out.format("\t\t\tTraining Filter: %s\n", filterSplit.getTrainingFilter());
      System.out.format("\t\t\tValidation Filter: %s\n", filterSplit.getValidationFilter());
      System.out.format("\t\t\tTest Filter: %s\n", filterSplit.getTestFilter());

      PredefinedSplit predefinedSplit = inputDataConfig.getPredefinedSplit();
      System.out.println("\t\tPredefined Split");
      System.out.format("\t\t\tKey: %s\n", predefinedSplit.getKey());

      TimestampSplit timestampSplit = inputDataConfig.getTimestampSplit();
      System.out.println("\t\tTimestamp Split");
      System.out.format("\t\t\tTraining Fraction: %s\n", timestampSplit.getTrainingFraction());
      System.out.format("\t\t\tValidation Fraction: %s\n", timestampSplit.getValidationFraction());
      System.out.format("\t\t\tTest Fraction: %s\n", timestampSplit.getTestFraction());
      System.out.format("\t\t\tKey: %s\n", timestampSplit.getKey());

      Model modelResponse = trainingPipelineResponse.getModelToUpload();
      System.out.println("\tModel To Upload");
      System.out.format("\t\tName: %s\n", modelResponse.getName());
      System.out.format("\t\tDisplay Name: %s\n", modelResponse.getDisplayName());
      System.out.format("\t\tDescription: %s\n", modelResponse.getDescription());

      System.out.format("\t\tMetadata Schema Uri: %s\n", modelResponse.getMetadataSchemaUri());
      System.out.format("\t\tMetadata: %s\n", modelResponse.getMetadata());
      System.out.format("\t\tTraining Pipeline: %s\n", modelResponse.getTrainingPipeline());
      System.out.format("\t\tArtifact Uri: %s\n", modelResponse.getArtifactUri());

      System.out.format(
          "\t\tSupported Deployment Resources Types: %s\n",
          modelResponse.getSupportedDeploymentResourcesTypesList());
      System.out.format(
          "\t\tSupported Input Storage Formats: %s\n",
          modelResponse.getSupportedInputStorageFormatsList());
      System.out.format(
          "\t\tSupported Output Storage Formats: %s\n",
          modelResponse.getSupportedOutputStorageFormatsList());

      System.out.format("\t\tCreate Time: %s\n", modelResponse.getCreateTime());
      System.out.format("\t\tUpdate Time: %s\n", modelResponse.getUpdateTime());
      System.out.format("\t\tLabels: %sn\n", modelResponse.getLabelsMap());

      PredictSchemata predictSchemata = modelResponse.getPredictSchemata();
      System.out.println("\t\tPredict Schemata");
      System.out.format("\t\t\tInstance Schema Uri: %s\n", predictSchemata.getInstanceSchemaUri());
      System.out.format(
          "\t\t\tParameters Schema Uri: %s\n", predictSchemata.getParametersSchemaUri());
      System.out.format(
          "\t\t\tPrediction Schema Uri: %s\n", predictSchemata.getPredictionSchemaUri());

      for (ExportFormat exportFormat : modelResponse.getSupportedExportFormatsList()) {
        System.out.println("\t\tSupported Export Format");
        System.out.format("\t\t\tId: %s\n", exportFormat.getId());
      }

      ModelContainerSpec modelContainerSpec = modelResponse.getContainerSpec();
      System.out.println("\t\tContainer Spec");
      System.out.format("\t\t\tImage Uri: %s\n", modelContainerSpec.getImageUri());
      System.out.format("\t\t\tCommand: %s\n", modelContainerSpec.getCommandList());
      System.out.format("\t\t\tArgs: %s\n", modelContainerSpec.getArgsList());
      System.out.format("\t\t\tPredict Route: %s\n", modelContainerSpec.getPredictRoute());
      System.out.format("\t\t\tHealth Route: %s\n", modelContainerSpec.getHealthRoute());

      for (EnvVar envVar : modelContainerSpec.getEnvList()) {
        System.out.println("\t\t\tEnv");
        System.out.format("\t\t\t\tName: %s\n", envVar.getName());
        System.out.format("\t\t\t\tValue: %s\n", envVar.getValue());
      }

      for (Port port : modelContainerSpec.getPortsList()) {
        System.out.println("\t\t\tPort");
        System.out.format("\t\t\t\tContainer Port: %s\n", port.getContainerPort());
      }

      for (DeployedModelRef deployedModelRef : modelResponse.getDeployedModelsList()) {
        System.out.println("\t\tDeployed Model");
        System.out.format("\t\t\tEndpoint: %s\n", deployedModelRef.getEndpoint());
        System.out.format("\t\t\tDeployed Model Id: %s\n", deployedModelRef.getDeployedModelId());
      }

      Status status = trainingPipelineResponse.getError();
      System.out.println("\tError");
      System.out.format("\t\tCode: %s\n", status.getCode());
      System.out.format("\t\tMessage: %s\n", status.getMessage());
    }
  }
}

Node.js

Sebelum mencoba contoh ini, ikuti petunjuk penyiapan Node.js di Panduan memulai Vertex AI menggunakan library klien. Untuk mengetahui informasi selengkapnya, lihat Dokumentasi referensi API Node.js Vertex AI.

Untuk melakukan autentikasi ke Vertex AI, siapkan Kredensial Default Aplikasi. Untuk mengetahui informasi selengkapnya, lihat Menyiapkan autentikasi untuk lingkungan pengembangan lokal.

/**
 * TODO(developer): Uncomment these variables before running the sample.\
 * (Not necessary if passing values as arguments)
 */

// const datasetId = 'YOUR_DATASET_ID';
// const modelDisplayName = 'YOUR_MODEL_DISPLAY_NAME';
// const trainingPipelineDisplayName = 'YOUR_TRAINING_PIPELINE_DISPLAY_NAME';
// const project = 'YOUR_PROJECT_ID';
// const location = 'YOUR_PROJECT_LOCATION';
const aiplatform = require('@google-cloud/aiplatform');
const {definition} =
  aiplatform.protos.google.cloud.aiplatform.v1.schema.trainingjob;

// Imports the Google Cloud Pipeline Service Client library
const {PipelineServiceClient} = aiplatform.v1;

// Specifies the location of the api endpoint
const clientOptions = {
  apiEndpoint: 'us-central1-aiplatform.googleapis.com',
};

// Instantiates a client
const pipelineServiceClient = new PipelineServiceClient(clientOptions);

async function createTrainingPipelineTextEntityExtraction() {
  // Configure the parent resource
  const parent = `projects/${project}/locations/${location}`;

  const trainingTaskInputObj = new definition.AutoMlTextExtractionInputs({});
  const trainingTaskInputs = trainingTaskInputObj.toValue();

  const modelToUpload = {displayName: modelDisplayName};
  const inputDataConfig = {datasetId: datasetId};
  const trainingPipeline = {
    displayName: trainingPipelineDisplayName,
    trainingTaskDefinition:
      'gs://google-cloud-aiplatform/schema/trainingjob/definition/automl_text_extraction_1.0.0.yaml',
    trainingTaskInputs,
    inputDataConfig,
    modelToUpload,
  };
  const request = {
    parent,
    trainingPipeline,
  };

  // Create training pipeline request
  const [response] =
    await pipelineServiceClient.createTrainingPipeline(request);

  console.log('Create training pipeline text entity extraction response :');
  console.log(`Name : ${response.name}`);
  console.log('Raw response:');
  console.log(JSON.stringify(response, null, 2));
}
createTrainingPipelineTextEntityExtraction();

Python

Untuk mempelajari cara menginstal atau mengupdate Vertex AI SDK untuk Python, lihat Menginstal Vertex AI SDK untuk Python. Untuk mengetahui informasi selengkapnya, lihat Dokumentasi referensi Python API.

def create_training_pipeline_text_entity_extraction_sample(
    project: str,
    location: str,
    display_name: str,
    dataset_id: str,
    model_display_name: Optional[str] = None,
    training_fraction_split: float = 0.8,
    validation_fraction_split: float = 0.1,
    test_fraction_split: float = 0.1,
    budget_milli_node_hours: int = 8000,
    disable_early_stopping: bool = False,
    sync: bool = True,
):
    aiplatform.init(project=project, location=location)

    job = aiplatform.AutoMLTextTrainingJob(
        display_name=display_name, prediction_type="extraction"
    )

    text_dataset = aiplatform.TextDataset(dataset_id)

    model = job.run(
        dataset=text_dataset,
        model_display_name=model_display_name,
        training_fraction_split=training_fraction_split,
        validation_fraction_split=validation_fraction_split,
        test_fraction_split=test_fraction_split,
        budget_milli_node_hours=budget_milli_node_hours,
        disable_early_stopping=disable_early_stopping,
        sync=sync,
    )

    model.wait()

    print(model.display_name)
    print(model.resource_name)
    print(model.uri)
    return model

Mengontrol pemisahan data menggunakan REST

Anda dapat mengontrol pembagian data pelatihan antara set pelatihan, validasi, dan pengujian. Saat menggunakan Vertex AI API, gunakan objek Split untuk menentukan pembagian data Anda. Objek Split dapat disertakan dalam objek InputConfig sebagai salah satu dari beberapa jenis objek, yang masing-masing memberikan cara berbeda untuk memisahkan data pelatihan. Anda hanya dapat memilih satu metode.

  • FractionSplit:
    • TRAINING_FRACTION: Bagian dari data pelatihan yang akan digunakan untuk set pelatihan.
    • VALIDATION_FRACTION: Bagian dari data pelatihan yang akan digunakan untuk set validasi. Tidak digunakan untuk data video.
    • TEST_FRACTION: Bagian dari data pelatihan yang akan digunakan untuk set pengujian.

    Jika ada satu pecahan yang ditentukan, semua pecahan harus ditentukan. Jumlah pecahan tersebut harus 1,0. Nilai default untuk pecahan berbeda-beda, bergantung pada jenis data Anda. Pelajari lebih lanjut.

    "fractionSplit": {
      "trainingFraction": TRAINING_FRACTION,
      "validationFraction": VALIDATION_FRACTION,
      "testFraction": TEST_FRACTION
    },
    
  • FilterSplit:
    • TRAINING_FILTER: Item data yang cocok dengan filter ini digunakan untuk set pelatihan.
    • VALIDATION_FILTER: Item data yang cocok dengan filter ini digunakan untuk set validasi. Harus berupa "-" untuk data video.
    • TEST_FILTER: Item data yang cocok dengan filter ini digunakan untuk set pengujian.

    Filter ini dapat digunakan dengan label ml_use, atau dengan label apa pun yang Anda terapkan pada data. Pelajari lebih lanjut cara menggunakan label ml-use dan label lainnya untuk memfilter data Anda.

    Contoh berikut menunjukkan cara menggunakan objek filterSplit dengan label ml_use, dengan menyertakan set validasi:

    "filterSplit": {
    "trainingFilter": "labels.aiplatform.googleapis.com/ml_use=training",
    "validationFilter": "labels.aiplatform.googleapis.com/ml_use=validation",
    "testFilter": "labels.aiplatform.googleapis.com/ml_use=test"
    }