[[["容易理解","easyToUnderstand","thumb-up"],["確實解決了我的問題","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["難以理解","hardToUnderstand","thumb-down"],["資訊或程式碼範例有誤","incorrectInformationOrSampleCode","thumb-down"],["缺少我需要的資訊/範例","missingTheInformationSamplesINeed","thumb-down"],["翻譯問題","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["上次更新時間:2025-09-03 (世界標準時間)。"],[],[],null,["# Mainframe Connector reference\n\nSet up Cloud Logging\n--------------------\n\nMainframe Connector can send JSON formatted log messages containing\ncontext information to Cloud Logging. The context includes job name, job ID,\njob date, step name, and other variables provided by z/OS.\n\nThis helps you find logs for specific jobs and create alerts. Additionally,\nwhen deploying Mainframe Connector on Cloud Run,\nGoogle Kubernetes Engine, or Compute Engine, the logs are collected by\nCloud Logging agent and they appear in Logs Explorer.\n\nTo configure Mainframe Connector to write to Cloud Logging, set the\n`LOG_PROJECT` and `LOG_ID` environment variables in the\nJCL that launches the job. For example, `LOG_PROJECT=mainframe-connector-proj`\nand `LOG_ID=my-mfc`. The log name `LOG_NAME` is derived\nfrom `LOG_PROJECT` and `LOG_ID`. In this example, it will\nbe `projects/mainframe-connector-proj/logs/my-mfc`.\n\nThe resource type is always *global* as the log is a user-created log\nrather than a log from the cloud service. During startup,\nMainframe Connector displays a message indicating whether\nCloud Logging is configured.\n\nEnable load statistics\n----------------------\n\nThe load statistics feature logs every command that you execute using\nMainframe Connector in a SQL table. To enable the load statistics\nfeature, create a table using the following command and add the\nflag `--stats_table `\u003cvar translate=\"no\"\u003eTABLE_NAME\u003c/var\u003e to the `cp`\ncommand, where \u003cvar translate=\"no\"\u003eTABLE_NAME\u003c/var\u003e is the name of the SQL table. \n\n CREATE TABLE\n `[PROJECT_ID].[DATASET_NAME].[TABLE_NAME]` (\n timestamp TIMESTAMP,\n job_id STRING,\n job_name STRING,\n job_date DATE,\n job_time TIME,\n job_step_name STRING,\n job_type STRING,\n source STRING,\n destination STRING,\n job_json STRING,\n rows_read INT64,\n rows_written INT64,\n rows_affected INT64,\n rows_inserted INT64,\n rows_deleted INT64,\n rows_updated INT64,\n rows_unmodified INT64,\n rows_before_merge INT64,\n rows_loaded INT64,\n bq_job_id STRING,\n bq_job_project STRING,\n bq_job_location STRING,\n statement_type STRING,\n query STRING,\n execution_ms INT64,\n queued_ms INT64,\n bytes_processed INT64,\n slot_ms INT64,\n slot_utilization_rate FLOAT64,\n slot_ms_to_total_bytes_ratio FLOAT64,\n shuffle_bytes FLOAT64,\n shuffle_bytes_to_total_bytes_ratio FLOAT64,\n shuffle_spill_bytes FLOAT64,\n shuffle_spill_bytes_to_shuffle_bytes_ratio FLOAT64,\n shuffle_spill_bytes_to_total_bytes_ratio FLOAT64,\n shuffle_spill_gb FLOAT64,\n bq_stage_count INT64,\n bq_step_count INT64,\n bq_sub_step_count INT64,\n bq_stage_summary STRING)\n PARTITION BY job_date\n CLUSTER BY job_name, job_id, job_step_name\n OPTIONS (\n partition_expiration_days=1000,\n description=\"Log table for mainframe jobs\",\n require_partition_filter=true)\n\nReplace the following:\n\n- \u003cvar translate=\"no\"\u003ePROJECT_NAME\u003c/var\u003e: the name of the project in which you want to execute the command.\n- \u003cvar translate=\"no\"\u003eDATASET_NAME\u003c/var\u003e: the name of the dataset file.\n- \u003cvar translate=\"no\"\u003eTABLE_NAME\u003c/var\u003e: the name of the SQL table in which you want to log the details.\n\nDataset names\n-------------\n\nYou can use the following [dataset definition (DD) files](https://www.ibm.com/docs/en/zos-basic-skills?topic=concepts-jcl-statements-what-does-dd-statement-do) in your BQSH JCL\nprocedure. Ensure that all [MVS datasets](https://www.ibm.com/docs/en/zos/3.1.0?topic=tsoe-mvs-data-sets)\nreferenced by a DD file uses the [fixed block (FB)](https://www.ibm.com/docs/en/zos-basic-skills?topic=set-data-record-formats) record format."]]