gcloud alpha genomics pipelines run

NAME
gcloud alpha genomics pipelines run - defines and runs a pipeline
SYNOPSIS
gcloud alpha genomics pipelines run [--command-line=COMMAND_LINE] [--cpus=CPUS] [--disk-size=DISK_SIZE] [--docker-image=DOCKER_IMAGE; default="google/cloud-sdk:slim"] [--env-vars=[NAME=VALUE,…]] [--inputs=[NAME=VALUE,…]] [--inputs-from-file=[NAME=FILE,…]] [--logging=LOGGING] [--memory=MEMORY] [--outputs=[NAME=VALUE,…]] [--preemptible] [--boot-disk-size=BOOT_DISK_SIZE] [--labels=[KEY=VALUE,…]] [--network=NETWORK] [--pipeline-file=PIPELINE_FILE] [--regions=[REGION,…]] [--service-account-email=SERVICE_ACCOUNT_EMAIL; default="default"] [--service-account-scopes=[SCOPE,…]] [--subnetwork=SUBNETWORK] [--zones=[ZONE,…]] [GCLOUD_WIDE_FLAG]
DESCRIPTION
(ALPHA) A pipeline is a transformation of a set of inputs to a set of outputs. Supports docker-based commands.
COMMONLY USED FLAGS
--command-line=COMMAND_LINE
Command line to run with /bin/sh in the specified docker image. Cannot be used with --pipeline-file.
--cpus=CPUS
The minimum number of CPUs to run the pipeline. Overrides any value specified in the pipeline-file.
--disk-size=DISK_SIZE
The disk size(s) in GB, specified as a comma-separated list of pairs of disk name and size. For example: --disk-size "name:size,name2:size2". Overrides any values specified in the pipeline-file.
--docker-image=DOCKER_IMAGE; default="google/cloud-sdk:slim"
A docker image to run. Requires --command-line to be specified and cannot be used with --pipeline-file.
--env-vars=[NAME=VALUE,…]
List of key-value pairs to set as environment variables.
--inputs=[NAME=VALUE,…]
Map of input PipelineParameter names to values. Used to pass literal parameters to the pipeline, and to specify input files in Google Cloud Storage that will have a localCopy made. Specified as a comma-separated list: --inputs file=gs://my-bucket/in.txt,name=hello
--inputs-from-file=[NAME=FILE,…]
Map of input PipelineParameter names to values. Used to pass literal parameters to the pipeline where values come from local files; this can be used to send large pipeline input parameters, such as code, data, or configuration values. Specified as a comma-separated list: --inputs-from-file script=myshellscript.sh,pyfile=mypython.py
--logging=LOGGING
The location in Google Cloud Storage to which the pipeline logs will be copied. Can be specified as a fully qualified directory path, in which case logs will be output with a unique identifier as the filename in that directory, or as a fully specified path, which must end in .log, in which case that path will be used. Stdout and stderr logs from the run are also generated and output as -stdout.log and -stderr.log.
--memory=MEMORY
The number of GB of RAM needed to run the pipeline. Overrides any value specified in the pipeline-file.
--outputs=[NAME=VALUE,…]
Map of output PipelineParameter names to values. Used to specify output files in Google Cloud Storage that will be made from a localCopy. Specified as a comma-separated list: --outputs ref=gs://my-bucket/foo,ref2=gs://my-bucket/bar
--preemptible
Whether to use a preemptible VM for this pipeline. The "resource" section of the pipeline-file must also set preemptible to "true" for this flag to take effect.
OTHER FLAGS
--boot-disk-size=BOOT_DISK_SIZE
The size of the boot disk in GB.

The boot disk size must be large enough to accomondate all Docker images from each action in the pipeline at the same time. If not specified, a small but reasonable default value is used.

--labels=[KEY=VALUE,…]
List of label KEY=VALUE pairs to add.

Keys must start with a lowercase character and contain only hyphens (-), underscores (_), lowercase characters, and numbers. Values must contain only hyphens (-), underscores (_), lowercase characters, and numbers.

--network=NETWORK
The network name to attach the VM's network interface to.

The value will be prefixed with global/networks/ unless it contains a /, in which case it is assumed to be a fully specified network resource URL.

If unspecified, the global default network is used.

--pipeline-file=PIPELINE_FILE
A YAML or JSON file containing a v2alpha1 Pipeline object. See https://cloud.google.com/genomics/reference/rest/v2alpha1/pipelines#Pipeline
--regions=[REGION,…]
List of Compute Engine regions the pipeline can run in.

If no regions are specified with the regions flag, then regions in the pipeline definition file will be used.

If no regions are specified in the pipeline definition, then the default region in your local client configuration is used.

At least one region or region must be specified.

For more information on default regions, see https://cloud.google.com/compute/docs/gcloud-compute/#set_default_zone_and_region_in_your_local_client

--service-account-email=SERVICE_ACCOUNT_EMAIL; default="default"
The service account used to run the pipeline. If unspecified, defaults to the Compute Engine service account for your project.
--service-account-scopes=[SCOPE,…]
List of additional scopes to be made available for this service account. The following scopes are always requested:
https://www.googleapis.com/auth/devstorage.read_write
https://www.googleapis.com/auth/genomics
--subnetwork=SUBNETWORK
The subnetwork to use on the provided network.

If the specified network is configured for custom subnet creation, the name of the subnetwork to attach the instance to must be specified here.

The value is prefixed with regions/*/subnetworks/ unless it contains a /, in which case it is assumed to be a fully specified subnetwork resource URL.

If the * character appears in the value, it is replaced with the region that the virtual machine has been allocated in.

--zones=[ZONE,…]
List of Compute Engine zones the pipeline can run in.

If no zones are specified with the zones flag, then zones in the pipeline definition file will be used.

If no zones are specified in the pipeline definition, then the default zone in your local client configuration is used.

If you have no default zone then at least one zone or region must be specified.

For more information on default zones, see https://cloud.google.com/compute/docs/gcloud-compute/#set_default_zone_and_region_in_your_local_client

GCLOUD WIDE FLAGS
These flags are available to all commands: --access-token-file, --account, --billing-project, --configuration, --flags-file, --flatten, --format, --help, --impersonate-service-account, --log-http, --project, --quiet, --trace-token, --user-output-enabled, --verbosity.

Run $ gcloud help for details.

NOTES
This command is currently in alpha and might change without notice. If this command fails with API permission errors despite specifying the correct project, you might be trying to access an API with an invitation-only early access allowlist.