- 0.51.0 (latest)
- 0.50.0
- 0.49.0
- 0.48.0
- 0.47.0
- 0.45.0
- 0.44.0
- 0.43.0
- 0.42.0
- 0.41.0
- 0.40.0
- 0.39.0
- 0.38.0
- 0.37.0
- 0.36.0
- 0.35.0
- 0.33.0
- 0.32.0
- 0.31.0
- 0.30.0
- 0.29.0
- 0.28.0
- 0.27.0
- 0.26.0
- 0.25.0
- 0.24.0
- 0.23.0
- 0.20.0
- 0.19.0
- 0.18.0
- 0.17.0
- 0.16.0
- 0.15.0
- 0.14.0
- 0.13.0
- 0.12.0
- 0.11.0
- 0.10.0
- 0.9.0
- 0.8.0
- 0.7.0
- 0.5.0
- 0.4.0
- 0.3.0
- 0.2.0
- 0.1.0
public interface BigQuerySourceOrBuilder extends MessageOrBuilder
Implements
MessageOrBuilderMethods
getDataSchema()
public abstract String getDataSchema()
The schema to use when parsing the data from the source. Supported values for imports:
user_event
(default): One JSON UserEvent per line.document
(default): One JSON Document per line. Each document must have a valid [document.id][].
string data_schema = 6;
Type | Description |
String | The dataSchema. |
getDataSchemaBytes()
public abstract ByteString getDataSchemaBytes()
The schema to use when parsing the data from the source. Supported values for imports:
user_event
(default): One JSON UserEvent per line.document
(default): One JSON Document per line. Each document must have a valid [document.id][].
string data_schema = 6;
Type | Description |
ByteString | The bytes for dataSchema. |
getDatasetId()
public abstract String getDatasetId()
Required. The BigQuery data set to copy the data from with a length limit of 1,024 characters.
string dataset_id = 2 [(.google.api.field_behavior) = REQUIRED];
Type | Description |
String | The datasetId. |
getDatasetIdBytes()
public abstract ByteString getDatasetIdBytes()
Required. The BigQuery data set to copy the data from with a length limit of 1,024 characters.
string dataset_id = 2 [(.google.api.field_behavior) = REQUIRED];
Type | Description |
ByteString | The bytes for datasetId. |
getGcsStagingDir()
public abstract String getGcsStagingDir()
Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory.
string gcs_staging_dir = 4;
Type | Description |
String | The gcsStagingDir. |
getGcsStagingDirBytes()
public abstract ByteString getGcsStagingDirBytes()
Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory.
string gcs_staging_dir = 4;
Type | Description |
ByteString | The bytes for gcsStagingDir. |
getPartitionCase()
public abstract BigQuerySource.PartitionCase getPartitionCase()
Type | Description |
BigQuerySource.PartitionCase |
getPartitionDate()
public abstract Date getPartitionDate()
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
.google.type.Date partition_date = 5;
Type | Description |
com.google.type.Date | The partitionDate. |
getPartitionDateOrBuilder()
public abstract DateOrBuilder getPartitionDateOrBuilder()
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
.google.type.Date partition_date = 5;
Type | Description |
com.google.type.DateOrBuilder |
getProjectId()
public abstract String getProjectId()
The project ID (can be project # or ID) that the BigQuery source is in with a length limit of 128 characters. If not specified, inherits the project ID from the parent request.
string project_id = 1;
Type | Description |
String | The projectId. |
getProjectIdBytes()
public abstract ByteString getProjectIdBytes()
The project ID (can be project # or ID) that the BigQuery source is in with a length limit of 128 characters. If not specified, inherits the project ID from the parent request.
string project_id = 1;
Type | Description |
ByteString | The bytes for projectId. |
getTableId()
public abstract String getTableId()
Required. The BigQuery table to copy the data from with a length limit of 1,024 characters.
string table_id = 3 [(.google.api.field_behavior) = REQUIRED];
Type | Description |
String | The tableId. |
getTableIdBytes()
public abstract ByteString getTableIdBytes()
Required. The BigQuery table to copy the data from with a length limit of 1,024 characters.
string table_id = 3 [(.google.api.field_behavior) = REQUIRED];
Type | Description |
ByteString | The bytes for tableId. |
hasPartitionDate()
public abstract boolean hasPartitionDate()
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
.google.type.Date partition_date = 5;
Type | Description |
boolean | Whether the partitionDate field is set. |