You can troubleshoot issues with version 2.8 (latest) of the BigQuery Connector for SAP by using both SAP LT Replication Server application logging and Google Cloud logs to review error and log messages.
BigQuery Connector for SAP sends all errors to the standard SAP LT Replication Server application logs.
You can also use SAP LT Replication Server debugging to isolate problems.
Troubleshooting overview
When you troubleshoot issues with BigQuery Connector for SAP, there are several different areas in which you might need to focus your attention, depending upon the area or scope of the issue that you are investigating:
- The infrastructure, such as the network, hardware or operating system.
- The SAP software, including the source server and SAP LT Replication Server.
- BigQuery Connector for SAP.
- BigQuery, including the BigQuery API and the target table.
Involve the right teams
The first thing you need to do when troubleshooting an issue is to determine in which of the preceding areas the issue is occurring and what the scope of the issue is.
To resolve an issue, you might need to work with multiple teams, such as your Basis administrators, your SAP LT Replication Server administrators, your DBAs, or your Google Cloud security administrators.
Getting the right teams and skills involved early can help you resolve your issues more quickly.
Determine the root cause
You need to determine the root cause of your issue and make sure that what you think might be the cause is not actually just a symptom of a root cause that lies elsewhere.
SAP systems are tightly integrated, but can write logs and traces to files in multiple different locations. When you are troubleshooting, you need to determine the correct logs and trace files to look at.
Check software requirements and prerequisites
Make sure that all system software are running at the required minimum versions and that all BigQuery Connector for SAP prerequisites have been met.
For information about BigQuery Connector for SAP installation prerequisites, see:
If SAP LT Replication Server is running on a Compute Engine VM, see Prerequisites.
If SAP LT Replication Server is running on a host that is external to Google Cloud, see Prerequisites.
For BigQuery Connector for SAP software requirements, see Software requirements.
If you are using older ECC software, make sure that your SAP LT Replication Server version is compatible with your ECC version. For more information, see SAP Note 2577774 - Version compatibility for source and target systems - SLT.
For the SAP source system and SAP LT Replication Server, make sure that you implement all correction notes for the ABAP-based Migration and Replication Technology. For more information, see SAP Note 3016862 - DMIS Note Analyzers with separated scenarios for ABAP-based Migration and Replication Technology.
Read the SAP support documentation
If you have an SAP user account, you can find the resolution to many SAP software issues by reading the SAP Notes and SAP Knowledge Base Articles that are available in the SAP ONE Support Launchpad.
Logging
BigQuery Connector for SAP sends its log messages to SAP LT Replication Server, where you can view them in the SAP LT Replication Server application logs.
These messages include the messages that BigQuery Connector for SAP receives from the BigQuery API.
General BigQuery log messages can be viewed in the Google Cloud console.
SAP LT Replication Server application logs
All error messages are saved to the standard SAP LT Replication Server application logs. Check the application logs to analyze and troubleshoot the root cause of issues.
You can display the application logs that are specific to your
SAP LT Replication Server configuration by running transaction LTRC
, opening
your configuration, and selecting Application Logs.
When the logs are displayed, select a table row, and then you can click a button to display any error messages. If you click the Application Logs tab for a replication, then you can filter relevant runtime log messages for the replication.
Messages generated by BigQuery Connector for SAP
Any errors that occur in BigQuery Connector for SAP before
records are sent to BigQuery are prefixed by /GOOG/SLT
.
Any errors that are returned from the BigQuery API are
prefixed by /GOOG/MSG
. This includes any HTTP errors.
If an error is not prefixed by either of these values, then the error was issued by SAP LT Replication Server.
BigQuery logs
BigQuery writes various log entries to the Cloud Logging in the Google Cloud console.
To view BigQuery log entries:
In the Google Cloud console, open Logging:
In the Query editor, specify a BigQuery resource. For example:
resource.type="bigquery_dataset"
For more information about BigQuery logs, see Logs.
HTTP trace
While troubleshooting errors, you can enable the HTTP trace in transaction
SMICM
or ST05
.
To limit the impact on performance, disable HTTP trace as soon as you are done.
Debugging
If you have the required authorization, you can debug the Business Add-In (BAdI) code of BigQuery Connector for SAP.
To debug the BAdI code:
If you don't already have the SAP authorizations that are required for debugging BAdI code, request them from your SAP administrator.
Enable debugging by typing
/h
in the transaction entry field in the SAP GUI, and then pressingEnter
.In the Settings menu, select Change Debugger Profile/ Settings.
Under Debug Modes, make sure that System Debugging is selected.
Set external breakpoints in the code as needed.
Monitoring
You can monitor several different points along the data path from the SAP data source to the target BigQuery table, including:
- Infrastructure - network, hardware and operating system
- The SAP database layer
- The SAP application layer
- BigQuery Connector for SAP
- BigQuery
For more information about monitoring at each of these points, see the BigQuery Connector for SAP operations guide.
Data reconciliation
There are three points at which you can check record counts:
- The source table
- The SAP LT Replication Server load or replication statistics in transaction
LTRC
- The BigQuery target table
You can use the Replication Validation tool to check and compare record counts or you can retrieve the records yourself by executing SQL queries.
For more information about data reconciliation, see the BigQuery Connector for SAP operations guide.
Common configuration issues
This section contains resolutions for common issues that can occur during the initial setup and configuration of BigQuery Connector for SAP.
SAP LT Replication Server support for database data types
Depending your SAP software versions, SAP LT Replication Server might not support some data types in a source database. For more information, see the Important Considerations section of SAP Note 1605140 - SAP Landscape Transformation Replication Server (SLT).
Issue: OS command for access token not working in SAP LT Replication Server
Issue: You created an operating system (OS) command to print the access token, but it is not working in SAP LT Replication Server.
Cause: This issue can be caused by multiple issues, but is most likely caused by not having the environment variables that are required for the OS command configured correctly.
Resolution: Confirm that the OS Command was configured correctly.
Try running the printenv
command both from the OS as the
SID_LCadm
and from SAP transaction SM69
and
compare the output. If the variables returned in transaction SM69
are
incomplete, then try restarting SAP LT Replication Server to register the variables.
Issue: /GOOG/MSG: 400 - Bad Request Request contains an invalid argument
Issue: Data transfer failed with the error message /GOOG/MSG: 400 - Bad
Request Request contains an invalid argument
.
Cause: For your SAP workload that is running on Google Cloud, the
access scope used in the client key table /GOOG/CLIENT_KEY
is not valid.
Resolution: To resolve this issue, complete the following steps:
In the SAP GUI, enter the
/GOOG/SLT_SETTINGS
transaction preceded by/n
:/n/GOOG/SLT_SETTINGS
For the mass transfer ID that failed in transaction
LTRC
, note the value of Google Cloud Key Name field.Enter transaction
SM30
, and then open the table/GOOG/CLIENT_KEY
.For the Google Cloud Key Name value that you noted in a preceding step, make sure that the value of the Scope field matches the access scope mentioned in Specify access settings in
/GOOG/CLIENT_KEY
. Make sure that there is no space entered in the field.Rerun your replication.
Issue: /GOOG/MSG : 400 - ICM_HTTP_CONNECTION_FAILED
Issue: Data transfer failed with the error message
/GOOG/MSG : 400 - ICM_HTTP_CONNECTION_FAILED
.
Cause: In the RFC destinations that BigQuery Connector for SAP uses to connect to Google Cloud APIs, the value for the Path Prefix or the Target Host field is incorrect.
Resolution: To resolve this issue, complete the following steps:
In the SAP GUI, enter the
/GOOG/SLT_SETTINGS
transaction preceded by/n
:/n/GOOG/SLT_SETTINGS
For the mass transfer ID that failed in transaction
LTRC
, note the value of Google Cloud Key Name field.Enter transaction
SM30
, and then open the table/GOOG/SERVIC_MAP
.For the Google Cloud Key Name value that you noted in a preceding step, note the RFC destination names.
Enter transaction
SM59
, and then complete the following steps:For the RFC destination that connects to BigQuery, make sure that the Path Prefix field value is
/bigquery/v2/
.For the RFC destination that connects to BigQuery, make sure that the Target Host field value is
bigquery.googleapis.com
.For the RFC destination that connects to IAM, make sure that the Path Prefix field value is
/v1/
.For the RFC destination that connects to IAM, make sure that the Target Host field value is
iamcredentials.googleapis.com
.
Rerun your replication or initial load.
Issue: /GOOG/MSG : 401 - Unauthorized Request is missing required authentication credential. Expected OAuth 2 access to ken, login coo
Issue: Data transfer failed with the error message /GOOG/MSG : 401 -
Unauthorized Request is missing required authentication credential. Expected
OAuth 2 access to ken, login coo
.
Cause: The HTTP port configuration is missing.
Resolution: Both HTTP and HTTPS ports must be created and be active in your SAP system.
The VM metadata is stored on a metadata server, which is only accessible through an HTTP port. Therefore, you must ensure that an HTTP port along with an HTTPS port is created and active in order to access the VM metadata.
To resolve this issue, complete the following steps:
In the SAP GUI, enter transaction code
SMICM
.On the menu bar, click Goto > Services.
Make sure that the HTTP and HTTPS ports are created and active. A green check in the Actv column indicates that the HTTP and HTTPS ports are active.
Rerun your replication.
For information about configuring the HTTP and HTTPS ports, see HTTP(S) Settings in ICM.
Issue: /GOOG/MSG : 401 - ICM_HTTP_CONNECTION_BROKEN
Issue: During initial load or replication, in the LTRC transaction, data
transfer failed with the error message
/GOOG/MSG : 401 - ICM_HTTP_CONNECTION_BROKEN
.
Cause: For your SAP workload that is running on Google Cloud, in the RFC destinations that BigQuery Connector for SAP uses to connect to Google CloudAPIs, SSL is not activated.
Resolution: To resolve this issue, complete the following steps:
In the SAP GUI, enter the
/GOOG/SLT_SETTINGS
transaction preceded by/n
:/n/GOOG/SLT_SETTINGS
For the mass transfer ID that failed in transaction
LTRC
, note the value of Google Cloud Key Name field.Enter transaction
SM30
, and then open the table/GOOG/SERVIC_MAP
.For the Google Cloud Key Name value that you noted in a preceding step, note the RFC destination names.
Enter transaction
SM59
, and then for the RFC destinations that you noted in the previous step, perform the following steps:Go to the Logon and Security tab.
For the SSL Certificate field, make sure that the option DFAULT SSL Client (Standard) is selected.
For the Service No. field, make sure that the value
443
is specified.
Rerun your replication.
Issue /GOOG/MSG: 110 - HTTPIO_PLG_CANCELED
Issue: During initial load or replication, in the LTRC transaction, data
transfer failed with the error message /GOOG/MSG: 110 - HTTPIO_PLG_CANCELED
.
Cause: The HTTP port configuration is missing.
Resolution: Both HTTP and HTTPS ports must be created and be active in your SAP system.
The VM metadata is stored on a metadata server, which is only accessible through an HTTP port. Therefore, you must ensure that an HTTP port along with an HTTPS port is created and active in order to access the VM metadata.
To resolve this issue, complete the following steps:
In the SAP GUI, enter transaction code
SMICM
.On the menu bar, click Goto > Services.
Make sure that the HTTP and HTTPS ports are created and active. A green check in the Actv column indicates that the HTTP and HTTPS ports are active.
Rerun your replication.
For information about configuring the HTTP and HTTPS ports, see HTTP(S) Settings in ICM.
Issue: /GOOG/MSG: 403 - SSL is required to perform this operation
Issue: Data transfer failed with the error message /GOOG/MSG: 403 - SSL is
required to perform this operation
.
Cause: For your SAP workload that is running on Google Cloud, in the RFC destinations that BigQuery Connector for SAP uses to connect to Google Cloud APIs, SSL is not activated.
Resolution: To resolve this issue, complete the following steps:
In the SAP GUI, enter the
/GOOG/SLT_SETTINGS
transaction preceded by/n
:/n/GOOG/SLT_SETTINGS
For the mass transfer ID that failed in transaction
LTRC
, note the value of Google Cloud Key Name field.Enter transaction
SM30
, and then open the table/GOOG/SERVIC_MAP
.For the Google Cloud Key Name value that you noted in a preceding step, note the RFC destination names.
Enter transaction
SM59
, and then for the RFC destinations that you noted in the previous step, perform the following steps:Go to the Logon and Security tab.
For the SSL Certificate field, make sure that the option DFAULT SSL Client (Standard) is selected.
For the Service No. field, make sure that the value
443
is specified.
Rerun your replication.
Issue: /GOOG/MSG: 403 - Request had insufficient authentication scopes
Issue: Data transfer failed with the error message /GOOG/MSG: 403 - Request
had insufficient authentication scopes
.
Cause: For your SAP workload that is running on Google Cloud, in the
table /GOOG/CLIENT_KEY
, the specified service account does not have the
required scope to access BigQuery.
Resolution: To resolve this issue, complete the following steps:
In the SAP GUI, enter the
/GOOG/SLT_SETTINGS
transaction preceded by/n
:/n/GOOG/SLT_SETTINGS
For the mass transfer ID that failed in transaction
LTRC
, note the value of Google Cloud Key Name field.Enter transaction
SM30
, and then open the table/GOOG/CLIENT_KEY
.For the Google Cloud Key Name value that you noted in a preceding step, make sure that the value specified for the Service Account Name field is
default
.In the Google Cloud console, go to the Compute Engine VM instances page.
Click the VM instance that hosts your SAP LT Replication Server.
Click Stop, and then follow the instructions to stop the VM instance.
Click Edit, edit the service account Access scopes to enable access to BigQuery, and then click Save.
Click Start / Resume to restart the VM instance.
Ensure that your SAP LT Replication Server is running.
Rerun your replication.
Issue: /GOOG/MSG: 403 - Access Denied: Dataset PROJECT_ID:DATASET_NAME: Permission bigquery.tables.created denied on dataset
Issue: Data transfer failed with the error message /GOOG/MSG: 403 -
Access Denied: Dataset PROJECT_ID:DATASET_NAME: Permission
bigquery.tables.created denied on dataset
.
Cause: For your SAP workload that is running on Google Cloud, in the
table /GOOG/CLIENT_KEY
, the specified service account does not have the
required permissions to access the BigQuery API.
Resolution: To resolve this issue, complete the following steps:
In the SAP GUI, enter the
/GOOG/SLT_SETTINGS
transaction preceded by/n
:/n/GOOG/SLT_SETTINGS
For the mass transfer ID that failed in transaction
LTRC
, note the value of Google Cloud Key Name field.Enter transaction
SM30
, and then open the table/GOOG/CLIENT_KEY
.For the Google Cloud Key Name value that you noted in a preceding step, note the value specified for the Service Account Name field.
In the Google Cloud console, go the Identity and Access Management Service accounts page.
Select the service account that you noted in a preceding step.
Ensure that the service account has the IAM roles that BigQuery Connector for SAP requires to access BigQuery, as given in Google Cloud Identity and Access Management.
Rerun your replication.
Issue: /GOOG/MSG: 404 - Not Found
Issue: Data transfer failed with the error message /GOOG/MSG: 404 - Not
Found
.
Cause: In the RFC destinations that BigQuery Connector for SAP uses to connect to Google Cloud APIs, the path prefix is not correct.
Resolution: To resolve this issue, complete the following steps:
In the SAP GUI, enter the
/GOOG/SLT_SETTINGS
transaction preceded by/n
:/n/GOOG/SLT_SETTINGS
For the mass transfer ID that failed in transaction
LTRC
, note the value of Google Cloud Key Name field.Enter transaction
SM30
, and then open the table/GOOG/SERVIC_MAP
.For the Google Cloud Key Name value that you noted in a preceding step, note the RFC destination names.
Enter transaction
SM59
, and then complete the following steps:- For the RFC destination that connects to BigQuery, make
sure that the Path Prefix field value is
/bigquery/v2/
. - For the RFC destination that connects to IAM, make sure
that the Path Prefix field value is
/v1/
.
- For the RFC destination that connects to BigQuery, make
sure that the Path Prefix field value is
Rerun your replication.
Issue: /GOOG/MSG: 404 - Table PROJECT_ID:DATASET_NAME.TABLE_NAME not found
Issue: Data transfer failed with the error message /GOOG/MSG: 404 -
Table PROJECT_ID:DATASET_NAME.TABLE_NAME not
found
.
Cause: In the RFC destinations that BigQuery Connector for SAP uses to connect to Google Cloud APIs, the value that you have specified for the Target Host field doesn't match any DNS name in Cloud DNS.
Resolution: To resolve this issue, complete the following steps:
In the SAP GUI, enter transaction code
SE38
, and then open the report/GOOG/R_SLT_SETTINGS
.For the
LTRC
transaction that was run, open the mass transfer ID and then note the value in the Google Cloud Key Name column.Enter transaction
SM30
and then open the table/GOOG/CLIENT_KEY
in display mode.Search the table
/GOOG/SERVIC_MAP
using the Google Cloud Key Name that you noted in a preceding step, and then note the specified RFC destination names.Enter transaction code
SM59
.For the RFC destinations that you use to connect to BigQuery and IAM APIs, note down the value specified for the Target Host field.
In the Google Cloud console, go to the Network services Cloud DNS page.
Click the private zone that contains the DNS records for the Private Service Connect endpoints, which you created to allow BigQuery Connector for SAP to privately connect to the BigQuery and IAM APIs.
Ensure that there is a DNS record, with matching DNS name, for each of the target host values that you noted in a preceding step.
Rerun your replication.
Issue: /GOOG/MSG: 404 - Not Found Requested entity was not found
Issue: Data transfer failed with the error message /GOOG/MSG: 404 - Not Found Requested entity was not found
.
Cause: For your workload that is running on Google Cloud, the service account used in the client key table /GOOG/CLIENT_KEY
is not valid.
Resolution: To resolve this issue, complete the following steps:
In the SAP GUI, enter the
/GOOG/SLT_SETTINGS
transaction preceded by/n
:/n/GOOG/SLT_SETTINGS
For the mass transfer ID that failed in transaction
LTRC
, note the value of Google Cloud Key Name field.Enter transaction
SM30
, and then open the table/GOOG/CLIENT_KEY
.For the Service Account Name field, make sure that the value specified is the email address of the service account that was created for BigQuery Connector for SAP in the step Create a service account.
Rerun your replication.
Issue: /GOOG/MSG: 418
- Data transfer failed with error message from SAP
Issue: Data transfer failed with an error message from SAP, for example No OS command defined for the key
.
Cause: For your SAP workload that is not running on Google Cloud, this issue can be caused by the following circumstances:
OS command that prints access token is not configured correctly
Causes: This issue can be caused by the following circumstances:
You have created an operating system (OS) command to print access token but have not added it to the access settings in the client key table
/GOOG/CLIENT_KEY
.The OS command that you created in transaction
SM69
failed to retrieve an access token from Google Cloud.
Resolution: In the client key table /GOOG/CLIENT_KEY
, for the field
Command name, make sure that the value entered matches the name of the
command that you created to print access token.
RFC is not configured correctly
Cause: For your SAP workload that is running on Google Cloud, in the RFC destinations that BigQuery Connector for SAP uses to connect to Google Cloud APIs, the Target Host field value is incorrect.
Resolution: To resolve this issue:
In the SAP GUI, enter transaction
SE38
, and then open the report/GOOG/R_SLT_SETTINGS
.For the
LTRC
transaction that was run, open the mass transfer ID and note the value of Google Cloud Key Name.Enter transaction
SM30
and then open the table/GOOG/CLIENT_KEY
in display mode.Search the table
/GOOG/SERVIC_MAP
using the Google Cloud Key Name that you noted in a preceding step, and then note the specified RFC destination names.Enter transaction code
SM59
, and then open the RFC destinations that you noted in the previous step.For the Target Host field, correct the specified URL.
Rerun your replication.
Issue: /GOOG/MSG: 413 - Request Entity Too Large
Issue: Data transfer failed with the error message /GOOG/MSG: 413 - Request
Entity Too Large
.
Cause: This issue can be caused when the byte size of the chunk that is sent by BigQuery Connector for SAP exceeded the maximum byte size for HTTP requests that BigQuery accepts. This can occur when the size of the table records or the amount of data the records contain causes the byte size of a chunk to grow beyond the BigQuery limit.
Resolution: Reduce the size of the chunks that are sent by
BigQuery Connector for SAP for your table. You can adjust the chunk size by
running the transaction /GOOG/SLT_SETTINGS
or enable dynamic chunk size to
automatically adjust the chunk size.
For more information, see:
Issue: /GOOG/MSG: 503 - HTTP Communication Failure - SSL client SSL Client (Standard)
Issue: Data transfer failed with the error message /GOOG/MSG: 503 - HTTP
Communication Failure - SSL client SSL Client (Standard)
.
Cause: For your SAP workload that is running on Google Cloud, in the RFC destinations that BigQuery Connector for SAP uses to connect to Google Cloud APIs, the Target Host field value is incorrect.
Resolution: To resolve this issue, see the resolution steps in RFC is not configured correctly.
Issue: /GOOG/MSG: 503 - HTTP Communication Failure exception occurred during the request sending
Issue: Data transfer failed with an error message /GOOG/MSG: 503 - HTTP
Communication Failure exception occurred during the request sending
.
Cause: This issue can be caused by connection or network issues.
Resolution: Validate your connection and make sure that your network is set up correctly, is running without errors, and is not congested.
Issue: /GOOG/MSG: 503 - HTTP Communication Failure exception occurred during the response receiving
Issue: Data transfer failed with an error message /GOOG/MSG: 503 - HTTP
Communication Failure exception occurred during the response receiving
.
This issue can be caused by the following circumstances:
- SSL is not activated in your RFC destinations
- SSL handshake failed
- Byte size of chunks exceeded the maximum byte size for HTTP requests that BigQuery accepts
SSL is not activated in your RFC destinations
Cause: In the RFC destinations that BigQuery Connector for SAP uses to connect to Google Cloud APIs, the security option for using SSL certificate is not activated.
Resolution: To resolve this issue, see the resolutions steps in
Issue: /GOOG/MSG: 403
- SSL is required to perform this operation.
SSL handshake failed
Cause: When the SSL handshake failed between the SAP LT Replication Server host and the BigQuery API endpoint. This occurs when the certificate presented by the TLS server is not valid for the target hostname that is supplied by the SAP LT Replication Server, possibly because client-side sending of the optional TLS extension SNI is not implemented on your NetWeaver kernel.
Resolution: In transaction SMICM
, look for the return code,
SSLERR_SERVER_CERT_MISMATCH
. If you find the return code
SSLERR_SERVER_CERT_MISMATCH
, then you need to enable sending of TLS extension
SNI. Also, make sure that your NetWeaver kernel implements client-side sending
of the optional TLS extension SNI.
To enable sending of TLS extension SNI, set the profile parameter
icm/HTTPS/client_sni_enabled
or ssl/client_sni_enabled
to TRUE
, depending
upon your NetWeaver kernel version. For more information from SAP, see:
- SAP Note 510007 - Additional considerations for setting up SSL on Application Server ABAP
- SAP Note 2582368 - SapSSL update for client-side sending of TLS extension SNI by saphttp, sapkprotp
- SAP Note 2124480 - ICM / Web Dispatcher: TLS Extension Server Name Indication (SNI) as client
Byte size of chunks exceeded the maximum byte size for HTTP requests that BigQuery accepts
Cause: When the byte size of the chunk that is sent by BigQuery Connector for SAP exceeded the maximum byte size for HTTP requests that BigQuery accepts. This can occur when the size of the table records or the amount of data the records contain causes the byte size of a chunk to grow beyond the BigQuery limit.
Resolution: Reduce the size of the chunks that are sent by
BigQuery Connector for SAP for this table. You can adjust the chunk size by
running the transaction /GOOG/SLT_SETTINGS
or enable dynamic chunk size to
automatically adjust the chunk size.
For more information, see:
Issue: /GOOG/MSG: 404 - Not found: Dataset DATASET_NAME
Issue: When attempting to either validate Google Cloud security
or to load data into a BigQuery table, you receive
message /GOOG/MSG: 404 - Not found: Dataset DATASET_NAME
.
Cause: This issue can be caused by the following circumstances:
- The BigQuery dataset was not created yet.
- The dataset name is not specified correctly in the mass transfer configuration.
- The replication configuration in SAP LT Replication Server needs to be activated.
Resolution: Try the following resolutions:
- Confirm that the dataset has been created in BigQuery.
- Check that the dataset name in the mass transfer configuration is the same as the dataset name in BigQuery.
- Run the
LTRC
transaction and deactivate and reactivate the replication configuration.
Issue: Mass Transfer Key can not be found for Mass Transfer ID XXX
Issue: You receive the error /GOOG/SLT: Mass Transfer Key can not
be found for Mass Transfer ID XXX
.
Cause: This issue can be caused by the following circumstances:
- A mass transfer configuration does not exist for the specified mass transfer ID.
- The corresponding replication configuration is not active.
Resolution: To resolve the issue, perform one of the following actions:
- Run the
/GOOG/SLT_SETTINGS
transaction and confirm that the mass transfer ID is correctly specified. - Run the
LTRC
transaction and deactivate and reactivate the replication configuration.
Issue: /GOOG/SLT : Unable to interpret VALUE as a BOOLEAN
Issue: The load or replication of a record fails with the message
/GOOG/SLT : Unable to interpret VALUE as a BOOLEAN
.
Cause: This issue is caused by the mapping of a field in the source table
to the BigQuery data type BOOLEAN
, but the data
in the source field does not resolve to a boolean.
Resolution: To resolve the issue, use transaction /GOOG/SLT_SETTINGS to either change the data type that the source field is mapped to or remove the data type mapping and accept the default data type.
Issue: /GOOG/SLT: Failed to convert field SAP_FIELD_NAME value to field BIGQUERY_FIELD_NAME: ERROR_DETAILS
Issue: The load or replication of a record fails with the message
/GOOG/SLT: Failed to convert field SAP_FIELD_NAME value
to field BIGQUERY_FIELD_NAME: ERROR_DETAILS
.
Cause: Either the source field contains an invalid value or the source field is mapped to a BigQuery data type that is not a valid mapping for the data that the source field contains.
Resolution: To resolve the issue, use transaction /GOOG/SLT_SETTINGS
to change the data type that the source field is mapped to or remove the data
type mapping and accept the default mapping for the data type.
Issue: /GOOG/MSG : Client key is not found in /GOOG/CLIENT_KEY table
Issue: A load or replication does not start with the message /GOOG/MSG:
Client key is not found in /GOOG/CLIENT_KEY table
.
Cause: Either the client key does not exist or it was specified incorrectly
in the mass transfer configuration of transaction /GOOG/SLT_SETTINGS
.
Resolution: To resolve the issue, either use transaction SM30
to
create the client key or use transaction /GOOG/SLT_SETTINGS
to correct
the specification of the client key value in the mass transfer configuration.
Issue: /GOOG/MSG: DESCRIPTION_OF_ISSUE error occurred in chunk ranging START_INDEX_OF_FAILED_CHUNK - END_INDEX_OF_FAILED_CHUNK
Issue: Replication of a chunk failed with an error message
/GOOG/MSG: DESCRIPTION_OF_ISSUE error occurred in chunk
ranging START_INDEX_OF_FAILED_CHUNK -
END_INDEX_OF_FAILED_CHUNK
.
Cause: This can have more than one cause, including
Invalid JSON Payload
, Quota Exceeded
, Request Entity Too Large
,
or HTTP Communication Failure
.
The error message for the chunk that failed to replicate to
BigQuery is shown with the start and end index of the chunk.
This error message is shown if you have not set the BREAK
flag in transaction
/GOOG/SLT_SETTINGS
. When the BREAK
flag is not set,
BigQuery Connector for SAP continues sending records to BigQuery
by sending the next chunk even when an error is encountered.
Resolution: Try the following resolutions:
For
Quota Exceeded
,Request Entity Too Large
, orHTTP Communication Failure
issues, follow the appropriate troubleshooting steps:Stop the current load, delete the target table from BigQuery, and then restart a fresh load.
To stop sending data to BigQuery and terminate the replication job when a chunk with an error is encountered, set the
BREAK
flag, which is recommended in production environments.For information about configuring the
BREAK
flag, see:- If SAP LT Replication Server is running on a Compute Engine VM, see Specify table creation and other general attributes.
- If SAP LT Replication Server is running on a host that is external to Google Cloud, see Specify table creation and other general attributes.
Issue: DESCRIPTION_OF_ISSUE while signing JWT using profile KEY_FILE_NAME.pse. Check JWT config in STRUST
STRUST
Issue: You receive the error DESCRIPTION_OF_ISSUE while
signing JWT using profile KEY_FILE_NAME .pse. Check JWT config in
.STRUST
Cause: The JWT configuration and service account key settings are not
configured correctly in STRUST
.
Resolution: Confirm that the JWT configuration and service account key are configured as explained in Authentication using JWT to obtain access tokens.
Issue: Bad Request invalid_grant. Invalid JWT Signature
Issue: You receive the error Bad Request invalid_grant.
Invalid JWT Signature
.
Cause: The PSE or P12 key file imported into STRUST
does not belong to the
service account that you used for signing the JWT.
Resolution: Make sure to import the correct service account key
file into STRUST
. For information about importing the service account key into
STRUST
, see Import the service account key into STRUST.
Issue: /GOOG/MSG : 400 - Bad Request invalid_grant Invalid grant: account not found
Issue: You are not able to connect to Google Cloud APIs.
Cause: The service account used for JWT signing is either incorrect or it does not have required permissions.
Resolution: Make sure that the service account that you specified for
JWT based token retrieval is correctly
maintained against the parameter JWT_SERVC_ACCT
in the table /GOOG/BQ_PARAM
.
For more information, see Enable JWT signing for the service account on the SAP LT Replication Server host.
Issue: OAuth RFC HTTP Destination not maintained in /GOOG/SERVIC_MAP
/GOOG/SERVIC_MAP
Issue: You receive the error OAuth RFC HTTP Destination not maintained
in
./GOOG/SERVIC_MAP
Cause: The RFC destination for OAuth 2.0 is not available
in the service mapping table /GOOG/SERVIC_MAP
.
Resolution: Update the RFC destination for OAuth 2.0 in the service mapping
table /GOOG/SERVIC_MAP
and re-run the load. For information about
specifying RFC destinations, see Specify RFC destinations in /GOOG/SERVIC_MAP
.
Issue: For non-English logon languages, field descriptions are garbled when you upload data using a CSV file
Issue: For non-English logon languages, when you upload BigQuery target table field types and field descriptions using the file upload option, the field descriptions from the CSV file are not uploaded accurately. You find garbled characters and symbols in the uploaded descriptions.
Cause: For non-English logon languages, the file upload utility is not able to interpret the characters in the CSV file accurately.
Resolution: To upload BigQuery target table field types and field descriptions in a non-English language, use the UTF-8 encoding format with Byte Order Mark (BOM) for your CSV file. Save the CSV file in UTF-8 with BOM format and then upload the file.
Common operational issues
This section contains resolutions for common issues that can occur after the initial setup of BigQuery Connector for SAP.
Issue: Empty source tables in SAP are not created in the BigQuery dataset
Issue: Empty source tables in SAP are not created in the BigQuery dataset.
Cause: For empty source tables in SAP, the SAP SLT prevents the creation of target tables in BigQuery.
Resolution: To create target tables in the BigQuery dataset for empty source tables in SAP, you can use the Create Table tool. For information about how to run the Create Table tool, see Create Table tool.
Issue: Incorrect number of writes in BigQuery
Issue: The number of records that are written to BigQuery is higher than the number of records that are shown in the SAP LT Replication Server logs.
Cause: This can have more than one cause, including transitory connection issues that cause SAP LT Replication Server to send records more than once or the fact that the BigQuery table accepts only inserts, and each change to a single record in the source is inserted as a separate entry in the target table.
Resolution: If the difference in record counts is not extreme and there are not fewer records in BigQuery than in the source table, this is expected behavior and not a problem.
To accurately reconcile the number of records in BigQuery with the number of records in the source table, query the BigQuery table as described in SQL queries for record counts.
For more information about possible causes for this issue, see Special considerations for ABAP sources/targets on HANA.
Issue: /GOOG/MSG : 400 - Schema mismatch for table TABLE_NAME
Issue: You receive error message /GOOG/MSG : 400 - Schema mismatch
for table TABLE_NAME. Please delete the table from BigQuery and
try again.
Cause: One of the following changes was entered for an existing BigQuery table:
- Deletion of a field
- Renaming of a field
- Change in the data type of a field
- Change in the partition type of a table
The preceding changes cannot be applied to an existing BigQuery table.
Resolution: If you need to change any of these field attributes in an existing table, then you need to delete the existing table and reload the records into a new table.
If the change was a mistake, then back out the change in SAP LT Replication Server.
For more information about configuring fields and partitions in a target BigQuery table, see BigQuery replication configurations.
Issue: Error messages related to invalid data
Issue: In the application logs, you receive an error message:
/GOOG/MSG/: DESCRIPTION_OF_INVALID_DATA error
occurred in FIELD_NAME in record
RECORD_KEYS
.
Cause: This error message is issued by BigQuery when inserting the records with any invalid data into the target table. The data might be invalid due to one of the following reasons:
- The data in the field of a particular record is not compatible with the
data type in BigQuery. For example, BigQuery
generates error messages when:
- A string is maintained in a field of type
DATE
,INTEGER
, orBOOLEAN
. - An invalid date (
00/00/0000
) is maintained in a field of typeDATE
.
- A string is maintained in a field of type
- An incorrect target data type is maintained in the field mappings in
transaction
/GOOG/SLT_SETTINGS
.
An error message is issued by BigQuery for each record that contains a field with invalid data.
Resolution: Analyze the error message,
DESCRIPTION_OF_INVALID_DATA
,
to understand the possible cause for invalid data.
To identify the record with
the field that contains the invalid data,
use RECORD_KEYS
, which includes the contents of
first five fields of the record. If the table has five fields or less, then
the contents of all fields are included in the
RECORD_KEYS
.
- If the data in the field is not compatible with the data type in BigQuery, then correct the data in the source table.
- If the error occurred due to a mismatch between the data and the data type,
then use transaction
/GOOG/SLT_SETTINGS
to specify an appropriate data type. For more information about data type mapping, see Data type mapping.
Issue: The value of a field displayed in transaction SE16
or SE16N
is different from the value that is shown in BigQuery
Issue: In some cases, the value of a field displayed in transaction
SE16
or SE16N
is different from the value that is shown
in BigQuery.
Cause: In the SAP S/4HANA source system, tables such as MARD
, MARC
,
MBEW
, and MBEWH
, have compatibility views
that display values in transaction SE16
or SE16N
. For such tables, the values
displayed in transaction SE16
or SE16N
are calculated within the
compatibility views using join conditions, which include several other
underlying tables.
When a table with a compatibility view
is configured in SLT for replication to BigQuery, SLT does not
replicate the data from the compatibility view. Instead, SLT replicates data
from each
of the underlying tables separately, and hence you might notice that some fields
have different values in SE16
or SE16N
as compared to what is shown in
BigQuery. This is a standard SLT behavior.
Resolution: To resolve the issue, complete the following steps:
- In the SAP GUI, enter transaction code
LTRS
. - Select the mass transfer settings for BigQuery replication.
- Go to Advanced Replication Settings > Table Settings.
- Select the table that uses compatibility views. If the table that you need isn't listed, then add the table.
In the Processing Settings section, enter values for the following fields:
- View for Initial Load: the value of compatibility view for the table from
SE16
orSE16N
. - View for Replication: the value of compatibility view for the table from
SE16
orSE16N
.
- View for Initial Load: the value of compatibility view for the table from
Save the settings.
Alternatively, replicate all underlying tables of the compatibility view to BigQuery. In BigQuery, join these tables using the same join conditions as the compatibility view.
For more information from SAP about some well known tables and their compatibility views, see SAP Note 2595627 - Accessing table from SE16/SE16N shows different results to SAP HANA database.
Issue: Failed to create proxy table TARGET_TABLE_NAME for object SOURCE_TABLE_NAME
Issue: While transferring data from SAP HANA to BigQuery by
using the BigQuery Connector for SAP, table loading failed with the error
message Failed to create proxy table
TARGET_TABLE_NAME for object SOURCE_TABLE_NAME
.
Cause: The number of characters in one or more fields of the source table exceeds the 30-character restriction set by SLT.
Resolution: Try the following resolutions:
- If the number of characters in the key fields of a table exceeds 30, then you cannot load or replicate such tables. This is a known limitation of SLT.
- If the number of characters in other fields of a table exceeds 30, then create a view to map the field names that exceed 30 characters to shorter field names.
For more information, see the SAP Note 1768805 - SAP Landscape Transformation Replication Server (SLT): Non ABAP-based Sources.
Get support
If you need help resolving problems with replication and the BigQuery Connector for SAP, then collect all available diagnostic information and contact Cloud Customer Care.
For more information about contacting Cloud Customer Care, see Getting support for SAP on Google Cloud.