Connect to an SAP Ariba Batch Source

Stay organized with collections Save and categorize content based on your preferences.

This page describes how to connect your data pipeline to an SAP Ariba Source and a BigQuery Sink. You can configure and execute bulk data transfers from Ariba without any coding using the SAP Ariba Batch Source plugin from the Cloud Data Fusion Hub.

The plugin extracts data from the reporting facts provided in the SAP Ariba Source. Each fact corresponds with an SAP Ariba Document Type. Facts are exposed in a view templates, which are accessed through the Analytical Reporting API.

For more information, see the SAP Ariba Batch Source reference.

Before you begin

  • Create an instance in Cloud Data Fusion version 6.5.1 or later. If your instance uses an earlier version, upgrade your Cloud Data Fusion environment.

  • An SAP Ariba User must do the following:

    • Create an application and generate the OAuth credentials.
    • Grant access to the Analytical Reporting API in the Ariba developer portal.
  • Retrieve the name of the reporting view template from the SAP Ariba Analytical Reporting - View Management API by sending a GET request. See Identifying Analytical reporting API view templates.

  • Optional: To prevent pipeline failures due to rate limits, identify the expected record count. The plugin extracts data from facts and dimensions through SAP Ariba's Analytical Reporting API, where rate limits apply. For more information, see Manage rate limits.

Deploy and configure the plugin

  1. Deploy the SAP Ariba Batch Source plugin from the Hub's SAP tab. For more information, see Deploy a plugin from the Hub.

  2. Open the pipeline on the Cloud Data Fusion Studio page and select Data Pipeline - Batch. The plugin doesn't support Realtime pipelines.

  3. In the source menu, click SAP Ariba. The SAP Ariba Batch Source tile appears in the pipeline.

  4. Go to the tile and click Properties. An Ariba Properties window opens.

  5. Configure the properties.

  6. Click Validate and resolve any errors.

  7. Click Close.

Optional: Connect the plugin to a BigQuery Sink

  1. On the Cloud Data Fusion Studio page, go to the Sink menu and click BigQuery.

    The BigQuery Sink tile appears in the pipeline.

  2. Configure the sink's required properties.

  3. Click Validate and resolve any errors.

  4. Click Close.

Optional: Manage rate limits

To check the records count for a specific date range in SAP Ariba, see Date-related filters for the Analytical Reporting API.

For more information, see Limits for the plugin.

The following table describes ways to troubleshoot issues with rate limits.

Example pipeline Count of records and required API calls Remaining daily limit Troubleshooting
I want to extract data from one view template for a specific date range.
1 2,020,000 records, 41 calls -1 of 40 The required API calls for this date range and record count exceeds the daily limit (40). To reduce the number of calls, select a smaller date range to decrease the count of records.
I want to extract data from multiple view templates for a specific date range.
1 50,001 records, 2 calls 38 of 40
2 100,000 records, 2 calls 36 of 40
3 100 records, 1 call 35 of 40
4 1,000,000 records, 20 calls 15 of 40
5 500,000 records, 10 calls 5 of 40
6 500,000 records, 10 calls -5 of 40 Pipeline 6 exceeds the limit for API calls. To prevent errors, run the extraction a day later or change the date range.

What's next