Data Mapping task

Stay organized with collections Save and categorize content based on your preferences.

You're viewing Apigee X documentation.
View Apigee Edge documentation.

The Data Mapping task lets you perform variable assignments in your integration, get and set properties of json objects, and apply nested transform functions to values. The variables can be integration variables or task variables.

For example, you can assign values from an integration variable X to a task variable Y or from a task variable Y to an integration variable X. For more information about variables in Apigee Integration, see Using variables in Apigee Integration.

An example of data mapping is shown in the following image. image showing data mapping of string variables image showing data mapping of string variables

Configure the Data Mapping task

To configure a Data Mapping task, perform the following steps:

  1. In the Apigee UI, select your Apigee Organization.
  2. Click Develop > Integrations.
  3. Select an existing integration or create a new integration by clicking CREATE INTEGRATION.

    If you are creating a new integration:

    1. Enter a name and description in the Create Integration dialog.
    2. Select a Region for the integration from the list of supported regions.
    3. Click Create.

    This opens the integration in the integration designer.

  4. In the integration designer navigation bar, click +Add a task/trigger > Tasks to view the list of available tasks.
  5. Click and place the Data Mapping element in the integration designer.
  6. Click the Data Mapping element on the designer to view the Data Mapping task configuration pane.
  7. Click Open Data Mapping Editor to view the default values of the task and to add a new mapping. For detailed steps about adding a mapping, see Add a mapping.

    The following image shows a sample layout of the Data Mapping Editor. For more information about the data mapping editor, see Data mapping overview.

    image showing data mapping editor image showing data mapping editor

Add a mapping

To add a mapping, perform the following steps:

  1. In the Data Mapping task configuration pane, click Open Data Mapping Editor.
  2. Configure the Input and Output fields:
    1. Drag a variable or its subfield from the Variables list into the Input field. You can click (Expand) to expand each variable in the Variables list and view the available subfields of that variable.
    2. Alternatively, click Variable or Value to add a variable, an initial value, or a function into the Input field.
      • Select Variable to search and use an existing variable. To create a new variable, click + Add new variable, and enter the name and data type of the new variable.
      • Select Value to enter an initial value, literal value or reference value of type string, integer, double, or Boolean).
      • Select Function to search and use a base function.

        A base function is used to retrieve or generate values during the execution of an integration. For example, generating a random UUID or retrieving the current integration region. For information about the supported base functions, see Supported base functions.

      • Click Save.
    3. Click + (Add a function) on any input variable, value, or base function in the Input field to add a mapping function from the list of available mapping functions.

      A mapping function can have one or more parameters. Each parameter can further have a value, a variable, or a base function followed by a chain of mapping functions (nested functions). Click + (Add a function) next to the function parameter to add a nested function. Similarly, to remove or delete the most recently added function, click - (Delete previous function).

      Once you define a data mapping function, you'll notice a colored indicator next to each of the function parameter. This color indicates the resultant data type of the respective parameter. To know what each of the color indicates, see Format of an integration variable.

      For information about the supported mapping functions, see Supported data types and mapping functions.

    4. Drag a variable from the Variables list into the Output field. If the variable is not available, then click create a new one to configure the name and data type of the new variable. Optionally, you can click the output variable and select whether to use that variable as an output of the integration, or as an input to another integration.
  3. To remove a variable, click (Clear) on the variable field.
  4. To delete a mapping row, click (Delete this mapping).
  5. Close the Data Mapping Editor once your mapping is complete. Any changes will be autosaved.

The completed data mapping is available to view from the Data Mapping task configuration pane, as shown in the following image:

image showing data mapping editor

Supported data types and mapping functions

Apigee Integration supports the following data types for variables in the Data Mapping task:

  • Strings and String arrays
  • Integers and Integer arrays
  • Doubles and Double arrays
  • Booleans and Boolean arrays
  • JSON
The following table lists the data mapping functions available for each of the data types.
Data type Supported mapping functions
Boolean AND, EQUALS, NAND, NOR, NOT, OR, TO_JSON, TO_STRING, XNOR, XOR
Boolean array APPEND, APPEND_ALL, CONTAINS, FILTER, FOR_EACH, GET, REMOVE, REMOVE_AT, SET, SIZE, TO_JSON, TO_SET
Double ADD, DIVIDE, CEIL, EQUALS, EXPONENT, GREATER_THAN, GREATER_THAN_EQUAL_TO, FLOOR, LESS_THAN, LESS_THAN_EQUAL_TO, MOD, MULTIPLY, ROUND, SUBTRACT, TO_JSON, TO_STRING
Double array APPEND, APPEND_ALL, AVG, CONTAINS, FILTER, FOR_EACH, GET, MAX, MIN, REMOVE, REMOVE_AT, SET, SIZE, SUM, TO_JSON, TO_SET
Integer ADD, DIVIDE, EPOCH_TO_HUMAN_READABLE_TIME, EQUALS, EXPONENT, GREATER_THAN, GREATER_THAN_EQUAL_TO, LESS_THAN, LESS_THAN_EQUAL_TO, MOD, MULTIPLY, SUBTRACT, TO_DOUBLE, TO_JSON, TO_STRING
Integer array APPEND, APPEND_ALL, AVG, CONTAINS, FILTER, FOR_EACH, GET, MAX, MIN, REMOVE, REMOVE_AT, SET, SIZE, SUM, TO_JSON, TO_SET
JSON APPEND_ELEMENT, FLATTEN, FILTER, FOR_EACH, GET_ELEMENT, GET_PROPERTY, MERGE, REMOVE_PROPERTY, RESOLVE_TEMPLATE, SET_PROPERTY, SIZE, TO_BOOLEAN, TO_BOOLEAN_ARRAY, TO_DOUBLE, TO_DOUBLE_ARRAY, TO_INT, TO_INT_ARRAY, TO_STRING, TO_STRING_ARRAY
String CONCAT, CONTAINS, EQUALS, EQUALS_IGNORE_CASE, LENGTH, REPLACE_ALL, RESOLVE_TEMPLATE, SPLIT, SUBSTRING, TO_BASE_64, TO_BOOLEAN, TO_DOUBLE, TO_INT, TO_JSON, TO_LOWERCASE, TO_UPPERCASE
String array APPEND, APPEND_ALL, CONTAINS, FILTER, FOR_EACH, GET, REMOVE, REMOVE_AT, SET, SIZE, TO_JSON, TO_SET
For more information about each of the data mapping functions, see Data Mapping Functions Reference.

Supported base functions

The following table lists the data mapping base functions available:
Data type Supported base functions
Integer NOW_IN_MILLIS
Integer array INT_LIST
String GENERATE_UUID, GET_EXECUTION_ID, GET_INTEGRATION_NAME, GET_INTEGRATION_REGION, GET_PROJECT_ID
For more information about each of the data mapping base functions, see Data Mapping Functions Reference.

Retry on failure

You can configure various retry strategies to handle errors in a task. The retry strategies allow you to specify how to rerun the task or integration in case of an error. For more information, see Error handling strategies.

Best practices

Using the Data Mapping task can be a powerful way to transform and pass key variables to various tasks in your integration. Here are a few tips to keep in mind as you build your integration:

  • Mappings are run in sequence from top to bottom. That is, if input variable A is mapped to an output variable B in the first row, the variable B is available for mapping in the subsequent rows.
  • In each row, the data type of the Input field must match the data type of the Output field. To cast between types, use transformation functions such as TO_STRING and TO_INT.
  • There are no limitations on the length of transformation chaining. However, debugging large chained transformations can be difficult. We recommend keeping input transformations readable and splitting complex transformations into multiple mappings.
  • If a mapping requires a fallback value, set a fallback value for the mapping. If you do not provide a fallback value, the mapping returns an error when the input value or transformation returns null.
  • When deleting a variable, make sure to delete any mapping that contains it.

Considerations

For JSON variables, based on a JSON schema, Apigee Integration interprets the child property of the variable as a JSON type in the following cases:

  • If the child property's type specifies null. For example:
    {
      "properties": {
        "PersonID": {
          "type": [
            "number",
            "null"
          ],
          "readOnly": false
        }
      }
    }
    
  • If the child property's type specifies multiple data types. For example:
    {
      "properties": {
        "PersonID": {
          "type": [
            "integer",
            "string",
            "number"
          ],
          "readOnly": false
        }
      }
    }
    

In these cases, using the data mapping functions, you must explicitly convert the child variables to the desired type.

The following examples illustrate the various types of property declarations and show how to use the data mapping functions to get the desired type:

Example 1

{
  "type": "object",
  "properties": {
    "dbIntegers": {
      "type": "array",
      "items": {
        "type": [
          "integer",
          "null"
        ]
      }
    },
    "dbBooleans": {
      "type": [
        "array"
      ],
      "items": {
        "type": "boolean"
      }
    }
  }
}
Apigee Integration interprets dbIntegers as a JSON type, and dbBooleans as a BOOLEAN_ARRAY type.

To convert dbIntegers use:

dbIntegers.TO_INT_ARRAY()

Example 2

{
  "type": "object",
  "properties": {
    "dbId": {
      "type": [
        "number",
        "null"
      ],
      "readOnly": false
    },
    "dbStatus": {
      "type": [
        "boolean",
        "null"
      ],
      "readOnly": false
    }
  }
}
Apigee Integration recognises both dbId and dbStatus as JSON types, but dbId is a parameter that can take a single double value or a null value, and dbStatus is a parameter that can take a single boolean value or a null value.

To convert dbId and dbStatus use:

dbId.TO_DOUBLE()
dbStatus.TO_BOOLEAN()

Example 3

{
  "type": "object",
  "properties": {
    "dbString": {
      "type": [
        "string"
      ],
      "readOnly": false
    }
  }
}
Apigee Integration recognises dbString as a string type.