Confluent Platform

Synopsis

This section shows how to launch the IDoc Connector on a Confluent Platform running locally within a docker environment.

Preliminary Setup

  1. Install Docker Engine and Docker Compose on the machine where you plan to run the Confluent Platform.
    1. If you’re using a Mac or Windows, Docker Desktop includes both.
  2. Download and launch a ready-to-go Confluent Platform Docker image as described in Confluent Platform Quickstart Guide.
  3. Make sure that the machine running the Confluent Platform is connected to the SAP® source.
  4. Ensure you have a licensed version of the SAP® Java Connector 3.1 SDK before proceeding.

Connector Installation

The connector can be installed either manually or through the Confluent Hub Client.

Note

In both scenarios, it is beneficial to use a volume to easily transfer the connector file into the Kafka Connect service container. If running Docker on a Windows machine make sure to add a new system variable COMPOSE_CONVERT_WINDOWS_PATHS and set it to 1.

Manual Installation

To install the IDoc Connector, follow these steps:

  1. Unzip the downloaded file, init-kafka-connect-idoc-x.x.x.zip.
  2. Get a copy of SAP® Java Connector v3.1.11 SDK and move it to the lib/ folder inside the unzipped connector folder. SAP® JCo consists of sapjco3.jar, and the native libraries like sapjco3.dll for Windows OS or sapjco3.so for Unix. Include e.g. the native lib sapjco3.so next to the sapjco3.jar.
  3. Get a copy of SAP® Java IDoc library v3.1.3 (sapidoc3.jar) and copy it to the plugins directory.
  4. Move the unzipped connector folder into the configured CONNECT_PLUGIN_PATH of the Kafka Connect service.
  5. Within the directory where the docker-compose.yml of the Confluent Platform is located, you can start the Confluent Platform using Docker Compose.

    docker-compose up -d
    

Confluent CLI

Install the zipped connector init-kafka-connect-idoc-x.x.x.zip using the Confluent Hub Client from outside the Kafka Connect docker container.

confluent connect plugin install {PATH_TO_ZIPPED_CONNECTOR}/init-kafka-connect-idoc-x.x.x.zip
Note

Further information on the Confluent CLI can be found in the Confluent CLI Command Reference.

Connector Configuration

The connector can be configured and launched using the control-center service of the Confluent Platform.

  1. In the control-center (default: localhost:9091) select a Connect Cluster in the Connect tab.
    Choose a Connect Cluster.
    Choose a Connect Cluster.
  2. Click the “Add connector” button and select the IDocSourceConnector or IDocSinkConnector.
    Add a connector.
    Add a connector.
  3. Enter a unique name for the connector instance and provide any other required configuration.

IDoc Source Connector

For the IDoc Source Connector to extract data from a SAP® source, you need to follow these steps:

  1. Transfer the properties including a minimal configuration to the Confluent Control Center user interface. Remember to include your license key.

    name = idoc-source-connector
    connector.class = org.init.ohja.kafka.connect.idoc.source.IDocSourceConnector
    tasks.max = 1
    sap.idoc.license.key = "Your license key here"
    sap.idoc#00.file.path = /usr/sap/data/out
    sap.idoc#00.exec-period = 900
    sap.idoc#00.topic = DEBMAS
    sap.idoc#00.file.type = 0
    sap.idoc#00.file.filter.regex = *.xml
    jco.client.ashost = 127.0.0.1
    jco.client.sysnr = 20
    jco.client.client = 100
    jco.client.user = user
    jco.client.passwd = password
    jco.client.lang = EN
    
  2. Launch the source connector.

IDoc Sink Connector

To import data to the SAP® source using the IDoc Sink Connector, you need to follow these steps:

  1. Transfer the properties to the Confluent Control Center user interface. Remember to include your license key.

    name = idoc-sink-connector
    connector.class = org.init.ohja.kafka.connect.idoc.sink.IDocSinkConnector
    tasks.max = 1
    sap.idoc.license.key = "Your license key here"
    topics = DEBMAS08
    # The remaining configs are specific to the IDoc Sink Connector.
    # The following configuration entries will be applied to every sink configured:
    # See SAP transaction WE20
    sap.idoc.sender.partner.number =
    sap.idoc.sender.partner.type =
    # See SAP transaction WE21 -> Transactional RFC
    sap.idoc.sender.port =
    # See SAP transaction WE20
    sap.idoc.receiver.partner.number =
    sap.idoc.receiver.partner.type =
    # SAP system identification SAP<SID>
    sap.idoc.receiver.port =
    sap.idoc#00.topic = DEBMAS08
    sap.idoc#00.basic-type = DEBMAS08
    sap.idoc#00.package.size = 1
    jco.client.ashost = 127.0.0.1
    jco.client.sysnr = 20
    jco.client.client = 100
    jco.client.user = user
    jco.client.passwd = password
    jco.client.lang = EN
    
  2. Launch the sink connector.
  3. To test the sink connector, you need sample data. You can import sample data into Kafka using the IDoc Source Connector.

Starting a Connector via REST Call

  1. Save the example configuration JSON file into a local directory, e.g. named as source.idoc.json. Remember to include your license key.

    {
        "name": "idoc-source-connector",
        "config": {
            "connector.class": "org.init.ohja.kafka.connect.idoc.source.IDocSourceConnector",
            "sap.idoc.license.key": "Your license key here",
            "sap.idoc#00.file.path": "/usr/sap/data/out",
            "sap.idoc#00.exec-period": "900",
            "sap.idoc#00.topic": "DEBMAS",
            "sap.idoc#00.file.type": "0",
            "sap.idoc#00.file.filter.regex": "*.xml",
            "jco.client.ashost": "127.0.0.1",
            "jco.client.sysnr": "20",
            "jco.client.client": "100",
            "jco.client.user": "user",
            "jco.client.passwd": "password"
        }
    }
    
  2. Once the configuration JSON is prepared, you can start the connector by sending it via a REST call to the Kafka Connect REST API. Use the following command to send a POST request:

    curl -X POST http://localhost:8083/connectors \
    -H "Content-Type:application/json" \
    -H "Accept:application/json" \
    -d @source.idoc.json
    
  3. Once the connector is started successfully, the Kafka Connect REST API will return a response in JSON format with details about the connector, including its status and any potential errors. You can verify that the connector is running by checking its status:

    curl -X GET http://localhost:8083/connectors/idoc-source-connector/status
    

    This will return a JSON object indicating whether the connector is running, its tasks, and any associated errors.

Additional Notes
  • Entering user credentials: To prevent from SAP® user lockout during the connector configuration process, it is recommended to enter the jco.client.user and jco.client.passwd properties in the JCo Destination configuration block with caution since the control-center validates the configuration with every user input which includes trying to establish a connection to the SAP® system.
  • Get recommended values automatically: If you enter the properties in the JCo Destination configuration block first, value recommendations for some other properties will be loaded directly from the SAP® source system.
  • Display advanced JCo configuration properties: You can display the advanced JCo configuration properties in the Confluent Control Center UI by setting the configuration property Display advanced properties to 1.
  • Configure multiple IDoc sources: Since you can push data to multiple IDoc sources in one connector instance, an additional IDoc sources configuration block will appear once you provided the required information for the first IDoc sources configuration block.
  • Selection limits in Confluent Control Center UI: There’s a limit to the number of IDoc sources that can be configured in the Confluent Control Center UI for one connector instance. If you need to configure additional sources beyond this limit, you can do so in the UI without any recommendations by using the Additional Properties section. The same applies to the number of selection conditions in full initialization mode.