Quickstart guide

This section contains two types of guides: one for local standalone mode using Apache Kafka and the other for operating a Confluent Platform inside docker.

Note

For using all the functionalities of the connector, we recommend going for the Confluent Platform.

Kafka Connect standalone

Synopsis

This quickstart will show how to set up the ODP source Connector against an existing SAP NetWeaver® system and run it locally in Kafka Connect standalone mode using Apache Kafka. We will use a non encrypted communication channel with basic SAP® username/password authentication. In productive environments, it is recommended to use a SAP® Secure Network Communication (SNC) setup with optional Single Sign On (SSO).

Preliminary setup

  1. Download and extract Apache Kafka
  2. Copy the ODP source connector jar into the Kafka Connect plugins directory
  3. Get a copy of SAP® JCo v3.1.8 (sapjco3.jar and native lib sapjco3.dll or sapjco3.so) and copy it to the plugins directory

Configuration

  1. Edit the contents of file /config/connect-standalone.properties like this:

    bootstrap.servers=localhost:9092
    key.converter=org.apache.kafka.connect.json.JsonConverter
    value.converter=org.apache.kafka.connect.json.JsonConverter
    key.converter.schemas.enable=true
    value.converter.schemas.enable=true
    offset.storage.file.filename=/tmp/connect.offsets
    offset.flush.interval.ms=10000
    plugin.path=/kafka_2.12-2.0.0/plugins
    
    Note

    Make sure that the plugin path exists

  2. Extract the properties template for the ODP source connector, which is located at etc/ohja-odp-source-connector.properties within the connector package. Then, copy this template to /config/ohja-odp-source-connector.properties.
  3. Get in contact with your administration team for the connection properties of your SAP NetWeaver® installation and maintain the following minimum connector properties:

    # SAP Netweaver application host DNS or IP
    jco.client.ashost = 127.0.0.1
    # SAP system number
    jco.client.sysnr = 20
    # SAP client number
    jco.client.client = 100
    # SAP RFC user
    jco.client.user = user
    # SAP user password
    jco.client.passwd = password
    
    Note

    Make sure the SAP® user has enough privileges to access RFC-enabled function modules and the SAP NetWeaver® Gateway is configured to accept RFC connections from your host.

  4. Maintain the output Kafka topic name, and the following connector configs according to the delta enabled ODP data source of your choice in SAP®:

    sap.odp#00.name = Test
    sap.odp#00.context = SAPI
    # Delta-Initialization without data transfer (0=false,1=true(default))
    sap.odp#00.init-simulation = 0
    # Kafka output topic name
    sap.odp#00.topic = ODPSAPITEST
    

Execution

  1. Start a local Zookeeper instance from the shell e.g., for Windows OS type:

    cd <KAFKA_ROOT>
    set KAFKA_LOG4J_OPTS=-Dlog4j.configuration=file:<KAFKA_ROOT>/config/tools-log4j.properties
    bin\windows\zookeeper-server-start.bat .\config\zookeeper.properties
    
  2. Start a local Kafka server instance from separate shell e.g., for Windows OS type:

    bin\windows\kafka-server-start.bat .\config\server.properties
    
  3. Start a simple kafka consumer from separate shell e.g., for Windows OS type:

    bin\windows\kafka-console-consumer.bat --bootstrap-server localhost:9092 --topic ODPSAPITEST --from-beginning
    
  4. Start a local standalone Kafka Connect instance and execute the ODP source connector from separate shell e.g., for Windows OS type:

    bin\windows\connect-standalone.bat .\config\connect-standalone.properties .\config\ohja-odp-source-connector.properties > log.txt 2>&1
    
    The logging outputs will be written to the file log.txt

To view the JSON representations of the ODP data messages along with their schema, switch to the Kafka consumer shell. If everything is set up correctly, you should see the messages being printed out to the console output in JSON format.

Logging

Check the log outputs by opening file log.txt in an editor of your choice or for Windows OS just type:

type log.txt

Confluent Platform

Synopsis

This section shows how to launch the ODP source connector on a Confluent Platform running locally within a docker environment.

Preliminary setup

  1. Install Docker Engine and Docker Compose on the machine where you plan to run the Confluent Platform.
    1. If you’re using a Mac or Windows, Docker Desktop includes both.
  2. Download and launch a ready-to-go Confluent Platform Docker image as described in Confluent Platform Quickstart Guide
  3. Make sure that the machine running the Confluent Platform is connected to the SAP® source.
  4. Make sure you have a licensed version of the SAP® Java Connector 3.1 SDK before proceeding.

Connector Installation

The connector can either be installed manually or through the Confluent Hub Client.

Note

In both scenarios, it is beneficial to use a volume to easily transfer the connector file into the Kafka Connect service container. If running Docker on a Windows machine, make sure to add a new system variable COMPOSE_CONVERT_WINDOWS_PATHS and set it to 1.

Manual Installation

To install the ODP Source Connector, follow these steps:

  1. Unzip the downloaded file, init-kafka-connect-odp-x.x.x.zip.
  2. Get a copy of SAP® Java Connector v3.1.8 SDK and move it to the lib folder inside the unzipped connector folder. SAP® JCo consists of sapjco3.jar, and the native libraries like sapjco3.dll for Windows OS or sapjco3.so for Unix. Include e.g. the native lib sapjco3.so next to the sapjco3.jar
  3. Move the unzipped connector folder into the configured CONNECT_PLUGIN_PATH of the Kafka Connect service
  4. Within the directory where the docker-compose.yml of the Confluent Platform is located you can start the Confluent Platform using Docker Compose

    docker-compose up -d
    

Confluent CLI

Install the zipped connector init-kafka-connect-odp-x.x.x.zip using the Confluent Hub Client from outside the Kafka Connect docker container

docker-compose exec connect confluent-hub install {PATH_TO_ZIPPED_CONNECTOR}/init-kafka-connect-odp-x.x.x.zip

Further information on the Confluent CLI can be found in the Confluent CLI Command Reference

Configuration

The connector can be configured and launched using the control-center service of the Confluent Platform.

  1. In the control-center (default: localhost:9091) select a Connect Cluster in the Connect tab.
  2. Click the “Add connector” button and select the ODPSourceConnector.
  3. Enter a unique name for the connector instance and provide any other required configuration.

Hints

  • To prevent from SAP® user lockout during the connector configuration process, it is recommended to enter the jco.client.user and jco.client.passwd properties in the JCo Destination configuration block with caution since the control-center validates the configuration with every user input which includes trying to establish a connection to the SAP® system
  • If you enter the properties in the JCo Destination configuration block first, value recommendations for some other properties will be loaded directly from the SAP® source system
  • You can display the advanced JCo configuration properties in the Confluent Control Center UI by setting the configuration property Display advanced properties to 1
  • Since you can push data to multiple ODP sources in one connector instance, an additional ODP sources configuration block will appear once you provided the required information for the first ODP sources configuration block
  • There’s a limit to the number of ODP sources that can be configured in the Confluent Control Center UI for one connector instance. If you need to configure additional sources beyond this limit, you can do so in the UI without any recommendations by using the Additional Properties section. The same applies to the number of selection conditions in full initialization mode.