Quickstart Guide

This section includes two types of guides, one for local standalone mode using Apache Kafka and another for running a confluent platform inside docker.

Note

For using all the functionalities of the connector, we recommend going for the Confluent platform.

Kafka Connect standalone

Synopsis

This quickstart will show how to set up the Webservice Data Source Sink Connector against an existing SAP Netweaver system and run it locally in Kafka Connect standalone mode using Apache Kafka. We will use a non encrypted communication channel with basic SAP username/password authentication. In productive environments, it is recommended to use a SAP® Secure Network Communication (SNC) setup with optional Single Sign On (SSO).

Preliminary setup

  1. Download and extract Apache Kafka
  2. Copy the WebDS sink connector jar into the Kafka Connect plugins directory
  3. Get a copy of SAP JCo v3.1.8 (sapjco3.jar and native lib sapjco3.dll or sapjco3.so) and copy it to the plugins directory

Configuration

  1. Edit the contents of file /config/connect-standalone.properties like this:

    bootstrap.servers=localhost:9092
    key.converter=org.apache.kafka.connect.json.JsonConverter
    value.converter=org.apache.kafka.connect.json.JsonConverter
    key.converter.schemas.enable=true
    value.converter.schemas.enable=true
    offset.storage.file.filename=/tmp/connect.offsets
    offset.flush.interval.ms=10000
    plugin.path=/kafka_2.12-2.0.0/plugins
    
    Note

    Make sure that the plugin path exists

  2. Extract the WebDS sink connector properties template etc/ohja-webds-sink-connector.properties from the connector package and copy it to /config/ohja-webds-sink-connector.properties
  3. Get in contact with your administration team for the connection properties of your SAP Netweaver installation and maintain the following minimum connector properties:

    # SAP Netweaver application host DNS or IP
    jco.client.ashost = 127.0.0.1
    # SAP system number
    jco.client.sysnr = 20
    # SAP client number
    jco.client.client = 100
    # SAP RFC user
    jco.client.user = user
    # SAP user password
    jco.client.passwd = password
    
    Note

    Make sure the SAP user has enough privileges to access RFC-enabled function modules and the SAP Netweaver Gateway is configured to accept RFC connections from your host.

  4. Maintain the input Kafka topic name, and the following connector configs according to the webservice data source of your choice in SAP:

    sap.webds#00.system=WEBDS
    sap.webds#00.name=ZTEST
    # Kafka input topic name
    sap.webds#00.topic=WEBDSTEST
    

Execution

  1. Start a local Zookeeper instance from the shell, e.g. for Windows OS type:

    cd <KAFKA_ROOT>
    set KAFKA_LOG4J_OPTS=-Dlog4j.configuration=file:<KAFKA_ROOT>/config/tools-log4j.properties
    bin\windows\zookeeper-server-start.bat .\config\zookeeper.properties
    
  2. Start a local Kafka server instance from another shell, e.g. for Windows OS type:

    bin\windows\kafka-server-start.bat .\config\server.properties
    
  3. Start a local standalone Kafka Connect instance and execute the WebDS sink connector from another shell, e.g. for Windows OS type:

    bin\windows\connect-standalone.bat .\config\connect-standalone.properties .\config\ohja-webds-sink-connector.properties > log.txt 2>&1
    
    The logging outputs will be written to the file log.txt
  4. Start a simple kafka producer from another shell, e.g. for Windows OS type:

    bin\windows\kafka-console-producer.bat --broker-list localhost:9092 --topic WEBDSTEST
    
  5. Now you can enter messages that will be put into topic WEBDSTEST as source topic for the connector. You only have to enter the value part of the messages, e.g.

    {"schema":{"type":"struct","fields":[{"type":"string","optional":false,"doc":"ZCONTRACT","field":"_BIC_ZCONTRACT"},{"type":"int32","optional":true,"name":"org.apache.kafka.connect.data.Date","version":1,"doc":"ZKEYDATE","field":"_BIC_ZKEYDATE"},{"type":"bytes","optional":false,"name":"org.apache.kafka.connect.data.Decimal","version":1,"doc":"ZCONTRVAL","field":"_BIC_ZCONTRVAL"},{"type":"string","optional":false,"doc":"Currency Key","field":"CURRENCY"}],"name":"ES_DATA"},"payload":{"_BIC_ZCONTRACT":"100","_BIC_ZKEYDATE":16647,"_BIC_ZCONTRVAL":4923.8,"CURRENCY":"EUR"}}
    
  6. Switch to the SAP GUI and monitor the webservice data source to check for new data, see Monitoring. If no data arrives at SAP you should check log.txt for any errors.

Logging

Check the log outputs by opening file log.txt in an editor of your choice or for Windows OS just type:

type log.txt

Confluent Platform

Synopsis

This section shows how to launch the Webservice Data Source Sink Connector on a Confluent Platform running locally within a docker environment.

Preliminary setup

  1. Make sure to have Docker Engine and Docker Compose installed on the machine where you want to run the Confluent Platform
    1. Docker Desktop available for Mac and Windows includes both
  2. Download and launch a ready-to-go Confluent Platform Docker image as described in Confluent Platform Quickstart Guide
  3. Ensure that the machine where the Confluent Platform is running on has a connection to the SAP source.
  4. Ensure that you have a licensed version of the SAP Java Connector 3.1 SDK available

Connector Installation

The connector can either be installed manually or through the Confluent Hub Client.

Note

In both scenarios it is beneficial to use a volume to easily transfer the connector file into the Kafka Connect service container. If running Docker on a Windows machine make sure to add a new system variable COMPOSE_CONVERT_WINDOWS_PATHS and set it to 1.

Manual Installation

  1. Unzip the zipped connector init-kafka-connect-webds-x.x.x.zip
  2. Get a copy of SAP Java Connector v3.1.8 SDK and move it to the lib folder inside the unzipped connector folder. SAP JCo consists of sapjco3.jar, and the native libraries like sapjco3.dll for Windows OS or sapjco3.so for Unix. Include e.g. the native lib sapjco3.so next to the sapjco3.jar
  3. Move the unzipped connector folder into the configured CONNECT_PLUGIN_PATH of the Kafka Connect service
  4. Within the directory where the docker-compose.yml of the Confluent Platform, is located you can start the Confluent Platform using Docker Compose

    docker-compose up -d
    

Confluent Hub Client

Install the zipped connector init-kafka-connect-webds-x.x.x.zip using the Confluent Hub Client from outside the Kafka Connect docker container

docker-compose exec connect confluent-hub install {PATH_TO_ZIPPED_CONNECTOR}/init-kafka-connect-webds-x.xx-x.x.x.zip

Further information on the Confluent CLI can be found in the Confluent CLI Command Reference

Configuration

The connector can be configured and launched using the control-center service of the Confluent Platform.

  1. In the control-center (default: localhost:9091) select a Connect Cluster in the Connect tab
  2. Click the button “Add connector” and select WebDSSinkConnector
  3. Enter a name and further required configuration for the connector

Hints

  • To prevent from SAP® user from being locked out during the connector configuration process, it’s important to be cautious when entering the jco.client.user and jco.client.passwd properties in the JCo Destination configuration block with caution since the control-center validates the configuration with every user input, which includes trying to establish a connection to the SAP® system
  • If you enter the properties in the JCo Destination configuration block first, value recommendations for some other properties will be loaded directly from the SAP source system
  • To display the advanced JCo configuration properties in the Confluent Control Center UI, you can set the configuration property Display advanced properties to 1.
  • Since you can push data to multiple webservice data sources in one connector instance, an additional Webservice Data Source configuration block will appear once you provided the required information for the first Webservice Data Source configuration block
  • The number of configurable webservice data sources in the Confluent Control Center UI for one connector instance is restricted.