Kafka Connect Standalone

Synopsis

This quickstart will show how to set up the aDSO Sink Connector against an existing SAP NetWeaver® system and run it locally in Kafka Connect standalone mode using Apache Kafka. We will use a non encrypted communication channel with basic SAP® username/password authentication. In productive environments, it is recommended to use a SAP Secure Network Communication (SNC) setup with optional Single Sign On (SSO).

Preliminary Setup

  1. Download and extract Apache Kafka.
  2. Copy the aDSO sink connector jar into the Kafka Connect plugins directory.
  3. Get a copy of SAP® JCo v3.1.11 (sapjco3.jar and native lib sapjco3.dll or sapjco3.so) and copy it to the plugins directory.

Connector Configuration

  1. Edit the contents of file <KAFKA_ROOT>/config/connect-standalone.properties like this:

    bootstrap.servers=localhost:9092
    key.converter=org.apache.kafka.connect.json.JsonConverter
    value.converter=org.apache.kafka.connect.json.JsonConverter
    key.converter.schemas.enable=true
    value.converter.schemas.enable=true
    offset.storage.file.filename=/tmp/connect.offsets
    offset.flush.interval.ms=10000
    plugin.path=<KAFKA_ROOT>/plugins
    
    Note

    Make sure that the plugin path exists.

  2. Extract the properties with a minimal configuration for the aDSO Sink Connector and copy this template to <KAFKA_ROOT>/config/ohja-adso-sink-connector.properties. Remember to add in your license key.

    # The first few settings are required for all connectors: a name, the connector class to run, and the maximum number of tasks to create:
    name = adso-sink-connector
    connector.class = org.init.ohja.kafka.connect.adso.sink.ADSOSinkConnector
    tasks.max = 1
    sap.adso.license.key = "Your license key here"
    # list of topics to listen on
    topics=ZADSOTEST
    
  3. Get in contact with your administration team for the connection properties of your SAP NetWeaver® installation and maintain the following minimum connector properties.

    # SAP Netweaver application host DNS or IP
    jco.client.ashost = 127.0.0.1
    # SAP system number
    jco.client.sysnr = 20
    # SAP client number
    jco.client.client = 100
    # SAP RFC user
    jco.client.user = user
    # SAP user password
    jco.client.passwd = password
    
    Note

    Ensure the SAP® user has enough privileges to access RFC-enabled function modules and the SAP NetWeaver® Gateway is configured to accept RFC connections from your host.

  4. Maintain the input Kafka topic name, and the technical name for the aDSO of your choice in SAP®.

    # technical name of the aDSO
    sap.adso#00.name=ZADSOTEST
    # Kafka input topic name
    sap.adso#00.topic=ZADSOTEST
    

Execution

The following steps are intended for users running Windows OS. Please make sure to use the appropriate paths and commands for your environment. If you are using a different operating system, adapt the commands accordingly.

  1. Use a custom logging configuration file located at <KAFKA_ROOT>/config/tools-log4j.properties.

    cd <KAFKA_ROOT>
    set KAFKA_LOG4J_OPTS=-Dlog4j.configuration=file:<KAFKA_ROOT>/config/tools-log4j.properties
    
  2. Format the Kafka storage directory from the shell using a random cluster ID.

    bin\windows\kafka-storage.bat random-uuid
    
    bin\windows\kafka-storage.bat format --standalone -t <KAFKA_CLUSTER_ID> -c config/kraft/server.properties
    
  3. Start a local Kafka server instance.
    bash bin\windows\kafka-server-start.bat config\kraft\server.properties
  4. In another shell start a local standalone Kafka Connect instance and execute the aDSO Sink Connector.

    bin\windows\connect-standalone.bat config\connect-standalone.properties config\adso-sink-connector.properties > log.txt 2>&1
    
    The logging outputs will be written to the file log.txt.
  5. Start a simple Kafka producer from another shell.

    bin\windows\kafka-console-producer.bat --bootstrap-server localhost:9092 --topic ZADSOTEST
    
  6. Now you can enter messages that will be put into topic ZADSOTEST as source topic for the connector. You only have to enter the value part of the messages, e.g.

    {"schema":{"type":"struct","fields":[{"type":"string","optional":false,"doc":"ZCONTRACT","field":"_BIC_ZCONTRACT"},{"type":"int32","optional":true,"name":"org.apache.kafka.connect.data.Date","version":1,"doc":"ZKEYDATE","field":"_BIC_ZKEYDATE"},{"type":"bytes","optional":false,"name":"org.apache.kafka.connect.data.Decimal","version":1,"doc":"ZCONTRVAL","field":"_BIC_ZCONTRVAL"},{"type":"string","optional":false,"doc":"Currency Key","field":"CURRENCY"}],"name":"ES_DATA"},"payload":{"_BIC_ZCONTRACT":"100","_BIC_ZKEYDATE":16647,"_BIC_ZCONTRVAL":4923.8,"CURRENCY":"EUR"}}
    
  7. Switch to the SAP GUI and monitor the aDSO to check for new data, see Monitoring. If no data arrives at SAP you should check log.txt for any errors.

Logging

Check the log outputs by opening file log.txt in an editor of your choice. For Windows OS just type:

type log.txt