Confluent Platform

Synopsis

This section provides instructions on how to deploy the OData V4 Connectors on a Confluent Platform instance using a Docker environment.

Preliminary Setup

  1. Make sure to have Docker Engine and Docker Compose installed on the machine where you want to run the Confluent Platform.
    1. Docker Desktop available for Mac and Windows includes both.
  2. Download and launch a ready-to-go Confluent Platform Docker image as described in Confluent Platform Quickstart Guide.
  3. Ensure that the machine where the Confluent Platform is running on has a network connection to the publicly available (read/write) TripPin service.

Connector Installation

The OData V4 Connectors can be installed either manually or through the Confluent Hub Client.

Note

In both scenarios, it is beneficial to use a volume to easily transfer the connector file into the Kafka Connect service container. If you´re running Docker on a Windows machine, make sure to add a new system variable COMPOSE_CONVERT_WINDOWS_PATHS and set it to 1.

Manual Installation

  1. Unzip the zipped connector package init-kafka-connect-odatav4-x.x.x.zip.
  2. Move the unzipped connector folder into the configured CONNECT_PLUGIN_PATH of the Kafka Connect service.
  3. Within the directory where the docker-compose.yml of the Confluent Platform is located, you can start the Confluent Platform using Docker Compose.

    docker-compose up -d
    

Confluent CLI

Install the zipped connector package init-kafka-connect-odatav4-x.x.x.zip using the Confluent Hub Client from outside the Kafka Connect docker container:

confluent connect plugin install {PATH_TO_ZIPPED_CONNECTOR}/init-kafka-connect-odatav4-x.x.x.zip
Note

Further information on the Confluent CLI can be found in the Confluent CLI Command Reference.

Connector Configuration

The OData V4 Connectors can be configured and launched using the control-center service of the Confluent Platform.

  1. In the control-center (default: localhost:9091) select a Connect Cluster in the Connect tab.
    Choose a Connect Cluster.
    Choose a Connect Cluster.
  2. Click the button “Add connector” and select OData4SourceConnector or OData4SinkConnector.
    Add a connector.
    Add a connector.
  3. Enter a name and further required configuration for the connector.

OData V4 Source Connector

For the OData V4 Source Connector to extract data from the TripPin Service, you need to follow these steps:

  1. To get a URL for a temporary service instance that allows read operations, open TripPin service in a browser. The S(readwrite) part in the URL will be automatically replaced with a temporary service ID.
    Copy the temporary service ID.
    Copy the temporary service ID.
  2. Transfer the properties including a minimal configuration to the control-center user interface. Remember to include the service ID in the service path and to include your license key.
    Insert the properties.
    Insert the properties.
    name = odata-source-connector
    connector.class = org.init.ohja.kafka.connect.odatav4.source.OData4SourceConnector
    tasks.max = 1
    sap.odata.license.key = "Your license key here"
    sap.odata.host.address = services.odata.org
    sap.odata.host.port = 443
    sap.odata.host.protocol = https
    sap.odata#00.service = /V4/{TEMPORARY_SERVICE_ID}/TripPinServiceRW
    sap.odata#00.entityset = People
    sap.odata#00.topic = People
    
  3. Launch the source connector.
    Launch the connector.
    Launch the connector.

OData V4 Sink Connector

For the OData V4 Sink Connector to export data to the TripPin service, you need to follow these steps:

  1. To get a URL for a temporary service instance that allows write operations, open TripPin service in a browser. The S(readwrite) part in the URL will be automatically replaced with a temporary service ID.
  2. Transfer the properties including a minimal configuration to the control-center user interface. Make sure to include the service ID to the service path. Remember to include your license key.
    Insert the properties.
    Insert the properties.
    name = odata-sink-connector
    connector.class = org.init.ohja.kafka.connect.odatav4.sink.OData4SinkConnector
    topics = People
    sap.odata.license.key = "Your license key here"
    tasks.max = 1
    sap.odata.host.address = services.odata.org
    sap.odata.host.port = 443
    sap.odata.host.protocol = https
    sap.odata#00.service = /V4/{TEMPORARY_SERVICE_ID}/TripPinServiceRW
    sap.odata#00.entityset = People
    sap.odata#00.topic = People
    
  3. Launch the sink connector.
  4. To test the sink connector, sample data is required. The target TripPin service instance also provides sample data for all entities. This data can be imported into Kafka using the OData V4 Source Connector.

Starting a Connector via REST Call

  1. Save the example configuration JSON file into a local directory, e.g. named as source.odatav4.json. To get a URL for a temporary service instance that allows write operations, open TripPin service in a browser. The S(readwrite) part in the URL will be automatically replaced with a temporary service ID. Remember to include your license key.

    {
        "name": "odata-source-connector",
        "config": {
            "connector.class": "org.init.ohja.kafka.connect.odatav4.source.OData4SourceConnector",
            "tasks.max": "1",
            "sap.odata.license.key": "Your license key here",
            "sap.odata.host.address": "services.odata.org",
            "sap.odata.host.port": "443",
            "sap.odata.host.protocol": "https",
            "sap.odata#00.service": "/V4/{TEMPORARY_SERVICE_ID}/TripPinServiceRW",
            "sap.odata#00.entityset": "People",
            "sap.odata#00.topic": "People"
        }
    }
    
  2. Once the configuration JSON is prepared, you can start the connector by sending it via a REST call to the Kafka Connect REST API. Use the following command to send a POST request:

    curl -X POST http://localhost:8083/connectors \
    -H "Content-Type:application/json" \
    -H "Accept:application/json" \
    -d @source.odatav4.json
    
  3. Once the connector is started successfully, the Kafka Connect REST API will return a response in JSON format with details about the connector, including its status and any potential errors. You can verify that the connector is running by checking its status:

    curl -X GET http://localhost:8083/connectors/odata-source-connector/status
    

    This will return a JSON object indicating whether the connector is running, its tasks, and any associated errors.

Additional Notes
  • Configure multiple OData Sources: When you use a single connector instance to poll data from different service entity sets, an extra configuration block for an OData Source will appear after you’ve provided the necessary information for the previous OData Source block.
  • Selection limits in Confluent Control Center UI: There’s a limit to the number of OData Source that can be configured in the Confluent Control Center UI for one connector instance. If you need to configure additional sources beyond this limit, you can do so in the UI without any recommendations by using the Additional Properties section.
  • The same rule applies to the number of query filter conditions and OData Sinks.