Data Management

Supported Data Types and Formats

The OData V4 Connectors support the Confluent JSON Converter as well as the AVRO Converter.

Data Type Mapping Source Connector

The following table gives an overview of the OData V4 data type mapping from OData V4 data types to Kafka Connect data types applied by the source connector:

OData V4 Kafka Connect Schema Type Java Data Type
EdmBinary BYTES java.nio.ByteBuffer
EdmBoolean BOOLEAN java.lang.Boolean
EdmByte INT16 java.lang.Short
EdmDate DATE java.util.Date
EdmDateTimeOffset TIMESTAMP java.util.Date
EdmDecimal DECIMAL java.math.BigDecimal
EdmDouble FLOAT64 java.lang.Double
EdmDuration DECIMAL java.math.BigDecimal
EdmGuid STRING java.lang.String
EdmInt16 INT16 java.lang.Short
EdmInt32 INT32 java.lang.Integer
EdmInt64 INT64 java.lang.Long
EdmSByte INT8 java.lang.Byte
EdmSingle FLOAT32 java.lang.Float
EdmStream STRING java.lang.String
EdmString STRING java.lang.String
EdmTimeOfDay TIME java.util.Date
EdmEnumType STRING java.lang.String
EdmStructuredType STRUCT java.util.Map[String, Object]
Note
  • The scale value floating for properties of type EdmDecimal is not supported.
  • The scale value variable for properties of type EdmDecimal is converted to the value of the precision facet. If the precision facet is unspecified, 6 is used as a scale with rounding mode half even for the conversion. The fallback scale and the rounding mode can be overridden using sap.odata.decimal.scale.fallback and sap.odata.decimal.rounding.mode.

Data Type Mapping Sink Connector

The following table gives an overview of the OData V4 data type mapping from Kafka Connect data types to OData V4 data types applied by the sink connector. The default conversion type is the first one listed in the Kafka Connect Schema Types column.

OData V4 Kafka Connect Schema Types
EdmBinary BYTES
EdmBoolean BOOLEAN
EdmByte INT8, INT16, INT32, INT64
EdmDate DATE, TIMESTAMP, INT64
EdmDateTimeOffset DATE, TIMESTAMP, INT64
EdmDecimal DECIMAL, FLOAT32, FLOAT64, INT8, INT16, INT32, INT64
EdmDouble INT8, INT16, INT32, INT64, FLOAT32, FLOAT64, DECIMAL
EdmDuration DECIMAL, INT8, INT16, INT32, INT64
EdmGuid STRING
EdmInt16 INT8, INT16, INT32, INT64
EdmInt32 INT8, INT16, INT32, INT64
EdmInt64 INT8, INT16, INT32, INT64
EdmSByte INT8, INT16, INT32, INT64
EdmSingle INT8, INT16, INT32, INT64, FLOAT32, FLOAT64, DECIMAL
EdmStream STRING
EdmString STRING
EdmTimeOfDay TIME, DATE, TIMESTAMP, Int64
EdmEnumType STRING
EdmStructuredType STRUCT
Note
  • Type conversions into OData types from other Kafka Connect Schema Types than the default conversion type might result in inaccurate data or loss of information. For instance, some integer values that can be represented as INT32 may not be accurately represented as EdmInt16.
  • The sink connector does not actively use ETags for concurrency control. When issuing a PUT or DELETE request, the sink connector uses the value * in the If-Match HTTP request header.

Kafka Headers

The OData V4 source connector supports inserting metadata information into the kafka message header by setting sap.odata.headers.enable = 1 (default is 0). The following header fields are supported:

Metric Explanation value type
absolut URL The current used absolut URL String
record Index With this index the message can be identified Integer
odata.timestamp Response time String
odata.version The used OData version String
odata.oauth.client.id OAuth Client ID String
odata.oauth.redirect OAuth Redirect Link String
odata.oauth.token.endpoint OAuth Token Endpoint String
connector version The version of the connector String

Optional Data Conversion Settings

You can optionally transform DECIMAL types to other appropriate data types if needed. decimal.mapping = primitive will transform decimals to double or long, depending on the scale.

Single Message Transforms (SMTs)

Single Message Transforms (SMTs) allow for lightweight, real-time modifications of data as it passes through Kafka Connect. SMTs can be applied in source connectors before writing data to Kafka topics or in sink connectors before sending data to external systems.

Use Cases for SMTs

  1. Data Filtering: Remove unnecessary fields or records based on conditions, focusing only on relevant data.
  2. Field Manipulation: Modify fields by renaming, masking sensitive data, or changing formats, ensuring consistency and compatibility.
  3. Field Enrichment: Add metadata or default values to provide more context to messages.
  4. Transformation Chains: Combine multiple SMTs for complex transformations.
  5. Routing and Partitioning: Dynamically change the Kafka topic or partition for records based on their contents.

The OData V4 Connectors support the chaining of multiple SMTs, allowing for flexible and powerful data processing, enhancing the capabilities of data pipelines in Kafka Connect.

Schema Registry Integration

The OData V4 Connectors are fully compatible with Confluent Schema Registry, enabling seamless integration of data with Kafka topics using Avro, JSON Schema, or Protobuf formats. This compatibility ensures that all schema definitions are centrally stored and managed, simplifying the development and maintenance of applications that consume the data.

Advantages of Schema Registry Compatibility

  • Data consistency: Enforces a well-defined schema, preventing issues like missing fields or type mismatches.
  • Schema evolution: Supports backward and forward schema compatibility, allowing updates to schemas without breaking existing consumers.
  • Reduced data size: Avro and Protobuf serialization minimize payload size, improving data transmission efficiency.
  • Centralized schema management: Simplifies handling multiple message formats by storing all schema versions in one place.

This integration enhances data governance and ensures reliable handling of messages in Kafka.