# Kafka Destination

## Destination Info

* Accepts [Page](/docs/segment/connections/spec/page), [Alias](/docs/segment/connections/spec/alias), [Group](/docs/segment/connections/spec/group), [Identify](/docs/segment/connections/spec/identify), [Track](/docs/segment/connections/spec/track) calls.
* Refer to it as **Kafka** in the [Integrations object.](/docs/segment/guides/filtering-data/#filtering-with-the-integrations-object)
* This destination is **Beta.**

[Kafka](https://kafka.apache.org/?utm_source=segmentio\&utm_medium=docs\&utm_campaign=partners) provides a highly scalable and fault-tolerant messaging system that enables real-time data processing and stream processing at scale. When integrated with Segment, Kafka serves as a powerful backbone for managing and processing event data collected by Segment, allowing businesses to efficiently ingest, route, and analyze data across various applications and systems in real time.

## Getting started

### Create the Kafka Destination

1. From your workspace's [Destination catalog page](https://app.segment.com/goto-my-workspace/destinations/catalog) search for "Kafka".
2. Select the "Kafka" tile and click **Add Destination**.
3. Select an existing Source to connect to Kafka.
4. Enter a name for your Kafka destination.

### Configure the Kafka Destination

The way you've configured your Kafka Cluster informs the authentication and encryption settings you'll need to apply to the Segment Kafka Destination. You may need the assistance of someone technical to provide values for the following Settings:

1. On the Settings tab, enter values into the **Client ID**, **Brokers** and **Authentication Mechanism** setting fields.
2. Populate fields based on the value you selected from the **Authentication Mechanism** field:
   * **Plain** or **SCRAM-SHA-256 / 512** authentication: provide values for **Username** and **Password** fields.
   * **Client Certificate** authentication: provide values for the **SSL Client Key** and **SSL Client Certificate** fields.
3. Populate the **SSL Certificate Authority** field, if necessary.
4. Save your changes and proceed to [Configure the Send Action](#configure-the-send-action).

### Configure the "Send" Action

1. Select the Mappings tab and add a new **Send** mapping.
2. Select a Topic to send data to. This field should auto-populate based on the credentials you provided in the Settings tab.
3. Map your payload using the **Payload** field.\
   *(Optional)*: Specify partitioning preferences, Headers and Message Key values.
4. Save and enable the Action, then navigate back to the Kafka destination's Settings tab to enable and save the Destination.

## Destination Settings

| Field                                           | Description                                                                                                                                                                                               | Required | Type     |
| ----------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------- | -------- |
| AWS Access Key ID                               | The Access Key ID for your AWS IAM instance. Must be populated if using AWS IAM Authentication Mechanism.                                                                                                 | No       | string   |
| AWS Authorization Identity                      | AWS IAM role ARN used for authorization. This field is optional, and should only be populated if using the AWS IAM Authentication Mechanism.                                                              | No       | string   |
| Brokers                                         | The brokers for your Kafka instance, in the format of \`host:port\`. E.g. localhost:9092. Accepts a comma delimited string.                                                                               | Yes      | string   |
| Client ID                                       | The client ID for your Kafka instance. Defaults to 'segment-actions-kafka-producer'.                                                                                                                      | Yes      | string   |
| Authentication Mechanism                        | Select the Authentication Mechanism to use. For SCRAM or PLAIN populate the 'Username' and 'Password' fields. For 'Client Certificate' populated the 'SSL Client Key' and 'SSL Client Certificate' fields | Yes      | select   |
| Password                                        | The password for your Kafka instance. Should only be populated if using PLAIN or SCRAM Authentication Mechanisms.                                                                                         | No       | password |
| AWS Secret Key                                  | The Secret Key for your AWS IAM instance. Must be populated if using AWS IAM Authentication Mechanism.                                                                                                    | No       | password |
| SSL Certificate Authority                       | The Certificate Authority for your Kafka instance. Exclude the first and last lines from the file. i.e \`-----BEGIN CERTIFICATE-----\` and \`-----END CERTIFICATE-----\`.                                 | No       | string   |
| SSL Client Certificate                          | The Certificate Authority for your Kafka instance. Exclude the first and last lines from the file. i.e \`-----BEGIN CERTIFICATE-----\` and \`-----END CERTIFICATE-----\`.                                 | No       | string   |
| SSL Enabled                                     | Indicates if SSL should be enabled.                                                                                                                                                                       | No       | boolean  |
| SSL Client Key                                  | The Client Key for your Kafka instance. Exclude the first and last lines from the file. i.e \`-----BEGIN CERTIFICATE-----\` and \`-----END CERTIFICATE-----\`.                                            | No       | password |
| SSL - Reject Unauthorized Certificate Authority | Whether to reject unauthorized CAs or not. This can be useful when testing, but is unadvised in Production.                                                                                               | No       | boolean  |
| Username                                        | The username for your Kafka instance. Should be populated only if using PLAIN or SCRAM Authentication Mechanisms.                                                                                         | No       | string   |

## Available Actions

Build your own Mappings. Combine supported [triggers](/docs/segment/connections/destinations/actions/#components-of-a-destination-action) with the following Kafka-supported actions:

> \[!NOTE]
>
> Individual destination instances have support a maximum of 50 mappings.

* [Send](#send-3)

### Send

Send data to a Kafka topic

Send is a **Cloud** action. The default Trigger is `type = "track" or type = "identify" or type = "page" or type = "screen" or type = "group"`

| Field             | Description                                                                                                                                                                                                                        | Required | Type    |
| ----------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------- | ------- |
| Topic             | The Kafka topic to send messages to. This field auto-populates from your Kafka instance.                                                                                                                                           | Yes      | STRING  |
| Payload           | The data to send to Kafka                                                                                                                                                                                                          | Yes      | OBJECT  |
| Headers           | Header data to send to Kafka. Format is Header key, Header value (optional).                                                                                                                                                       | No       | OBJECT  |
| Partition         | The partition to send the message to (optional)                                                                                                                                                                                    | No       | INTEGER |
| Default Partition | The default partition to send the message to (optional)                                                                                                                                                                            | No       | INTEGER |
| Message Key       | The key for the message (optional)                                                                                                                                                                                                 | No       | STRING  |
| Batch Bytes       | Specifies the maximum number of bytes to batch before sending. The default is 1 MB, though the maximum allowed depends on the Kafka cluster. Smaller batch sizes result in more frequent requests to the cluster. Minimum is 5 KB. | No       | NUMBER  |
| Batch Size        | Max batch size to send to Kafka                                                                                                                                                                                                    | No       | NUMBER  |

## FAQ

### Which Kafka Platforms are supported?

The Kafka Destination can send data to Topics on self-hosted Kafka Clusters, or to Clusters hosted on Managed Service platforms like **Confluent Cloud** and **Aiven**.

### Which data formats are supported?

Segment sends data to Kafka in JSON format only. Segment does not yet support other formats, like Avro or Protobuf.

### Which authentication mechanisms are supported?

The Authentication Mechanism is controlled with the **Authentication Mechanism** Setting field.

Segment supports the following SASL-based authentication methods:

* Plain
* SCRAM-SHA-256
* SCRAM-SHA-512
* AWS

Segment also supports **Client Certificate** authentication.

### How is partitioning controlled?

The **Send** Action provides multiple ways to specify which Partition an event should be sent to.

* **Partition**: Use this field to specify the name of the Partition Segment should send events to.
* **Default Partition**: Use this field to specify a default Partition. Segment uses this when you don't provide a value in the **Partition** field.
* **Message Key**: Segment uses a hash of this field's value to determine which Partition should receive an event. If you don't provide a Message Key, Segment uses a round robin algorithm to select the partition to send the event to.

### What is the "SSL - Reject Unauthorized Certificate Authority" field for?

This field specifies if Segment should reject server connections when a certificate is not signed by a trusted Certificate Authority (CA). This can be useful for testing purposes or when using a self-signed certificate.

## Engage

You can send computed traits and audiences generated using [Engage](/docs/segment/engage) to this destination as a **user property**. To learn more about Engage, schedule a [demo](https://segment.com/contact/demo).

For user-property destinations, an [identify](/docs/segment/connections/spec/identify/) call is sent to the destination for each user being added and removed. The property name is the snake\_cased version of the audience name, with a true/false value to indicate membership. For example, when a user first completes an order in the last 30 days, Engage sends an Identify call with the property `order_completed_last_30days: true`. When the user no longer satisfies this condition (for example, it's been more than 30 days since their last order), Engage sets that value to `false`.

When you first create an audience, Engage sends an Identify call for every user in that audience. Later audience syncs only send updates for users whose membership has changed since the last sync.

> \[!NOTE]
>
> Real-time audience syncs to Kafka  may take six or more hours for the initial sync to complete. Upon completion, a sync frequency of two to three hours is expected.

## Settings

Segment lets you change these destination settings from the Segment app without having to touch any code.

| Field                                           | Description                                                                                                                                                                                               | Required | Type     |
| ----------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------- | -------- |
| AWS Access Key ID                               | The Access Key ID for your AWS IAM instance. Must be populated if using AWS IAM Authentication Mechanism.                                                                                                 | No       | string   |
| AWS Authorization Identity                      | AWS IAM role ARN used for authorization. This field is optional, and should only be populated if using the AWS IAM Authentication Mechanism.                                                              | No       | string   |
| Brokers                                         | The brokers for your Kafka instance, in the format of \`host:port\`. E.g. localhost:9092. Accepts a comma delimited string.                                                                               | Yes      | string   |
| Client ID                                       | The client ID for your Kafka instance. Defaults to 'segment-actions-kafka-producer'.                                                                                                                      | Yes      | string   |
| Authentication Mechanism                        | Select the Authentication Mechanism to use. For SCRAM or PLAIN populate the 'Username' and 'Password' fields. For 'Client Certificate' populated the 'SSL Client Key' and 'SSL Client Certificate' fields | Yes      | select   |
| Password                                        | The password for your Kafka instance. Should only be populated if using PLAIN or SCRAM Authentication Mechanisms.                                                                                         | No       | password |
| AWS Secret Key                                  | The Secret Key for your AWS IAM instance. Must be populated if using AWS IAM Authentication Mechanism.                                                                                                    | No       | password |
| SSL Certificate Authority                       | The Certificate Authority for your Kafka instance. Exclude the first and last lines from the file. i.e \`-----BEGIN CERTIFICATE-----\` and \`-----END CERTIFICATE-----\`.                                 | No       | string   |
| SSL Client Certificate                          | The Certificate Authority for your Kafka instance. Exclude the first and last lines from the file. i.e \`-----BEGIN CERTIFICATE-----\` and \`-----END CERTIFICATE-----\`.                                 | No       | string   |
| SSL Enabled                                     | Indicates if SSL should be enabled.                                                                                                                                                                       | No       | boolean  |
| SSL Client Key                                  | The Client Key for your Kafka instance. Exclude the first and last lines from the file. i.e \`-----BEGIN CERTIFICATE-----\` and \`-----END CERTIFICATE-----\`.                                            | No       | password |
| SSL - Reject Unauthorized Certificate Authority | Whether to reject unauthorized CAs or not. This can be useful when testing, but is unadvised in Production.                                                                                               | No       | boolean  |
| Username                                        | The username for your Kafka instance. Should be populated only if using PLAIN or SCRAM Authentication Mechanisms.                                                                                         | No       | string   |
