All pages
1 of 12

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

API How-Tos

Quick Links

Endpoints

  • Frankfurt, Germany: https://kafka.de-fra.ionos.com

  • Berlin, Germany: https://kafka.de-txl.ionos.com

Retrieve a Kafka Cluster

Allows you to retrieve the details of a specific Kafka cluster based on its ID.

Endpoint

GET /clusters/{clusterId}

The GET /clusters/{clusterId} endpoint retrieves detailed information about a specific Kafka cluster identified by its unique UUID (clusterId). This endpoint returns metadata, including creation and modification dates, ownership details, and current operational state. The properties section provides specific details such as the cluster name, version, size, and connection information, including broker addresses and network configurations.

Use this endpoint to fetch comprehensive details about a Kafka cluster within your environment, facilitating effective management and monitoring of Kafka resources.

Request

To make authenticated requests to the API, the following fields are mandatory in the request header:

Header Parameters
Required
Type
Description

Response

200 Successful operation

Create a Kafka Cluster

You can create a new Kafka cluster with specified configurations.

Note:

  • Only contract administrators, owners, and users with Access and manage Event Streams for Apache Kafka privileges can create and manage Kafka clusters.

  • After creating the cluster, you can use it via the corresponding LAN and certificates.

The data center must be provided as an UUID. The easiest way to retrieve the UUID is through the .

Endpoint

POST /clusters

The POST /clusters endpoint allows you to create a new Kafka cluster with specified properties. The name, version, size, and connection fields are required. The response includes the newly created cluster's ID, metadata, and properties, along with its current state and broker addresses.

Use this endpoint to provision a Kafka cluster tailored to your application's requirements, ensuring seamless integration and efficient data management.

Request

To make authenticated requests to the API, the following fields are mandatory in the request header:

Header Parameters
Required
Type
Description

Response

200 Successful operation

Delete a Kafka Cluster

Allows you to delete a Kafka cluster based on its ID.

Endpoint

DELETE /clusters/{clusterId}

The DELETE /clusters/{clusterId} endpoint initiates the deletion of a Kafka cluster identified by its unique UUID (clusterId). Upon successful deletion request, the endpoint returns a 200 Successful operation status code, indicating that the cluster deletion process has been initiated.

This action permanently removes the specified Kafka cluster and all associated resources. Use caution when invoking this endpoint as it cannot be undone.

Use this endpoint to manage and decommission Kafka clusters within your environment, ensuring efficient resource utilization and lifecycle management.

Request

The following fields are mandatory to make authenticated requests to the API in the request header:

Header Parameters
Required
Type
Description

Response

200 Successful operation

The request to delete the cluster was successful.

Delete a Kafka Topic

Allows you to delete a specific Kafka topic from a specified Kafka cluster.

Endpoint

DELETE /clusters/{clusterId}/topics/{topicId}

The DELETE /clusters/{clusterId}/topics/{topicId} endpoint deletes the Kafka topic specified by topicId from the Kafka cluster identified by clusterId. Upon successful deletion request, the endpoint returns a 202 Accepted status code, indicating that the topic deletion process has been initiated.

Use this endpoint carefully as it permanently removes the Kafka topic and its associated data. Ensure appropriate permissions and safeguards are in place before executing this operation.

Request

To make authenticated requests to the API, the following fields are mandatory in the request header:

Header Parameters
Required
Type
Description

Response

202 Accepted

The request to delete the topic was successful.

List all Kafka Clusters

With this endpoint, you can retrieve a list of Kafka clusters.

Endpoint

GET /clusters

The GET /clusters endpoint retrieves a collection of Kafka clusters.

This endpoint provides essential information about each cluster, including its ID, metadata, properties, and connections. Use the response data to manage and monitor Kafka clusters within your environment effectively.

Create a Kafka Cluster

Learn to create a Kafka cluster

Retrieve a Kafka Cluster

Learn to retrieve a Kafka cluster

List all Kafka Clusters

Learn to get a list of all your Kafka clusters

Delete a Kafka Cluster

Learn to delete a Kafka cluster

Create a Kafka Topic

Learn to create new topics in your cluster

List all Kafka Topics

Get a list of all topics in the cluster

Retrieve a Kafka Topic

Get detailed information about a specific topic

Delete a Kafka Topic

Learn to delete a Kafka topic

List all Kafka Users

Get a list of the users in the cluster

Retrieve Kafka Cluster User Credentials

Learn to fetch credentials for your Kafka cluster

Configure Access to Kafka Cluster

Learn to configure access to your Kafka cluster

Path Parameters
Required
Type
Description

clusterId

Yes

string

The UUID of the Kafka cluster. Example: e69b22a5-8fee-56b1-b6fb-4a07e4205ead

Accept

yes

string

Set this to application/json.

Authorization

yes

string

Provide a header value as Bearer followed by your token.

Path Parameters
Required
Type
Description

clusterId

Yes

string

The UUID of the Kafka cluster. Example: e69b22a5-8fee-56b1-b6fb-4a07e4205ead

Accept

yes

string

Set this to application/json.

Authorization

yes

string

Provide a header value as Bearer followed by your token.

Path Parameters
Required
Type
Description

clusterId

Yes

string

The UUID of the Kafka cluster. Example: e69b22a5-8fee-56b1-b6fb-4a07e4205ead

topicId

Yes

string

The UUID of the Kafka topic. Example: ae085c4c-3626-5f1d-b4bc-cc53ae8267ce

Accept

yes

string

Set this to application/json.

Authorization

yes

string

Provide a header value as Bearer followed by your token.

Request

To make authenticated requests to the API, the following fields are mandatory in the request header:

Header Parameters
Required
Type
Description

Accept

yes

string

Set this to application/json.

Authorization

yes

string

Provide a header value as Bearer followed by your token.

Response

200 Successful operation

curl -X 'GET' \
  'https://kafka.de-txl.ionos.com/clusters/e69b22a5-8fee-56b1-b6fb-4a07e4205ead' \
--header 'Accept: application/json' \
--header 'Authorization: Bearer $Token'
{
  "id": "e69b22a5-8fee-56b1-b6fb-4a07e4205ead",
  "type": "cluster",
  "href": "/clusters/e69b22a5-8fee-56b1-b6fb-4a07e4205ead",
  "metadata": {
    "createdDate": "2020-12-10T13:37:50+01:00",
    "createdBy": "ionos:identity:::users/87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "createdByUserId": "87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "lastModifiedDate": "2020-12-11T13:37:50+01:00",
    "lastModifiedBy": "ionos:identity:::users/87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "lastModifiedByUserId": "87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "resourceURN": "ionos:<product>:<location>:<contract>:<resource-path>",
    "state": "AVAILABLE",
    "brokerAddresses": [
      "192.168.1.101:9093",
      "192.168.1.102:9093",
      "192.168.1.103:9093"
    ]
  },
  "properties": {
    "name": "my-kafka-cluster",
    "version": "3.7.0",
    "size": "S",
    "connections": [
      {
        "datacenterId": "5a029f4a-72e5-11ec-90d6-0242ac120003",
        "lanId": "2",
        "brokerAddresses": [
          "192.168.1.101/24",
          "192.168.1.102/24",
          "192.168.1.103/24"
        ]
      }
    ]
  }
}
curl -X 'DELETE' \
  'https://kafka.de-txl.ionos.com/clusters/e69b22a5-8fee-56b1-b6fb-4a07e4205ead' \
--header 'Accept: application/json' \
--header 'Authorization: Bearer $Token'
curl -X 'DELETE' \
  'https://kafka.de-txl.ionos.com/clusters/e69b22a5-8fee-56b1-b6fb-4a07e4205ead/topics/ae085c4c-3626-5f1d-b4bc-cc53ae8267ce' \
--header 'Accept: application/json' \
--header 'Authorization: Bearer $Token'
curl -X 'GET' \
  'https://kafka.de-txl.ionos.com/clusters' \
--header 'Accept: application/json' \
--header 'Authorization: Bearer $Token'
{
  "id": "ed17eb1f-ac43-5670-9e63-8be33c475449",
  "type": "collection",
  "href": "/clusters",
  "items": [
    {
      "id": "e69b22a5-8fee-56b1-b6fb-4a07e4205ead",
      "type": "cluster",
      "href": "/clusters/e69b22a5-8fee-56b1-b6fb-4a07e4205ead",
      "metadata": {
        "createdDate": "2020-12-10T13:37:50+01:00",
        "createdBy": "ionos:identity:::users/87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
        "createdByUserId": "87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
        "lastModifiedDate": "2020-12-11T13:37:50+01:00",
        "lastModifiedBy": "ionos:identity:::users/87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
        "lastModifiedByUserId": "87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
        "resourceURN": "ionos:<product>:<location>:<contract>:<resource-path>",
        "state": "AVAILABLE",
        "brokerAddresses": [
          "192.168.1.101:9093",
          "192.168.1.102:9093",
          "192.168.1.103:9093"
        ]
      },
      "properties": {
        "name": "my-kafka-cluster",
        "version": "3.7.0",
        "size": "S",
        "connections": [
          {
            "datacenterId": "5a029f4a-72e5-11ec-90d6-0242ac120003",
            "lanId": "2",
            "brokerAddresses": [
              "192.168.1.101/24",
              "192.168.1.102/24",
              "192.168.1.103/24"
            ]
          }
        ]
      }
    }
  ]
}

yes

string

Provide a header value as Bearer followed by your token.

Below is the list of mandatory body parameters:

Body Parameter
Type
Description
Example

name

string

The name of the Kafka cluster.

my-kafka-cluster

version

string

The version of Kafka to use for the cluster.

3.7.0

Content-Type

yes

string

Set this to application/json.

Accept

yes

string

Set this to application/json.

Cloud API

Authorization

Create a Kafka Topic

Allows you to create a new Kafka topic within a specified Kafka cluster.

Endpoint

POST /clusters/{clusterId}/topics

The POST /clusters/{clusterId}/topics endpoint creates a new Kafka topic within the specified Kafka cluster (clusterId). The request body must include the topic's name; the other parameters are optional.

Upon successful creation, the endpoint returns detailed information about the newly created topic, including its ID (id), metadata, and properties. Use this endpoint to dynamically manage Kafka topics within your environment, ensuring efficient data distribution and retention policies.

Use this endpoint to dynamically manage Kafka topics within your environment, ensuring efficient data distribution and retention policies.

Request

To make authenticated requests to the API, the following fields are mandatory in the request header:

Header Parameters
Required
Type
Description

Response

201 Successful operation

Retrieve a Kafka Topic

With this endpoint you can retrieve details of a specific Kafka topic within a specified Kafka cluster.

Endpoint

GET /clusters/{clusterId}/topics/{topicId}

The GET /clusters/{clusterId}/topics/{topicId} endpoint retrieves detailed information about a specific Kafka topic identified by topicId within the Kafka cluster specified by clusterId. The response includes metadata such as creation and modification dates, ownership details, and current operational state. Additionally, topic properties such as name, replicationFactor, numberOfPartitions, and logRetention settings are provided.

Use this endpoint to fetch specific details of Kafka topics, facilitating effective monitoring and management of individual topics within your Kafka cluster.

Request

To make authenticated requests to the API, the following fields are mandatory in the request header:

Header Parameters
Required
Type
Description

Response

200 Successful operation

List all Kafka Users

This endpoint allows you to retrieve all users associated with a specified Kafka cluster, supporting pagination and optional filters.

Endpoint

GET /clusters/{clusterId}/users

The GET /clusters/{clusterId}/users endpoint retrieves a collection of users associated with the Kafka cluster specified by clusterId. The response includes a paginated list of user objects, each containing metadata such as creation and modification details, ownership information, and current operational state. Use this endpoint to manage and monitor users efficiently within your Kafka cluster.

Request

To make authenticated requests to the API, the following fields are mandatory in the request header:

Header Parameters
Required
Type
Description

Response

200 Successful operation

Retrieve Kafka Cluster User Credentials

The endpoint retrieves the credentials of a specific user of the Kafka cluster. It includes relevant access certificates and a key found within the metadata.

Endpoint

GET /clusters/{clusterId}/users/{userId}/access

The GET /clusters/{clusterId}/users/{userId}/access endpoint retrieves the credentials necessary to configure access to your Kafka cluster. The credentials belong to the Kafka administrator user, giving administrators access to the Kafka cluster. The response includes detailed metadata about the access credentials of the admin user, including creation and modification timestamps, ownership information, and current operational state. Access credentials including certificate authority, private key, and certificate are provided to facilitate secure communication with the Kafka cluster. Use this endpoint to manage and obtain detailed information about Kafka admin user credentials within your Kafka infrastructure.

curl -X 'POST' \
  'https://kafka.de-txl.ionos.com/clusters' \
--header 'Content-Type: application/json' \
--header 'Accept: application/json' \
--header 'Authorization: Bearer $Token' \
--data
'{
  "metadata": {},
  "properties": {
    "name": "my-kafka-cluster",
    "version": "3.7.0",
    "size": "S",
    "connections": [
      {
        "datacenterId": "5a029f4a-72e5-11ec-90d6-0242ac120003",
        "lanId": "2",
        "brokerAddresses": [
          "192.168.1.101/24",
          "192.168.1.102/24",
          "192.168.1.103/24"
        ]
      }
    ]
  }
}`
{
  "id": "e69b22a5-8fee-56b1-b6fb-4a07e4205ead",
  "type": "cluster",
  "href": "/clusters/e69b22a5-8fee-56b1-b6fb-4a07e4205ead",
  "metadata": {
    "createdDate": "2020-12-10T13:37:50+01:00",
    "createdBy": "ionos:identity:::users/87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "createdByUserId": "87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "lastModifiedDate": "2020-12-11T13:37:50+01:00",
    "lastModifiedBy": "ionos:identity:::users/87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "lastModifiedByUserId": "87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "resourceURN": "ionos:<product>:<location>:<contract>:<resource-path>",
    "state": "AVAILABLE",
    "brokerAddresses": [
      "192.168.1.101:9093",
      "192.168.1.102:9093",
      "192.168.1.103:9093"
    ]
  },
  "properties": {
    "name": "my-kafka-cluster",
    "version": "3.7.0",
    "size": "S",
    "connections": [
      {
        "datacenterId": "5a029f4a-72e5-11ec-90d6-0242ac120003",
        "lanId": "2",
        "brokerAddresses": [
          "192.168.1.101/24",
          "192.168.1.102/24",
          "192.168.1.103/24"
        ]
      }
    ]
  }
}

size

string

The size of the Kafka cluster.

S

datacenterId

string

The UUID of the data center where the cluster will be created.

5a029f4a-72e5-11ec-90d6-0242ac120003

lanId

string

The LAN ID where the cluster will be connected.

2

brokerAddresses

array

List of broker addresses for the cluster.

["192.168.1.101/24","192.168.1.102/24","192.168.1.103/24"]

Path Parameters
Required
Type
Description

clusterId

Yes

string

The UUID of the Kafka cluster. Example: e69b22a5-8fee-56b1-b6fb-4a07e4205ead

topicId

Yes

string

The UUID of the Kafka topic. Example: ae085c4c-3626-5f1d-b4bc-cc53ae8267ce

Accept

yes

string

Set this to application/json.

Authorization

yes

string

Provide a header value as Bearer followed by your token.

Path Parameters
Required
Type
Description

clusterId

Yes

string

The UUID of the Kafka cluster. Example: e69b22a5-8fee-56b1-b6fb-4a07e4205ead

Accept

yes

string

Set this to application/json.

Authorization

yes

string

Provide a header value as Bearer followed by your token.

curl -X 'GET' \
  'https://kafka.de-txl.ionos.com/clusters/e69b22a5-8fee-56b1-b6fb-4a07e4205ead/topics/ae085c4c-3626-5f1d-b4bc-cc53ae8267ce' \
--header 'Accept: application/json' \
--header 'Authorization: Bearer $Token'
{
  "id": "ae085c4c-3626-5f1d-b4bc-cc53ae8267ce",
  "type": "topic",
  "href": "/clusters/{clusterId}/topics/ae085c4c-3626-5f1d-b4bc-cc53ae8267ce",
  "metadata": {
    "createdDate": "2020-12-10T13:37:50+01:00",
    "createdBy": "ionos:identity:::users/87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "createdByUserId": "87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "lastModifiedDate": "2020-12-11T13:37:50+01:00",
    "lastModifiedBy": "ionos:identity:::users/87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "lastModifiedByUserId": "87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "resourceURN": "ionos:<product>:<location>:<contract>:<resource-path>",
    "state": "AVAILABLE"
  },
  "properties": {
    "name": "my-kafka-cluster-topic",
    "replicationFactor": 3,
    "numberOfPartitions": 3,
    "logRetention": {
      "retentionTime": 604800000,
      "segmentBytes": 1073741824
    }
  }
}
curl -X 'GET' \
  'https://kafka.de-txl.ionos.com/clusters/e69b22a5-8fee-56b1-b6fb-4a07e4205ead/users' \
--header 'Accept: application/json' \
--header 'Authorization: Bearer $Token'
{
  "id": "434e0c83-031d-5f5a-be82-63ee54dda025",
  "type": "collection",
  "href": "/clusters/{clusterId}/users",
  "items": [
    {
      "id": "d11db12c-2625-5664-afd4-a3599731b5af",
      "type": "user",
      "href": "/clusters/{clusterId}/users/d11db12c-2625-5664-afd4-a3599731b5af",
      "metadata": {
        "createdDate": "2020-12-10T13:37:50+01:00",
        "createdBy": "ionos:identity:::users/87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
        "createdByUserId": "87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
        "lastModifiedDate": "2020-12-11T13:37:50+01:00",
        "lastModifiedBy": "ionos:identity:::users/87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
        "lastModifiedByUserId": "87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
        "resourceURN": "ionos:<product>:<location>:<contract>:<resource-path>",
        "state": "AVAILABLE"
      },
      "properties": {
        "name": "admin"
      }
    }
  ]
}

yes

string

Provide a header value as Bearer followed by your token.

Path Parameters
Required
Type
Description

clusterId

Yes

string

The UUID of the Kafka cluster where the topic will be created. Example: e69b22a5-8fee-56b1-b6fb-4a07e4205ead

Below is the list of mandatory body parameters:

Body Parameters
Type
Description
Example

name

string

The name of the Kafka topic.

my-kafka-cluster-topic

replicationFactor

number

The number of replicas for the topic. This determines the fault tolerance.

3

Content-Type

yes

string

Set this to application/json.

Accept

yes

string

Set this to application/json.

Authorization

Request

To make authenticated requests to the API, the following fields are mandatory in the request header:

Header Parameters
Required
Type
Description

Accept

yes

string

Set this to application/json.

Authorization

yes

string

Provide a header value as Bearer followed by your token.

Path Parameters
Required
Type
Description

Response

List All Kafka Topics

List all Kafka Topics

This endpoint lets you fetch a list of all Kafka topics within a specified Kafka cluster.

Endpoint

GET /clusters/{clusterId}/topics

The GET /clusters/{clusterId}/topics endpoint retrieves a collection of all Kafka topics within the specified Kafka cluster identified by clusterId. Each topic includes detailed metadata such as creation and modification dates, ownership details, and current operational state. Topic properties like name, replicationFactor, numberOfPartitions, and logRetention settings are also provided.

Use this endpoint to fetch and monitor all Kafka topics within your environment, enabling efficient management and monitoring of data streams and event processing.

Request

To make authenticated requests to the API, the following fields are mandatory in the request header:

Header Parameters
Required
Type
Description

Response

200 Successful operation

curl -X 'POST' \
  'https://kafka.de-txl.ionos.com/clusters/e69b22a5-8fee-56b1-b6fb-4a07e4205ead/topics' \
--header 'Content-Type: application/json' \
--header 'Accept: application/json' \
--header 'Authorization: Bearer $Token' \
--data
'{
  "metadata": {},
  "properties": {
    "name": "my-kafka-cluster-topic",
    "replicationFactor": 3,
    "numberOfPartitions": 3,
    "logRetention": {
      "retentionTime": 604800000,
      "segmentBytes": 1073741824
    }
  }
}'
{
  "id": "ae085c4c-3626-5f1d-b4bc-cc53ae8267ce",
  "type": "topic",
  "href": "/clusters/{clusterId}/topics/ae085c4c-3626-5f1d-b4bc-cc53ae8267ce",
  "metadata": {
    "createdDate": "2020-12-10T13:37:50+01:00",
    "createdBy": "ionos:identity:::users/87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "createdByUserId": "87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "lastModifiedDate": "2020-12-11T13:37:50+01:00",
    "lastModifiedBy": "ionos:identity:::users/87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "lastModifiedByUserId": "87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "resourceURN": "ionos:<product>:<location>:<contract>:<resource-path>",
    "state": "AVAILABLE",
    "message": "In progress."
  },
  "properties": {
    "name": "my-kafka-cluster-topic",
    "replicationFactor": 3,
    "numberOfPartitions": 3,
    "logRetention": {
      "retentionTime": 86400000,
      "segmentBytes": 1073741824
    }
  }
}
curl -X 'GET' \
  'https://kafka.de-txl.ionos.com/clusters/e69b22a5-8fee-56b1-b6fb-4a07e4205ead/users/d11db12c-2625-5664-afd4-a3599731b5af' \
--header 'Accept: application/json' \
--header 'Authorization: Bearer $Token'
{
  "id": "d11db12c-2625-5664-afd4-a3599731b5af",
  "type": "user",
  "href": "/clusters/{clusterId}/users/d11db12c-2625-5664-afd4-a3599731b5af",
  "metadata": {
    "createdDate": "2020-12-10T13:37:50+01:00",
    "createdBy": "ionos:identity:::users/87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "createdByUserId": "87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "lastModifiedDate": "2020-12-11T13:37:50+01:00",
    "lastModifiedBy": "ionos:identity:::users/87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "lastModifiedByUserId": "87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
    "resourceURN": "ionos:<product>:<location>:<contract>:<resource-path>",
    "state": "AVAILABLE",
    "certificateAuthority": "-----BEGIN CERTIFICATE ...",
    "privateKey": "-----BEGIN PRIVATE KEY ...",
    "certificate": "-----BEGIN CERTIFICATE ..."
  },
  "properties": {
    "name": "admin"
  }
}

numberOfPartitions

number

The number of partitions for the topic. This affects the parallelism and throughput.

3

retentionTime

number

The retention time for logs in milliseconds. Defaults to 604800000 (7 days).

604800000

segmentBytes

number

The maximum size of a log segment in bytes before a new segment is rolled. Defaults to 1073741824 (1 GB).

1073741824

clusterId

Yes

string

The UUID of the Kafka cluster. Example: /clusters/e69b22a5-8fee-56b1-b6fb-4a07e4205ead/topics

userId

Yes

string

The UUID of the Kafka user. Example: d11db12c-2625-5664-afd4-a3599731b5af

Path Parameters
Required
Type
Description

clusterId

Yes

string

The UUID of the Kafka cluster. Example: e69b22a5-8fee-56b1-b6fb-4a07e4205ead

Accept

yes

string

Set this to application/json.

Authorization

yes

string

Provide a header value as Bearer followed by your token.

curl -X 'GET' \
  'https://kafka.de-txl.ionos.com/clusters/e69b22a5-8fee-56b1-b6fb-4a07e4205ead/topics' \
--header 'Accept: application/json' \
--header 'Authorization: Bearer $Token'
{
  "id": "7c1fe82d-a1ea-55fc-a744-12fad4180eef",
  "type": "collection",
  "href": "/clusters/{clusterId}/topics",
  "items": [
    {
      "id": "ae085c4c-3626-5f1d-b4bc-cc53ae8267ce",
      "type": "topic",
      "href": "/clusters/{clusterId}/topics/ae085c4c-3626-5f1d-b4bc-cc53ae8267ce",
      "metadata": {
        "createdDate": "2020-12-10T13:37:50+01:00",
        "createdBy": "ionos:identity:::users/87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
        "createdByUserId": "87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
        "lastModifiedDate": "2020-12-11T13:37:50+01:00",
        "lastModifiedBy": "ionos:identity:::users/87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
        "lastModifiedByUserId": "87f9a82e-b28d-49ed-9d04-fba2c0459cd3",
        "resourceURN": "ionos:<product>:<location>:<contract>:<resource-path>",
        "state": "AVAILABLE"
      },
      "properties": {
        "name": "my-kafka-cluster-topic",
        "replicationFactor": 3,
        "numberOfPartitions": 3,
        "logRetention": {
          "retentionTime": 604800000,
          "segmentBytes": 1073741824
        }
      }
    }
  ]
}

Configure Access to Kafka Cluster

The following information describes how to use credentials to configure access to the Kafka cluster.

Kafka mTLS authentication

Communication with your Kafka cluster is TLS secured, meaning both the client and the Kafka cluster authenticate each other. The client authenticates the server by verifying the server's certificate, and the server authenticates the client by verifying the client's certificate. As the Kafka cluster does not have publicly signed certificates, you must validate them with the cluster's certificate authority. Authentication happens via mutual TLS (mTLS). Therefore, your cluster maintains a client certificate authority to sign authenticated user certificates.

Get certificates and key

To connect and authenticate to your Kafka cluster, you must fetch the required two certificates and a key from the user's API endpoint. Below are the steps to get the required certificates and key with curl commands for a cluster created in Frankfurt (de-fra) region.

Convert certificates & key

You will need different file formats for the certificates depending on the consumer/producer's implementation. The following sections show how to create and use them with the Kafka Command-Line Interface (CLI) Tools.

PKCS#12 (.p12 / .pfx)

Your admin.properties files should look like this:

Java KeyStore (JKS)

Your admin.properties files should look similar to the following:

PKCS#8 PEM

Your admin.properties files should look similar to the following:

# Get the cluster's CA certificate
curl --location https://kafka.de-fra.ionos.com/clusters/${clusterId}/users/${userId}/access --header "Authorization: Bearer ${personalToken}" |  yq -r '.metadata.certificateAuthority' > ca-cert.pem
# verify
openssl x509 -in ca-cert.pem -text -noout

# Get the (admin) users client certificate
curl --location https://kafka.de-fra.ionos.com/clusters/${clusterId}/users/${userId}/access --header "Authorization: Bearer ${personalToken}" |  yq -r '.metadata.certificate' > admin-cert.pem
# verify
openssl x509 -in admin-cert.pem -text -noout

# Get the (admin) users client key
curl --location https://kafka.de-fra.ionos.com/clusters/${clusterId}/users/${userId}/access --header "Authorization: Bearer ${personalToken}" |  yq -r '.metadata.privateKey' > admin-key.pem
# verify
openssl rsa -in admin-key.pem -check
# Create a ca-cert.p12 (with openssl >3.2 )
openssl pkcs12 -export -nokeys -in ca-cert.pem -out ca-cert.p12 -passout "pass:changeit" -jdktrust anyExtendedKeyUsage
# Create a ca-cert.p12 (with keytool)
keytool -importcert -storetype PKCS12 -keystore ca-cert.p12 -storepass changeit -alias cluster-ca -file ca-cert.pem -noprompt
# verify
openssl pkcs12 -info -in ca-cert.p12

# Create an admin.p12
openssl pkcs12 -export -in admin-cert.pem -inkey admin-key.pem -out admin.p12 -passout "pass:admin_p12_pass"
# verify
openssl pkcs12 -info -nodes -in admin.p12
security.protocol=SSL
ssl.truststore.type=PKCS12
ssl.truststore.location=ca-cert.p12
ssl.truststore.password=changeit
ssl.endpoint.identification.algorithm=

ssl.keystore.type=PKCS12
ssl.keystore.location=admin.p12
ssl.keystore.password=admin_p12_pass
bin/kafka-topics.sh --list --bootstrap-server=clusterIp:Port --command-config admin.properties
# Create a Java Truststore
keytool -import -alias cluster-ca -file ca-cert.pem -keystore truststore.jks -storepass changeit -noprompt
# verify
keytool -list -keystore truststore.jks -rfc -storepass changeit

# Create a Java Keystore
openssl pkcs12 -export -in admin-cert.pem -inkey admin-key.pem -out admin.p12 -passout "pass:admin_p12_pass"
keytool -importkeystore -srckeystore admin.p12 -srcstorepass admin_p12_pass -destkeystore admin.ks -storepass admin_jks_pass
# verify
keytool -list -keystore admin.ks -rfc -storepass admin_jks_pass
# verify including the key
keytool -importkeystore -srckeystore admin.ks -srcstorepass admin_jks_pass -deststoretype PKCS12 -destkeystore filename.p12 -storepass p12_pass; openssl pkcs12 -info -nodes -in filename.p12 -passin "pass:p12_pass"; rm -f filename.p12
security.protocol=SSL
ssl.truststore.location=truststore.jks
ssl.truststore.password=changeit
ssl.endpoint.identification.algorithm=

ssl.keystore.location=admin.ks
ssl.keystore.password=admin_jks_pass
bin/kafka-topics.sh --list --bootstrap-server=clusterIp:Port --command-config admin.properties
# No need to do anything with the ca-cert.pem it can be used without any modification
# verify
openssl x509 -in ca-cert.pem -text -noout

# Create a admin.pem containing key and cert
# as the Kafka CLI tool requires the key in PKCS#8 and to be secured with a passphrase we need to convert it first
openssl pkcs8 -in admin-key.pem -passout "pass:admin_pem_pass" -topk8 -v1 PBE-SHA1-3DES -out admin.pem
cat admin-cert.pem >> admin.pem
# verify
openssl x509 -in admin.pem -text -noout
openssl pkey -in admin.pem -check
security.protocol=SSL
ssl.truststore.type=PEM
ssl.truststore.location=ca-cert.pem
ssl.endpoint.identification.algorithm=

ssl.keystore.type=PEM
ssl.keystore.location=admin.pem
ssl.key.password=admin_pem_pass
bin/kafka-topics.sh --list --bootstrap-server=clusterIp:Port --command-config admin.properties