Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
You can create a new Kafka cluster with specified configurations.
Note:
Only contract administrators, owners, and users with Access and manage Event Streams for Apache Kafka privileges can create and manage Kafka clusters.
After creating the cluster, you can use it via the corresponding LAN and certificates.
The data center must be provided as an UUID
. The easiest way to retrieve the UUID
is through the Cloud API.
POST /clusters
The POST /clusters
endpoint allows you to create a new Kafka cluster with specified properties. The name, version, size, and connection fields are required. The response includes the newly created cluster's ID, metadata, and properties, along with its current state and broker addresses.
Use this endpoint to provision a Kafka cluster tailored to your application's requirements, ensuring seamless integration and efficient data management.
To make authenticated requests to the API, the following fields are mandatory in the request header:
Content-Type
yes
string
Set this to application/json
.
Accept
yes
string
Set this to application/json
.
Authorization
yes
string
Provide a header value as Bearer
followed by your token
.
Below is the list of mandatory body parameters:
name
string
The name of the Kafka cluster.
my-kafka-cluster
version
string
The version of Kafka to use for the cluster.
3.7.0
size
string
The size of the Kafka cluster.
S
datacenterId
string
The UUID of the data center where the cluster will be created.
5a029f4a-72e5-11ec-90d6-0242ac120003
lanId
string
The LAN ID where the cluster will be connected.
2
brokerAddresses
array
List of broker addresses for the cluster.
["192.168.1.101/24","192.168.1.102/24","192.168.1.103/24"]
200 Successful operation
Allows you to retrieve the details of a specific Kafka cluster based on its ID.
GET /clusters/{clusterId}
The GET /clusters/{clusterId}
endpoint retrieves detailed information about a specific Kafka cluster identified by its unique UUID (clusterId
). This endpoint returns metadata, including creation and modification dates, ownership details, and current operational state. The properties
section provides specific details such as the cluster name, version, size, and connection information, including broker addresses and network configurations.
Use this endpoint to fetch comprehensive details about a Kafka cluster within your environment, facilitating effective management and monitoring of Kafka resources.
To make authenticated requests to the API, the following fields are mandatory in the request header:
200 Successful operation
With this endpoint, you can retrieve a list of Kafka clusters.
GET /clusters
The GET /clusters
endpoint retrieves a collection of Kafka clusters.
This endpoint provides essential information about each cluster, including its ID, metadata, properties, and connections. Use the response data to manage and monitor Kafka clusters within your environment effectively.
To make authenticated requests to the API, the following fields are mandatory in the request header:
200 Successful operation
Allows you to delete a Kafka cluster based on its ID.
DELETE /clusters/{clusterId}
The DELETE /clusters/{clusterId}
endpoint initiates the deletion of a Kafka cluster identified by its unique UUID (clusterId
). Upon successful deletion request, the endpoint returns a 200 Successful operation
status code, indicating that the cluster deletion process has been initiated.
This action permanently removes the specified Kafka cluster and all associated resources. Use caution when invoking this endpoint as it cannot be undone.
Use this endpoint to manage and decommission Kafka clusters within your environment, ensuring efficient resource utilization and lifecycle management.
The following fields are mandatory to make authenticated requests to the API in the request header:
200 Successful operation
The request to delete the cluster was successful.
Accept
yes
string
Set this to application/json
.
Authorization
yes
string
Provide a header value as Bearer
followed by your token
.
clusterId
Yes
string
The UUID of the Kafka cluster.
Example: e69b22a5-8fee-56b1-b6fb-4a07e4205ead
Accept
yes
string
Set this to application/json
.
Authorization
yes
string
Provide a header value as Bearer
followed by your token
.
Accept
yes
string
Set this to application/json
.
Authorization
yes
string
Provide a header value as Bearer
followed by your token
.
clusterId
Yes
string
The UUID of the Kafka cluster.
Example: e69b22a5-8fee-56b1-b6fb-4a07e4205ead
Allows you to create a new Kafka topic within a specified Kafka cluster.
POST /clusters/{clusterId}/topics
The POST /clusters/{clusterId}/topics
endpoint creates a new Kafka topic within the specified Kafka cluster (clusterId
). The request body must include the topic's name; the other parameters are optional.
Upon successful creation, the endpoint returns detailed information about the newly created topic, including its ID (id), metadata, and properties. Use this endpoint to dynamically manage Kafka topics within your environment, ensuring efficient data distribution and retention policies.
Use this endpoint to dynamically manage Kafka topics within your environment, ensuring efficient data distribution and retention policies.
To make authenticated requests to the API, the following fields are mandatory in the request header:
Content-Type
yes
string
Set this to application/json
.
Accept
yes
string
Set this to application/json
.
Authorization
yes
string
Provide a header value as Bearer
followed by your token
.
clusterId
Yes
string
The UUID of the Kafka cluster where the topic will be created.
Example: e69b22a5-8fee-56b1-b6fb-4a07e4205ead
Below is the list of mandatory body parameters:
name
string
The name of the Kafka topic.
my-kafka-cluster-topic
replicationFactor
number
The number of replicas for the topic. This determines the fault tolerance.
3
numberOfPartitions
number
The number of partitions for the topic. This affects the parallelism and throughput.
3
retentionTime
number
The retention time for logs in milliseconds. Defaults to 604800000 (7 days).
604800000
segmentBytes
number
The maximum size of a log segment in bytes before a new segment is rolled. Defaults to 1073741824 (1 GB).
1073741824
201 Successful operation
This endpoint lets you fetch a list of all Kafka topics within a specified Kafka cluster.
GET /clusters/{clusterId}/topics
The GET /clusters/{clusterId}/topics
endpoint retrieves a collection of all Kafka topics within the specified Kafka cluster identified by clusterId
. Each topic includes detailed metadata such as creation and modification dates, ownership details, and current operational state. Topic properties like name, replicationFactor, numberOfPartitions, and logRetention settings are also provided.
Use this endpoint to fetch and monitor all Kafka topics within your environment, enabling efficient management and monitoring of data streams and event processing.
To make authenticated requests to the API, the following fields are mandatory in the request header:
Accept
yes
string
Set this to application/json
.
Authorization
yes
string
Provide a header value as Bearer
followed by your token
.
clusterId
Yes
string
The UUID of the Kafka cluster.
Example: e69b22a5-8fee-56b1-b6fb-4a07e4205ead
200 Successful operation
With this endpoint you can retrieve details of a specific Kafka topic within a specified Kafka cluster.
GET /clusters/{clusterId}/topics/{topicId}
The GET /clusters/{clusterId}/topics/{topicId}
endpoint retrieves detailed information about a specific Kafka topic identified by topicId
within the Kafka cluster specified by clusterId
. The response includes metadata such as creation and modification dates, ownership details, and current operational state. Additionally, topic properties such as name
, replicationFactor
, numberOfPartitions
, and logRetention
settings are provided.
Use this endpoint to fetch specific details of Kafka topics, facilitating effective monitoring and management of individual topics within your Kafka cluster.
To make authenticated requests to the API, the following fields are mandatory in the request header:
Accept
yes
string
Set this to application/json
.
Authorization
yes
string
Provide a header value as Bearer
followed by your token
.
clusterId
Yes
string
The UUID of the Kafka cluster.
Example: e69b22a5-8fee-56b1-b6fb-4a07e4205ead
topicId
Yes
string
The UUID of the Kafka topic.
Example: ae085c4c-3626-5f1d-b4bc-cc53ae8267ce
200 Successful operation
This endpoint allows you to retrieve all users associated with a specified Kafka cluster, supporting pagination and optional filters.
GET /clusters/{clusterId}/users
The GET /clusters/{clusterId}/users
endpoint retrieves a collection of users associated with the Kafka cluster specified by clusterId. The response includes a paginated list of user objects, each containing metadata such as creation and modification details, ownership information, and current operational state. Use this endpoint to manage and monitor users efficiently within your Kafka cluster.
To make authenticated requests to the API, the following fields are mandatory in the request header:
200 Successful operation
The endpoint retrieves the credentials of a specific user of the Kafka cluster. It includes relevant access certificates and a key found within the metadata.
GET /clusters/{clusterId}/users/{userId}/access
The GET /clusters/{clusterId}/users/{userId}/access
endpoint retrieves the credentials necessary to configure access to your Kafka cluster. The credentials belong to the Kafka administrator user, giving administrators access to the Kafka cluster. The response includes detailed metadata about the access credentials of the admin user, including creation and modification timestamps, ownership information, and current operational state. Access credentials including certificate authority, private key, and certificate are provided to facilitate secure communication with the Kafka cluster. Use this endpoint to manage and obtain detailed information about Kafka admin user credentials within your Kafka infrastructure.
To make authenticated requests to the API, the following fields are mandatory in the request header:
Allows you to delete a specific Kafka topic from a specified Kafka cluster.
DELETE /clusters/{clusterId}/topics/{topicId}
The DELETE /clusters/{clusterId}/topics/{topicId}
endpoint deletes the Kafka topic specified by topicId
from the Kafka cluster identified by clusterId
. Upon successful deletion request, the endpoint returns a 202 Accepted
status code, indicating that the topic deletion process has been initiated.
Use this endpoint carefully as it permanently removes the Kafka topic and its associated data. Ensure appropriate permissions and safeguards are in place before executing this operation.
To make authenticated requests to the API, the following fields are mandatory in the request header:
202 Accepted
The request to delete the topic was successful.
The following information describes how to use credentials to configure access to the Kafka cluster.
Communication with your Kafka cluster is TLS secured, meaning both the client and the Kafka cluster authenticate each other. The client authenticates the server by verifying the server's certificate, and the server authenticates the client by verifying the client's certificate. As the Kafka cluster does not have publicly signed certificates, you must validate them with the cluster's certificate authority. Authentication happens via mutual TLS (mTLS). Therefore, your cluster maintains a client certificate authority to sign authenticated user certificates.
To connect and authenticate to your Kafka cluster, you must fetch the required two certificates and a key from the user's API endpoint. Below are the steps to get the required certificates and key with curl commands for a cluster created in Frankfurt (de-fra) region.
You will need different file formats for the certificates depending on the consumer/producer's implementation. The following sections show how to create and use them with the Kafka Command-Line Interface (CLI) Tools.
Your admin.properties files should look like this:
Your admin.properties
files should look similar to the following:
Your admin.properties
files should look similar to the following:
Accept
yes
string
Set this to application/json
.
Authorization
yes
string
Provide a header value as Bearer
followed by your token
.
clusterId
Yes
string
The UUID of the Kafka cluster.
Example: e69b22a5-8fee-56b1-b6fb-4a07e4205ead
Accept
yes
string
Set this to application/json
.
Authorization
yes
string
Provide a header value as Bearer
followed by your token
.
clusterId
Yes
string
The UUID of the Kafka cluster.
Example: /clusters/e69b22a5-8fee-56b1-b6fb-4a07e4205ead/topics
userId
Yes
string
The UUID of the Kafka user.
Example: d11db12c-2625-5664-afd4-a3599731b5af
Accept
yes
string
Set this to application/json
.
Authorization
yes
string
Provide a header value as Bearer
followed by your token
.
clusterId
Yes
string
The UUID of the Kafka cluster.
Example: e69b22a5-8fee-56b1-b6fb-4a07e4205ead
topicId
Yes
string
The UUID of the Kafka topic.
Example: ae085c4c-3626-5f1d-b4bc-cc53ae8267ce