A key is necessary to send logs over the logging pipeline. For more information about the purpose of a pipeline key, see Log Security.
Warning:
IONOS adheres to a Share Once policy to generate a key; there is no alternative method to retrieve it if lost. Hence, we recommend that you secure the generated key.
The previous key is instantly revoked when you generate a new key for a specific pipeline.
To get a new key for a pipeline, you can use the following request. Remember to replace the {pipelineID}
with a valid ID of a pipeline to access its key
.
The response contains the key necessary to configure the Shared_Key
in Fluent Bit.
It is necessary to create an instance of the logging pipeline before sending log data to the Logging Service platform. For more information, see Log Pipelines.
When sending a request to create a logging pipeline, you can specify a unique tag of your choice for each log source. For more information about the complete list of available sources, see Log Sources.
This topic contains the following sections:
The following request creates an instance of a logging pipeline with two log streams: docker
and kubernetes
.
The following is a sample response. The values returned by each response differ based on the request.
You may notice that the pipeline's status is temporarily set to the PROVISIONING
state while provisioning is in process. A GET
request retrieves information about the pipeline and its status. For more information, see Retrieve logging pipeline information.
Log sources like Kubernetes, Docker, and Linux Systemd collect and offer relevant labels. You can use these labels to analyze reports and query the dashboard. However, if you want to label additional fields from the log sources, you can define custom labels as follows when you create a pipeline:
The Logging Service offers regional APIs that enable programmatic interaction with the platform. These APIs serve various purposes: task automation, system integration, and platform functionality extension. Additionally, the APIs allow you to filter logs based on different criteria, such as the date range, log level, and source.
A regional endpoint is necessary to interact with the Logging Service REST API endpoints. Currently, IONOS supports only the following endpoint for the Berlin region:
https://logging.de-txl.ionos.com/pipelines
To interact with the Logging Service REST API endpoints, the header must contain the following values:
Header | Required | Type | Description |
---|---|---|---|
Here are some of the most common API How-Tos for the Logging Service:
We recommend you refer to the following after creating an instance via the API:
To retrieve your logging pipeline information, you need the ID of the respective pipeline.
The following is a sample request. Remember to replace the {pipelineID}
with a valid ID of the specific pipeline whose information you want to access.
If your request is successful, you will receive the relevant information on a Ready
pipeline.
To access logs using Fluent Bit, you need the following information:
Field | Usage |
---|
Note: A key is necessary to send logs through the newly formed logging pipeline. For more information about obtaining a key, see .
Authorization
yes
string
A Bearer $TOKEN
is a string that is tied to your account. For information on generating tokens, see Create New Tokens.
Content-Type
yes
string
Set this to application/json
.
| Set the TCP log server address during Fluent Bit configuration. |
| Set the HTTP log server address during Fluent Bit configuration. |
| Set the tag during Fluent Bit configuration. Remember to use the same tag you defined while creating the pipeline. |
Each log stream in your pipeline is initially assigned a default data retention policy of 30 days. The logs for each log stream are retained for the specified number of days. However, you can define a custom retention policy for each log stream. The available options for retention periods include 7, 14, and 30 days.
Note: You can alternatively use the PATCH
request to update the retention policy of an existing pipeline.
The following is an example:
You can modify your logging pipeline by sending a PATCH
request with a specific pipeline ID.
Note: To modify a logging pipeline, you can use the same payload that you use in the POST
request for creating a logging pipeline. For more information, see Set Up a Logging Pipeline Instance.
The following is a sample request. Remember to replace the {pipelineID}
with a valid ID of the respective logging pipeline.