Overview
Last updated
Last updated
Logging Service is a cloud-based service that allows you to store, manage, and analyze logs generated by your applications or systems. It is a scalable and cost-effective solution for organizations that need to manage and analyze large volumes of logs.
Logging Service is also an efficient log management solution, which contains several important features and benefits that can help organizations improve their log management practices and troubleshoot issues more quickly.
Log data is the key to an early detection of security incidents; thus, using the Logging Service can improve your operational efficiency, enhance your security, and increase your visibility into your system's performance and errors. Log data is available 24/7, secure, high-performance, and scalable. Your data is encrypted before, during, and after it is transmitted. The Fluent Bit agent gathers log data locally, while the Logging Service forms the basis for analysis and better visualization of your logs with increased security.
The managed Grafana in the Logging Service comes with a pre-configured datasource for the Telemetry API called IONOS Telemetry. You can use this datasource to query metrics from the IONOS Cloud Telemetry API. For more information, see Integration with IONOS Telemetry API.
Logging Service also enables you to configure an unlimited log retention period for your logs. For more information, see Modify the Log Retention Policy.
The architecture of the Logging Service includes the following three main components that can be used to aggregate logs from various sources, analyze the gathered logs, and create visualizations for monitoring and report generation.
Data Collection: Data is collected from various log sources, such as applications, servers, and network devices, which are sent to a centralized Logging Service platform. Currently, the following log sources are supported: Kubernetes, Docker, Linux Systemd, HTTP (JSON REST API), and Generic.
Logging Service Platform: The Logging Service platform stores the logs gathered from various sources in a centralized location for easy access. This data can be accessed for analysis and troubleshooting. The platform includes log search, routing, storage, analytics, and visualization features. For more information, see Log Collection.
Analytics and Visualization: Grafana, an analytics and visualization tool allows you to analyze the log data to identify patterns and trends and visualize the log data or generate reports. You can also use these reports to secure your log data from threats or troubleshoot underlying issues.
The illustration shows the default components of the Logging Service platform and the following is a brief description of the components:
Systems (1, 2, and n): These are the various log sources with Fluent Bit installed to gather, parse, and redirect logs to the Logging Service platform.
Logging Service Platform:
It contains an in-built log aggregator, Fluentd, easily compatible with various sources and targets. Hence, shipping logs from the source to the destination is hassle-free. It aggregates, processes, and ships data to the stipulated target.
Fluentd feeds logs to Loki, which stores and aggregates them before forwarding them to the visualization tool Grafana. Loki also works as an aggregation tool that indexes and groups log streams based on the labels.
The logs are displayed in the Grafana dashboard. You may generate reports, edit your dashboards according to your needs, and visualize the data accordingly.
The following are the key limitations of Logging Service:
Aspect
Description
Limit
Service Access
Means of creating and managing pipelines.
REST API only
HTTP Rate Limit
Default rate limit for HTTP requests per pipeline during log ingestion.
50 requests/second
TCP Bandwidth
Default TCP bandwidth limit per pipeline, approximately in terms of logs per second.
~10,000 logs/second
Maximum Pipelines
The maximum number of pipelines allowed per contract.
5 pipelines
Log Streams per Pipeline
The maximum number of log streams allowed per pipeline.
10 log streams/pipeline