A centralized logging platform consists of two major components: Log Collection and Log Aggregation. In the context of the Logging Service, it's important to clarify the responsibilities of the platform provider and the user.
Log Collection: The responsibility for log collection and its configuration lies with the user. This involves setting up mechanisms to gather log data from various sources within the infrastructure and applications. These mechanisms can include agents, log shippers, or APIs that send log data to a central location for storage and analysis.
Log Aggregation: The Logging Service platform provider provides and manages the log aggregation component. This component involves the centralization of log data from multiple sources, making it accessible for analysis and visualization. The platform handles log storage, indexing, and search functionalities.
Logs must be targeted and collected to be sent to a Logging Service platform for aggregation and analysis. Log agents responsible for collecting and forwarding logs to the central logging platform typically facilitate this process.
While various log agents are available, it's mentioned that the supported log agent for the Logging Service platform in question is FluentBit Log Agent. FluentBit is a lightweight and efficient log forwarder that can be installed on Linux, macOS, and Windows systems. It provides the necessary functionality to collect logs from different sources and push them to the Logging Service platform for further processing and analysis.
FluentBit can be installed on Linux, macOS, and Windows. For more information, see FluentBit's official website.
Note that FluentBit installation and configuration is depending on your log sources.
Ensure you follow the instructions provided by the Logging Service platform provider and refer to any additional documentation or guidelines they may offer for integrating FluentBit Log Agent into your logging infrastructure.
When configuring FluentBit for log shipping, certain pieces of information need to be properly configured to ensure the logs are shipped correctly and securely.
The Log Server Endpoint refers to the address of your logging pipeline, where the logs will be sent after they are collected. This endpoint can be obtained from the REST API response.
The Tag is a piece of information that must be configured in the log agent (FluentBit) to ensure synchronization between the agent and the log server. It helps identify and categorize the logs and can also be used for reporting purposes.
In addition to the TLS connection, FluentBit needs to be configured with a Key (SharedKey) for authentication purposes. This key ensures that only authorized logs are sent to the logging pipeline. The token can be obtained via our REST API.
Here is an example of a FluentBit configuration that needs an Endpoint, Tag, and Key:
Note: Any data masking or sanitization must happen on the client's side.
The IONOS Logging Service is a cloud-based service that allows you to store, manage, and analyze logs generated by your applications or systems. It is a scalable and cost-effective solution for organizations that need to manage and analyze large volumes of logs.
Logging Service is also an efficient log management solution. It has a number of features and benefits that can help organizations improve their log management practices and troubleshoot issues more quickly.
The architecture of the Logging Service includes three main components:
Data Collection: Data is collected from various sources such as applications, servers, and network devices. This data is then sent to a centralized logging platform.
Logging Platform: The logging platform stores the logs in a centralized location, which you can access for analysis and troubleshooting. The platform includes log search, routing, storage, analytics, and visualization features.
Analytics and Visualization: Analytics and visualization tools allow you to analyze and visualize the log data. This helps you identify patterns and trends and troubleshoot issues.
Logging Service offers the following key features:
Scalability: The service can handle large volumes of logs and scale up or down per the user's needs.
Availability: The service is available 24/7, and logs can be accessed from anywhere using a web-based interface.
Security: The service ensures the security of logs by providing encryption, access controls, and audit trails.
Customization: Customize the service per your needs, such as defining log retention periods, setting up alerts, and creating custom dashboards.
Logging Service offers the following benefits:
Reduced Costs: Logging Service eliminates the need to invest in hardware and software for log management. You only pay for the services they use, which can result in significant cost savings.
Increased Efficiency: Logging Service automates log management tasks such as log collection, storage, and analysis. This reduces the time and effort required by you to manage logs manually.
Improved Troubleshooting: Logging Service provides you with real-time access to log data, which helps them identify and troubleshoot issues quickly.
Compliance: Logging Service can help organizations meet compliance requirements by providing secure log storage and access controls.
Berlin
Based on the chosen plan, there are some limitations:
Rate and Bandwidth limitations:
Default HTTP rate limit: 50 requests per seconds.
Default TCP bandwidth: approximately 15,000 logs per seconds.
Maximum 10 pipelines.
Maximum 3 log streams per pipeline.
User must be the contract owner.
Kubernetes
Docker
Systemd
HTTP (JSON REST API)
Generic
The IONOS Logging Service is a versatile and accessible platform that allows you to conveniently store and manage logs from various sources. Whether it's logs generated within the IONOS infrastructure, your own bare metal system, or even from another cloud environment, this platform provides a seamless solution for log aggregation. With its flexibility, you can effortlessly push logs from anywhere, ensuring comprehensive log monitoring and analysis.
All types of communications to push logs (HTTP and TCP) are protected by the following two mechanisms:
TLS (Transport Layer Security)
KEY
HTTP: APIKEY Header
TCP: SharedKey
The Key brings an extra layer of security with which you can revoke or regenerate the existing key.
When using TCP/TLS, you have to enable tls
and provide a Shared_key
(key) in the FluentBit configuration. The key can be obtained via our REST API (ShareOnce Policy)
Refer to FluentBit's official website to see the whole parameters.
If you are planning to use HTTP (JSON), you should provide the key in the header alongside your account email address:
This is an equivalent example of configuring FluentBit with HTTP outbound:
Refer to FluentBit's official website to see the whole parameters.
Depending on the origins of the logs, it may be necessary to adopt a different approach for the installation and configuration of FluentBit.
However, to provide convenient parsing and data labeling, we accept 4 different log sources, which need a different setup and configuration.
You may need this method to collect and ship your Kubernetes application's logs. FluentBit offers a wide range of information on how to set up FluentBit on your Kubernetes cluster. However, you are welcome to try our Example Kubernetes Configuration.
If you have a set of applications on Docker, we recommend using our Example Docker configuration. However, you can find more information about Docker configuration on FluentBit's official website.
To set up FluentBit in a Linux system with systems, journald you need to install an appropriate package for your distribution. Instructions on doing so can be found on the FluentBit's official website. You can find a sample configuration in our Example Section.
You can even send logs in a JSON format through our HTTP REST endpoint. Example:
Note: Technically, you can send logs with FluentBit from any sources as long as the communication is supported: TCP and HTTP(JSON). However, at the moment, we provide convenient parsing with the above log sources.
A pipeline in the context of IONOS Logging Service refers to an instance or configuration of the logging service you can create. This instance is created using the REST API provided by IONOS. To create an instance of Logging Service, you can request the designated endpoint, such as https://logging.de-txl.ionos.com/pipelines
(this is an example endpoint, the actual endpoint may vary).
Within each pipeline instance, it is possible to define multiple log streams, where each stream functions as a separate log source. These log streams allow you to organize and manage different sources of logs within your logging system.
To differentiate the log sources and enable effective reporting, it is necessary to provide a unique tag for each log source within the pipeline instance. The tag serves as an identifier or label for the log source, allowing you to distinguish and track the logs from different sources easily.
At the conclusion of the pipeline setup, a unique endpoint (HTTP/TCP) will be assigned to each pipeline, establishing a connection to an independent log server. This endpoint serves as the designated destination for sending logs generated by all the log sources within the pipeline. However, to ensure proper categorization and differentiation, each pipeline configuration log source must utilize its designated tag. By adhering to this practice, the logs generated by each source can be accurately identified and traced, even when they are directed to the same endpoint.
Note: Based on your pricing model, a specific limit is tied to each pipeline, which limits the log rate you can send to the log server.