LogicMonitor Data Publisher Overview
Last updated - 04 September, 2025
LogicMonitor Data Publisher is an integrated service that extracts and sends real-time DataSource metrics from the Collector to a receiver (third-party destination) for further analytics. The Receiver receives and processes the metrics. You can export collected data to external data stores or analytics tools. To support extended analytics, LogicMonitor has introduced the LogicMonitor Data Publisher service.
LogicMonitor Data Publisher supports sending the collected data to data stores for the following reasons:
- Export metrics to create a data lake.
- Export data from LogicMonitor to analytics platforms such as Power BI and Tableau.
- Access data directly while bypassing API rate limits.
- Export metrics to populate a capacity data repository.
- Extract metrics using integration tools such as Kafka, Prometheus, Grafana, and S3.
Types of Data Publisher Receivers
LogicMonitor Data Publisher supports integration with Kafka and HTTPS receivers to receive metrics in OTLP-formatted JSON string.
- Kafka—The Collector publishes and sends metrics to the Kafka topic. For more information, see LogicMonitor Data Publisher for Kafka Receiver.
- HTTPS—The Collector publishes and sends metrics to HTTPS endpoint. For more information, see LogicMonitor Data Publisher for HTTPS Receiver.
Considerations for Using LogicMonitor Data Publisher
You must consider the following points for using LogicMonitor Data Publisher:
- The Collector publishes data to a receiver such as a Kafka topic or HTTPS endpoint.
- Customers must own and maintain the receiver infrastructure, including Kafka or HTTPS.
- Customers must create the Kafka topic.
- Kafka and HTTPS configurations are defined in the
agent.conf
settings. - Data is published to the receiver as an OTLP-formatted JSON string.
- Each published metrics includes metadata such as device, DataSource, instance information, and so on.
- Customers must implement a receiver to consume the published data.