This project provides a seamless integration between PySpark and Prometheus for monitoring Spark Structured Streaming applications.
Note: this project focuses on better metrics for Spark Structured Streaming specifically. If you would like other Spark metrics such as executor memory, CPU, GC times, etc. in Prometheus please refer to Spark's monitoring guide and its support for Prometheus using JMX (Java Management Extensions).
- Collects metrics from PySpark Streaming Queries
- Exposes metrics in Prometheus format
- Easy integration with existing PySpark applications
To install the required dependencies, run:
pip install -r requirements.txt
-
Import the necessary modules in your PySpark application:
from pyspark_prometheus import with_prometheus_metrics
-
Initialize the Prometheus metrics:
spark = SparkSession.builder.master("local").appName("MySparkApp").getOrCreate() spark = with_prometheus_metrics(spark, 'http://localhost:9091')
-
Start your PySpark job as usual. Metrics will be collected and exposed automatically.
Contributions are welcome! Please submit a pull request or open an issue to discuss your ideas.
This project is licensed under the MIT License. See the LICENSE file for details.
For any questions or support, please open an issue in the repository.