Skip to main content
Version: 1.26.0

Spark Integration Metrics

The OpenLineage integration with Spark not only utilizes the Java client's metrics but also introduces its own set of metrics specific to Spark operations. Below is a list of these metrics.

Metrics Overview

The following table provides the metrics added by the Spark integration, along with their definitions and types:

MetricDefinitionType
openlineage.spark.event.sql.startNumber of SparkListenerSQLExecutionStart events receivedCounter
openlineage.spark.event.sql.endNumber of SparkListenerSQLExecutionEnd events receivedCounter
openlineage.spark.event.job.startNumber of SparkListenerJobStart events receivedCounter
openlineage.spark.event.job.endNumber of SparkListenerJobEnd events receivedCounter
openlineage.spark.event.app.startNumber of SparkListenerApplicationStart events receivedCounter
openlineage.spark.event.app.endNumber of SparkListenerApplicationEnd events receivedCounter
openlineage.spark.event.app.start.memoryusagePercentage of used memory at the start of the applicationCounter
openlineage.spark.event.app.end.memoryusagePercentage of used memory at the end of the applicationCounter
openlineage.spark.unknownFacet.timeTime spent building the UnknownEntryRunFacetTimer
openlineage.spark.dataset.input.execution.timeTime spent constructing input datasets for executionTimer
openlineage.spark.facets.job.execution.timeTime spent building job-specific facetsTimer
openlineage.spark.facets.run.execution.timeTime spent constructing run-specific facetsTimer