Skip to main content
Version: 1.25.0

OpenLineage Integrations

Capability Matrix

caution

This matrix is not yet complete.

The matrix below shows the relationship between an input facet and various mechanisms OpenLineage uses to gather metadata. Not all mechanisms collect data to fill in all facets, and some facets are specific to one integration.
✔️: The mechanism does implement this facet.
✖️: The mechanism does not implement this facet.
An empty column means it is not yet documented if the mechanism implements this facet.

MechanismIntegrationMetadata GatheredInputDatasetFacetOutputDatasetFacetSqlJobFacetSchemaDatasetFacetDataSourceDatasetFacetDataQualityMetricsInputDatasetFacetDataQualityAssertionsDatasetFacetSourceCodeJobFacetExternalQueryRunFacetDocumentationDatasetFacetSourceCodeLocationJobFacetDocumentationJobFacetParentRunFacet
SnowflakeOperator*Airflow ExtractorLineage
Job duration
✔️✔️✔️✔️✔️✖️✖️
BigQueryOperator**Airflow ExtractorLineage
Schema details
Job duration
✔️✔️✔️
PostgresOperator*Airflow ExtractorLineage
Job duration
✔️✔️✔️✔️✔️
SqlCheckOperatorsAirflow ExtractorLineage
Data quality assertions
✔️✖️✔️✔️✔️✔️✔️
dbtdbt Project FilesLineage
Row count
Byte count.
✔️
Great ExpectationsActionData quality assertions✔️✔️✔️
SparkSparkListenerSchema
Row count
Column lineage
✔️
Snowflake***Access HistoryLineage

* Uses the Rest SQL parser
** Uses the BigQuery API
*** Uses Snowflake query logs

Compatibility matrix

This matrix shows which data sources are known to work with each integration, along with the minimum versions required in the target system or framework.

PlatformVersionData Sources
Apache Airflow1.10+
2.0+
PostgreSQL
MySQL
Snowflake
Amazon Athena
Amazon Redshift
Amazon SageMaker
Amazon S3 Copy and Transform
Google BigQuery
Google Cloud Storage
Great Expectations
SFTP
FTP
Apache Spark2.4+JDBC
HDFS
Google Cloud Storage
Google BigQuery
Amazon S3
Azure Blob Storage
Azure Data Lake Gen2
Azure Synapse
dbt0.20+Snowflake
Google BigQuery

Integration strategies

info

This section could use some more detail! You're welcome to contribute using the Edit link at the bottom.

Integrating with pipelines

Integrating with Pipelines

Integrating with data sources

Integrating with Data Sources