Add BigQuery check support #960@denimalpaca Adds logic and support for proper dynamic class inheritance for BigQuery-style operators. (BigQuery's extractor needed additional logic to support the forthcoming BigQueryColumnCheckOperator and BigQueryTableCheckOperator.)
Add RUNNINGEventType in spec and Python client #972@mzareba382 Introduces a RUNNING event state in the OpenLineage spec to indicate a running task and adds a RUNNING event type in the Python API.
Use databases & schemas in SQL Extractors #974@JDarDagran Allows the Airflow integration to differentiate between databases and schemas. (There was no notion of databases and schemas when querying and parsing results from information_schema tables.)
Implement Event forwarding feature via HTTP protocol #995@howardyoo Adds HttpLineageStream to forward a given OpenLineage event to any HTTP endpoint.
Introduce SymlinksDatasetFacet to spec #936@pawel-big-lebowski Creates a new facet, the SymlinksDatasetFacet, to support the storing of alternative dataset names.
Add Azure Cosmos Handler to Spark integration #983@hmoazam Defines a new interface, the RelationHandler, to support Spark data sources that do not have TableCatalog, Identifier, or TableProperties set, as is the case with the Azure Cosmos DB Spark connector.
Support OL Datasets in manual lineage inputs/outputs #1015@conorbev Allows Airflow users to create OpenLineage Dataset classes directly in DAGs with no conversion necessary. (Manual lineage definition required users to create an airflow.lineage.entities.Table, which was then converted to an OpenLineage Dataset.)
Create ownership facets #996@julienledem Adds an ownership facet to both Dataset and Job in the OpenLineage spec to capture ownership of jobs and datasets.
Use RUNNING EventType in Flink integration for currently running jobs #985@mzareba382 Makes use of the new RUNNING event type in the Flink integration, changing events sent by Flink jobs from OTHER to this new type.
Convert task objects to JSON-encodable objects when creating custom Airflow version facets #1018@fm100 Implements a to_json_encodable function in the Airflow integration to make task objects JSON-encodable.
Add support for custom SQL queries in v3 Great Expectations API #1025@collado-mike Fixes support for custom SQL statements in the Great Expectations provider. (The Great Expectations custom SQL datasource was not applied to the support for the V3 checkpoints API.)