Flink: Experimental version for flink native lineage listener.#3099@pawel-big-lebowskiNew flink listener to extract lineage through native Flink interfaces. Supports Flink SQL. Requires Flink 2.0.
dbt: add support for consuming dbt structured logs for test and build commands .#3362@MassyBNew option for dbt integration now can handle test and build commands too.
Spark: allow attaching custom facets to RDDExecContext events.#3379@ssanthanam185Events emitted from RDDExecutionContext now include custom facets that get loaded as part of InternalHandlerFactory.
spec: add DatasetTypeDatasetFacet.#3390@ssanthanam185Events emitted from RDDExecutionContext now include custom facets that get loaded as part of InternalHandlerFactory.
Python: allow adding additionalProperties to Python facets#3391@JDarDagranAdds with_additonal_properties method that allows to create modified instance of facet with additional properties.
Spark: allow attaching custom facets to RDDExecContext events.#3379@ssanthanam185Events emitted from RDDExecutionContext now include custom facets that get loaded as part of InternalHandlerFactory.
Spark: SerializedFromObject events aren't filtered for not-delta plans.#3403@ssanthanam185Those events shouldn't be filtered outside Databricks/Delta ecosystem.
Spark: fixed ClassLoader handling for OpenLineageExtensionProvider.#3368@ddebowczyk92Fixes ClassNotFoundException issue when using the openlineage-spark integration alongside a Spark connector that implements the spark-extension-interfaces due to class loader conflicts.
SQL: add minimal support for Snowflake LATERAL.#3368@cisenbeSQL parser won't error on Snowflake's LATERAL keyword.
dbt: handle errors in parse_assertions#3311@dsaxton-1passworddbt integration won't fail when looking at tests on seeds.
Spark: fix infinite loop in RDD flatten & perf optimization.#3379@ssanthanam185Spark integration now correctly handles complex jobs that have cycles and nested RDD trees.
Python: FileTransport now correctly attaches json file extension.#3404@kacpermudaWhen append=False, the json file extension wasn't properly added before.