1.14.0 - 2024-05-09
Added
- Common/dbt: add DREMIO to supported dbt profile types #2674@surisimran
 *Adds support for dbt-dremio, resolving#2668.
- Flink: support Protobuf format for sources and sinks #2482@pawel-big-lebowski
 Adds schema extraction from Protobuf classes. Includes support for nested object types,arraytype,maptype,oneOfandany.
- Java: add facet conversion test #2663@julienledem
 Adds a simple test that shows how to deserialize a facet in the server model.
- Spark: job type facet to distinguish RDD jobs from Spark SQL jobs #2652@pawel-big-lebowski
 Sets thejobTypeproperty ofJobTypeJobFacetto eitherSQL_JOBorRDD_JOB.
- Spark: add Glue symlink if reading from Glue catalog table #2646@mobuchowski
 The dataset symlink now points to the Glue catalog table name if the Glue catalog table is used.
- Spark: add spark_jobDetails facet #2662@dolfinus
 Adds aSparkJobDetailsFacet, capturing information about Spark application jobs -- e.g.jobId,jobDescription,jobGroup,jobCallSite. This allows for tracking an OpenLineageRunEventwith a specific Spark job in SparkUI.
Removed
- Airflow: drop old ParentRunFacet key #2660@dolfinus
 Changes the integration to use theparentkey forParentFacet, dropping the outdatedparentRun.
- Spark: drop SparkVersionFacet #2659@dolfinus
 Drops theSparkVersionfacet, deprecated since 1.2.0 and planned for removal since 1.4.0.
- Python: allow relative paths in URI formats for Python facets #2679@JDarDagran
 Removes a URI validator that checked if scheme and netloc were present, allowing relative paths in URI formats for Python facets.
Changed
- GreatExpectations: rename ParentRunFacetkey#2661@dolfinus
 The OpenLineage spec defined theParentRunFacetwith the property name parent but the Great Expectations integration created a lineage event withparentRun. This renamesParentRunFacetkey fromparentRuntoparent. For backwards compatibility, keep the old name.
Fixed
- dbt: support a less ambiguous logic to generate job names #2658@blacklight
 Includes profile and models in the dbt job name to make it more unique.
- Spark: update to use org.apache.commons.lang3 instead of org.apache.commons.lang #2676@harels
 Updates Apache Commons Lang to the latest version. We were mixing two versions, and the old one was not present in many places.