Skip to content

Commit 2ebf6cf

Browse files
committed
Updating Databricks Init Script doc links
1 parent c8516a8 commit 2ebf6cf

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

deploy-base.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -156,6 +156,7 @@ Follow the instructions below and refer to the [OpenLineage Databricks Install I
156156
> If you do not have line feed endings, your cluster will fail to start due to an init script error.
157157
158158
3. Upload the init script and jar to dbfs using the [Databricks CLI](https://docs.microsoft.com/en-us/azure/databricks/dev-tools/cli/)
159+
* Alternatively, use the [databricks workspace import --format SOURCE](https://github.com/databricks/cli/blob/main/docs/commands.md#databricks-workspace-import---import-a-workspace-object) command to upload the init script as a workspace file.
159160
160161
```text
161162
dbfs mkdirs dbfs:/databricks/openlineage
@@ -181,7 +182,7 @@ Follow the instructions below and refer to the [OpenLineage Databricks Install I
181182
182183
After configuring the secret storage, the API key for OpenLineage can be configured in the Spark config, as in the following example:
183184
`spark.openlineage.url.param.code {{secrets/secret_scope/Ol-Output-Api-Key}}`
184-
1. Add a reference to the uploaded init script `dbfs:/databricks/openlineage/open-lineage-init-script.sh` on the [Init script section](https://docs.microsoft.com/en-us/azure/databricks/clusters/init-scripts#configure-a-cluster-scoped-init-script-using-the-ui) of the Advanced Options.
185+
1. Add a reference to the uploaded init script `dbfs:/databricks/openlineage/open-lineage-init-script.sh` on the [Init script section](https://learn.microsoft.com/en-us/azure/databricks/init-scripts/cluster-scoped#configure-a-cluster-scoped-init-script-using-the-ui) of the Advanced Options.
185186
186187
5. At this point, you can run a Databricks notebook on an "all-purpose cluster" in your configured workspace and observe lineage in Microsoft Purview once the Databricks notebook has finished running all cells.
187188
@@ -204,6 +205,6 @@ To support Databricks Jobs, you must add the service principal to your Databrick
204205
205206
### <a id="global-init"/>Global Init Scripts
206207
207-
You can also configure the OpenLineage listener to run globally, so that any cluster which is created automatically runs the listener. To do this, you can utilize a [global init script](https://docs.microsoft.com/en-us/azure/databricks/clusters/init-scripts#global-init-scripts).
208+
You can also configure the OpenLineage listener to run globally, so that any cluster which is created automatically runs the listener. To do this, you can utilize a [global init script](https://learn.microsoft.com/en-us/azure/databricks/init-scripts/global).
208209
209210
**Note**: Global initialization cannot currently use values from Azure Databricks KeyVault integration mentioned above. If using global initialization scripts, this key would need to be retrieved in the notebooks themselves, or hardcoded into the global init script.

0 commit comments

Comments
 (0)