'Run .ipynb on Databricks without the import ui

Is there a way to run (or convert) .ipynb files on a Databricks cluster without using the import ui of Databricks. Basically I want to be able to develop in Jupyter but also be able to run this file on Databricks where its pulled trough git.



Solution 1:[1]

It's possible to import Jupyter notebooks into Databricks workspace as a Databricks notebook, and then execute it. You can use:

P.S. Unfortunately you can't open it by committing into a Repo, it will be treated as JSON. So you need to import it to convert into a Databricks notebook

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Alex Ott