'Create a notebook inside another notebook in Databricks Dynamically using Python

I am trying to create a notebook inside another notebook , the created notebook should contain both python code and sql code (using % sql ,% python ) .I need to run the created notebook from the parent notebook once it is created .Can anyone suggest a better way to do this .

I found something like dbutils.notebook.run() -which will help me to run the already existing notebook , but seeking a method to create a notebook first and run it later .Any suggestion is appreciable!!



Solution 1:[1]

You can use the import command of the Databricks Workspace REST API.

Something like this (put the notebook content into content value):

import requests
import os
import json
import base64

ctx = json.loads(dbutils.notebook.entry_point.getDbutils().notebook().getContext().toJson())
host_name = ctx['extraContext']['api_url']
host_token = ctx['extraContext']['api_token']
notebook_path = ctx['extraContext']['notebook_path']
new_path = os.path.join(os.path.dirname(notebook_path), 'New name')

content = "some code"

data = {
  "content": base64.b64encode(content.encode("utf-8")).decode('ascii'),
  "path": new_path,
  "language": "PYTHON",
  "overwrite": True,
  "format": "SOURCE"
}

response = requests.post(
    '{host_name}/api/2.0/workspace/import',
    headers={'Authorization': f'Bearer {host_token}'},
    json = data
  ).json()

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1