'configure spark with delta undefined function in Python

recently I wanted to use delta lake according to its documentation like this:

https://docs.delta.io/latest/quick-start.html#set-up-apache-spark-with-delta-lake&language-python

I installed deltalake and built it, after that I installed pyspark + spark 3.2.1 (which obviously match the delta-1.1.0 version). but when tried in my IntelliJ their example like bellow in the screen:

enter image description here

My Intellij don't find the proposed function to use "configure_spark_with_delta_pip"

I don't understand which isn't defined by the example's import of pyspark.

Do you have and idea of the problem and how I can solve it plz ? BTY anyone have a real good first tutorial which work correclty ? (I precise, I'm beginner with spark and python because I never try/use before)

Thank a lot and best regarde

enter image description here

the link to delta import referral to



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source