'Write DataFrame to mysql table using pySpark

I am attempting to insert records into a MySql table. The table contains id and name as columns.

I am doing like below in a pyspark shell.

name = 'tester_1'
id = '103'  
import pandas as pd
l = [id,name]

df = pd.DataFrame([l])

df.write.format('jdbc').options(
      url='jdbc:mysql://localhost/database_name',
      driver='com.mysql.jdbc.Driver',
      dbtable='DestinationTableName',
      user='your_user_name',
      password='your_password').mode('append').save()

I am getting the below attribute error

AttributeError: 'DataFrame' object has no attribute 'write'

What am I doing wrong? What is the correct method to insert records into a MySql table from pySpark



Solution 1:[1]

Just to add @mrsrinivas answer's.

Make sure that you have jar location of sql connector available in your spark session. This code helps:

spark = SparkSession\
    .builder\
    .config("spark.jars", "/Users/coder/Downloads/mysql-connector-java-8.0.22.jar")\
    .master("local[*]")\
    .appName("pivot and unpivot")\
    .getOrCreate()

otherwise it will throw an error.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 vegetarianCoder