16

I am attempting to insert records into a MySql table. The table contains id and name as columns.

I am doing like below in a pyspark shell.

name = 'tester_1'
id = '103'  
import pandas as pd
l = [id,name]

df = pd.DataFrame([l])

df.write.format('jdbc').options(
      url='jdbc:mysql://localhost/database_name',
      driver='com.mysql.jdbc.Driver',
      dbtable='DestinationTableName',
      user='your_user_name',
      password='your_password').mode('append').save()

I am getting the below attribute error

AttributeError: 'DataFrame' object has no attribute 'write'

What am I doing wrong? What is the correct method to insert records into a MySql table from pySpark

2 Answers 2

20

Use Spark DataFrame instead of pandas', as .write is available on Spark Dataframe only

So the final code could be

data =['103', 'tester_1']

df = sc.parallelize(data).toDF(['id', 'name'])

df.write.format('jdbc').options(
      url='jdbc:mysql://localhost/database_name',
      driver='com.mysql.jdbc.Driver',
      dbtable='DestinationTableName',
      user='your_user_name',
      password='your_password').mode('append').save()
Sign up to request clarification or add additional context in comments.

6 Comments

I am getting this below error java.lang.RuntimeException: org.apache.spark.sql.execution.datasources.jdbc.DefaultSource does not allow create table as select.. Is there any other alternative to this.
does that table exists in datasource and try playing with mode also.
I am getting this below error java.lang.RuntimeException: org.apache.spark.sql.execution.datasources.jdbc.DefaultSource does not allow create table as select.. Is there any other alternative to this.
@Karn_way: does the table exists in the target or creating one?
yes table does exists . I am using CDH image with mysql . I think its been fixed in spark 2.0 where in my image its 1.6
|
0

Just to add @mrsrinivas answer's.

Make sure that you have jar location of sql connector available in your spark session. This code helps:

spark = SparkSession\
    .builder\
    .config("spark.jars", "/Users/coder/Downloads/mysql-connector-java-8.0.22.jar")\
    .master("local[*]")\
    .appName("pivot and unpivot")\
    .getOrCreate()

otherwise it will throw an error.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.