6

Sorry if it sounds vague but can one explain the steps to writing an existing DataFrame "df" into MySQL table say "product_mysql" and the other way around.

1 Answer 1

5

please see this databricks article : Connecting to SQL Databases using JDBC.

import org.apache.spark.sql.SaveMode

val df = spark.table("...")
println(df.rdd.partitions.length)
// given the number of partitions above, users can reduce the partition value by calling coalesce() or increase it by calling repartition() to manage the number of connections.
df.repartition(10).write.mode(SaveMode.Append).jdbc(jdbcUrl, "product_mysql", connectionProperties)
Sign up to request clarification or add additional context in comments.

1 Comment

Thanks for your help Ram!

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.