1

I am using Jupyter Notebook with Scala kernel, below is my code to import mysql table to a dataframe:

val sql="""select * from customer"""
val df_customer = spark.read
  .format("jdbc")
  .option("url", "jdbc:mysql://localhost:3306/ccfd")
  .option("driver", "com.mysql.jdbc.Driver")
  .option("dbtable",  s"( $sql ) t")
  .option("user", "root")
  .option("password", "xxxxxxx")
  .load()

Below is the error:

Name: java.lang.ClassNotFoundException
Message: com.mysql.jdbc.Driver
StackTrace:   at scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  at org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:45)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$6.apply(JDBCOptions.scala:79)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$6.apply(JDBCOptions.scala:79)
  at scala.Option.foreach(Option.scala:257)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:79)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:35)
  at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:34)
  at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:340)
  at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:239)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:227)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)

Can anyone share a working code snippet here? I am using Spark2, session named spark is ready when I start the kernel in a new notebook.

Thank you in advance.

3
  • Have you added the mysql jdbc jar to spark lib directory? Commented Mar 14, 2019 at 3:26
  • Thank you. I added the latest jar mysql-connector-java-5.1.47.jar to JAVA_HOME, which is /usr/java/jdk1.8.0_121, the error message changes to: Name: java.sql.SQLException Message: No suitable driver Commented Mar 14, 2019 at 23:11
  • Place it under SPARK_HOME/lib directory please. Commented Mar 15, 2019 at 2:38

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.