You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Lian Jiang <ji...@gmail.com> on 2018/10/24 03:40:25 UTC

How to make livy2.spark find jar

Hi,

I am trying to use oracle jdbc to read oracle database table. I have added
below property in custom zeppelin-env:

SPARK_SUBMIT_OPTIONS="--jars /my/path/to/ojdbc8.jar"

But

val df = spark.read.format("jdbc").option("url", "jdbc:oracle:thin:@
(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=10.9.44.99)(PORT=1521))(CONNECT_DATA=(SERVER=DEDICATED)(SERVICE_NAME=
myservice.mydns.com)))").option("user","myuser").option("password","mypassword").option("driver",
"oracle.jdbc.driver.OracleDriver").option("dbtable",
"myuser.mytable").load()

throws:

java.lang.ClassNotFoundException: oracle.jdbc.driver.OracleDriver at
scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at
java.lang.ClassLoader.loadClass(ClassLoader.java:357) at
org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:45)
at
org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$6.apply(JDBCOptions.scala:79)
at
org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$6.apply(JDBCOptions.scala:79)
at scala.Option.foreach(Option.scala:257) at
org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:79)
at
org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:35)
at
org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:34)
at
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:340)
at
org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:239)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:227) at
org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)

How to make livy2.spark interpreter find ojdbc8.jar? Thanks.

Re: How to make livy2.spark find jar

Posted by Jeff Zhang <zj...@gmail.com>.
specify livy.spark.jars in livy's interpreter setting. And make sure your
jars is on hdfs, because livy only support jars on hdfs, this is different
from standard spark.


Lian Jiang <ji...@gmail.com>于2018年10月25日周四 下午12:40写道:

> Hi,
>
> I am trying to use oracle jdbc to read oracle database table. I have added
> below property in custom zeppelin-env:
>
> SPARK_SUBMIT_OPTIONS="--jars /my/path/to/ojdbc8.jar"
>
> But
>
> val df = spark.read.format("jdbc").option("url", "jdbc:oracle:thin:@
> (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=10.9.44.99)(PORT=1521))(CONNECT_DATA=(SERVER=DEDICATED)(SERVICE_NAME=
> myservice.mydns.com)))").option("user","myuser").option("password","mypassword").option("driver",
> "oracle.jdbc.driver.OracleDriver").option("dbtable",
> "myuser.mytable").load()
>
> throws:
>
> java.lang.ClassNotFoundException: oracle.jdbc.driver.OracleDriver at
> scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at
> java.lang.ClassLoader.loadClass(ClassLoader.java:357) at
> org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:45)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$6.apply(JDBCOptions.scala:79)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$6.apply(JDBCOptions.scala:79)
> at scala.Option.foreach(Option.scala:257) at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:79)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:35)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:34)
> at
> org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:340)
> at
> org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:239)
> at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:227) at
> org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)
>
> How to make livy2.spark interpreter find ojdbc8.jar? Thanks.
>
>

Re: How to make livy2.spark find jar

Posted by Ruslan Dautkhanov <da...@gmail.com>.
Try adding ZEPPELIN_INTP_CLASSPATH_OVERRIDES, for example,

export
ZEPPELIN_INTP_CLASSPATH_OVERRIDES=/etc/hive/conf:/var/lib/sqoop/ojdbc7.jar


-- 
Ruslan Dautkhanov


On Tue, Oct 23, 2018 at 9:40 PM Lian Jiang <ji...@gmail.com> wrote:

> Hi,
>
> I am trying to use oracle jdbc to read oracle database table. I have added
> below property in custom zeppelin-env:
>
> SPARK_SUBMIT_OPTIONS="--jars /my/path/to/ojdbc8.jar"
>
> But
>
> val df = spark.read.format("jdbc").option("url", "jdbc:oracle:thin:@
> (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=10.9.44.99)(PORT=1521))(CONNECT_DATA=(SERVER=DEDICATED)(SERVICE_NAME=
> myservice.mydns.com)))").option("user","myuser").option("password","mypassword").option("driver",
> "oracle.jdbc.driver.OracleDriver").option("dbtable",
> "myuser.mytable").load()
>
> throws:
>
> java.lang.ClassNotFoundException: oracle.jdbc.driver.OracleDriver at
> scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at
> java.lang.ClassLoader.loadClass(ClassLoader.java:357) at
> org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:45)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$6.apply(JDBCOptions.scala:79)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$6.apply(JDBCOptions.scala:79)
> at scala.Option.foreach(Option.scala:257) at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:79)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:35)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:34)
> at
> org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:340)
> at
> org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:239)
> at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:227) at
> org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)
>
> How to make livy2.spark interpreter find ojdbc8.jar? Thanks.
>
>