You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by SNEHASISH DUTTA <in...@gmail.com> on 2020/02/18 10:40:17 UTC

DB2 connectivity issue with SSL

Hi,
I am trying to connect to DB2 using spark with following code

spark.sparkContext.addFile("xyz.jks")
spark.sparkContext.addFile("xyz.pfx")

 val url =
s"""jdbc:db2://host:port/schema:securityMechanism=18;sslConnection=true;user=user;sslTrustStoreLocation=${SparkFiles.get("xyz.jks")};sslKeyStoreType=PKCS12;sslKeyStorePassword=<<password>>;sslKeyStoreLocation=${SparkFiles.get("xyz.pfx")};"""




spark.read.format("jdbc").option("url",
url).option("dbtable","schemaname.tablename").load()

The dataframe is getting formulated, but after that while querying I am
getting an error that




20/02/18 10:27:09 ERROR TaskSetManager: Task 0 in stage 0.0 failed 4 times;
aborting job
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage
0.0 (TID 3, oser406450.cn.wal-mart.com, executor 2):
com.ibm.db2.jcc.am.DisconnectNonTransientConnectionException:
[jcc][t4][2043][11550][4.24.92] Exception java.io.FileNotFoundException:
Error opening socket to server host/ip on port xxxx with message:
<<SparkFiles.getRootDirectory location>>/xyz.jks (No such file or
directory). ERRORCODE=-4499, SQLSTATE=08001


I have tried to add the *.jks and *.pfx file using --files but the result
is the same.

Also I checked on the contrary to this message *<<SparkFiles.getRootDirectory
location>>/xyz.jks (No such file or directory). *File actually is present
in these locations.

Is there any other way to connect to DB2 from spark while using SSL
certificates?


Regards,
Snehasish