You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Viral Bajaria <vi...@gmail.com> on 2010/12/10 03:11:54 UTC

unable to load jtds driver for sqlserver

Hello,

I just came across the GenericUDFDBOutput and was able to successfully write
data to a mysql db.

I tried modifying the connection string to a sql server using jtds library
and for some reason it does not seem to work with jtds.

Has anyone come across the same issue before ?

I did the following steps before running my dboutput udf:


add jar
/home/hadoop/svn/elsharpy/library/mysql-connector-java-5.1.13-bin.jar;
add jar /usr/lib/hive/lib/hive_contrib.jar;
create temporary function dboutput as
'org.apache.hadoop.hive.contrib.genericudf.example.GenericUDFDBOutput';
(ran mysql query... it was all fine... )

add jar /usr/lib/hive/lib/jtds-1.2.5.jar;
(ran sql server query ... following is the stack trace from one of my
reducers
010-12-10 01:52:19,071 ERROR
org.apache.hadoop.hive.contrib.genericudf.example.GenericUDFDBOutput: Driver
loading or connection issue

java.sql.SQLException: No suitable driver found for
jdbc:jtds:sqlserver://<IP>/<Database>
	at java.sql.DriverManager.getConnection(DriverManager.java:602)
	at java.sql.DriverManager.getConnection(DriverManager.java:185)
	at org.apache.hadoop.hive.contrib.genericudf.example.GenericUDFDBOutput.evaluate(GenericUDFDBOutput.java:120)
	at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.evaluate(ExprNodeGenericFuncEvaluator.java:82)
	at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:73)
	at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:386)
	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:598)
	at org.apache.hadoop.hive.ql.exec.GroupByOperator.forward(GroupByOperator.java:746)
	at org.apache.hadoop.hive.ql.exec.GroupByOperator.closeOp(GroupByOperator.java:788)
	at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:462)
	at org.apache.hadoop.hive.ql.exec.ExecReducer.close(ExecReducer.java:258)
	at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:473)
	at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:411)
	at org.apache.hadoop.mapred.Child.main(Child.java:170)

Thanks,
Viral