You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael Armbrust (JIRA)" <ji...@apache.org> on 2015/09/15 23:44:46 UTC

[jira] [Resolved] (SPARK-5194) ADD JAR doesn't update classpath until reconnect

     [ https://issues.apache.org/jira/browse/SPARK-5194?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Michael Armbrust resolved SPARK-5194.
-------------------------------------
    Resolution: Cannot Reproduce

Closing as cannot reproduce.  Please reopen if you can on Spark 1.5

> ADD JAR doesn't update classpath until reconnect
> ------------------------------------------------
>
>                 Key: SPARK-5194
>                 URL: https://issues.apache.org/jira/browse/SPARK-5194
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.2.0
>            Reporter: Oleg Danilov
>
> Steps to reproduce:
> beeline>  !connect jdbc:hive2://vmhost-vm0:10000                                   
> 0: jdbc:hive2://vmhost-vm0:10000> add jar ./target/nexr-hive-udf-0.2-SNAPSHOT.jar
> 0: jdbc:hive2://vmhost-vm0:10000> CREATE TEMPORARY FUNCTION nvl AS 'com.nexr.platform.hive.udf.GenericUDFNVL';
> 0: jdbc:hive2://vmhost-vm0:10000> select nvl(imsi,'test') from ps_cei_index_1_week limit 1;
> Error: java.lang.ClassNotFoundException: com.nexr.platform.hive.udf.GenericUDFNVL (state=,code=0)
> 0: jdbc:hive2://vmhost-vm0:10000> !reconnect
> Reconnecting to "jdbc:hive2://vmhost-vm0:10000"...
> Closing: org.apache.hive.jdbc.HiveConnection@3f18dc75: {1}
> Connected to: Spark SQL (version 1.2.0)
> Driver: null (version null)
> Transaction isolation: TRANSACTION_REPEATABLE_READ
> 0: jdbc:hive2://vmhost-vm0:10000> select nvl(imsi,'test') from ps_cei_index_1_week limit 1;
> +------+
> | _c0  |
> +------+
> | -1   |
> +------+
> 1 row selected (1.605 seconds)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org