You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "George Kyriacou (JIRA)" <ji...@apache.org> on 2014/11/21 11:00:47 UTC

[jira] [Resolved] (SPARK-4499) Problems building and running 1.2 with Hive support

     [ https://issues.apache.org/jira/browse/SPARK-4499?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

George Kyriacou resolved SPARK-4499.
------------------------------------
    Resolution: Not a Problem

Thank you for the response and the explanation. 

Moving forward I will compile with the hive-thiftserver option and use an updated version of the Simba driver.

> Problems building and running 1.2 with Hive support
> ---------------------------------------------------
>
>                 Key: SPARK-4499
>                 URL: https://issues.apache.org/jira/browse/SPARK-4499
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.2.0
>         Environment: Hadoop 2.4.1, Hive 0.12, Ubuntu Server 12.04 LTS x64
>            Reporter: George Kyriacou
>              Labels: build, hive, hive-thiftserver, maven
>
> Building using Maven with the '-Phive' option causes an error during start-up of the Thrift Server. It complains that the project was not built using the '-Phive' and '-Phive-thriftserver' options and exits abnormally.
> Compiling with both the '-Phive' and '-Phive-thiftserver' allows the Thrift Server to start correctly however it seems that now it is running the same kind of Thrift server that Hive would be running. Connecting using the Simba Spark ODBC driver to the port that the Spark Thrift Server is running on displays an error that I am attempting to connect to an incorrect server type. Switching to a Hive ODBC driver from Hortonworks and pointing it to the same port connects successfully.
> This did not happen with previous Spark 1.2 builds.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org