You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Hafiz Mujadid <ha...@gmail.com> on 2015/10/11 19:42:37 UTC

Hive with apache spark

Hi

how can we read data from external hive server. Hive server is running and I
want to read data remotely using spark. is there any example ?


thanks



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Hive-with-apache-spark-tp25020.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


RE: Hive with apache spark

Posted by "Cheng, Hao" <ha...@intel.com>.
One option is you can read the data via JDBC, however, probably it's the worst option, as you probably need some hacky work to enable the parallel reading in Spark SQL.
Another option is copy the hive-site.xml of your Hive Server to $SPARK_HOME/conf, then Spark SQL will see everything that Hive Server does, and you can load the Hive table as need.


-----Original Message-----
From: Hafiz Mujadid [mailto:hafizmujadid00@gmail.com] 
Sent: Monday, October 12, 2015 1:43 AM
To: user@spark.apache.org
Subject: Hive with apache spark

Hi

how can we read data from external hive server. Hive server is running and I want to read data remotely using spark. is there any example ?


thanks



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Hive-with-apache-spark-tp25020.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org For additional commands, e-mail: user-help@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org