You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by 喜之郎 <25...@qq.com> on 2016/06/22 11:38:51 UTC

spark-1.6.1-bin-without-hadoop can not use spark-sql

Hi all.
I download spark-1.6.1-bin-without-hadoop.tgz from website.
And I configured "SPARK_DIST_CLASSPATH" in spark-env.sh.
Now spark-shell run well. But spark-sql can not run.
My hadoop version is 2.7.2.
This is error infos:


bin/spark-sql 
java.lang.ClassNotFoundException: org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:278)
	at org.apache.spark.util.Utils$.classForName(Utils.scala:174)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:689)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Failed to load main class org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.
You need to build Spark with -Phive and -Phive-thriftserver.



Do I need configure something else in spark-env.sh or spark-default.conf?
Suggestions are appreciated ,thanks.

Re: spark-1.6.1-bin-without-hadoop can not use spark-sql

Posted by Ted Yu <yu...@gmail.com>.
I wonder if the tar ball was built with:

-Phive -Phive-thriftserver

Maybe rebuild by yourself with the above ?

FYI

On Wed, Jun 22, 2016 at 4:38 AM, 喜之郎 <25...@qq.com> wrote:

> Hi all.
> I download spark-1.6.1-bin-without-hadoop.tgz
> <http://d3kbcqa49mib13.cloudfront.net/spark-1.6.1-bin-without-hadoop.tgz> from
> website.
> And I configured "SPARK_DIST_CLASSPATH" in spark-env.sh.
> Now spark-shell run well. But spark-sql can not run.
> My hadoop version is 2.7.2.
> This is error infos:
>
> bin/spark-sql
> java.lang.ClassNotFoundException:
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:278)
> at org.apache.spark.util.Utils$.classForName(Utils.scala:174)
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:689)
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Failed to load main class
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.
> You need to build Spark with -Phive and -Phive-thriftserver.
>
> Do I need configure something else in spark-env.sh or spark-default.conf?
> Suggestions are appreciated ,thanks.
>
>
>
>