You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Tom Panning (JIRA)" <ji...@apache.org> on 2015/01/09 15:55:34 UTC

[jira] [Created] (SPARK-5176) Thrift server fails with confusing error message when deploy-mode is cluster

Tom Panning created SPARK-5176:
----------------------------------

             Summary: Thrift server fails with confusing error message when deploy-mode is cluster
                 Key: SPARK-5176
                 URL: https://issues.apache.org/jira/browse/SPARK-5176
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 1.2.0, 1.1.0
            Reporter: Tom Panning


With Spark 1.2.0, when I try to run
{noformat}
$SPARK_HOME/sbin/start-thriftserver.sh --deploy-mode cluster --master spark://xd-spark.xdata.data-tactics-corp.com:7077
{noformat}
The log output is
{noformat}
Spark assembly has been built with Hive, including Datanucleus jars on classpath
Spark Command: /usr/java/latest/bin/java -cp ::/home/tpanning/Projects/spark/spark-1.2.0-bin-hadoop2.4/sbin/../conf:/home/tpanning/Projects/spark/spark-1.2.0-bin-hadoop2.4/lib/spark-assembly-1.2.0-hadoop2.4.0.jar:/home/tpanning/Projects/spark/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-core-3.2.10.jar:/home/tpanning/Projects/spark/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.9.jar:/home/tpanning/Projects/spark/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar -XX:MaxPermSize=128m -Xms512m -Xmx512m org.apache.spark.deploy.SparkSubmit --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 --deploy-mode cluster --master spark://xd-spark.xdata.data-tactics-corp.com:7077 spark-internal
========================================

Jar url 'spark-internal' is not in valid format.
Must be a jar file path in URL format (e.g. hdfs://host:port/XX.jar, file:///XX.jar)

Usage: DriverClient [options] launch <active-master> <jar-url> <main-class> [driver options]
Usage: DriverClient kill <active-master> <driver-id>

Options:
   -c CORES, --cores CORES        Number of cores to request (default: 1)
   -m MEMORY, --memory MEMORY     Megabytes of memory to request (default: 512)
   -s, --supervise                Whether to restart the driver on failure
   -v, --verbose                  Print more debugging output
     
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
{noformat}

I do not get this error if deploy-mode is set to client. The --deploy-mode option is described by the --help output, so I expected it to work. I checked, and this behavior seems to be present in Spark 1.1.0 as well.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org