You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "inred (JIRA)" <ji...@apache.org> on 2016/11/17 06:42:58 UTC

[jira] [Created] (SPARK-18483) spark on yarn always connect to yarn resourcemanager at 0.0.0.0:8032

inred created SPARK-18483:
-----------------------------

             Summary: spark on yarn always connect to  yarn resourcemanager at  0.0.0.0:8032
                 Key: SPARK-18483
                 URL: https://issues.apache.org/jira/browse/SPARK-18483
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.0.1
         Environment: java8
SBT0.13
scala2.11.8
spark-2.0.1-bin-hadoop2.6
            Reporter: inred


I have installed the yarn resource manager at 192.168.13.159:8032 and have  set YARN_CONF_DIR environment var and have the yarn-site.xml
configured as the following, but it always connects to 0.0.0.0:8032 instead of 
192.168.13.159:8032


set environment
E:\app>set yarn
YARN_CONF_DIR=D:\Documents\download\hadoop

E:\app>set had
HADOOP_HOME=D:\Documents\download\hadoop

E:\app>cat D:\Documents\download\hadoop\yarn-site.xml
<property>
<name>yarn.resourcemanager.address</name>
<value>192.168.13.159:8032</value>
</property>

"C:\Program Files\Java\jdk1.8.0_92\bin\java" -Didea.launcher.port=7532 "-Didea.launcher.bin.path=C:\Program Files (x86)\JetBrains\IntelliJ IDEA Community Edition 2016.2.5\bin" -Dfile.encoding=UTF-8 -classpath "C:\Program Files\Java\jdk1.8.0_92\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\cldrdata.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\zipfs.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\jce.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\resources.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\rt.jar;E:\app\adam-ano\target\scala-2.11\classes;C:\Users\feiwu\.ivy2\cache\org.scala-lang\scala-library\jars\scala-library-2.11.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\xz-1.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jta-1.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jpam-1.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\guice-3.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\ivy-2.4.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\lz4-1.3.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\oro-2.0.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\ST4-4.0.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\avro-1.7.7.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\core-1.1.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\gson-2.2.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\mail-1.4.7.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\mx4j-3.0.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\snappy-0.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\antlr-2.7.7.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\opencsv-2.3.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\py4j-0.10.3.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\xmlenc-0.52.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\base64-2.3.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\guava-14.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\janino-2.7.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jets3t-0.9.3.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jetty-6.1.26.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jline-2.12.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jsr305-1.3.9.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\log4j-1.2.17.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\minlog-1.3.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\pyrolite-4.9.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\stream-2.7.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jdo-api-3.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\json-20090211.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\objenesis-2.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\paranamer-2.3.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\scalap-2.11.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\avro-ipc-1.7.7.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-io-2.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\httpcore-4.4.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\JavaEWAH-0.3.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\javax.inject-1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jaxb-api-2.2.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\libfb303-0.9.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\stax-api-1.0-2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\stax-api-1.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\aopalliance-1.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-cli-1.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-net-2.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\derby-10.12.1.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\joda-time-2.9.3.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jodd-core-3.5.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\libthrift-0.9.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\super-csv-2.2.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\zookeeper-3.4.6.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\activation-1.1.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\chill-java-0.8.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\chill_2.11-0.8.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-dbcp-1.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-lang-2.6.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\httpclient-4.5.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\javolution-5.5.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\slf4j-api-1.7.16.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spire_2.11-0.7.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\xercesImpl-2.9.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\antlr-runtime-3.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\guice-servlet-3.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-auth-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-hdfs-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hk2-api-2.4.0-b34.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\htrace-core-3.0.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jackson-xc-1.9.13.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jetty-util-6.1.26.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jtransforms-2.4.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\kryo-shaded-3.0.3.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\metrics-jvm-3.1.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\netty-3.8.0.Final.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\pmml-model-1.2.15.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\api-util-1.0.0-M20.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\breeze_2.11-0.11.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-codec-1.10.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-pool-1.5.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\compress-lzf-1.0.3.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jackson-core-2.6.5.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\leveldbjni-all-1.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\metrics-core-3.1.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\metrics-json-3.1.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\pmml-schema-1.2.15.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\bcprov-jdk15on-1.51.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-lang3-3.3.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-math3-3.4.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-client-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-common-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hk2-utils-2.4.0-b34.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\java-xmlbuilder-1.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\javassist-3.18.1-GA.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jersey-guava-2.22.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jul-to-slf4j-1.7.16.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\protobuf-java-2.5.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\snappy-java-1.1.2.6.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\antlr4-runtime-4.5.3.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\bonecp-0.8.0.RELEASE.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-digester-1.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\curator-client-2.6.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jackson-jaxrs-1.9.13.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jersey-client-2.22.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jersey-common-2.22.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jersey-server-2.22.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\parquet-column-1.7.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\parquet-common-1.7.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\parquet-hadoop-1.7.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\RoaringBitmap-0.5.11.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\scala-library-2.11.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\scala-reflect-2.11.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\scala-xml_2.11-1.0.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\slf4j-log4j12-1.7.16.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-sql_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\stringtemplate-3.2.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-logging-1.1.3.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\curator-recipes-2.6.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-yarn-api-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hive-cli-1.2.1.spark2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hk2-locator-2.4.0-b34.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\javax.ws.rs-api-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jcl-over-slf4j-1.7.16.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\parquet-jackson-1.7.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\scala-compiler-2.11.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-core_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-hive_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-repl_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-tags_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-yarn_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\xbean-asm5-shaded-4.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\api-asn1-api-1.0.0-M20.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-compiler-2.7.6.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-compress-1.4.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-httpclient-3.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hive-exec-1.2.1.spark2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hive-jdbc-1.2.1.spark2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jackson-databind-2.6.5.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\javax.inject-2.4.0-b34.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\json4s-ast_2.11-3.2.11.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\metrics-graphite-3.1.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\netty-all-4.0.29.Final.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\parquet-encoding-1.7.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-mllib_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\apacheds-i18n-2.0.0-M15.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\arpack_combined_all-0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-beanutils-1.7.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\curator-framework-2.6.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\datanucleus-core-3.2.10.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\datanucleus-rdbms-3.2.9.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jackson-core-asl-1.9.13.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\javax.servlet-api-3.1.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\json4s-core_2.11-3.2.11.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\parquet-generator-1.7.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-graphx_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-sketch_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-unsafe_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spire-macros_2.11-0.7.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\univocity-parsers-2.1.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-annotations-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-yarn-client-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-yarn-common-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\javax.annotation-api-1.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jersey-media-jaxb-2.22.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\avro-mapred-1.7.7-hadoop2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\breeze-macros_2.11-0.11.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-collections-3.2.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-configuration-1.6.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\datanucleus-api-jdo-3.2.6.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hive-beeline-1.2.1.spark2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jackson-annotations-2.6.5.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jackson-mapper-asl-1.9.13.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-catalyst_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-launcher_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\apache-log4j-extras-1.2.17.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\eigenbase-properties-1.1.5.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\json4s-jackson_2.11-3.2.11.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-streaming_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\validation-api-1.1.0.Final.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hive-metastore-1.2.1.spark2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\osgi-resource-locator-1.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\parquet-hadoop-bundle-1.6.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-beanutils-core-1.8.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\mesos-0.21.1-shaded-protobuf.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-mllib-local_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\calcite-core-1.2.0-incubating.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jackson-module-paranamer-2.6.5.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\calcite-linq4j-1.2.0-incubating.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-yarn-server-common-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jackson-module-scala_2.11-2.6.5.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jersey-container-servlet-2.22.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\parquet-format-2.3.0-incubating.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-network-common_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\aopalliance-repackaged-2.4.0-b34.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\calcite-avatica-1.2.0-incubating.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-network-shuffle_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\apacheds-kerberos-codec-2.0.0-M15.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-mapreduce-client-app-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-mapreduce-client-core-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-yarn-server-web-proxy-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-hive-thriftserver_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\scala-parser-combinators_2.11-1.0.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-mapreduce-client-common-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jersey-container-servlet-core-2.22.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-mapreduce-client-shuffle-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-mapreduce-client-jobclient-2.6.4.jar;D:\Documents\download\adam-distribution-spark2_2.11-0.20.0\repo\adam_2.11-0.20.0.jar;C:\Program Files (x86)\JetBrains\IntelliJ IDEA Community Edition 2016.2.5\lib\idea_rt.jar" com.intellij.rt.execution.application.AppMain com.ano.adam.AnnoSp --deploy-mode client
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/D:/Documents/download/spark-2.0.1-bin-hadoop2.6/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/D:/Documents/download/adam-distribution-spark2_2.11-0.20.0/repo/adam_2.11-0.20.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2016-11-17 13:48:08 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-11-17 14:14:03 ERROR SparkContext:91 - Error initializing SparkContext.
java.net.ConnectException: Call From BJL443R90HT444/192.168.22.232 to 0.0.0.0:8032 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.reflect.GeneratedConstructorAccessor7.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:731)
	at org.apache.hadoop.ipc.Client.call(Client.java:1473)
	at org.apache.hadoop.ipc.Client.call(Client.java:1400)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
	at com.sun.proxy.$Proxy9.getClusterMetrics(Unknown Source)
	at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.getClusterMetrics(ApplicationClientProtocolPBClientImpl.java:202)
	at sun.reflect.GeneratedMethodAccessor7.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
	at com.sun.proxy.$Proxy10.getClusterMetrics(Unknown Source)
	at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getYarnClusterMetrics(YarnClientImpl.java:461)
	at org.apache.spark.deploy.yarn.Client$$anonfun$submitApplication$1.apply(Client.scala:154)
	at org.apache.spark.deploy.yarn.Client$$anonfun$submitApplication$1.apply(Client.scala:154)
	at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
	at org.apache.spark.deploy.yarn.Client.logInfo(Client.scala:60)
	at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:153)
	at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
	at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:149)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:497)
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2275)
	at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
	at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
	at com.ano.adam.AnnoSp$.main(AnnoSp.scala:38)
	at com.ano.adam.AnnoSp.main(AnnoSp.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.net.ConnectException: Connection refused: no further information
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:608)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:706)
	at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:369)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1522)
	at org.apache.hadoop.ipc.Client.call(Client.java:1439)
	... 31 more
2016-11-17 14:14:03 WARN  YarnSchedulerBackend$YarnSchedulerEndpoint:66 - Attempted to request executors before the AM has registered!
2016-11-17 14:14:03 WARN  MetricsSystem:66 - Stopping a MetricsSystem that is not running
Exception in thread "main" java.net.ConnectException: Call From BJL443R90HT444/192.168.22.232 to 0.0.0.0:8032 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.reflect.GeneratedConstructorAccessor7.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:731)
	at org.apache.hadoop.ipc.Client.call(Client.java:1473)
	at org.apache.hadoop.ipc.Client.call(Client.java:1400)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
	at com.sun.proxy.$Proxy9.getClusterMetrics(Unknown Source)
	at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.getClusterMetrics(ApplicationClientProtocolPBClientImpl.java:202)
	at sun.reflect.GeneratedMethodAccessor7.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
	at com.sun.proxy.$Proxy10.getClusterMetrics(Unknown Source)
	at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getYarnClusterMetrics(YarnClientImpl.java:461)
	at org.apache.spark.deploy.yarn.Client$$anonfun$submitApplication$1.apply(Client.scala:154)
	at org.apache.spark.deploy.yarn.Client$$anonfun$submitApplication$1.apply(Client.scala:154)
	at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
	at org.apache.spark.deploy.yarn.Client.logInfo(Client.scala:60)
	at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:153)
	at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
	at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:149)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:497)
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2275)
	at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
	at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
	at com.ano.adam.AnnoSp$.main(AnnoSp.scala:38)
	at com.ano.adam.AnnoSp.main(AnnoSp.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.net.ConnectException: Connection refused: no further information
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:608)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:706)
	at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:369)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1522)
	at org.apache.hadoop.ipc.Client.call(Client.java:1439)
	... 31 more

Process finished with exit code 1






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org