You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kylin.apache.org by XiongQiuQiu <zh...@gmail.com> on 2017/04/12 07:04:41 UTC

use spark build cube happen error

*I use kylin-2.0 spark-1.6 hadoo-2.6.1 .
Building kylin cube by spark(beta) occur error .below is error log*

how can I fix it


OS command error exit with 1 -- export
HADOOP_CONF_DIR=/opt/kylin/hadoop-conf && /opt/spark/bin/spark-submit
--class org.apache.kylin.common.util.SparkEntry  --conf
spark.executor.instances=1  --conf
spark.yarn.jar=hdfs://n-vsan-12:9000/kylin/spark/spark-assembly-1.6.3-hadoop2.6.0.jar 
--conf spark.yarn.queue=default  --conf
spark.yarn.am.extraJavaOptions=-Dhdp.version=current  --conf
spark.history.fs.logDirectory=hdfs:///kylin/spark-history  --conf
spark.driver.extraJavaOptions=-Dhdp.version=current  --conf
spark.master=yarn  --conf
spark.executor.extraJavaOptions=-Dhdp.version=current  --conf
spark.executor.memory=1G  --conf spark.eventLog.enabled=true  --conf
spark.eventLog.dir=hdfs:///kylin/spark-history  --conf
spark.executor.cores=2  --conf spark.submit.deployMode=cluster --files
/opt/hbase/conf/hbase-site.xml --jars
/opt/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar,/opt/hbase/lib/htrace-core-3.1.0-incubating.jar,/opt/hbase/lib/hbase-client-1.3.0.jar,/opt/hbase/lib/hbase-common-1.3.0.jar,/opt/hbase/lib/hbase-protocol-1.3.0.jar,/opt/hbase/lib/metrics-core-2.2.0.jar,/opt/hbase/lib/guava-12.0.1.jar,/opt/hive/lib/mysql-connector-java-5.1.41-bin.jar
/opt/kylin/lib/kylin-job-2.0.0-SNAPSHOT.jar -className
org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable
kylin_intermediate_kylin_sales_cube_353bd5ac_1ebe_458f_a833_b2654572eee6
-output
hdfs:///kylin/kylin_metadata/kylin-4acc95c6-1eab-450b-b9f8-ffd0942a949d/kylin_sales_cube/cuboid/
-segmentId 353bd5ac-1ebe-458f-a833-b2654572eee6 -confPath /opt/kylin/conf
-cubename kylin_sales_cube
SparkEntry args:-className org.apache.kylin.engine.spark.SparkCubingByLayer
-hiveTable
kylin_intermediate_kylin_sales_cube_353bd5ac_1ebe_458f_a833_b2654572eee6
-output
hdfs:///kylin/kylin_metadata/kylin-4acc95c6-1eab-450b-b9f8-ffd0942a949d/kylin_sales_cube/cuboid/
-segmentId 353bd5ac-1ebe-458f-a833-b2654572eee6 -confPath /opt/kylin/conf
-cubename kylin_sales_cube
Abstract Application args:-hiveTable
kylin_intermediate_kylin_sales_cube_353bd5ac_1ebe_458f_a833_b2654572eee6
-output
hdfs:///kylin/kylin_metadata/kylin-4acc95c6-1eab-450b-b9f8-ffd0942a949d/kylin_sales_cube/cuboid/
-segmentId 353bd5ac-1ebe-458f-a833-b2654572eee6 -confPath /opt/kylin/conf
-cubename kylin_sales_cube
17/04/12 14:57:59 INFO spark.SparkContext: Running Spark version 1.6.3
17/04/12 14:58:00 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
17/04/12 14:58:00 WARN spark.SparkConf: 
SPARK_CLASSPATH was detected (set to
'/opt/hive/lib/mysql-connector-java-5.1.41-bin.jar').
This is deprecated in Spark 1.0+.

Please instead use:
 - ./spark-submit with --driver-class-path to augment the driver classpath
 - spark.executor.extraClassPath to augment the executor classpath
        
17/04/12 14:58:00 WARN spark.SparkConf: Setting
'spark.executor.extraClassPath' to
'/opt/hive/lib/mysql-connector-java-5.1.41-bin.jar' as a work-around.
17/04/12 14:58:00 WARN spark.SparkConf: Setting
'spark.driver.extraClassPath' to
'/opt/hive/lib/mysql-connector-java-5.1.41-bin.jar' as a work-around.
17/04/12 14:58:00 INFO spark.SecurityManager: Changing view acls to: hadoop
17/04/12 14:58:00 INFO spark.SecurityManager: Changing modify acls to:
hadoop
17/04/12 14:58:00 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(hadoop); users with modify permissions: Set(hadoop)
17/04/12 14:58:00 INFO util.Utils: Successfully started service
'sparkDriver' on port 41419.
17/04/12 14:58:00 INFO slf4j.Slf4jLogger: Slf4jLogger started
17/04/12 14:58:00 INFO Remoting: Starting remoting
17/04/12 14:58:00 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkDriverActorSystem@10.20.45.12:35951]
17/04/12 14:58:00 INFO util.Utils: Successfully started service
'sparkDriverActorSystem' on port 35951.
17/04/12 14:58:00 INFO spark.SparkEnv: Registering MapOutputTracker
17/04/12 14:58:00 INFO spark.SparkEnv: Registering BlockManagerMaster
17/04/12 14:58:00 INFO storage.DiskBlockManager: Created local directory at
/opt/spark-dir/blockmgr-db025fa5-69d3-4cdc-b403-15d2b016b955
17/04/12 14:58:00 INFO storage.MemoryStore: MemoryStore started with
capacity 511.1 MB
17/04/12 14:58:01 INFO spark.SparkEnv: Registering OutputCommitCoordinator
17/04/12 14:58:01 INFO server.Server: jetty-8.y.z-SNAPSHOT
17/04/12 14:58:01 INFO server.AbstractConnector: Started
SelectChannelConnector@0.0.0.0:4040
17/04/12 14:58:01 INFO util.Utils: Successfully started service 'SparkUI' on
port 4040.
17/04/12 14:58:01 INFO ui.SparkUI: Started SparkUI at
http://10.20.45.12:4040
17/04/12 14:58:01 INFO spark.HttpFileServer: HTTP File server directory is
/opt/spark-dir/spark-e3dfa20d-20c2-45ef-8320-8cf52a1f7ccc/httpd-8bcd23f3-e6ee-4fc7-badc-9b1d197da46f
17/04/12 14:58:01 INFO spark.HttpServer: Starting HTTP Server
17/04/12 14:58:01 INFO server.Server: jetty-8.y.z-SNAPSHOT
17/04/12 14:58:01 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0:46304
17/04/12 14:58:01 INFO util.Utils: Successfully started service 'HTTP file
server' on port 46304.
17/04/12 14:58:01 INFO spark.SparkContext: Added JAR
file:/opt/hadoop/share/hadoop/common/lib/htrace-core-3.0.4.jar at
http://10.20.45.12:46304/jars/htrace-core-3.0.4.jar with timestamp
1491980281297
17/04/12 14:58:01 INFO spark.SparkContext: Added JAR
file:/opt/hbase/lib/htrace-core-3.1.0-incubating.jar at
http://10.20.45.12:46304/jars/htrace-core-3.1.0-incubating.jar with
timestamp 1491980281301
17/04/12 14:58:01 INFO spark.SparkContext: Added JAR
file:/opt/hbase/lib/hbase-client-1.3.0.jar at
http://10.20.45.12:46304/jars/hbase-client-1.3.0.jar with timestamp
1491980281304
17/04/12 14:58:01 INFO spark.SparkContext: Added JAR
file:/opt/hbase/lib/hbase-common-1.3.0.jar at
http://10.20.45.12:46304/jars/hbase-common-1.3.0.jar with timestamp
1491980281305
17/04/12 14:58:01 INFO spark.SparkContext: Added JAR
file:/opt/hbase/lib/hbase-protocol-1.3.0.jar at
http://10.20.45.12:46304/jars/hbase-protocol-1.3.0.jar with timestamp
1491980281315
17/04/12 14:58:01 INFO spark.SparkContext: Added JAR
file:/opt/hbase/lib/metrics-core-2.2.0.jar at
http://10.20.45.12:46304/jars/metrics-core-2.2.0.jar with timestamp
1491980281316
17/04/12 14:58:01 INFO spark.SparkContext: Added JAR
file:/opt/hbase/lib/guava-12.0.1.jar at
http://10.20.45.12:46304/jars/guava-12.0.1.jar with timestamp 1491980281319
17/04/12 14:58:01 INFO spark.SparkContext: Added JAR
file:/opt/hive/lib/mysql-connector-java-5.1.41-bin.jar at
http://10.20.45.12:46304/jars/mysql-connector-java-5.1.41-bin.jar with
timestamp 1491980281321
17/04/12 14:58:01 INFO spark.SparkContext: Added JAR
file:/opt/kylin/lib/kylin-job-2.0.0-SNAPSHOT.jar at
http://10.20.45.12:46304/jars/kylin-job-2.0.0-SNAPSHOT.jar with timestamp
1491980281329
17/04/12 14:58:01 INFO client.RMProxy: Connecting to ResourceManager at
n-vsan-12/10.20.45.12:8032
17/04/12 14:58:01 INFO yarn.Client: Requesting a new application from
cluster with 5 NodeManagers
17/04/12 14:58:01 INFO yarn.Client: Verifying our application has not
requested more than the maximum memory capability of the cluster (8192 MB
per container)
17/04/12 14:58:01 INFO yarn.Client: Will allocate AM container, with 896 MB
memory including 384 MB overhead
17/04/12 14:58:01 INFO yarn.Client: Setting up container launch context for
our AM
17/04/12 14:58:01 INFO yarn.Client: Setting up the launch environment for
our AM container
17/04/12 14:58:01 INFO yarn.Client: Preparing resources for our AM container
17/04/12 14:58:02 INFO yarn.Client: Source and destination file systems are
the same. Not copying
hdfs://n-vsan-12:9000/kylin/spark/spark-assembly-1.6.3-hadoop2.6.0.jar
17/04/12 14:58:02 INFO yarn.Client: Uploading resource
file:/opt/hbase/conf/hbase-site.xml ->
hdfs://n-vsan-12:9000/user/hadoop/.sparkStaging/application_1491978153469_0004/hbase-site.xml
17/04/12 14:58:02 INFO yarn.Client: Uploading resource
file:/opt/spark-dir/spark-e3dfa20d-20c2-45ef-8320-8cf52a1f7ccc/__spark_conf__2883249325037185025.zip
->
hdfs://n-vsan-12:9000/user/hadoop/.sparkStaging/application_1491978153469_0004/__spark_conf__2883249325037185025.zip
17/04/12 14:58:02 INFO spark.SecurityManager: Changing view acls to: hadoop
17/04/12 14:58:02 INFO spark.SecurityManager: Changing modify acls to:
hadoop
17/04/12 14:58:02 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(hadoop); users with modify permissions: Set(hadoop)
17/04/12 14:58:02 INFO yarn.Client: Submitting application 4 to
ResourceManager
17/04/12 14:58:02 INFO impl.YarnClientImpl: Submitted application
application_1491978153469_0004
17/04/12 14:58:03 INFO yarn.Client: Application report for
application_1491978153469_0004 (state: ACCEPTED)
17/04/12 14:58:03 INFO yarn.Client: 
	 client token: N/A
	 diagnostics: N/A
	 ApplicationMaster host: N/A
	 ApplicationMaster RPC port: -1
	 queue: default
	 start time: 1491980282515
	 final status: UNDEFINED
	 tracking URL: http://n-vsan-12:8088/proxy/application_1491978153469_0004/
	 user: hadoop
17/04/12 14:58:04 INFO yarn.Client: Application report for
application_1491978153469_0004 (state: ACCEPTED)
17/04/12 14:58:05 INFO yarn.Client: Application report for
application_1491978153469_0004 (state: ACCEPTED)
17/04/12 14:58:06 INFO yarn.Client: Application report for
application_1491978153469_0004 (state: ACCEPTED)
17/04/12 14:58:07 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:
ApplicationMaster registered as NettyRpcEndpointRef(null)
17/04/12 14:58:07 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter.
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS
-> n-vsan-12, PROXY_URI_BASES ->
http://n-vsan-12:8088/proxy/application_1491978153469_0004),
/proxy/application_1491978153469_0004
17/04/12 14:58:07 INFO ui.JettyUtils: Adding filter:
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
17/04/12 14:58:07 INFO yarn.Client: Application report for
application_1491978153469_0004 (state: ACCEPTED)
17/04/12 14:58:08 INFO yarn.Client: Application report for
application_1491978153469_0004 (state: RUNNING)
17/04/12 14:58:08 INFO yarn.Client: 
	 client token: N/A
	 diagnostics: N/A
	 ApplicationMaster host: 10.20.45.15
	 ApplicationMaster RPC port: 0
	 queue: default
	 start time: 1491980282515
	 final status: UNDEFINED
	 tracking URL: http://n-vsan-12:8088/proxy/application_1491978153469_0004/
	 user: hadoop
17/04/12 14:58:08 INFO cluster.YarnClientSchedulerBackend: Application
application_1491978153469_0004 has started running.
17/04/12 14:58:08 INFO util.Utils: Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port 32917.
17/04/12 14:58:08 INFO netty.NettyBlockTransferService: Server created on
32917
17/04/12 14:58:08 INFO storage.BlockManagerMaster: Trying to register
BlockManager
17/04/12 14:58:08 INFO storage.BlockManagerMasterEndpoint: Registering block
manager 10.20.45.12:32917 with 511.1 MB RAM, BlockManagerId(driver,
10.20.45.12, 32917)
17/04/12 14:58:08 INFO storage.BlockManagerMaster: Registered BlockManager
17/04/12 14:58:09 INFO scheduler.EventLoggingListener: Logging events to
hdfs:///kylin/spark-history/application_1491978153469_0004
17/04/12 14:58:11 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:
ApplicationMaster registered as NettyRpcEndpointRef(null)
17/04/12 14:58:11 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter.
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS
-> n-vsan-12, PROXY_URI_BASES ->
http://n-vsan-12:8088/proxy/application_1491978153469_0004),
/proxy/application_1491978153469_0004
17/04/12 14:58:11 INFO ui.JettyUtils: Adding filter:
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
17/04/12 14:58:14 ERROR cluster.YarnClientSchedulerBackend: Yarn application
has already exited with state FINISHED!
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/metrics/json,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/api,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/static,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors/threadDump,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors/json,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/environment/json,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/environment,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage/rdd,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage/json,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/pool/json,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/pool,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/json,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/json,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs/job/json,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs/job,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs/json,null}
17/04/12 14:58:14 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs,null}
17/04/12 14:58:14 INFO ui.SparkUI: Stopped Spark web UI at
http://10.20.45.12:4040
17/04/12 14:58:14 INFO cluster.YarnClientSchedulerBackend: Shutting down all
executors
17/04/12 14:58:14 INFO cluster.YarnClientSchedulerBackend: Asking each
executor to shut down
17/04/12 14:58:14 INFO cluster.YarnClientSchedulerBackend: Stopped
17/04/12 14:58:14 INFO spark.MapOutputTrackerMasterEndpoint:
MapOutputTrackerMasterEndpoint stopped!
17/04/12 14:58:14 INFO storage.MemoryStore: MemoryStore cleared
17/04/12 14:58:14 INFO storage.BlockManager: BlockManager stopped
17/04/12 14:58:14 INFO storage.BlockManagerMaster: BlockManagerMaster
stopped
17/04/12 14:58:14 INFO
scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
OutputCommitCoordinator stopped!
17/04/12 14:58:14 INFO spark.SparkContext: Successfully stopped SparkContext
17/04/12 14:58:14 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Shutting down remote daemon.
17/04/12 14:58:14 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Remote daemon shut down; proceeding with flushing remote transports.
17/04/12 14:58:14 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Remoting shut down.
17/04/12 14:58:31 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend
is ready for scheduling beginning after waiting
maxRegisteredResourcesWaitingTime: 30000(ms)
17/04/12 14:58:31 ERROR spark.SparkContext: Error initializing SparkContext.
java.lang.NullPointerException
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:584)
	at
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
	at
org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:149)
	at
org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)
	at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/04/12 14:58:31 INFO spark.SparkContext: SparkContext already stopped.
Exception in thread "main" java.lang.RuntimeException: error execute
org.apache.kylin.engine.spark.SparkCubingByLayer
	at
org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:42)
	at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NullPointerException
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:584)
	at
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
	at
org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:149)
	at
org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)
	... 10 more
17/04/12 14:58:31 INFO util.ShutdownHookManager: Shutdown hook called
17/04/12 14:58:31 INFO util.ShutdownHookManager: Deleting directory
/opt/spark-dir/spark-e3dfa20d-20c2-45ef-8320-8cf52a1f7ccc/httpd-8bcd23f3-e6ee-4fc7-badc-9b1d197da46f
17/04/12 14:58:31 INFO util.ShutdownHookManager: Deleting directory
/opt/spark-dir/spark-e3dfa20d-20c2-45ef-8320-8cf52a1f7ccc

--
View this message in context: http://apache-kylin.74782.x6.nabble.com/use-spark-build-cube-happen-error-tp7621.html
Sent from the Apache Kylin mailing list archive at Nabble.com.