You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by ๏̯͡๏ <ÐΞ€ρ@Ҝ>, de...@gmail.com on 2015/03/24 12:47:09 UTC

Unable to run Hive queries on Spark

Hello,

I am attempting to read avro file and want to simply count number of
records, after i run the program i see the below exception. I am using
Spark 1.3, data bricks lib to read avro (
https://github.com/databricks/spark-avro)


Command

#On Apollo-CLI
export SPARK_HOME=/home/dvasthimal/spark1.3/spark-1.3.0-bin-hadoop2.4
export SPARK_JAR=$SPARK_HOME/lib/spark-assembly-1.3.0-hadoop2.4.0.jar
export
SPARK_CLASSPATH=/apache/hadoop/share/hadoop/common/hadoop-common-2.4.1-EBAY-2.jar:/apache/hadoop/lib/hadoop-lzo-0.6.0.jar:/apache/hadoop-2.4.1-2.1.3.0-2-EBAY/share/hadoop/yarn/lib/guava-11.0.2.jar
export HADOOP_CONF_DIR=/apache/hadoop/conf

./bin/spark-submit -v --master yarn-cluster --jars
/home/dvasthimal/spark1.3/spark-avro_2.10-1.0.0.jar,/home/dvasthimal/spark1.3/spark-1.3.0-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar,/home/dvasthimal/spark1.3/spark-1.3.0-bin-hadoop2.4/lib/datanucleus-core-3.2.10.jar,/home/dvasthimal/spark1.3/spark-1.3.0-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.9.jar
--num-executors 3 --driver-memory 8g --executor-memory 2g --executor-cores
1 --queue hdmi-express --class com.ebay.ep.poc.spark.reporting.SparkApp
/home/dvasthimal/spark1.3/spark_reporting-1.0-SNAPSHOT.jar
startDate=2015-02-16 endDate=2015-02-16
input=/user/dvasthimal/epdatasets/successdetail1/part-r-00000.avro
subcommand=successevents2 output=/user/dvasthimal/epdatasets/successdetail2

Code:

import org.apache.spark.SparkConf

import org.apache.spark.SparkContext

import org.apache.spark.SparkContext._


import collection.mutable.HashMap


import com.databricks.spark.avro._


    val hc = new org.apache.spark.sql.hive.HiveContext(sc)

    val successDetail_S1 = hc.avroFile(input)

    successDetail_S1.registerTempTable("success_events.sojsuccessevents1")

    val countS1 = hc.sql("select count(*) from
success_events.sojsuccessevents1")

    println(countS1.collect)


Exception

15/03/24 04:37:15 INFO metastore.HiveMetaStore: No user is added in admin
role, since config is empty
15/03/24 04:37:15 INFO session.SessionState: No Tez session required at
this point. hive.execution.engine=mr.
15/03/24 04:37:15 INFO session.SessionState: No Tez session required at
this point. hive.execution.engine=mr.
15/03/24 04:37:15 INFO parse.ParseDriver: Parsing command: select count(*)
from success_events.sojsuccessevents1
15/03/24 04:37:17 INFO parse.ParseDriver: Parse Completed
Exception in thread "Driver"
Exception: java.lang.OutOfMemoryError thrown from the
UncaughtExceptionHandler in thread "Driver"

LogType: stdout



Full Log
=======

15/03/24 04:37:21 INFO yarn.Client: Application report for
application_1426715280024_68434 (state: RUNNING)
15/03/24 04:37:22 INFO yarn.Client: Application report for
application_1426715280024_68434 (state: RUNNING)
15/03/24 04:37:23 INFO yarn.Client: Application report for
application_1426715280024_68434 (state: RUNNING)
15/03/24 04:37:24 INFO yarn.Client: Application report for
application_1426715280024_68434 (state: FAILED)
15/03/24 04:37:24 INFO yarn.Client:
 client token: N/A
 diagnostics: Application application_1426715280024_68434 failed 2 times
due to AM Container for appattempt_1426715280024_68434_000002 exited with
 exitCode: 0 due to: .Failing this attempt.. Failing the application.
 ApplicationMaster host: N/A
 ApplicationMaster RPC port: -1
 queue: hdmi-express
 start time: 1427196892993
 final status: FAILED
 tracking URL:
RM-HOST-NAME:50030/cluster/app/application_1426715280024_68434
 user: dvasthimal
Exception in thread "main" org.apache.spark.SparkException: Application
finished with failed status
at org.apache.spark.deploy.yarn.Client.run(Client.scala:622)
at org.apache.spark.deploy.yarn.Client$.main(Client.scala:647)
at org.apache.spark.deploy.yarn.Client.main(Client.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
-sh-4.1$ /apache/hadoop/bin/yarn logs -applicationId
application_1426715280024_68434
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException:
/apache/hadoop/logs/SecurityAuth-dvasthimal.audit (Permission denied)
at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.<init>(FileOutputStream.java:221)
at java.io.FileOutputStream.<init>(FileOutputStream.java:142)
at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
at
org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:223)
at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
at
org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
at
org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
at
org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
at
org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
at
org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
at
org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
at
org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
at
org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
at org.apache.log4j.Logger.getLogger(Logger.java:104)
at
org.apache.commons.logging.impl.Log4JLogger.getLogger(Log4JLogger.java:262)
at org.apache.commons.logging.impl.Log4JLogger.<init>(Log4JLogger.java:108)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at
org.apache.commons.logging.impl.LogFactoryImpl.createLogFromClass(LogFactoryImpl.java:1025)
at
org.apache.commons.logging.impl.LogFactoryImpl.discoverLogImplementation(LogFactoryImpl.java:844)
at
org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:541)
at
org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:292)
at
org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:269)
at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
at org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:169)
at org.apache.hadoop.yarn.client.cli.LogsCLI.main(LogsCLI.java:196)
log4j:ERROR Either File or DatePattern options are not set for appender
[DRFAS].


Container: container_1426715280024_68434_01_000003 on
CONTAINER-HOST-NAME_60288
===========================================================================================
LogType: stderr
LogLength: 6752
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/hadoop/1/scratch/local/usercache/dvasthimal/filecache/13/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/apache/hadoop-2.4.1-2.1.3.0-2-EBAY/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/03/24 04:35:20 INFO executor.CoarseGrainedExecutorBackend: Registered
signal handlers for [TERM, HUP, INT]
15/03/24 04:35:22 INFO spark.SecurityManager: Changing view acls to:
dvasthimal
15/03/24 04:35:22 INFO spark.SecurityManager: Changing modify acls to:
dvasthimal
15/03/24 04:35:22 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(dvasthimal); users with modify permissions: Set(dvasthimal)
15/03/24 04:35:22 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/03/24 04:35:22 INFO Remoting: Starting remoting
15/03/24 04:35:23 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://driverPropsFetcher@CONTAINER-HOST-NAME:35270]
15/03/24 04:35:23 INFO util.Utils: Successfully started service
'driverPropsFetcher' on port 35270.
15/03/24 04:35:23 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Shutting down remote daemon.
15/03/24 04:35:23 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Remote daemon shut down; proceeding with flushing remote transports.
15/03/24 04:35:23 INFO spark.SecurityManager: Changing view acls to:
dvasthimal
15/03/24 04:35:23 INFO spark.SecurityManager: Changing modify acls to:
dvasthimal
15/03/24 04:35:23 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(dvasthimal); users with modify permissions: Set(dvasthimal)
15/03/24 04:35:23 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/03/24 04:35:23 INFO Remoting: Starting remoting
15/03/24 04:35:23 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Remoting shut down.
15/03/24 04:35:23 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkExecutor@CONTAINER-HOST-NAME:54112]
15/03/24 04:35:23 INFO util.Utils: Successfully started service
'sparkExecutor' on port 54112.
15/03/24 04:35:24 INFO util.AkkaUtils: Connecting to MapOutputTracker:
akka.tcp://sparkDriver@SOME-HOST-NAME:48318/user/MapOutputTracker
15/03/24 04:35:24 INFO util.AkkaUtils: Connecting to BlockManagerMaster:
akka.tcp://sparkDriver@SOME-HOST-NAME:48318/user/BlockManagerMaster
15/03/24 04:35:24 INFO storage.DiskBlockManager: Created local directory at
/hadoop/1/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-e26411e4-4823-4d92-bb24-2a718447dae8
15/03/24 04:35:24 INFO storage.DiskBlockManager: Created local directory at
/hadoop/2/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-525b7109-2017-4490-af78-497d7592b05d
15/03/24 04:35:24 INFO storage.DiskBlockManager: Created local directory at
/hadoop/3/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-cda9b253-6568-4be4-8821-e5a3e473d167
15/03/24 04:35:24 INFO storage.DiskBlockManager: Created local directory at
/hadoop/4/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-9efd5d6c-325a-4407-93f8-9351d37efc38
15/03/24 04:35:24 INFO storage.DiskBlockManager: Created local directory at
/hadoop/5/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-927870f8-f3f7-48ed-9d45-4808bbbbf43a
15/03/24 04:35:24 INFO storage.DiskBlockManager: Created local directory at
/hadoop/6/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-5fe365a5-33b8-42bd-bb04-82c3523a817b
15/03/24 04:35:24 INFO storage.DiskBlockManager: Created local directory at
/hadoop/7/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-207e5572-ca34-4806-8c2f-4d50b8174cb0
15/03/24 04:35:24 INFO storage.DiskBlockManager: Created local directory at
/hadoop/8/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-2bf45c33-e83c-4938-962e-0b9990b5f780
15/03/24 04:35:24 INFO storage.DiskBlockManager: Created local directory at
/hadoop/9/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-1a401137-af27-4245-bb95-40a04464e861
15/03/24 04:35:24 INFO storage.DiskBlockManager: Created local directory at
/hadoop/10/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-47d2da0b-b2ee-4c1e-8c8b-c6095024b92b
15/03/24 04:35:24 INFO storage.DiskBlockManager: Created local directory at
/hadoop/11/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-4afc363d-e7ba-425f-b9d4-ae967ade973e
15/03/24 04:35:24 INFO storage.DiskBlockManager: Created local directory at
/hadoop/12/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-1fb81248-8a0a-472f-9f3d-84c65a4f2de2
15/03/24 04:35:24 INFO storage.MemoryStore: MemoryStore started with
capacity 1060.3 MB
15/03/24 04:35:24 INFO util.AkkaUtils: Connecting to
OutputCommitCoordinator: akka.tcp://sparkDriver@SOME-HOST-NAME
:48318/user/OutputCommitCoordinator
15/03/24 04:35:24 INFO executor.CoarseGrainedExecutorBackend: Connecting to
driver: akka.tcp://sparkDriver@SOME-HOST-NAME
:48318/user/CoarseGrainedScheduler
15/03/24 04:35:24 INFO executor.CoarseGrainedExecutorBackend: Successfully
registered with driver
15/03/24 04:35:24 INFO executor.Executor: Starting executor ID 1 on host
CONTAINER-HOST-NAME
15/03/24 04:35:25 INFO netty.NettyBlockTransferService: Server created on
59851
15/03/24 04:35:25 INFO storage.BlockManagerMaster: Trying to register
BlockManager
15/03/24 04:35:25 INFO storage.BlockManagerMaster: Registered BlockManager
15/03/24 04:35:25 INFO util.AkkaUtils: Connecting to HeartbeatReceiver:
akka.tcp://sparkDriver@SOME-HOST-NAME:48318/user/HeartbeatReceiver
15/03/24 04:36:00 ERROR executor.CoarseGrainedExecutorBackend: Driver
Disassociated [akka.tcp://sparkExecutor@CONTAINER-HOST-NAME:54112] ->
[akka.tcp://sparkDriver@SOME-HOST-NAME:48318] disassociated! Shutting down.
15/03/24 04:36:00 WARN remote.ReliableDeliverySupervisor: Association with
remote system [akka.tcp://sparkDriver@SOME-HOST-NAME:48318] has failed,
address is now gated for [5000] ms. Reason is: [Disassociated].

LogType: stdout
LogLength: 0
Log Contents:



Container: container_1426715280024_68434_02_000008 on SOME-HOST-NAME_35688
===========================================================================================
LogType: stderr
LogLength: 6544
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/hadoop/6/scratch/local/usercache/dvasthimal/filecache/13/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/apache/hadoop-2.4.1-2.1.3.0-2-EBAY/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/03/24 04:36:40 INFO executor.CoarseGrainedExecutorBackend: Registered
signal handlers for [TERM, HUP, INT]
15/03/24 04:36:41 INFO spark.SecurityManager: Changing view acls to:
dvasthimal
15/03/24 04:36:41 INFO spark.SecurityManager: Changing modify acls to:
dvasthimal
15/03/24 04:36:41 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(dvasthimal); users with modify permissions: Set(dvasthimal)
15/03/24 04:36:42 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/03/24 04:36:42 INFO Remoting: Starting remoting
15/03/24 04:36:42 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://driverPropsFetcher@SOME-HOST-NAME:51402]
15/03/24 04:36:42 INFO util.Utils: Successfully started service
'driverPropsFetcher' on port 51402.
15/03/24 04:36:42 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Shutting down remote daemon.
15/03/24 04:36:42 INFO spark.SecurityManager: Changing view acls to:
dvasthimal
15/03/24 04:36:42 INFO spark.SecurityManager: Changing modify acls to:
dvasthimal
15/03/24 04:36:42 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(dvasthimal); users with modify permissions: Set(dvasthimal)
15/03/24 04:36:42 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Remote daemon shut down; proceeding with flushing remote transports.
15/03/24 04:36:42 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Remoting shut down.
15/03/24 04:36:42 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/03/24 04:36:42 INFO Remoting: Starting remoting
15/03/24 04:36:42 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkExecutor@SOME-HOST-NAME:34891]
15/03/24 04:36:42 INFO util.Utils: Successfully started service
'sparkExecutor' on port 34891.
15/03/24 04:36:42 INFO util.AkkaUtils: Connecting to MapOutputTracker:
akka.tcp://sparkDriver@SOME-HOST-NAME:59962/user/MapOutputTracker
15/03/24 04:36:42 INFO util.AkkaUtils: Connecting to BlockManagerMaster:
akka.tcp://sparkDriver@SOME-HOST-NAME:59962/user/BlockManagerMaster
15/03/24 04:36:42 INFO storage.DiskBlockManager: Created local directory at
/hadoop/2/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-c4c6a0d4-1ef6-4432-88df-532883bf86ad
15/03/24 04:36:42 INFO storage.DiskBlockManager: Created local directory at
/hadoop/3/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-7f091d4c-4e33-4686-bb74-6bddfb042d1e
15/03/24 04:36:42 INFO storage.DiskBlockManager: Created local directory at
/hadoop/4/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-b198020a-c78c-47c8-833e-810a35468df3
15/03/24 04:36:42 INFO storage.DiskBlockManager: Created local directory at
/hadoop/5/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-78bcbd57-8439-48ee-a91a-d4f69345148c
15/03/24 04:36:42 INFO storage.DiskBlockManager: Created local directory at
/hadoop/6/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-4909b68c-32f4-4e11-b9a2-ed9964f95536
15/03/24 04:36:42 INFO storage.DiskBlockManager: Created local directory at
/hadoop/7/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-384d50c4-b5bb-4700-8368-b741dc512fa9
15/03/24 04:36:42 INFO storage.DiskBlockManager: Created local directory at
/hadoop/8/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-a0e8b330-589d-496d-a1d6-6844678c0871
15/03/24 04:36:42 INFO storage.DiskBlockManager: Created local directory at
/hadoop/9/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-6b1bd239-8c1a-4104-9d51-de789bcaa95b
15/03/24 04:36:42 INFO storage.DiskBlockManager: Created local directory at
/hadoop/10/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-31b41976-74d5-4648-ae8c-e4370a598f4e
15/03/24 04:36:42 INFO storage.DiskBlockManager: Created local directory at
/hadoop/11/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-821f9555-e390-45b9-b3c5-a2bf68f9b30e
15/03/24 04:36:42 INFO storage.DiskBlockManager: Created local directory at
/hadoop/12/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-8172b9ec-2bd2-414a-b484-6d5eb9406399
15/03/24 04:36:42 INFO storage.MemoryStore: MemoryStore started with
capacity 1060.3 MB
15/03/24 04:36:43 INFO util.AkkaUtils: Connecting to
OutputCommitCoordinator: akka.tcp://sparkDriver@SOME-HOST-NAME
:59962/user/OutputCommitCoordinator
15/03/24 04:36:43 INFO executor.CoarseGrainedExecutorBackend: Connecting to
driver: akka.tcp://sparkDriver@SOME-HOST-NAME
:59962/user/CoarseGrainedScheduler
15/03/24 04:36:43 INFO executor.CoarseGrainedExecutorBackend: Successfully
registered with driver
15/03/24 04:36:43 INFO executor.Executor: Starting executor ID 3 on host
SOME-HOST-NAME
15/03/24 04:36:43 INFO netty.NettyBlockTransferService: Server created on
56109
15/03/24 04:36:43 INFO storage.BlockManagerMaster: Trying to register
BlockManager
15/03/24 04:36:43 INFO storage.BlockManagerMaster: Registered BlockManager
15/03/24 04:36:43 INFO util.AkkaUtils: Connecting to HeartbeatReceiver:
akka.tcp://sparkDriver@SOME-HOST-NAME:59962/user/HeartbeatReceiver
15/03/24 04:37:23 ERROR executor.CoarseGrainedExecutorBackend: Driver
Disassociated [akka.tcp://sparkExecutor@SOME-HOST-NAME:34891] ->
[akka.tcp://sparkDriver@SOME-HOST-NAME:59962] disassociated! Shutting down.
15/03/24 04:37:23 WARN remote.ReliableDeliverySupervisor: Association with
remote system [akka.tcp://sparkDriver@SOME-HOST-NAME:59962] has failed,
address is now gated for [5000] ms. Reason is: [Disassociated].

LogType: stdout
LogLength: 0
Log Contents:



Container: container_1426715280024_68434_02_000005 on SOME-HOST-NAME_33114
===================================================================================================
LogType: stderr
LogLength: 6784
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/hadoop/4/scratch/local/usercache/dvasthimal/filecache/19/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/apache/hadoop-2.4.1-2.1.3.0-2-EBAY/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/03/24 04:36:43 INFO executor.CoarseGrainedExecutorBackend: Registered
signal handlers for [TERM, HUP, INT]
15/03/24 04:36:44 INFO spark.SecurityManager: Changing view acls to:
dvasthimal
15/03/24 04:36:44 INFO spark.SecurityManager: Changing modify acls to:
dvasthimal
15/03/24 04:36:44 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(dvasthimal); users with modify permissions: Set(dvasthimal)
15/03/24 04:36:44 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/03/24 04:36:44 INFO Remoting: Starting remoting
15/03/24 04:36:45 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://driverPropsFetcher@SOME-HOST-NAME:33922]
15/03/24 04:36:45 INFO util.Utils: Successfully started service
'driverPropsFetcher' on port 33922.
15/03/24 04:36:45 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Shutting down remote daemon.
15/03/24 04:36:45 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Remote daemon shut down; proceeding with flushing remote transports.
15/03/24 04:36:45 INFO spark.SecurityManager: Changing view acls to:
dvasthimal
15/03/24 04:36:45 INFO spark.SecurityManager: Changing modify acls to:
dvasthimal
15/03/24 04:36:45 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(dvasthimal); users with modify permissions: Set(dvasthimal)
15/03/24 04:36:45 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/03/24 04:36:45 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Remoting shut down.
15/03/24 04:36:45 INFO Remoting: Starting remoting
15/03/24 04:36:45 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkExecutor@SOME-HOST-NAME:39571]
15/03/24 04:36:45 INFO util.Utils: Successfully started service
'sparkExecutor' on port 39571.
15/03/24 04:36:45 INFO util.AkkaUtils: Connecting to MapOutputTracker:
akka.tcp://sparkDriver@SOME-HOST-NAME:59962/user/MapOutputTracker
15/03/24 04:36:45 INFO util.AkkaUtils: Connecting to BlockManagerMaster:
akka.tcp://sparkDriver@SOME-HOST-NAME:59962/user/BlockManagerMaster
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/1/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-b505cf2d-0546-4dfe-8fb5-83493f38a684
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/2/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-57d6e2b7-a333-4827-9be3-d2abfa9b5005
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/3/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-579bc30c-595d-47b9-8326-d9204c692d73
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/4/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-3cb81fa6-8e36-4164-bd7c-d85b879891ff
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/5/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-997e120c-7713-4f8f-a78a-7a90772536c7
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/6/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-7b2873b4-b7fd-477c-86be-3b66ce26e36f
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/7/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-6a07a09e-2178-4c57-a96d-eac271330466
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/8/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-466f74cc-595b-405f-bfaf-76c072d3fa75
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/9/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-698b20cf-eb3a-434f-b02d-58791618aa7b
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/10/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-26d6fa34-794c-4c9f-bc38-79d9eb1523b0
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/11/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-e0bc93ff-70f2-4830-835e-9471d4263b00
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/12/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-b6b5ff23-317e-4c3a-a5a4-b136f1d9a5b8
15/03/24 04:36:45 INFO storage.MemoryStore: MemoryStore started with
capacity 1060.3 MB
15/03/24 04:36:46 INFO util.AkkaUtils: Connecting to
OutputCommitCoordinator: akka.tcp://sparkDriver@SOME-HOST-NAME
:59962/user/OutputCommitCoordinator
15/03/24 04:36:46 INFO executor.CoarseGrainedExecutorBackend: Connecting to
driver: akka.tcp://sparkDriver@SOME-HOST-NAME
:59962/user/CoarseGrainedScheduler
15/03/24 04:36:46 INFO executor.CoarseGrainedExecutorBackend: Successfully
registered with driver
15/03/24 04:36:46 INFO executor.Executor: Starting executor ID 2 on host
SOME-HOST-NAME
15/03/24 04:36:46 INFO netty.NettyBlockTransferService: Server created on
40478
15/03/24 04:36:46 INFO storage.BlockManagerMaster: Trying to register
BlockManager
15/03/24 04:36:46 INFO storage.BlockManagerMaster: Registered BlockManager
15/03/24 04:36:46 INFO util.AkkaUtils: Connecting to HeartbeatReceiver:
akka.tcp://sparkDriver@SOME-HOST-NAME:59962/user/HeartbeatReceiver
15/03/24 04:37:23 ERROR executor.CoarseGrainedExecutorBackend: Driver
Disassociated [akka.tcp://sparkExecutor@SOME-HOST-NAME:39571] ->
[akka.tcp://sparkDriver@SOME-HOST-NAME:59962] disassociated! Shutting down.
15/03/24 04:37:23 WARN remote.ReliableDeliverySupervisor: Association with
remote system [akka.tcp://sparkDriver@SOME-HOST-NAME:59962] has failed,
address is now gated for [5000] ms. Reason is: [Disassociated].

LogType: stdout
LogLength: 0
Log Contents:



Container: container_1426715280024_68434_02_000004 on SOME-HOST-NAME_34510
===================================================================================================
LogType: stderr
LogLength: 6785
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/hadoop/11/scratch/local/usercache/dvasthimal/filecache/13/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/apache/hadoop-2.4.1-2.1.3.0-2-EBAY/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/03/24 04:36:41 INFO executor.CoarseGrainedExecutorBackend: Registered
signal handlers for [TERM, HUP, INT]
15/03/24 04:36:42 INFO spark.SecurityManager: Changing view acls to:
dvasthimal
15/03/24 04:36:42 INFO spark.SecurityManager: Changing modify acls to:
dvasthimal
15/03/24 04:36:42 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(dvasthimal); users with modify permissions: Set(dvasthimal)
15/03/24 04:36:44 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/03/24 04:36:44 INFO Remoting: Starting remoting
15/03/24 04:36:44 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://driverPropsFetcher@SOME-HOST-NAME:47070]
15/03/24 04:36:44 INFO util.Utils: Successfully started service
'driverPropsFetcher' on port 47070.
15/03/24 04:36:45 INFO spark.SecurityManager: Changing view acls to:
dvasthimal
15/03/24 04:36:45 INFO spark.SecurityManager: Changing modify acls to:
dvasthimal
15/03/24 04:36:45 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(dvasthimal); users with modify permissions: Set(dvasthimal)
15/03/24 04:36:45 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Shutting down remote daemon.
15/03/24 04:36:45 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Remote daemon shut down; proceeding with flushing remote transports.
15/03/24 04:36:45 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/03/24 04:36:45 INFO Remoting: Starting remoting
15/03/24 04:36:45 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Remoting shut down.
15/03/24 04:36:45 INFO util.Utils: Successfully started service
'sparkExecutor' on port 60898.
15/03/24 04:36:45 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkExecutor@SOME-HOST-NAME:60898]
15/03/24 04:36:45 INFO util.AkkaUtils: Connecting to MapOutputTracker:
akka.tcp://sparkDriver@SOME-HOST-NAME:59962/user/MapOutputTracker
15/03/24 04:36:45 INFO util.AkkaUtils: Connecting to BlockManagerMaster:
akka.tcp://sparkDriver@SOME-HOST-NAME:59962/user/BlockManagerMaster
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/1/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-e9292d93-1f34-44fa-a4f3-de1060ce748c
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/2/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-85969064-c151-4a47-adc0-920ceb452370
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/3/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-ebfaa9ad-17db-44d4-9ce5-bb03a5ba14e1
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/4/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-da4fe676-b7f7-4b91-8853-2167691fd40a
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/5/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-4ee40d66-6b0b-484e-aca2-af1e6713052c
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/6/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-594e63c9-6813-45d3-871c-f5095da1aed9
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/7/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-2db1879f-481c-48ec-aaba-1b0df4351cf6
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/8/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-f12a2e6e-416e-447c-9e32-764efbc4344f
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/9/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-8d148837-256e-4196-8160-c921cf8ebc4f
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/10/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-bcb9d5c4-2de8-4c63-af43-1dcb5b79c5ba
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/11/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-3aa57718-99f4-4e53-9299-e784780d1be3
15/03/24 04:36:45 INFO storage.DiskBlockManager: Created local directory at
/hadoop/12/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-dd3f8181-6a5a-4fbf-aba6-3a0756a8c840
15/03/24 04:36:45 INFO storage.MemoryStore: MemoryStore started with
capacity 1060.3 MB
15/03/24 04:36:46 INFO util.AkkaUtils: Connecting to
OutputCommitCoordinator: akka.tcp://sparkDriver@SOME-HOST-NAME
:59962/user/OutputCommitCoordinator
15/03/24 04:36:46 INFO executor.CoarseGrainedExecutorBackend: Connecting to
driver: akka.tcp://sparkDriver@SOME-HOST-NAME
:59962/user/CoarseGrainedScheduler
15/03/24 04:36:46 INFO executor.CoarseGrainedExecutorBackend: Successfully
registered with driver
15/03/24 04:36:46 INFO executor.Executor: Starting executor ID 1 on host
SOME-HOST-NAME
15/03/24 04:36:47 INFO netty.NettyBlockTransferService: Server created on
52674
15/03/24 04:36:47 INFO storage.BlockManagerMaster: Trying to register
BlockManager
15/03/24 04:36:47 INFO storage.BlockManagerMaster: Registered BlockManager
15/03/24 04:36:47 INFO util.AkkaUtils: Connecting to HeartbeatReceiver:
akka.tcp://sparkDriver@SOME-HOST-NAME:59962/user/HeartbeatReceiver
15/03/24 04:37:23 ERROR executor.CoarseGrainedExecutorBackend: Driver
Disassociated [akka.tcp://sparkExecutor@SOME-HOST-NAME:60898] ->
[akka.tcp://sparkDriver@SOME-HOST-NAME:59962] disassociated! Shutting down.
15/03/24 04:37:23 WARN remote.ReliableDeliverySupervisor: Association with
remote system [akka.tcp://sparkDriver@SOME-HOST-NAME:59962] has failed,
address is now gated for [5000] ms. Reason is: [Disassociated].

LogType: stdout
LogLength: 0
Log Contents:



Container: container_1426715280024_68434_02_000002 on SOME-HOST-NAME_53429
===================================================================================================
LogType: stderr
LogLength: 29235
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/hadoop/11/scratch/local/usercache/dvasthimal/filecache/13/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/apache/hadoop-2.4.1-2.1.3.0-2-EBAY/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/03/24 04:36:24 INFO yarn.ApplicationMaster: Registered signal handlers
for [TERM, HUP, INT]
15/03/24 04:36:25 INFO yarn.ApplicationMaster: ApplicationAttemptId:
appattempt_1426715280024_68434_000002
15/03/24 04:36:26 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
15/03/24 04:36:26 WARN hdfs.BlockReaderLocal: The short-circuit local reads
feature cannot be used because libhadoop cannot be loaded.
15/03/24 04:36:26 INFO spark.SecurityManager: Changing view acls to:
dvasthimal
15/03/24 04:36:26 INFO spark.SecurityManager: Changing modify acls to:
dvasthimal
15/03/24 04:36:26 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(dvasthimal); users with modify permissions: Set(dvasthimal)
15/03/24 04:36:26 INFO yarn.ApplicationMaster: Starting the user
application in a separate Thread
15/03/24 04:36:26 INFO yarn.ApplicationMaster: Waiting for spark context
initialization
15/03/24 04:36:26 INFO yarn.ApplicationMaster: Waiting for spark context
initialization ...
15/03/24 04:36:26 INFO spark.SparkContext: Running Spark version 1.3.0
15/03/24 04:36:26 INFO spark.SecurityManager: Changing view acls to:
dvasthimal
15/03/24 04:36:26 INFO spark.SecurityManager: Changing modify acls to:
dvasthimal
15/03/24 04:36:26 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(dvasthimal); users with modify permissions: Set(dvasthimal)
15/03/24 04:36:27 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/03/24 04:36:27 INFO Remoting: Starting remoting
15/03/24 04:36:27 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkDriver@SOME-HOST-NAME:59962]
15/03/24 04:36:27 INFO util.Utils: Successfully started service
'sparkDriver' on port 59962.
15/03/24 04:36:27 INFO spark.SparkEnv: Registering MapOutputTracker
15/03/24 04:36:27 INFO spark.SparkEnv: Registering BlockManagerMaster
15/03/24 04:36:27 INFO storage.DiskBlockManager: Created local directory at
/hadoop/1/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-f0f33b1c-b4af-4436-87b3-744510f8e487
15/03/24 04:36:27 INFO storage.DiskBlockManager: Created local directory at
/hadoop/2/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-ae03be01-c037-4ac3-bdd3-94b461947da0
15/03/24 04:36:27 INFO storage.DiskBlockManager: Created local directory at
/hadoop/3/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-d618a72c-7b01-4dbc-bb2d-8bf76c96dee8
15/03/24 04:36:27 INFO storage.DiskBlockManager: Created local directory at
/hadoop/4/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-0cb9d815-1cb0-4293-b6fc-cabc921b694a
15/03/24 04:36:27 INFO storage.DiskBlockManager: Created local directory at
/hadoop/5/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-3ccc3e4b-e92a-40bd-9b75-de34db2849a6
15/03/24 04:36:27 INFO storage.DiskBlockManager: Created local directory at
/hadoop/6/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-ce771408-47a4-42d3-8356-148ac8d1a5e8
15/03/24 04:36:27 INFO storage.DiskBlockManager: Created local directory at
/hadoop/7/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-ef71328f-c9e9-4a23-a076-b11e42a1b3df
15/03/24 04:36:27 INFO storage.DiskBlockManager: Created local directory at
/hadoop/8/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-4665e7e8-0e74-4b42-8c2e-302a7e691c3f
15/03/24 04:36:27 INFO storage.DiskBlockManager: Created local directory at
/hadoop/9/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-90d60981-c48a-4767-b60d-ee2401316e5f
15/03/24 04:36:27 INFO storage.DiskBlockManager: Created local directory at
/hadoop/11/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-544d5887-8149-4e7d-9d45-dfb16fd6f9a3
15/03/24 04:36:27 INFO storage.DiskBlockManager: Created local directory at
/hadoop/12/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-4fe015a6-3795-4abd-9c6e-9086a73d6916
15/03/24 04:36:27 INFO storage.MemoryStore: MemoryStore started with
capacity 3.8 GB
15/03/24 04:36:28 INFO spark.HttpFileServer: HTTP File server directory is
/hadoop/1/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/httpd-27a7058f-29c5-4565-bfff-2a7ff5167666
15/03/24 04:36:28 INFO spark.HttpServer: Starting HTTP Server
15/03/24 04:36:28 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/03/24 04:36:28 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0:46374
15/03/24 04:36:28 INFO util.Utils: Successfully started service 'HTTP file
server' on port 46374.
15/03/24 04:36:28 INFO spark.SparkEnv: Registering OutputCommitCoordinator
15/03/24 04:36:28 INFO ui.JettyUtils: Adding filter:
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
15/03/24 04:36:28 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/03/24 04:36:28 INFO server.AbstractConnector: Started
SelectChannelConnector@0.0.0.0:53028
15/03/24 04:36:28 INFO util.Utils: Successfully started service 'SparkUI'
on port 53028.
15/03/24 04:36:28 INFO ui.SparkUI: Started SparkUI at
http://SOME-HOST-NAME:53028
15/03/24 04:36:28 INFO cluster.YarnClusterScheduler: Created
YarnClusterScheduler
15/03/24 04:36:29 INFO netty.NettyBlockTransferService: Server created on
39754
15/03/24 04:36:29 INFO storage.BlockManagerMaster: Trying to register
BlockManager
15/03/24 04:36:29 INFO storage.BlockManagerMasterActor: Registering block
manager SOME-HOST-NAME:39754 with 3.8 GB RAM, BlockManagerId(<driver>,
SOME-HOST-NAME, 39754)
15/03/24 04:36:29 INFO storage.BlockManagerMaster: Registered BlockManager
15/03/24 04:36:29 INFO yarn.ApplicationMaster: Listen to driver:
akka.tcp://sparkDriver@SOME-HOST-NAME:59962/user/YarnScheduler
15/03/24 04:36:29 INFO cluster.YarnClusterSchedulerBackend:
ApplicationMaster registered as
Actor[akka://sparkDriver/user/YarnAM#-1547399133]
15/03/24 04:36:29 INFO yarn.YarnRMClient: Registering the ApplicationMaster
15/03/24 04:36:29 INFO yarn.YarnAllocator: Will request 3 executor
containers, each with 1 cores and 2432 MB memory including 384 MB overhead
15/03/24 04:36:29 INFO yarn.YarnAllocator: Container request (host: Any,
capability: <memory:2432, vCores:1>)
15/03/24 04:36:29 INFO yarn.YarnAllocator: Container request (host: Any,
capability: <memory:2432, vCores:1>)
15/03/24 04:36:29 INFO yarn.YarnAllocator: Container request (host: Any,
capability: <memory:2432, vCores:1>)
15/03/24 04:36:29 INFO yarn.ApplicationMaster: Started progress reporter
thread - sleep time : 5000
15/03/24 04:36:29 INFO impl.AMRMClientImpl: Received new token for :
SOME-HOST-NAME:34510
15/03/24 04:36:29 INFO yarn.YarnAllocator: Launching container
container_1426715280024_68434_02_000004 for on host SOME-HOST-NAME
15/03/24 04:36:29 INFO yarn.YarnAllocator: Launching ExecutorRunnable.
driverUrl: akka.tcp://sparkDriver@SOME-HOST-NAME:59962/user/CoarseGrainedScheduler,
 executorHostname: SOME-HOST-NAME
15/03/24 04:36:29 INFO yarn.ExecutorRunnable: Starting Executor Container
15/03/24 04:36:29 INFO yarn.YarnAllocator: Received 1 containers from YARN,
launching executors on 1 of them.
15/03/24 04:36:29 INFO impl.ContainerManagementProtocolProxy:
yarn.client.max-nodemanagers-proxies : 500
15/03/24 04:36:29 INFO yarn.ExecutorRunnable: Setting up
ContainerLaunchContext
15/03/24 04:36:29 INFO yarn.ExecutorRunnable: Preparing Local resources
15/03/24 04:36:29 INFO yarn.ExecutorRunnable: Prepared Local resources
Map(__app__.jar -> resource { scheme: "hdfs" host: "NN-HOST-NAME" port:
8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark_reporting-1.0-SNAPSHOT.jar"
} size: 144485 timestamp: 1427196892676 type: FILE visibility: PRIVATE,
__spark__.jar -> resource { scheme: "hdfs" host: "NN-HOST-NAME" port: 8020
file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-assembly-1.3.0-hadoop2.4.0.jar"
} size: 159319006 timestamp: 1427196892503 type: FILE visibility: PRIVATE,
datanucleus-core-3.2.10.jar -> resource { scheme: "hdfs" host:
"NN-HOST-NAME" port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-core-3.2.10.jar"
} size: 1890075 timestamp: 1427196892807 type: FILE visibility: PRIVATE,
datanucleus-api-jdo-3.2.6.jar -> resource { scheme: "hdfs" host:
"NN-HOST-NAME" port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-api-jdo-3.2.6.jar"
} size: 339666 timestamp: 1427196892748 type: FILE visibility: PRIVATE,
spark-avro_2.10-1.0.0.jar -> resource { scheme: "hdfs" host: "NN-HOST-NAME"
port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-avro_2.10-1.0.0.jar"
} size: 73413 timestamp: 1427196892708 type: FILE visibility: PRIVATE,
datanucleus-rdbms-3.2.9.jar -> resource { scheme: "hdfs" host:
"NN-HOST-NAME" port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-rdbms-3.2.9.jar"
} size: 1809447 timestamp: 1427196892892 type: FILE visibility: PRIVATE)
15/03/24 04:36:29 INFO yarn.ExecutorRunnable: Setting up executor with
environment: Map(CLASSPATH ->
{{PWD}}<CPS>{{PWD}}/__spark__.jar<CPS>$HADOOP_CONF_DIR<CPS>$HADOOP_COMMON_HOME/share/hadoop/common/*<CPS>$HADOOP_COMMON_HOME/share/hadoop/common/lib/*<CPS>$HADOOP_HDFS_HOME/share/hadoop/hdfs/*<CPS>$HADOOP_HDFS_HOME/share/hadoop/hdfs/lib/*<CPS>$HADOOP_YARN_HOME/share/hadoop/yarn/*<CPS>$HADOOP_YARN_HOME/share/hadoop/yarn/lib/*<CPS>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*<CPS>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*,
SPARK_LOG_URL_STDERR ->
http://SOME-HOST-NAME:50060/node/containerlogs/container_1426715280024_68434_02_000004/dvasthimal/stderr?start=0,
SPARK_YARN_STAGING_DIR -> .sparkStaging/application_1426715280024_68434,
SPARK_YARN_CACHE_FILES_FILE_SIZES ->
159319006,144485,73413,339666,1890075,1809447, SPARK_USER -> dvasthimal,
SPARK_YARN_CACHE_FILES_VISIBILITIES ->
PRIVATE,PRIVATE,PRIVATE,PRIVATE,PRIVATE,PRIVATE, SPARK_YARN_MODE -> true,
SPARK_YARN_CACHE_FILES_TIME_STAMPS ->
1427196892503,1427196892676,1427196892708,1427196892748,1427196892807,1427196892892,
SPARK_LOG_URL_STDOUT ->
http://SOME-HOST-NAME:50060/node/containerlogs/container_1426715280024_68434_02_000004/dvasthimal/stdout?start=0,
SPARK_YARN_CACHE_FILES ->
hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-assembly-1.3.0-hadoop2.4.0.jar#__spark__.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark_reporting-1.0-SNAPSHOT.jar#__app__.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-avro_2.10-1.0.0.jar#spark-avro_2.10-1.0.0.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-api-jdo-3.2.6.jar#datanucleus-api-jdo-3.2.6.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-core-3.2.10.jar#datanucleus-core-3.2.10.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-rdbms-3.2.9.jar#datanucleus-rdbms-3.2.9.jar)
15/03/24 04:36:29 INFO yarn.ExecutorRunnable: Setting up executor with
commands: List({{JAVA_HOME}}/bin/java, -server,
-XX:OnOutOfMemoryError='kill %p', -Xms2048m, -Xmx2048m,
-Djava.io.tmpdir={{PWD}}/tmp, '-Dspark.ui.port=0',
'-Dspark.driver.port=59962', -Dspark.yarn.app.container.log.dir=<LOG_DIR>,
org.apache.spark.executor.CoarseGrainedExecutorBackend, --driver-url,
akka.tcp://sparkDriver@SOME-HOST-NAME:59962/user/CoarseGrainedScheduler,
--executor-id, 1, --hostname, SOME-HOST-NAME, --cores, 1, --app-id,
application_1426715280024_68434, --user-class-path, file:$PWD/__app__.jar,
--user-class-path, file:$PWD/spark-avro_2.10-1.0.0.jar, --user-class-path,
file:$PWD/datanucleus-api-jdo-3.2.6.jar, --user-class-path,
file:$PWD/datanucleus-core-3.2.10.jar, --user-class-path,
file:$PWD/datanucleus-rdbms-3.2.9.jar, 1>, <LOG_DIR>/stdout, 2>,
<LOG_DIR>/stderr)
15/03/24 04:36:29 INFO impl.ContainerManagementProtocolProxy: Opening proxy
: SOME-HOST-NAME:34510
15/03/24 04:36:34 INFO impl.AMRMClientImpl: Received new token for :
SOME-HOST-NAME:33114
15/03/24 04:36:34 INFO impl.AMRMClientImpl: Received new token for :
SOME-HOST-NAME:35688
15/03/24 04:36:34 INFO yarn.YarnAllocator: Launching container
container_1426715280024_68434_02_000005 for on host SOME-HOST-NAME
15/03/24 04:36:34 INFO yarn.YarnAllocator: Launching ExecutorRunnable.
driverUrl: akka.tcp://sparkDriver@SOME-HOST-NAME:59962/user/CoarseGrainedScheduler,
 executorHostname: SOME-HOST-NAME
15/03/24 04:36:34 INFO yarn.YarnAllocator: Launching container
container_1426715280024_68434_02_000008 for on host SOME-HOST-NAME
15/03/24 04:36:34 INFO yarn.YarnAllocator: Launching ExecutorRunnable.
driverUrl: akka.tcp://sparkDriver@SOME-HOST-NAME:59962/user/CoarseGrainedScheduler,
 executorHostname: SOME-HOST-NAME
15/03/24 04:36:34 INFO yarn.YarnAllocator: Received 2 containers from YARN,
launching executors on 2 of them.
15/03/24 04:36:34 INFO yarn.ExecutorRunnable: Starting Executor Container
15/03/24 04:36:34 INFO yarn.ExecutorRunnable: Starting Executor Container
15/03/24 04:36:34 INFO impl.ContainerManagementProtocolProxy:
yarn.client.max-nodemanagers-proxies : 500
15/03/24 04:36:34 INFO impl.ContainerManagementProtocolProxy:
yarn.client.max-nodemanagers-proxies : 500
15/03/24 04:36:34 INFO yarn.ExecutorRunnable: Setting up
ContainerLaunchContext
15/03/24 04:36:34 INFO yarn.ExecutorRunnable: Setting up
ContainerLaunchContext
15/03/24 04:36:34 INFO yarn.ExecutorRunnable: Preparing Local resources
15/03/24 04:36:34 INFO yarn.ExecutorRunnable: Preparing Local resources
15/03/24 04:36:34 INFO yarn.ExecutorRunnable: Prepared Local resources
Map(__app__.jar -> resource { scheme: "hdfs" host: "NN-HOST-NAME" port:
8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark_reporting-1.0-SNAPSHOT.jar"
} size: 144485 timestamp: 1427196892676 type: FILE visibility: PRIVATE,
__spark__.jar -> resource { scheme: "hdfs" host: "NN-HOST-NAME" port: 8020
file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-assembly-1.3.0-hadoop2.4.0.jar"
} size: 159319006 timestamp: 1427196892503 type: FILE visibility: PRIVATE,
datanucleus-core-3.2.10.jar -> resource { scheme: "hdfs" host:
"NN-HOST-NAME" port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-core-3.2.10.jar"
} size: 1890075 timestamp: 1427196892807 type: FILE visibility: PRIVATE,
datanucleus-api-jdo-3.2.6.jar -> resource { scheme: "hdfs" host:
"NN-HOST-NAME" port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-api-jdo-3.2.6.jar"
} size: 339666 timestamp: 1427196892748 type: FILE visibility: PRIVATE,
spark-avro_2.10-1.0.0.jar -> resource { scheme: "hdfs" host: "NN-HOST-NAME"
port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-avro_2.10-1.0.0.jar"
} size: 73413 timestamp: 1427196892708 type: FILE visibility: PRIVATE,
datanucleus-rdbms-3.2.9.jar -> resource { scheme: "hdfs" host:
"NN-HOST-NAME" port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-rdbms-3.2.9.jar"
} size: 1809447 timestamp: 1427196892892 type: FILE visibility: PRIVATE)
15/03/24 04:36:34 INFO yarn.ExecutorRunnable: Prepared Local resources
Map(__app__.jar -> resource { scheme: "hdfs" host: "NN-HOST-NAME" port:
8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark_reporting-1.0-SNAPSHOT.jar"
} size: 144485 timestamp: 1427196892676 type: FILE visibility: PRIVATE,
__spark__.jar -> resource { scheme: "hdfs" host: "NN-HOST-NAME" port: 8020
file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-assembly-1.3.0-hadoop2.4.0.jar"
} size: 159319006 timestamp: 1427196892503 type: FILE visibility: PRIVATE,
datanucleus-core-3.2.10.jar -> resource { scheme: "hdfs" host:
"NN-HOST-NAME" port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-core-3.2.10.jar"
} size: 1890075 timestamp: 1427196892807 type: FILE visibility: PRIVATE,
datanucleus-api-jdo-3.2.6.jar -> resource { scheme: "hdfs" host:
"NN-HOST-NAME" port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-api-jdo-3.2.6.jar"
} size: 339666 timestamp: 1427196892748 type: FILE visibility: PRIVATE,
spark-avro_2.10-1.0.0.jar -> resource { scheme: "hdfs" host: "NN-HOST-NAME"
port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-avro_2.10-1.0.0.jar"
} size: 73413 timestamp: 1427196892708 type: FILE visibility: PRIVATE,
datanucleus-rdbms-3.2.9.jar -> resource { scheme: "hdfs" host:
"NN-HOST-NAME" port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-rdbms-3.2.9.jar"
} size: 1809447 timestamp: 1427196892892 type: FILE visibility: PRIVATE)
15/03/24 04:36:34 INFO yarn.ExecutorRunnable: Setting up executor with
environment: Map(CLASSPATH ->
{{PWD}}<CPS>{{PWD}}/__spark__.jar<CPS>$HADOOP_CONF_DIR<CPS>$HADOOP_COMMON_HOME/share/hadoop/common/*<CPS>$HADOOP_COMMON_HOME/share/hadoop/common/lib/*<CPS>$HADOOP_HDFS_HOME/share/hadoop/hdfs/*<CPS>$HADOOP_HDFS_HOME/share/hadoop/hdfs/lib/*<CPS>$HADOOP_YARN_HOME/share/hadoop/yarn/*<CPS>$HADOOP_YARN_HOME/share/hadoop/yarn/lib/*<CPS>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*<CPS>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*,
SPARK_LOG_URL_STDERR ->
http://SOME-HOST-NAME:50060/node/containerlogs/container_1426715280024_68434_02_000008/dvasthimal/stderr?start=0,
SPARK_YARN_STAGING_DIR -> .sparkStaging/application_1426715280024_68434,
SPARK_YARN_CACHE_FILES_FILE_SIZES ->
159319006,144485,73413,339666,1890075,1809447, SPARK_USER -> dvasthimal,
SPARK_YARN_CACHE_FILES_VISIBILITIES ->
PRIVATE,PRIVATE,PRIVATE,PRIVATE,PRIVATE,PRIVATE, SPARK_YARN_MODE -> true,
SPARK_YARN_CACHE_FILES_TIME_STAMPS ->
1427196892503,1427196892676,1427196892708,1427196892748,1427196892807,1427196892892,
SPARK_LOG_URL_STDOUT ->
http://SOME-HOST-NAME:50060/node/containerlogs/container_1426715280024_68434_02_000008/dvasthimal/stdout?start=0,
SPARK_YARN_CACHE_FILES ->
hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-assembly-1.3.0-hadoop2.4.0.jar#__spark__.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark_reporting-1.0-SNAPSHOT.jar#__app__.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-avro_2.10-1.0.0.jar#spark-avro_2.10-1.0.0.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-api-jdo-3.2.6.jar#datanucleus-api-jdo-3.2.6.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-core-3.2.10.jar#datanucleus-core-3.2.10.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-rdbms-3.2.9.jar#datanucleus-rdbms-3.2.9.jar)
15/03/24 04:36:34 INFO yarn.ExecutorRunnable: Setting up executor with
commands: List({{JAVA_HOME}}/bin/java, -server,
-XX:OnOutOfMemoryError='kill %p', -Xms2048m, -Xmx2048m,
-Djava.io.tmpdir={{PWD}}/tmp, '-Dspark.ui.port=0',
'-Dspark.driver.port=59962', -Dspark.yarn.app.container.log.dir=<LOG_DIR>,
org.apache.spark.executor.CoarseGrainedExecutorBackend, --driver-url,
akka.tcp://sparkDriver@SOME-HOST-NAME:59962/user/CoarseGrainedScheduler,
--executor-id, 3, --hostname, SOME-HOST-NAME, --cores, 1, --app-id,
application_1426715280024_68434, --user-class-path, file:$PWD/__app__.jar,
--user-class-path, file:$PWD/spark-avro_2.10-1.0.0.jar, --user-class-path,
file:$PWD/datanucleus-api-jdo-3.2.6.jar, --user-class-path,
file:$PWD/datanucleus-core-3.2.10.jar, --user-class-path,
file:$PWD/datanucleus-rdbms-3.2.9.jar, 1>, <LOG_DIR>/stdout, 2>,
<LOG_DIR>/stderr)
15/03/24 04:36:34 INFO impl.ContainerManagementProtocolProxy: Opening proxy
: SOME-HOST-NAME:35688
15/03/24 04:36:34 INFO yarn.ExecutorRunnable: Setting up executor with
environment: Map(CLASSPATH ->
{{PWD}}<CPS>{{PWD}}/__spark__.jar<CPS>$HADOOP_CONF_DIR<CPS>$HADOOP_COMMON_HOME/share/hadoop/common/*<CPS>$HADOOP_COMMON_HOME/share/hadoop/common/lib/*<CPS>$HADOOP_HDFS_HOME/share/hadoop/hdfs/*<CPS>$HADOOP_HDFS_HOME/share/hadoop/hdfs/lib/*<CPS>$HADOOP_YARN_HOME/share/hadoop/yarn/*<CPS>$HADOOP_YARN_HOME/share/hadoop/yarn/lib/*<CPS>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*<CPS>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*,
SPARK_LOG_URL_STDERR ->
http://SOME-HOST-NAME:50060/node/containerlogs/container_1426715280024_68434_02_000005/dvasthimal/stderr?start=0,
SPARK_YARN_STAGING_DIR -> .sparkStaging/application_1426715280024_68434,
SPARK_YARN_CACHE_FILES_FILE_SIZES ->
159319006,144485,73413,339666,1890075,1809447, SPARK_USER -> dvasthimal,
SPARK_YARN_CACHE_FILES_VISIBILITIES ->
PRIVATE,PRIVATE,PRIVATE,PRIVATE,PRIVATE,PRIVATE, SPARK_YARN_MODE -> true,
SPARK_YARN_CACHE_FILES_TIME_STAMPS ->
1427196892503,1427196892676,1427196892708,1427196892748,1427196892807,1427196892892,
SPARK_LOG_URL_STDOUT ->
http://SOME-HOST-NAME:50060/node/containerlogs/container_1426715280024_68434_02_000005/dvasthimal/stdout?start=0,
SPARK_YARN_CACHE_FILES ->
hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-assembly-1.3.0-hadoop2.4.0.jar#__spark__.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark_reporting-1.0-SNAPSHOT.jar#__app__.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-avro_2.10-1.0.0.jar#spark-avro_2.10-1.0.0.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-api-jdo-3.2.6.jar#datanucleus-api-jdo-3.2.6.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-core-3.2.10.jar#datanucleus-core-3.2.10.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-rdbms-3.2.9.jar#datanucleus-rdbms-3.2.9.jar)
15/03/24 04:36:34 INFO yarn.ExecutorRunnable: Setting up executor with
commands: List({{JAVA_HOME}}/bin/java, -server,
-XX:OnOutOfMemoryError='kill %p', -Xms2048m, -Xmx2048m,
-Djava.io.tmpdir={{PWD}}/tmp, '-Dspark.ui.port=0',
'-Dspark.driver.port=59962', -Dspark.yarn.app.container.log.dir=<LOG_DIR>,
org.apache.spark.executor.CoarseGrainedExecutorBackend, --driver-url,
akka.tcp://sparkDriver@SOME-HOST-NAME:59962/user/CoarseGrainedScheduler,
--executor-id, 2, --hostname, SOME-HOST-NAME, --cores, 1, --app-id,
application_1426715280024_68434, --user-class-path, file:$PWD/__app__.jar,
--user-class-path, file:$PWD/spark-avro_2.10-1.0.0.jar, --user-class-path,
file:$PWD/datanucleus-api-jdo-3.2.6.jar, --user-class-path,
file:$PWD/datanucleus-core-3.2.10.jar, --user-class-path,
file:$PWD/datanucleus-rdbms-3.2.9.jar, 1>, <LOG_DIR>/stdout, 2>,
<LOG_DIR>/stderr)
15/03/24 04:36:34 INFO impl.ContainerManagementProtocolProxy: Opening proxy
: SOME-HOST-NAME:33114
15/03/24 04:36:39 INFO impl.AMRMClientImpl: Received new token for :
SOME-HOST-NAME:54087
15/03/24 04:36:39 INFO yarn.YarnAllocator: Received 1 containers from YARN,
launching executors on 0 of them.
15/03/24 04:36:43 INFO cluster.YarnClusterSchedulerBackend: Registered
executor: Actor[akka.tcp://sparkExecutor@SOME-HOST-NAME:34891/user/Executor#868744931]
with ID 3
15/03/24 04:36:43 INFO storage.BlockManagerMasterActor: Registering block
manager SOME-HOST-NAME:56109 with 1060.3 MB RAM, BlockManagerId(3,
SOME-HOST-NAME, 56109)
15/03/24 04:36:46 INFO cluster.YarnClusterSchedulerBackend: Registered
executor: Actor[akka.tcp://sparkExecutor@SOME-HOST-NAME:39571/user/Executor#-859976043]
with ID 2
15/03/24 04:36:46 INFO storage.BlockManagerMasterActor: Registering block
manager SOME-HOST-NAME:40478 with 1060.3 MB RAM, BlockManagerId(2,
SOME-HOST-NAME, 40478)
15/03/24 04:36:46 INFO cluster.YarnClusterSchedulerBackend: Registered
executor: Actor[akka.tcp://sparkExecutor@SOME-HOST-NAME:60898/user/Executor#1920659669]
with ID 1
15/03/24 04:36:46 INFO cluster.YarnClusterSchedulerBackend:
SchedulerBackend is ready for scheduling beginning after reached
minRegisteredResourcesRatio: 0.8
15/03/24 04:36:46 INFO cluster.YarnClusterScheduler:
YarnClusterScheduler.postStartHook done
15/03/24 04:36:47 INFO storage.BlockManagerMasterActor: Registering block
manager SOME-HOST-NAME:52674 with 1060.3 MB RAM, BlockManagerId(1,
SOME-HOST-NAME, 52674)
15/03/24 04:36:49 INFO metastore.HiveMetaStore: 0: Opening raw store with
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
15/03/24 04:36:49 INFO metastore.ObjectStore: ObjectStore, initialize called
15/03/24 04:36:50 INFO DataNucleus.Persistence: Property
datanucleus.cache.level2 unknown - will be ignored
15/03/24 04:36:50 INFO DataNucleus.Persistence: Property
hive.metastore.integral.jdo.pushdown unknown - will be ignored
15/03/24 04:37:01 INFO metastore.ObjectStore: Setting MetaStore object pin
classes with
hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
15/03/24 04:37:02 INFO metastore.MetaStoreDirectSql: MySQL check failed,
assuming we are not on mysql: Lexical error at line 1, column 5.
Encountered: "@" (64), after : "".
15/03/24 04:37:04 INFO DataNucleus.Datastore: The class
"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
"embedded-only" so does not have its own datastore table.
15/03/24 04:37:04 INFO DataNucleus.Datastore: The class
"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
"embedded-only" so does not have its own datastore table.
15/03/24 04:37:11 INFO DataNucleus.Datastore: The class
"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
"embedded-only" so does not have its own datastore table.
15/03/24 04:37:11 INFO DataNucleus.Datastore: The class
"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
"embedded-only" so does not have its own datastore table.
15/03/24 04:37:13 INFO metastore.ObjectStore: Initialized ObjectStore
15/03/24 04:37:13 WARN metastore.ObjectStore: Version information not found
in metastore. hive.metastore.schema.verification is not enabled so
recording the schema version 0.13.1aa
15/03/24 04:37:14 INFO metastore.HiveMetaStore: Added admin role in
metastore
15/03/24 04:37:14 INFO metastore.HiveMetaStore: Added public role in
metastore
15/03/24 04:37:15 INFO metastore.HiveMetaStore: No user is added in admin
role, since config is empty
15/03/24 04:37:15 INFO session.SessionState: No Tez session required at
this point. hive.execution.engine=mr.
15/03/24 04:37:15 INFO session.SessionState: No Tez session required at
this point. hive.execution.engine=mr.
15/03/24 04:37:15 INFO parse.ParseDriver: Parsing command: select count(*)
from success_events.sojsuccessevents1
15/03/24 04:37:17 INFO parse.ParseDriver: Parse Completed
Exception in thread "Driver"
Exception: java.lang.OutOfMemoryError thrown from the
UncaughtExceptionHandler in thread "Driver"

LogType: stdout
LogLength: 0
Log Contents:



Container: container_1426715280024_68434_01_000005 on
phxaishdc9dn1783.stratus.phx.ebay.com_52167
===================================================================================================
LogType: stderr
LogLength: 6576
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/hadoop/4/scratch/local/usercache/dvasthimal/filecache/13/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/apache/hadoop-2.4.1-2.1.3.0-2-EBAY/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/03/24 04:35:18 INFO executor.CoarseGrainedExecutorBackend: Registered
signal handlers for [TERM, HUP, INT]
15/03/24 04:35:19 INFO spark.SecurityManager: Changing view acls to:
dvasthimal
15/03/24 04:35:19 INFO spark.SecurityManager: Changing modify acls to:
dvasthimal
15/03/24 04:35:19 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(dvasthimal); users with modify permissions: Set(dvasthimal)
15/03/24 04:35:19 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/03/24 04:35:19 INFO Remoting: Starting remoting
15/03/24 04:35:20 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://driverPropsFetcher@phxaishdc9dn1783.stratus.phx.ebay.com:42126]
15/03/24 04:35:20 INFO util.Utils: Successfully started service
'driverPropsFetcher' on port 42126.
15/03/24 04:35:20 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Shutting down remote daemon.
15/03/24 04:35:20 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Remote daemon shut down; proceeding with flushing remote transports.
15/03/24 04:35:20 INFO spark.SecurityManager: Changing view acls to:
dvasthimal
15/03/24 04:35:20 INFO spark.SecurityManager: Changing modify acls to:
dvasthimal
15/03/24 04:35:20 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(dvasthimal); users with modify permissions: Set(dvasthimal)
15/03/24 04:35:20 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/03/24 04:35:20 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Remoting shut down.
15/03/24 04:35:20 INFO Remoting: Starting remoting
15/03/24 04:35:20 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkExecutor@phxaishdc9dn1783.stratus.phx.ebay.com:38557]
15/03/24 04:35:20 INFO util.Utils: Successfully started service
'sparkExecutor' on port 38557.
15/03/24 04:35:20 INFO util.AkkaUtils: Connecting to MapOutputTracker:
akka.tcp://sparkDriver@SOME-HOST-NAME:48318/user/MapOutputTracker
15/03/24 04:35:20 INFO util.AkkaUtils: Connecting to BlockManagerMaster:
akka.tcp://sparkDriver@SOME-HOST-NAME:48318/user/BlockManagerMaster
15/03/24 04:35:20 INFO storage.DiskBlockManager: Created local directory at
/hadoop/1/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-8693060d-7870-42b6-97c4-e6fdaaf41772
15/03/24 04:35:20 INFO storage.DiskBlockManager: Created local directory at
/hadoop/2/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-8e979b79-8e14-482a-b80c-2db091c0aeed
15/03/24 04:35:20 INFO storage.DiskBlockManager: Created local directory at
/hadoop/4/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-2d3c23bd-19fb-4f4f-9cf9-b7a4752a54d6
15/03/24 04:35:20 INFO storage.DiskBlockManager: Created local directory at
/hadoop/5/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-593c8c0e-79b1-494f-931a-43fa8161d63b
15/03/24 04:35:20 INFO storage.DiskBlockManager: Created local directory at
/hadoop/6/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-8ea791db-b76f-4e83-8813-cf31c8dfac05
15/03/24 04:35:20 INFO storage.DiskBlockManager: Created local directory at
/hadoop/7/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-2119a2ba-cd92-47c5-a739-57cd385c3749
15/03/24 04:35:20 INFO storage.DiskBlockManager: Created local directory at
/hadoop/8/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-0b96d8a0-2e4e-4d2e-b6a6-60a1ffabd164
15/03/24 04:35:20 INFO storage.DiskBlockManager: Created local directory at
/hadoop/9/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-773c54de-c473-4925-bced-40f346eec009
15/03/24 04:35:20 INFO storage.DiskBlockManager: Created local directory at
/hadoop/10/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-97564c24-194d-4c95-bb3c-fc3becf5ae2c
15/03/24 04:35:20 INFO storage.DiskBlockManager: Created local directory at
/hadoop/11/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-1b7777fe-f3bb-4183-ac17-cb2d558d19eb
15/03/24 04:35:20 INFO storage.DiskBlockManager: Created local directory at
/hadoop/12/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-11a6407c-780b-4f6d-899a-a3aef441d7de
15/03/24 04:35:20 INFO storage.MemoryStore: MemoryStore started with
capacity 1060.3 MB
15/03/24 04:35:21 INFO util.AkkaUtils: Connecting to
OutputCommitCoordinator: akka.tcp://sparkDriver@SOME-HOST-NAME
:48318/user/OutputCommitCoordinator
15/03/24 04:35:21 INFO executor.CoarseGrainedExecutorBackend: Connecting to
driver: akka.tcp://sparkDriver@SOME-HOST-NAME
:48318/user/CoarseGrainedScheduler
15/03/24 04:35:21 INFO executor.CoarseGrainedExecutorBackend: Successfully
registered with driver
15/03/24 04:35:21 INFO executor.Executor: Starting executor ID 3 on host
phxaishdc9dn1783.stratus.phx.ebay.com
15/03/24 04:35:21 INFO netty.NettyBlockTransferService: Server created on
47980
15/03/24 04:35:21 INFO storage.BlockManagerMaster: Trying to register
BlockManager
15/03/24 04:35:21 INFO storage.BlockManagerMaster: Registered BlockManager
15/03/24 04:35:21 INFO util.AkkaUtils: Connecting to HeartbeatReceiver:
akka.tcp://sparkDriver@SOME-HOST-NAME:48318/user/HeartbeatReceiver
15/03/24 04:36:00 ERROR executor.CoarseGrainedExecutorBackend: Driver
Disassociated [akka.tcp://
sparkExecutor@phxaishdc9dn1783.stratus.phx.ebay.com:38557] ->
[akka.tcp://sparkDriver@SOME-HOST-NAME:48318] disassociated! Shutting down.
15/03/24 04:36:00 WARN remote.ReliableDeliverySupervisor: Association with
remote system [akka.tcp://sparkDriver@SOME-HOST-NAME:48318] has failed,
address is now gated for [5000] ms. Reason is: [Disassociated].

LogType: stdout
LogLength: 0
Log Contents:



Container: container_1426715280024_68434_01_000002 on SOME-HOST-NAME_37147
===================================================================================================
LogType: stderr
LogLength: 28847
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/hadoop/11/scratch/local/usercache/dvasthimal/filecache/13/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/apache/hadoop-2.4.1-2.1.3.0-2-EBAY/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/03/24 04:35:05 INFO yarn.ApplicationMaster: Registered signal handlers
for [TERM, HUP, INT]
15/03/24 04:35:06 INFO yarn.ApplicationMaster: ApplicationAttemptId:
appattempt_1426715280024_68434_000001
15/03/24 04:35:07 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
15/03/24 04:35:07 WARN hdfs.BlockReaderLocal: The short-circuit local reads
feature cannot be used because libhadoop cannot be loaded.
15/03/24 04:35:07 INFO spark.SecurityManager: Changing view acls to:
dvasthimal
15/03/24 04:35:07 INFO spark.SecurityManager: Changing modify acls to:
dvasthimal
15/03/24 04:35:07 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(dvasthimal); users with modify permissions: Set(dvasthimal)
15/03/24 04:35:07 INFO yarn.ApplicationMaster: Starting the user
application in a separate Thread
15/03/24 04:35:07 INFO yarn.ApplicationMaster: Waiting for spark context
initialization
15/03/24 04:35:07 INFO yarn.ApplicationMaster: Waiting for spark context
initialization ...
15/03/24 04:35:07 INFO spark.SparkContext: Running Spark version 1.3.0
15/03/24 04:35:07 INFO spark.SecurityManager: Changing view acls to:
dvasthimal
15/03/24 04:35:07 INFO spark.SecurityManager: Changing modify acls to:
dvasthimal
15/03/24 04:35:07 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(dvasthimal); users with modify permissions: Set(dvasthimal)
15/03/24 04:35:08 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/03/24 04:35:08 INFO Remoting: Starting remoting
15/03/24 04:35:08 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkDriver@SOME-HOST-NAME:48318]
15/03/24 04:35:08 INFO util.Utils: Successfully started service
'sparkDriver' on port 48318.
15/03/24 04:35:08 INFO spark.SparkEnv: Registering MapOutputTracker
15/03/24 04:35:08 INFO spark.SparkEnv: Registering BlockManagerMaster
15/03/24 04:35:08 INFO storage.DiskBlockManager: Created local directory at
/hadoop/1/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-57c6b19a-3501-4787-89ce-21ba27373cae
15/03/24 04:35:08 INFO storage.DiskBlockManager: Created local directory at
/hadoop/2/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-8e0642c7-4077-46c1-9cc9-d716cf938d52
15/03/24 04:35:08 INFO storage.DiskBlockManager: Created local directory at
/hadoop/3/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-19fe5fd4-6eba-4991-8869-34e47796339b
15/03/24 04:35:08 INFO storage.DiskBlockManager: Created local directory at
/hadoop/4/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-a7e9261f-0352-42ce-97f3-07ec9830721f
15/03/24 04:35:08 INFO storage.DiskBlockManager: Created local directory at
/hadoop/5/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-350815b4-b4cd-4ad5-8c49-0c40aa58ee8f
15/03/24 04:35:08 INFO storage.DiskBlockManager: Created local directory at
/hadoop/6/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-fea0f052-6845-4679-a1e5-ecd7876bac9f
15/03/24 04:35:08 INFO storage.DiskBlockManager: Created local directory at
/hadoop/7/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-75e81a89-06e5-4508-a53f-320f3061c903
15/03/24 04:35:08 INFO storage.DiskBlockManager: Created local directory at
/hadoop/9/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-bf4a37e3-98fd-43fe-84c7-72f752a0d28c
15/03/24 04:35:08 INFO storage.DiskBlockManager: Created local directory at
/hadoop/10/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-a1f2d439-4bf0-4c58-88e7-02491727f548
15/03/24 04:35:08 INFO storage.DiskBlockManager: Created local directory at
/hadoop/11/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-36b222c6-a06b-4b93-b9c6-6252dee64c4c
15/03/24 04:35:08 INFO storage.DiskBlockManager: Created local directory at
/hadoop/12/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-5e68187a-9e3c-477f-8325-bfdf038a1564
15/03/24 04:35:08 INFO storage.MemoryStore: MemoryStore started with
capacity 3.8 GB
15/03/24 04:35:08 INFO spark.HttpFileServer: HTTP File server directory is
/hadoop/1/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/httpd-ae81c626-999c-42d7-8d68-7e3c6a8a37f0
15/03/24 04:35:08 INFO spark.HttpServer: Starting HTTP Server
15/03/24 04:35:08 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/03/24 04:35:08 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0:48394
15/03/24 04:35:08 INFO util.Utils: Successfully started service 'HTTP file
server' on port 48394.
15/03/24 04:35:08 INFO spark.SparkEnv: Registering OutputCommitCoordinator
15/03/24 04:35:08 INFO ui.JettyUtils: Adding filter:
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
15/03/24 04:35:09 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/03/24 04:35:09 INFO server.AbstractConnector: Started
SelectChannelConnector@0.0.0.0:35930
15/03/24 04:35:09 INFO util.Utils: Successfully started service 'SparkUI'
on port 35930.
15/03/24 04:35:09 INFO ui.SparkUI: Started SparkUI at
http://SOME-HOST-NAME:35930
15/03/24 04:35:09 INFO cluster.YarnClusterScheduler: Created
YarnClusterScheduler
15/03/24 04:35:09 INFO netty.NettyBlockTransferService: Server created on
40829
15/03/24 04:35:09 INFO storage.BlockManagerMaster: Trying to register
BlockManager
15/03/24 04:35:09 INFO storage.BlockManagerMasterActor: Registering block
manager SOME-HOST-NAME:40829 with 3.8 GB RAM, BlockManagerId(<driver>,
SOME-HOST-NAME, 40829)
15/03/24 04:35:09 INFO storage.BlockManagerMaster: Registered BlockManager
15/03/24 04:35:09 INFO yarn.ApplicationMaster: Listen to driver:
akka.tcp://sparkDriver@SOME-HOST-NAME:48318/user/YarnScheduler
15/03/24 04:35:09 INFO cluster.YarnClusterSchedulerBackend:
ApplicationMaster registered as
Actor[akka://sparkDriver/user/YarnAM#-1175613036]
15/03/24 04:35:09 INFO yarn.YarnRMClient: Registering the ApplicationMaster
15/03/24 04:35:10 INFO yarn.YarnAllocator: Will request 3 executor
containers, each with 1 cores and 2432 MB memory including 384 MB overhead
15/03/24 04:35:10 INFO yarn.YarnAllocator: Container request (host: Any,
capability: <memory:2432, vCores:1>)
15/03/24 04:35:10 INFO yarn.YarnAllocator: Container request (host: Any,
capability: <memory:2432, vCores:1>)
15/03/24 04:35:10 INFO yarn.YarnAllocator: Container request (host: Any,
capability: <memory:2432, vCores:1>)
15/03/24 04:35:10 INFO yarn.ApplicationMaster: Started progress reporter
thread - sleep time : 5000
15/03/24 04:35:10 INFO impl.AMRMClientImpl: Received new token for :
CONTAINER-HOST-NAME:60288
15/03/24 04:35:11 INFO impl.AMRMClientImpl: Received new token for :
phxdpehdc9dn2200.stratus.phx.ebay.com:37865
15/03/24 04:35:11 INFO impl.AMRMClientImpl: Received new token for :
phxaishdc9dn1783.stratus.phx.ebay.com:52167
15/03/24 04:35:11 INFO yarn.YarnAllocator: Launching container
container_1426715280024_68434_01_000003 for on host CONTAINER-HOST-NAME
15/03/24 04:35:11 INFO yarn.YarnAllocator: Launching ExecutorRunnable.
driverUrl: akka.tcp://sparkDriver@SOME-HOST-NAME:48318/user/CoarseGrainedScheduler,
 executorHostname: CONTAINER-HOST-NAME
15/03/24 04:35:11 INFO yarn.YarnAllocator: Launching container
container_1426715280024_68434_01_000004 for on host
phxdpehdc9dn2200.stratus.phx.ebay.com
15/03/24 04:35:11 INFO yarn.ExecutorRunnable: Starting Executor Container
15/03/24 04:35:11 INFO yarn.YarnAllocator: Launching ExecutorRunnable.
driverUrl: akka.tcp://sparkDriver@SOME-HOST-NAME:48318/user/CoarseGrainedScheduler,
 executorHostname: phxdpehdc9dn2200.stratus.phx.ebay.com
15/03/24 04:35:11 INFO yarn.ExecutorRunnable: Starting Executor Container
15/03/24 04:35:11 INFO yarn.YarnAllocator: Launching container
container_1426715280024_68434_01_000005 for on host
phxaishdc9dn1783.stratus.phx.ebay.com
15/03/24 04:35:11 INFO impl.ContainerManagementProtocolProxy:
yarn.client.max-nodemanagers-proxies : 500
15/03/24 04:35:11 INFO yarn.YarnAllocator: Launching ExecutorRunnable.
driverUrl: akka.tcp://sparkDriver@SOME-HOST-NAME:48318/user/CoarseGrainedScheduler,
 executorHostname: phxaishdc9dn1783.stratus.phx.ebay.com
15/03/24 04:35:11 INFO yarn.ExecutorRunnable: Setting up
ContainerLaunchContext
15/03/24 04:35:11 INFO yarn.YarnAllocator: Received 3 containers from YARN,
launching executors on 3 of them.
15/03/24 04:35:11 INFO impl.ContainerManagementProtocolProxy:
yarn.client.max-nodemanagers-proxies : 500
15/03/24 04:35:11 INFO yarn.ExecutorRunnable: Starting Executor Container
15/03/24 04:35:11 INFO impl.ContainerManagementProtocolProxy:
yarn.client.max-nodemanagers-proxies : 500
15/03/24 04:35:11 INFO yarn.ExecutorRunnable: Setting up
ContainerLaunchContext
15/03/24 04:35:11 INFO yarn.ExecutorRunnable: Setting up
ContainerLaunchContext
15/03/24 04:35:11 INFO yarn.ExecutorRunnable: Preparing Local resources
15/03/24 04:35:11 INFO yarn.ExecutorRunnable: Preparing Local resources
15/03/24 04:35:11 INFO yarn.ExecutorRunnable: Preparing Local resources
15/03/24 04:35:11 INFO yarn.ExecutorRunnable: Prepared Local resources
Map(__app__.jar -> resource { scheme: "hdfs" host: "NN-HOST-NAME" port:
8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark_reporting-1.0-SNAPSHOT.jar"
} size: 144485 timestamp: 1427196892676 type: FILE visibility: PRIVATE,
__spark__.jar -> resource { scheme: "hdfs" host: "NN-HOST-NAME" port: 8020
file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-assembly-1.3.0-hadoop2.4.0.jar"
} size: 159319006 timestamp: 1427196892503 type: FILE visibility: PRIVATE,
datanucleus-core-3.2.10.jar -> resource { scheme: "hdfs" host:
"NN-HOST-NAME" port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-core-3.2.10.jar"
} size: 1890075 timestamp: 1427196892807 type: FILE visibility: PRIVATE,
datanucleus-api-jdo-3.2.6.jar -> resource { scheme: "hdfs" host:
"NN-HOST-NAME" port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-api-jdo-3.2.6.jar"
} size: 339666 timestamp: 1427196892748 type: FILE visibility: PRIVATE,
spark-avro_2.10-1.0.0.jar -> resource { scheme: "hdfs" host: "NN-HOST-NAME"
port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-avro_2.10-1.0.0.jar"
} size: 73413 timestamp: 1427196892708 type: FILE visibility: PRIVATE,
datanucleus-rdbms-3.2.9.jar -> resource { scheme: "hdfs" host:
"NN-HOST-NAME" port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-rdbms-3.2.9.jar"
} size: 1809447 timestamp: 1427196892892 type: FILE visibility: PRIVATE)
15/03/24 04:35:11 INFO yarn.ExecutorRunnable: Prepared Local resources
Map(__app__.jar -> resource { scheme: "hdfs" host: "NN-HOST-NAME" port:
8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark_reporting-1.0-SNAPSHOT.jar"
} size: 144485 timestamp: 1427196892676 type: FILE visibility: PRIVATE,
__spark__.jar -> resource { scheme: "hdfs" host: "NN-HOST-NAME" port: 8020
file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-assembly-1.3.0-hadoop2.4.0.jar"
} size: 159319006 timestamp: 1427196892503 type: FILE visibility: PRIVATE,
datanucleus-core-3.2.10.jar -> resource { scheme: "hdfs" host:
"NN-HOST-NAME" port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-core-3.2.10.jar"
} size: 1890075 timestamp: 1427196892807 type: FILE visibility: PRIVATE,
datanucleus-api-jdo-3.2.6.jar -> resource { scheme: "hdfs" host:
"NN-HOST-NAME" port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-api-jdo-3.2.6.jar"
} size: 339666 timestamp: 1427196892748 type: FILE visibility: PRIVATE,
spark-avro_2.10-1.0.0.jar -> resource { scheme: "hdfs" host: "NN-HOST-NAME"
port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-avro_2.10-1.0.0.jar"
} size: 73413 timestamp: 1427196892708 type: FILE visibility: PRIVATE,
datanucleus-rdbms-3.2.9.jar -> resource { scheme: "hdfs" host:
"NN-HOST-NAME" port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-rdbms-3.2.9.jar"
} size: 1809447 timestamp: 1427196892892 type: FILE visibility: PRIVATE)
15/03/24 04:35:11 INFO yarn.ExecutorRunnable: Prepared Local resources
Map(__app__.jar -> resource { scheme: "hdfs" host: "NN-HOST-NAME" port:
8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark_reporting-1.0-SNAPSHOT.jar"
} size: 144485 timestamp: 1427196892676 type: FILE visibility: PRIVATE,
__spark__.jar -> resource { scheme: "hdfs" host: "NN-HOST-NAME" port: 8020
file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-assembly-1.3.0-hadoop2.4.0.jar"
} size: 159319006 timestamp: 1427196892503 type: FILE visibility: PRIVATE,
datanucleus-core-3.2.10.jar -> resource { scheme: "hdfs" host:
"NN-HOST-NAME" port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-core-3.2.10.jar"
} size: 1890075 timestamp: 1427196892807 type: FILE visibility: PRIVATE,
datanucleus-api-jdo-3.2.6.jar -> resource { scheme: "hdfs" host:
"NN-HOST-NAME" port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-api-jdo-3.2.6.jar"
} size: 339666 timestamp: 1427196892748 type: FILE visibility: PRIVATE,
spark-avro_2.10-1.0.0.jar -> resource { scheme: "hdfs" host: "NN-HOST-NAME"
port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-avro_2.10-1.0.0.jar"
} size: 73413 timestamp: 1427196892708 type: FILE visibility: PRIVATE,
datanucleus-rdbms-3.2.9.jar -> resource { scheme: "hdfs" host:
"NN-HOST-NAME" port: 8020 file:
"/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-rdbms-3.2.9.jar"
} size: 1809447 timestamp: 1427196892892 type: FILE visibility: PRIVATE)
15/03/24 04:35:11 INFO yarn.ExecutorRunnable: Setting up executor with
environment: Map(CLASSPATH ->
{{PWD}}<CPS>{{PWD}}/__spark__.jar<CPS>$HADOOP_CONF_DIR<CPS>$HADOOP_COMMON_HOME/share/hadoop/common/*<CPS>$HADOOP_COMMON_HOME/share/hadoop/common/lib/*<CPS>$HADOOP_HDFS_HOME/share/hadoop/hdfs/*<CPS>$HADOOP_HDFS_HOME/share/hadoop/hdfs/lib/*<CPS>$HADOOP_YARN_HOME/share/hadoop/yarn/*<CPS>$HADOOP_YARN_HOME/share/hadoop/yarn/lib/*<CPS>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*<CPS>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*,
SPARK_LOG_URL_STDERR ->
http://phxdpehdc9dn2200.stratus.phx.ebay.com:50060/node/containerlogs/container_1426715280024_68434_01_000004/dvasthimal/stderr?start=0,
SPARK_YARN_STAGING_DIR -> .sparkStaging/application_1426715280024_68434,
SPARK_YARN_CACHE_FILES_FILE_SIZES ->
159319006,144485,73413,339666,1890075,1809447, SPARK_USER -> dvasthimal,
SPARK_YARN_CACHE_FILES_VISIBILITIES ->
PRIVATE,PRIVATE,PRIVATE,PRIVATE,PRIVATE,PRIVATE, SPARK_YARN_MODE -> true,
SPARK_YARN_CACHE_FILES_TIME_STAMPS ->
1427196892503,1427196892676,1427196892708,1427196892748,1427196892807,1427196892892,
SPARK_LOG_URL_STDOUT ->
http://phxdpehdc9dn2200.stratus.phx.ebay.com:50060/node/containerlogs/container_1426715280024_68434_01_000004/dvasthimal/stdout?start=0,
SPARK_YARN_CACHE_FILES ->
hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-assembly-1.3.0-hadoop2.4.0.jar#__spark__.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark_reporting-1.0-SNAPSHOT.jar#__app__.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-avro_2.10-1.0.0.jar#spark-avro_2.10-1.0.0.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-api-jdo-3.2.6.jar#datanucleus-api-jdo-3.2.6.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-core-3.2.10.jar#datanucleus-core-3.2.10.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-rdbms-3.2.9.jar#datanucleus-rdbms-3.2.9.jar)
15/03/24 04:35:11 INFO yarn.ExecutorRunnable: Setting up executor with
environment: Map(CLASSPATH ->
{{PWD}}<CPS>{{PWD}}/__spark__.jar<CPS>$HADOOP_CONF_DIR<CPS>$HADOOP_COMMON_HOME/share/hadoop/common/*<CPS>$HADOOP_COMMON_HOME/share/hadoop/common/lib/*<CPS>$HADOOP_HDFS_HOME/share/hadoop/hdfs/*<CPS>$HADOOP_HDFS_HOME/share/hadoop/hdfs/lib/*<CPS>$HADOOP_YARN_HOME/share/hadoop/yarn/*<CPS>$HADOOP_YARN_HOME/share/hadoop/yarn/lib/*<CPS>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*<CPS>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*,
SPARK_LOG_URL_STDERR ->
http://CONTAINER-HOST-NAME:50060/node/containerlogs/container_1426715280024_68434_01_000003/dvasthimal/stderr?start=0,
SPARK_YARN_STAGING_DIR -> .sparkStaging/application_1426715280024_68434,
SPARK_YARN_CACHE_FILES_FILE_SIZES ->
159319006,144485,73413,339666,1890075,1809447, SPARK_USER -> dvasthimal,
SPARK_YARN_CACHE_FILES_VISIBILITIES ->
PRIVATE,PRIVATE,PRIVATE,PRIVATE,PRIVATE,PRIVATE, SPARK_YARN_MODE -> true,
SPARK_YARN_CACHE_FILES_TIME_STAMPS ->
1427196892503,1427196892676,1427196892708,1427196892748,1427196892807,1427196892892,
SPARK_LOG_URL_STDOUT ->
http://CONTAINER-HOST-NAME:50060/node/containerlogs/container_1426715280024_68434_01_000003/dvasthimal/stdout?start=0,
SPARK_YARN_CACHE_FILES ->
hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-assembly-1.3.0-hadoop2.4.0.jar#__spark__.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark_reporting-1.0-SNAPSHOT.jar#__app__.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-avro_2.10-1.0.0.jar#spark-avro_2.10-1.0.0.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-api-jdo-3.2.6.jar#datanucleus-api-jdo-3.2.6.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-core-3.2.10.jar#datanucleus-core-3.2.10.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-rdbms-3.2.9.jar#datanucleus-rdbms-3.2.9.jar)
15/03/24 04:35:11 INFO yarn.ExecutorRunnable: Setting up executor with
commands: List({{JAVA_HOME}}/bin/java, -server,
-XX:OnOutOfMemoryError='kill %p', -Xms2048m, -Xmx2048m,
-Djava.io.tmpdir={{PWD}}/tmp, '-Dspark.driver.port=48318',
'-Dspark.ui.port=0', -Dspark.yarn.app.container.log.dir=<LOG_DIR>,
org.apache.spark.executor.CoarseGrainedExecutorBackend, --driver-url,
akka.tcp://sparkDriver@SOME-HOST-NAME:48318/user/CoarseGrainedScheduler,
--executor-id, 2, --hostname, phxdpehdc9dn2200.stratus.phx.ebay.com,
--cores, 1, --app-id, application_1426715280024_68434, --user-class-path,
file:$PWD/__app__.jar, --user-class-path,
file:$PWD/spark-avro_2.10-1.0.0.jar, --user-class-path,
file:$PWD/datanucleus-api-jdo-3.2.6.jar, --user-class-path,
file:$PWD/datanucleus-core-3.2.10.jar, --user-class-path,
file:$PWD/datanucleus-rdbms-3.2.9.jar, 1>, <LOG_DIR>/stdout, 2>,
<LOG_DIR>/stderr)
15/03/24 04:35:11 INFO yarn.ExecutorRunnable: Setting up executor with
commands: List({{JAVA_HOME}}/bin/java, -server,
-XX:OnOutOfMemoryError='kill %p', -Xms2048m, -Xmx2048m,
-Djava.io.tmpdir={{PWD}}/tmp, '-Dspark.driver.port=48318',
'-Dspark.ui.port=0', -Dspark.yarn.app.container.log.dir=<LOG_DIR>,
org.apache.spark.executor.CoarseGrainedExecutorBackend, --driver-url,
akka.tcp://sparkDriver@SOME-HOST-NAME:48318/user/CoarseGrainedScheduler,
--executor-id, 1, --hostname, CONTAINER-HOST-NAME, --cores, 1, --app-id,
application_1426715280024_68434, --user-class-path, file:$PWD/__app__.jar,
--user-class-path, file:$PWD/spark-avro_2.10-1.0.0.jar, --user-class-path,
file:$PWD/datanucleus-api-jdo-3.2.6.jar, --user-class-path,
file:$PWD/datanucleus-core-3.2.10.jar, --user-class-path,
file:$PWD/datanucleus-rdbms-3.2.9.jar, 1>, <LOG_DIR>/stdout, 2>,
<LOG_DIR>/stderr)
15/03/24 04:35:11 INFO yarn.ExecutorRunnable: Setting up executor with
environment: Map(CLASSPATH ->
{{PWD}}<CPS>{{PWD}}/__spark__.jar<CPS>$HADOOP_CONF_DIR<CPS>$HADOOP_COMMON_HOME/share/hadoop/common/*<CPS>$HADOOP_COMMON_HOME/share/hadoop/common/lib/*<CPS>$HADOOP_HDFS_HOME/share/hadoop/hdfs/*<CPS>$HADOOP_HDFS_HOME/share/hadoop/hdfs/lib/*<CPS>$HADOOP_YARN_HOME/share/hadoop/yarn/*<CPS>$HADOOP_YARN_HOME/share/hadoop/yarn/lib/*<CPS>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*<CPS>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*,
SPARK_LOG_URL_STDERR ->
http://phxaishdc9dn1783.stratus.phx.ebay.com:50060/node/containerlogs/container_1426715280024_68434_01_000005/dvasthimal/stderr?start=0,
SPARK_YARN_STAGING_DIR -> .sparkStaging/application_1426715280024_68434,
SPARK_YARN_CACHE_FILES_FILE_SIZES ->
159319006,144485,73413,339666,1890075,1809447, SPARK_USER -> dvasthimal,
SPARK_YARN_CACHE_FILES_VISIBILITIES ->
PRIVATE,PRIVATE,PRIVATE,PRIVATE,PRIVATE,PRIVATE, SPARK_YARN_MODE -> true,
SPARK_YARN_CACHE_FILES_TIME_STAMPS ->
1427196892503,1427196892676,1427196892708,1427196892748,1427196892807,1427196892892,
SPARK_LOG_URL_STDOUT ->
http://phxaishdc9dn1783.stratus.phx.ebay.com:50060/node/containerlogs/container_1426715280024_68434_01_000005/dvasthimal/stdout?start=0,
SPARK_YARN_CACHE_FILES ->
hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-assembly-1.3.0-hadoop2.4.0.jar#__spark__.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark_reporting-1.0-SNAPSHOT.jar#__app__.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/spark-avro_2.10-1.0.0.jar#spark-avro_2.10-1.0.0.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-api-jdo-3.2.6.jar#datanucleus-api-jdo-3.2.6.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-core-3.2.10.jar#datanucleus-core-3.2.10.jar,hdfs://NN-HOST-NAME:8020/user/dvasthimal/.sparkStaging/application_1426715280024_68434/datanucleus-rdbms-3.2.9.jar#datanucleus-rdbms-3.2.9.jar)
15/03/24 04:35:11 INFO yarn.ExecutorRunnable: Setting up executor with
commands: List({{JAVA_HOME}}/bin/java, -server,
-XX:OnOutOfMemoryError='kill %p', -Xms2048m, -Xmx2048m,
-Djava.io.tmpdir={{PWD}}/tmp, '-Dspark.driver.port=48318',
'-Dspark.ui.port=0', -Dspark.yarn.app.container.log.dir=<LOG_DIR>,
org.apache.spark.executor.CoarseGrainedExecutorBackend, --driver-url,
akka.tcp://sparkDriver@SOME-HOST-NAME:48318/user/CoarseGrainedScheduler,
--executor-id, 3, --hostname, phxaishdc9dn1783.stratus.phx.ebay.com,
--cores, 1, --app-id, application_1426715280024_68434, --user-class-path,
file:$PWD/__app__.jar, --user-class-path,
file:$PWD/spark-avro_2.10-1.0.0.jar, --user-class-path,
file:$PWD/datanucleus-api-jdo-3.2.6.jar, --user-class-path,
file:$PWD/datanucleus-core-3.2.10.jar, --user-class-path,
file:$PWD/datanucleus-rdbms-3.2.9.jar, 1>, <LOG_DIR>/stdout, 2>,
<LOG_DIR>/stderr)
15/03/24 04:35:11 INFO impl.ContainerManagementProtocolProxy: Opening proxy
: CONTAINER-HOST-NAME:60288
15/03/24 04:35:11 INFO impl.ContainerManagementProtocolProxy: Opening proxy
: phxdpehdc9dn2200.stratus.phx.ebay.com:37865
15/03/24 04:35:11 INFO impl.ContainerManagementProtocolProxy: Opening proxy
: phxaishdc9dn1783.stratus.phx.ebay.com:52167
15/03/24 04:35:20 INFO cluster.YarnClusterSchedulerBackend: Registered
executor: Actor[akka.tcp://
sparkExecutor@phxdpehdc9dn2200.stratus.phx.ebay.com:34641/user/Executor#-418876637]
with ID 2
15/03/24 04:35:20 INFO storage.BlockManagerMasterActor: Registering block
manager phxdpehdc9dn2200.stratus.phx.ebay.com:48175 with 1060.3 MB RAM,
BlockManagerId(2, phxdpehdc9dn2200.stratus.phx.ebay.com, 48175)
15/03/24 04:35:21 INFO cluster.YarnClusterSchedulerBackend: Registered
executor: Actor[akka.tcp://
sparkExecutor@phxaishdc9dn1783.stratus.phx.ebay.com:38557/user/Executor#-1725106695]
with ID 3
15/03/24 04:35:21 INFO storage.BlockManagerMasterActor: Registering block
manager phxaishdc9dn1783.stratus.phx.ebay.com:47980 with 1060.3 MB RAM,
BlockManagerId(3, phxaishdc9dn1783.stratus.phx.ebay.com, 47980)
15/03/24 04:35:24 INFO cluster.YarnClusterSchedulerBackend: Registered
executor: Actor[akka.tcp://sparkExecutor@CONTAINER-HOST-NAME:54112/user/Executor#2102882162]
with ID 1
15/03/24 04:35:24 INFO cluster.YarnClusterSchedulerBackend:
SchedulerBackend is ready for scheduling beginning after reached
minRegisteredResourcesRatio: 0.8
15/03/24 04:35:24 INFO cluster.YarnClusterScheduler:
YarnClusterScheduler.postStartHook done
15/03/24 04:35:25 INFO storage.BlockManagerMasterActor: Registering block
manager CONTAINER-HOST-NAME:59851 with 1060.3 MB RAM, BlockManagerId(1,
CONTAINER-HOST-NAME, 59851)
15/03/24 04:35:26 INFO metastore.HiveMetaStore: 0: Opening raw store with
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
15/03/24 04:35:26 INFO metastore.ObjectStore: ObjectStore, initialize called
15/03/24 04:35:27 INFO DataNucleus.Persistence: Property
datanucleus.cache.level2 unknown - will be ignored
15/03/24 04:35:27 INFO DataNucleus.Persistence: Property
hive.metastore.integral.jdo.pushdown unknown - will be ignored
15/03/24 04:35:38 INFO metastore.ObjectStore: Setting MetaStore object pin
classes with
hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
15/03/24 04:35:38 INFO metastore.MetaStoreDirectSql: MySQL check failed,
assuming we are not on mysql: Lexical error at line 1, column 5.
Encountered: "@" (64), after : "".
15/03/24 04:35:41 INFO DataNucleus.Datastore: The class
"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
"embedded-only" so does not have its own datastore table.
15/03/24 04:35:41 INFO DataNucleus.Datastore: The class
"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
"embedded-only" so does not have its own datastore table.
15/03/24 04:35:49 INFO DataNucleus.Datastore: The class
"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
"embedded-only" so does not have its own datastore table.
15/03/24 04:35:49 INFO DataNucleus.Datastore: The class
"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
"embedded-only" so does not have its own datastore table.
15/03/24 04:35:51 INFO metastore.ObjectStore: Initialized ObjectStore
15/03/24 04:35:51 WARN metastore.ObjectStore: Version information not found
in metastore. hive.metastore.schema.verification is not enabled so
recording the schema version 0.13.1aa
15/03/24 04:35:52 INFO metastore.HiveMetaStore: Added admin role in
metastore
15/03/24 04:35:52 INFO metastore.HiveMetaStore: Added public role in
metastore
15/03/24 04:35:52 INFO metastore.HiveMetaStore: No user is added in admin
role, since config is empty
15/03/24 04:35:53 INFO session.SessionState: No Tez session required at
this point. hive.execution.engine=mr.
15/03/24 04:35:53 INFO session.SessionState: No Tez session required at
this point. hive.execution.engine=mr.
15/03/24 04:35:53 INFO parse.ParseDriver: Parsing command: select count(*)
from success_events.sojsuccessevents1
Exception in thread "Driver"
Exception: java.lang.OutOfMemoryError thrown from the
UncaughtExceptionHandler in thread "Driver"

LogType: stdout
LogLength: 0
Log Contents:



Container: container_1426715280024_68434_01_000004 on
phxdpehdc9dn2200.stratus.phx.ebay.com_37865
===================================================================================================
LogType: stderr
LogLength: 5949
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/hadoop/1/scratch/local/usercache/dvasthimal/filecache/13/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/apache/hadoop-2.4.1-2.1.3.0-2-EBAY/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/03/24 04:35:17 INFO executor.CoarseGrainedExecutorBackend: Registered
signal handlers for [TERM, HUP, INT]
15/03/24 04:35:18 INFO spark.SecurityManager: Changing view acls to:
dvasthimal
15/03/24 04:35:18 INFO spark.SecurityManager: Changing modify acls to:
dvasthimal
15/03/24 04:35:18 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(dvasthimal); users with modify permissions: Set(dvasthimal)
15/03/24 04:35:18 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/03/24 04:35:18 INFO Remoting: Starting remoting
15/03/24 04:35:19 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://driverPropsFetcher@phxdpehdc9dn2200.stratus.phx.ebay.com:55296]
15/03/24 04:35:19 INFO util.Utils: Successfully started service
'driverPropsFetcher' on port 55296.
15/03/24 04:35:19 INFO spark.SecurityManager: Changing view acls to:
dvasthimal
15/03/24 04:35:19 INFO spark.SecurityManager: Changing modify acls to:
dvasthimal
15/03/24 04:35:19 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(dvasthimal); users with modify permissions: Set(dvasthimal)
15/03/24 04:35:19 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Shutting down remote daemon.
15/03/24 04:35:19 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Remote daemon shut down; proceeding with flushing remote transports.
15/03/24 04:35:19 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/03/24 04:35:19 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Remoting shut down.
15/03/24 04:35:19 INFO Remoting: Starting remoting
15/03/24 04:35:19 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkExecutor@phxdpehdc9dn2200.stratus.phx.ebay.com:34641]
15/03/24 04:35:19 INFO util.Utils: Successfully started service
'sparkExecutor' on port 34641.
15/03/24 04:35:19 INFO util.AkkaUtils: Connecting to MapOutputTracker:
akka.tcp://sparkDriver@SOME-HOST-NAME:48318/user/MapOutputTracker
15/03/24 04:35:19 INFO util.AkkaUtils: Connecting to BlockManagerMaster:
akka.tcp://sparkDriver@SOME-HOST-NAME:48318/user/BlockManagerMaster
15/03/24 04:35:19 INFO storage.DiskBlockManager: Created local directory at
/hadoop/1/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-9bf08ed6-420d-42ea-818f-2c53abf563dd
15/03/24 04:35:19 INFO storage.DiskBlockManager: Created local directory at
/hadoop/2/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-323086d0-bdc3-472e-88ae-d4e345b955c6
15/03/24 04:35:19 INFO storage.DiskBlockManager: Created local directory at
/hadoop/3/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-fb17fb39-4032-4ce1-8a0f-0a2596a70343
15/03/24 04:35:19 INFO storage.DiskBlockManager: Created local directory at
/hadoop/4/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-acd38dec-8ef6-4278-b510-dd6872273760
15/03/24 04:35:19 INFO storage.DiskBlockManager: Created local directory at
/hadoop/5/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-fd9b7d17-668a-44a3-bcec-b330ef14b703
15/03/24 04:35:19 INFO storage.DiskBlockManager: Created local directory at
/hadoop/6/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-e47f7957-937f-4b76-aa97-d4288f5c85c9
15/03/24 04:35:19 INFO storage.DiskBlockManager: Created local directory at
/hadoop/7/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-4cc5121c-9731-46b5-9021-7b33edbd1261
15/03/24 04:35:19 INFO storage.DiskBlockManager: Created local directory at
/hadoop/8/scratch/local/usercache/dvasthimal/appcache/application_1426715280024_68434/blockmgr-fff27bbe-11d3-4b68-a6ed-f950f47823ab
15/03/24 04:35:19 INFO storage.MemoryStore: MemoryStore started with
capacity 1060.3 MB
15/03/24 04:35:20 INFO util.AkkaUtils: Connecting to
OutputCommitCoordinator: akka.tcp://sparkDriver@SOME-HOST-NAME
:48318/user/OutputCommitCoordinator
15/03/24 04:35:20 INFO executor.CoarseGrainedExecutorBackend: Connecting to
driver: akka.tcp://sparkDriver@SOME-HOST-NAME
:48318/user/CoarseGrainedScheduler
15/03/24 04:35:20 INFO executor.CoarseGrainedExecutorBackend: Successfully
registered with driver
15/03/24 04:35:20 INFO executor.Executor: Starting executor ID 2 on host
phxdpehdc9dn2200.stratus.phx.ebay.com
15/03/24 04:35:20 INFO netty.NettyBlockTransferService: Server created on
48175
15/03/24 04:35:20 INFO storage.BlockManagerMaster: Trying to register
BlockManager
15/03/24 04:35:20 INFO storage.BlockManagerMaster: Registered BlockManager
15/03/24 04:35:20 INFO util.AkkaUtils: Connecting to HeartbeatReceiver:
akka.tcp://sparkDriver@SOME-HOST-NAME:48318/user/HeartbeatReceiver
15/03/24 04:36:00 ERROR executor.CoarseGrainedExecutorBackend: Driver
Disassociated [akka.tcp://
sparkExecutor@phxdpehdc9dn2200.stratus.phx.ebay.com:34641] ->
[akka.tcp://sparkDriver@SOME-HOST-NAME:48318] disassociated! Shutting down.
15/03/24 04:36:00 WARN remote.ReliableDeliverySupervisor: Association with
remote system [akka.tcp://sparkDriver@SOME-HOST-NAME:48318] has failed,
address is now gated for [5000] ms. Reason is: [Disassociated].

LogType: stdout
LogLength: 0
Log Contents:

-sh-4.1$


Regards,
Deepak

-- 
Deepak