You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kylin.apache.org by suresh m <su...@gmail.com> on 2016/11/25 10:22:55 UTC

Is kylin not support Hive ORC tables with ACID properties.

Hi Facing an issue where i can able to build cube with text format but
unable to building cube with ORC tables.

Let me know kylin having any issues with ORC format.?

 Hive having limitation that Text format tables not having possibility to
enabling ACID properties since text format not supporting ACID. But for me
ACID properties is important to handle my data, this i can do with ORC but
kylin throwing errors with ORC format.


Regards,
Suresh

Re: Is kylin not support Hive ORC tables with ACID properties.

Posted by ShaoFeng Shi <sh...@apache.org>.
At the cube build begining Kylin will create an intermediate flat table,
and then pulling data from source tables into it. You can get the
intermediate table name in the job output, and then use hive cli to check
whether the data in it is correct.

2016-11-29 14:17 GMT+08:00 suresh m <su...@gmail.com>:

> Hi ShaoFeng Shi,
>
> Thanks for your reply, but here my case is different. I can able to create
> and build cube with same data using text formatted tables but when i tried
> to build cube with ORC formatted tables with same data facing an issue.
>
> Regards,
> Suresh
>
> On Mon, Nov 28, 2016 at 7:31 PM, ShaoFeng Shi <sh...@apache.org>
> wrote:
>
> > Hi Suresh,
> >
> > Another user also got similar problem, and I replied in the user@ group;
> > Just minute ago I forwarded it to dev@ group; please take a look and let
> > me
> > know whether it is the same:
> > http://apache-kylin.74782.x6.nabble.com/Fwd-Re-org-apache-
> > kylin-dict-TrieDictionary-Not-a-valid-value-td6428.html
> >
> > 2016-11-28 19:38 GMT+08:00 suresh m <su...@gmail.com>:
> >
> > > can some see log and help me what is the exact issue facing with ORC
> > > formatted tables. Why i am unable build cube successfully with ORC
> > > formatted tables.
> > >
> > > On Mon, Nov 28, 2016 at 10:47 AM, suresh m <su...@gmail.com>
> > wrote:
> > >
> > > > Please find detail as requested,
> > > >
> > > > Log Type: syslog
> > > >
> > > > Log Upload Time: Fri Nov 25 15:16:35 +0530 2016
> > > >
> > > > Log Length: 107891
> > > >
> > > > 2016-11-25 15:15:35,185 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.MRAppMaster:
> > > Created MRAppMaster for application appattempt_1479580915733_0167_
> 000001
> > > > 2016-11-25 15:15:35,592 WARN [main] org.apache.hadoop.util.
> > NativeCodeLoader:
> > > Unable to load native-hadoop library for your platform... using
> > > builtin-java classes where applicable
> > > > 2016-11-25 15:15:35,630 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.MRAppMaster:
> > > Executing with tokens:
> > > > 2016-11-25 15:15:35,956 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.MRAppMaster:
> > > Kind: YARN_AM_RM_TOKEN, Service: , Ident: (appAttemptId {
> application_id
> > {
> > > id: 167 cluster_timestamp: 1479580915733 } attemptId: 1 } keyId:
> > 2128280969)
> > > > 2016-11-25 15:15:35,974 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.MRAppMaster:
> > > Using mapred newApiCommitter.
> > > > 2016-11-25 15:15:35,976 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.MRAppMaster:
> > > OutputCommitter set in config null
> > > > 2016-11-25 15:15:36,029 INFO [main] org.apache.hadoop.mapreduce.
> > > lib.output.FileOutputCommitter: File Output Committer Algorithm
> version
> > > is 1
> > > > 2016-11-25 15:15:36,029 INFO [main] org.apache.hadoop.mapreduce.
> > > lib.output.FileOutputCommitter: FileOutputCommitter skip cleanup
> > > _temporary folders under output directory:false, ignore cleanup
> failures:
> > > false
> > > > 2016-11-25 15:15:36,692 WARN [main] org.apache.hadoop.hdfs.
> > shortcircuit.DomainSocketFactory:
> > > The short-circuit local reads feature cannot be used because libhadoop
> > > cannot be loaded.
> > > > 2016-11-25 15:15:36,702 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.MRAppMaster:
> > > OutputCommitter is org.apache.hadoop.mapreduce.
> > > lib.output.FileOutputCommitter
> > > > 2016-11-25 15:15:36,891 INFO [main] org.apache.hadoop.yarn.event.
> > AsyncDispatcher:
> > > Registering class org.apache.hadoop.mapreduce.jobhistory.EventType for
> > > class org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler
> > > > 2016-11-25 15:15:36,891 INFO [main] org.apache.hadoop.yarn.event.
> > AsyncDispatcher:
> > > Registering class org.apache.hadoop.mapreduce.
> > v2.app.job.event.JobEventType
> > > for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> > > JobEventDispatcher
> > > > 2016-11-25 15:15:36,892 INFO [main] org.apache.hadoop.yarn.event.
> > AsyncDispatcher:
> > > Registering class org.apache.hadoop.mapreduce.
> > v2.app.job.event.TaskEventType
> > > for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> > > TaskEventDispatcher
> > > > 2016-11-25 15:15:36,893 INFO [main] org.apache.hadoop.yarn.event.
> > AsyncDispatcher:
> > > Registering class org.apache.hadoop.mapreduce.v2.app.job.event.
> > TaskAttemptEventType
> > > for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> > > TaskAttemptEventDispatcher
> > > > 2016-11-25 15:15:36,893 INFO [main] org.apache.hadoop.yarn.event.
> > AsyncDispatcher:
> > > Registering class org.apache.hadoop.mapreduce.v2.app.commit.
> > CommitterEventType
> > > for class org.apache.hadoop.mapreduce.v2.app.commit.
> > CommitterEventHandler
> > > > 2016-11-25 15:15:36,894 INFO [main] org.apache.hadoop.yarn.event.
> > AsyncDispatcher:
> > > Registering class org.apache.hadoop.mapreduce.
> > v2.app.speculate.Speculator$EventType
> > > for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> > > SpeculatorEventDispatcher
> > > > 2016-11-25 15:15:36,894 INFO [main] org.apache.hadoop.yarn.event.
> > AsyncDispatcher:
> > > Registering class org.apache.hadoop.mapreduce.
> > > v2.app.rm.ContainerAllocator$EventType for class
> > > org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> ContainerAllocatorRouter
> > > > 2016-11-25 15:15:36,895 INFO [main] org.apache.hadoop.yarn.event.
> > AsyncDispatcher:
> > > Registering class org.apache.hadoop.mapreduce.v2.app.launcher.
> > ContainerLauncher$EventType
> > > for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> > > ContainerLauncherRouter
> > > > 2016-11-25 15:15:36,923 INFO [main] org.apache.hadoop.mapreduce.
> > v2.jobhistory.JobHistoryUtils:
> > > Default file system is set solely by core-default.xml therefore -
> > ignoring
> > > > 2016-11-25 15:15:36,945 INFO [main] org.apache.hadoop.mapreduce.
> > v2.jobhistory.JobHistoryUtils:
> > > Default file system is set solely by core-default.xml therefore -
> > ignoring
> > > > 2016-11-25 15:15:36,967 INFO [main] org.apache.hadoop.mapreduce.
> > v2.jobhistory.JobHistoryUtils:
> > > Default file system is set solely by core-default.xml therefore -
> > ignoring
> > > > 2016-11-25 15:15:37,029 INFO [main] org.apache.hadoop.mapreduce.
> > > jobhistory.JobHistoryEventHandler: Emitting job history data to the
> > > timeline server is not enabled
> > > > 2016-11-25 15:15:37,064 INFO [main] org.apache.hadoop.yarn.event.
> > AsyncDispatcher:
> > > Registering class org.apache.hadoop.mapreduce.v2.app.job.event.
> > JobFinishEvent$Type
> > > for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> > > JobFinishEventHandler
> > > > 2016-11-25 15:15:37,204 WARN [main] org.apache.hadoop.metrics2.
> > impl.MetricsConfig:
> > > Cannot locate configuration: tried hadoop-metrics2-mrappmaster.
> > > properties,hadoop-metrics2.properties
> > > > 2016-11-25 15:15:37,300 INFO [main] org.apache.hadoop.metrics2.
> > impl.MetricsSystemImpl:
> > > Scheduled snapshot period at 10 second(s).
> > > > 2016-11-25 15:15:37,300 INFO [main] org.apache.hadoop.metrics2.
> > impl.MetricsSystemImpl:
> > > MRAppMaster metrics system started
> > > > 2016-11-25 15:15:37,308 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.job.impl.JobImpl:
> > > Adding job token for job_1479580915733_0167 to jobTokenSecretManager
> > > > 2016-11-25 15:15:37,452 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.job.impl.JobImpl:
> > > Not uberizing job_1479580915733_0167 because: not enabled; too much
> RAM;
> > > > 2016-11-25 15:15:37,468 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.job.impl.JobImpl:
> > > Input size for job job_1479580915733_0167 = 223589. Number of splits =
> 1
> > > > 2016-11-25 15:15:37,469 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.job.impl.JobImpl:
> > > Number of reduces for job job_1479580915733_0167 = 1
> > > > 2016-11-25 15:15:37,469 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.job.impl.JobImpl:
> > > job_1479580915733_0167Job Transitioned from NEW to INITED
> > > > 2016-11-25 15:15:37,470 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.MRAppMaster:
> > > MRAppMaster launching normal, non-uberized, multi-container job
> > > job_1479580915733_0167.
> > > > 2016-11-25 15:15:37,493 INFO [main] org.apache.hadoop.ipc.
> > CallQueueManager:
> > > Using callQueue: class java.util.concurrent.LinkedBlockingQueue
> > > scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler
> > > > 2016-11-25 15:15:37,506 INFO [Socket Reader #1 for port 60945]
> > > org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 60945
> > > > 2016-11-25 15:15:37,525 INFO [main] org.apache.hadoop.yarn.
> > > factories.impl.pb.RpcServerFactoryPBImpl: Adding protocol
> > > org.apache.hadoop.mapreduce.v2.api.MRClientProtocolPB to the server
> > > > 2016-11-25 15:15:37,527 INFO [IPC Server Responder]
> > > org.apache.hadoop.ipc.Server: IPC Server Responder: starting
> > > > 2016-11-25 15:15:37,529 INFO [IPC Server listener on 60945]
> > > org.apache.hadoop.ipc.Server: IPC Server listener on 60945: starting
> > > > 2016-11-25 15:15:37,529 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.client.MRClientService:
> > > Instantiated MRClientService at hadoopclusterslic73.ad.
> > > infosys.com/10.122.97.73:60945
> > > > 2016-11-25 <http://hadoopclusterslic73.ad.infosys.com/10.122.97.73:
> > > 609452016-11-25> 15:15:37,614 INFO [main] org.mortbay.log: Logging to
> > > org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
> > > org.mortbay.log.Slf4jLog
> > > > 2016-11-25 15:15:37,623 INFO [main] org.apache.hadoop.security.
> > > authentication.server.AuthenticationFilter: Unable to initialize
> > > FileSignerSecretProvider, falling back to use random secrets.
> > > > 2016-11-25 15:15:37,628 WARN [main] org.apache.hadoop.http.
> > HttpRequestLog:
> > > Jetty request log can only be enabled using Log4j
> > > > 2016-11-25 15:15:37,636 INFO [main] org.apache.hadoop.http.
> > HttpServer2:
> > > Added global filter 'safety' (class=org.apache.hadoop.http.
> HttpServer2$
> > > QuotingInputFilter)
> > > > 2016-11-25 15:15:37,688 INFO [main] org.apache.hadoop.http.
> > HttpServer2:
> > > Added filter AM_PROXY_FILTER (class=org.apache.hadoop.yarn.
> > > server.webproxy.amfilter.AmIpFilter) to context mapreduce
> > > > 2016-11-25 15:15:37,688 INFO [main] org.apache.hadoop.http.
> > HttpServer2:
> > > Added filter AM_PROXY_FILTER (class=org.apache.hadoop.yarn.
> > > server.webproxy.amfilter.AmIpFilter) to context static
> > > > 2016-11-25 15:15:37,691 INFO [main] org.apache.hadoop.http.
> > HttpServer2:
> > > adding path spec: /mapreduce/*
> > > > 2016-11-25 15:15:37,692 INFO [main] org.apache.hadoop.http.
> > HttpServer2:
> > > adding path spec: /ws/*
> > > > 2016-11-25 15:15:38,181 INFO [main] org.apache.hadoop.yarn.webapp.
> > WebApps:
> > > Registered webapp guice modules
> > > > 2016-11-25 15:15:38,183 INFO [main] org.apache.hadoop.http.
> > HttpServer2:
> > > Jetty bound to port 34311
> > > > 2016-11-25 15:15:38,183 INFO [main] org.mortbay.log: jetty-6.1.26.hwx
> > > > 2016-11-25 15:15:38,263 INFO [main] org.mortbay.log: Extract
> > > jar:file:/hadoop/yarn/local/filecache/16/mapreduce.tar.gz/
> > > hadoop/share/hadoop/yarn/hadoop-yarn-common-2.7.3.2.5.
> > > 0.0-1245.jar!/webapps/mapreduce to /hadoop/yarn/local/usercache/
> > > hdfs/appcache/application_1479580915733_0167/container_
> > > e125_1479580915733_0167_01_000001/tmp/Jetty_0_0_0_0_
> > > 34311_mapreduce____2ncvaf/webapp
> > > > 2016-11-25 15:15:39,882 INFO [main] org.mortbay.log: Started
> > HttpServer2$
> > > SelectChannelConnectorWithSafeStartup@0.0.0.0:34311
> > > > 2016-11-25 15:15:39,882 INFO [main] org.apache.hadoop.yarn.webapp.
> > WebApps:
> > > Web app mapreduce started at 34311
> > > > 2016-11-25 15:15:39,933 INFO [main] org.apache.hadoop.ipc.
> > CallQueueManager:
> > > Using callQueue: class java.util.concurrent.LinkedBlockingQueue
> > > scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler
> > > > 2016-11-25 15:15:39,936 INFO [Socket Reader #1 for port 57220]
> > > org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 57220
> > > > 2016-11-25 15:15:39,943 INFO [IPC Server listener on 57220]
> > > org.apache.hadoop.ipc.Server: IPC Server listener on 57220: starting
> > > > 2016-11-25 15:15:39,953 INFO [IPC Server Responder]
> > > org.apache.hadoop.ipc.Server: IPC Server Responder: starting
> > > > 2016-11-25 15:15:39,978 INFO [main] org.apache.hadoop.mapreduce.
> > > v2.app.rm.RMContainerRequestor: nodeBlacklistingEnabled:true
> > > > 2016-11-25 15:15:39,978 INFO [main] org.apache.hadoop.mapreduce.
> > > v2.app.rm.RMContainerRequestor: maxTaskFailuresPerNode is 3
> > > > 2016-11-25 15:15:39,978 INFO [main] org.apache.hadoop.mapreduce.
> > > v2.app.rm.RMContainerRequestor: blacklistDisablePercent is 33
> > > > 2016-11-25 15:15:40,082 WARN [main] org.apache.hadoop.ipc.Client:
> > Failed
> > > to connect to server: hadoopclusterslic71.ad.
> > infosys.com/10.122.97.71:8030:
> > > retries get failed due to exceeded maximum allowed retries number: 0
> > > > java.net.ConnectException: Connection refused
> > > >       at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> > > >       at sun.nio.ch.SocketChannelImpl.finishConnect(
> > > SocketChannelImpl.java:717)
> > > >       at org.apache.hadoop.net.SocketIOWithTimeout.connect(
> > > SocketIOWithTimeout.java:206)
> > > >       at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
> > > >       at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
> > > >       at org.apache.hadoop.ipc.Client$Connection.setupConnection(
> > > Client.java:650)
> > > >       at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(
> > > Client.java:745)
> > > >       at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.
> > > java:397)
> > > >       at org.apache.hadoop.ipc.Client.getConnection(Client.java:
> 1618)
> > > >       at org.apache.hadoop.ipc.Client.call(Client.java:1449)
> > > >       at org.apache.hadoop.ipc.Client.call(Client.java:1396)
> > > >       at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.
> > > invoke(ProtobufRpcEngine.java:233)
> > > >       at com.sun.proxy.$Proxy80.registerApplicationMaster(Unknown
> > > Source)
> > > >       at org.apache.hadoop.yarn.api.impl.pb.client.
> > > ApplicationMasterProtocolPBClientImpl.registerApplicationMaster(
> > > ApplicationMasterProtocolPBClientImpl.java:106)
> > > >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > >       at sun.reflect.NativeMethodAccessorImpl.invoke(
> > > NativeMethodAccessorImpl.java:62)
> > > >       at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> > > DelegatingMethodAccessorImpl.java:43)
> > > >       at java.lang.reflect.Method.invoke(Method.java:497)
> > > >       at org.apache.hadoop.io.retry.RetryInvocationHandler.
> > invokeMethod(
> > > RetryInvocationHandler.java:278)
> > > >       at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
> > > RetryInvocationHandler.java:194)
> > > >       at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
> > > RetryInvocationHandler.java:176)
> > > >       at com.sun.proxy.$Proxy81.registerApplicationMaster(Unknown
> > > Source)
> > > >       at org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator.
> > > register(RMCommunicator.java:160)
> > > >       at org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator.
> > > serviceStart(RMCommunicator.java:121)
> > > >       at org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator.
> > > serviceStart(RMContainerAllocator.java:250)
> > > >       at org.apache.hadoop.service.AbstractService.start(
> > > AbstractService.java:193)
> > > >       at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> > > ContainerAllocatorRouter.serviceStart(MRAppMaster.java:881)
> > > >       at org.apache.hadoop.service.AbstractService.start(
> > > AbstractService.java:193)
> > > >       at org.apache.hadoop.service.CompositeService.serviceStart(
> > > CompositeService.java:120)
> > > >       at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.
> > > serviceStart(MRAppMaster.java:1151)
> > > >       at org.apache.hadoop.service.AbstractService.start(
> > > AbstractService.java:193)
> > > >       at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$5.run(
> > > MRAppMaster.java:1557)
> > > >       at java.security.AccessController.doPrivileged(Native Method)
> > > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > > UserGroupInformation.java:1724)
> > > >       at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.
> > > initAndStartAppMaster(MRAppMaster.java:1553)
> > > >       at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(
> > > MRAppMaster.java:1486)
> > > > 2016-11-25 15:15:40,089 INFO [main] org.apache.hadoop.yarn.client.
> > > ConfiguredRMFailoverProxyProvider: Failing over to rm2
> > > > 2016-11-25 15:15:40,183 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMCommunicator:
> > > maxContainerCapability: <memory:28672, vCores:3>
> > > > 2016-11-25 15:15:40,183 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMCommunicator:
> > > queue: default
> > > > 2016-11-25 15:15:40,186 INFO [main] org.apache.hadoop.mapreduce.
> > > v2.app.launcher.ContainerLauncherImpl: Upper limit on the thread pool
> > > size is 500
> > > > 2016-11-25 15:15:40,186 INFO [main] org.apache.hadoop.mapreduce.
> > > v2.app.launcher.ContainerLauncherImpl: The thread pool initial size is
> > 10
> > > > 2016-11-25 15:15:40,189 INFO [main] org.apache.hadoop.yarn.client.
> > > api.impl.ContainerManagementProtocolProxy: yarn.client.max-cached-
> > nodemanagers-proxies
> > > : 0
> > > > 2016-11-25 15:15:40,202 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> > > job_1479580915733_0167Job Transitioned from INITED to SETUP
> > > > 2016-11-25 15:15:40,212 INFO [CommitterEvent Processor #0]
> > > org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
> > > Processing the event EventType: JOB_SETUP
> > > > 2016-11-25 15:15:40,226 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> > > job_1479580915733_0167Job Transitioned from SETUP to RUNNING
> > > > 2016-11-25 15:15:40,291 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.yarn.util.RackResolver: Resolved
> > hadoopclusterslic72.ad.
> > > infosys.com to /default-rack
> > > > 2016-11-25 15:15:40,328 INFO [eventHandlingThread]
> > > org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Event
> > > Writer setup for JobId: job_1479580915733_0167, File:
> > > hdfs://SLICHDP:8020/user/hdfs/.staging/job_1479580915733_
> > > 0167/job_1479580915733_0167_1.jhist
> > > > 2016-11-25 15:15:40,351 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.yarn.util.RackResolver: Resolved
> > hadoopclusterslic73.ad.
> > > infosys.com to /default-rack
> > > > 2016-11-25 15:15:40,357 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
> > > task_1479580915733_0167_m_000000 Task Transitioned from NEW to
> SCHEDULED
> > > > 2016-11-25 15:15:40,358 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
> > > task_1479580915733_0167_r_000000 Task Transitioned from NEW to
> SCHEDULED
> > > > 2016-11-25 15:15:40,359 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
> NEW
> > > to UNASSIGNED
> > > > 2016-11-25 15:15:40,359 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_r_000000_0 TaskAttempt Transitioned from
> NEW
> > > to UNASSIGNED
> > > > 2016-11-25 15:15:40,401 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > > v2.app.rm.RMContainerAllocator: mapResourceRequest:<memory:3072,
> > vCores:1>
> > > > 2016-11-25 15:15:40,416 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > > v2.app.rm.RMContainerAllocator: reduceResourceRequest:<memory:4096,
> > > vCores:1>
> > > > 2016-11-25 15:15:41,191 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before
> > > Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0
> AssignedMaps:0
> > > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:0 ContRel:0
> > > HostLocal:0 RackLocal:0
> > > > 2016-11-25 15:15:41,225 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> > > getResources() for application_1479580915733_0167: ask=4 release= 0
> > > newContainers=0 finishedContainers=0 resourcelimit=<memory:38912,
> > vCores:1>
> > > knownNMs=2
> > > > 2016-11-25 15:15:41,225 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
> > Recalculating
> > > schedule, headroom=<memory:38912, vCores:1>
> > > > 2016-11-25 15:15:41,226 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce
> slow
> > > start threshold not met. completedMapsForReduceSlowstart 1
> > > > 2016-11-25 15:15:42,235 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got
> > allocated
> > > containers 1
> > > > 2016-11-25 15:15:42,237 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
> > > container container_e125_1479580915733_0167_01_000002 to
> > > attempt_1479580915733_0167_m_000000_0
> > > > 2016-11-25 15:15:42,238 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
> > Recalculating
> > > schedule, headroom=<memory:34816, vCores:0>
> > > > 2016-11-25 15:15:42,238 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce
> slow
> > > start threshold not met. completedMapsForReduceSlowstart 1
> > > > 2016-11-25 15:15:42,238 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
> > > Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0
> AssignedMaps:1
> > > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:1 ContRel:0
> > > HostLocal:1 RackLocal:0
> > > > 2016-11-25 15:15:42,286 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.yarn.util.RackResolver: Resolved
> > hadoopclusterslic73.ad.
> > > infosys.com to /default-rack
> > > > 2016-11-25 15:15:42,311 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: The
> job-jar
> > > file on the remote FS is hdfs://SLICHDP/user/hdfs/.
> > > staging/job_1479580915733_0167/job.jar
> > > > 2016-11-25 15:15:42,315 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: The
> > job-conf
> > > file on the remote FS is /user/hdfs/.staging/job_
> > > 1479580915733_0167/job.xml
> > > > 2016-11-25 15:15:42,336 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Adding #0
> > > tokens and #1 secret keys for NM use for launching container
> > > > 2016-11-25 15:15:42,336 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Size of
> > > containertokens_dob is 1
> > > > 2016-11-25 15:15:42,336 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Putting
> > > shuffle token in serviceData
> > > > 2016-11-25 15:15:42,441 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
> > > UNASSIGNED to ASSIGNED
> > > > 2016-11-25 15:15:42,455 INFO [ContainerLauncher #0]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container
> > > container_e125_1479580915733_0167_01_000002 taskAttempt
> > > attempt_1479580915733_0167_m_000000_0
> > > > 2016-11-25 15:15:42,457 INFO [ContainerLauncher #0]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > Launching attempt_1479580915733_0167_m_000000_0
> > > > 2016-11-25 15:15:42,457 INFO [ContainerLauncher #0]
> > > org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolPro
> xy:
> > > Opening proxy : hadoopclusterslic73.ad.infosys.com:45454
> > > > 2016-11-25 15:15:42,531 INFO [ContainerLauncher #0]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > Shuffle port returned by ContainerManager for
> > attempt_1479580915733_0167_m_000000_0
> > > : 13562
> > > > 2016-11-25 15:15:42,533 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > TaskAttempt:
> > > [attempt_1479580915733_0167_m_000000_0] using containerId:
> > > [container_e125_1479580915733_0167_01_000002 on NM: [
> > > hadoopclusterslic73.ad.infosys.com:45454]
> > > > 2016-11-25 15:15:42,536 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
> > > ASSIGNED to RUNNING
> > > > 2016-11-25 15:15:42,536 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
> > > task_1479580915733_0167_m_000000 Task Transitioned from SCHEDULED to
> > > RUNNING
> > > > 2016-11-25 15:15:43,241 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> > > getResources() for application_1479580915733_0167: ask=4 release= 0
> > > newContainers=0 finishedContainers=0 resourcelimit=<memory:34816,
> > vCores:0>
> > > knownNMs=2
> > > > 2016-11-25 15:15:44,790 INFO [Socket Reader #1 for port 57220]
> > > SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
> > > job_1479580915733_0167 (auth:SIMPLE)
> > > > 2016-11-25 15:15:44,870 INFO [IPC Server handler 5 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID :
> > > jvm_1479580915733_0167_m_137438953472002 asked for a task
> > > > 2016-11-25 15:15:44,870 INFO [IPC Server handler 5 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID:
> > > jvm_1479580915733_0167_m_137438953472002 given task:
> > > attempt_1479580915733_0167_m_000000_0
> > > > 2016-11-25 15:15:51,923 INFO [IPC Server handler 12 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
> > TaskAttempt
> > > attempt_1479580915733_0167_m_000000_0 is : 0.667
> > > > 2016-11-25 15:15:52,099 INFO [IPC Server handler 5 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
> > TaskAttempt
> > > attempt_1479580915733_0167_m_000000_0 is : 0.667
> > > > 2016-11-25 15:15:52,137 ERROR [IPC Server handler 12 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
> > > attempt_1479580915733_0167_m_000000_0 - exited : java.io.IOException:
> > > Failed to build cube in mapper 0
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:145)
> > > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> > java:787)
> > > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.
> java:168)
> > > >       at java.security.AccessController.doPrivileged(Native Method)
> > > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > > UserGroupInformation.java:1724)
> > > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > > Caused by: java.util.concurrent.ExecutionException:
> > > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > > java.lang.IllegalArgumentException: Value not exists!
> > > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:143)
> > > >       ... 8 more
> > > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > > java.io.IOException: java.lang.IllegalArgumentException: Value not
> > exists!
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:82)
> > > >       at java.util.concurrent.Executors$RunnableAdapter.
> > > call(Executors.java:511)
> > > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > > ThreadPoolExecutor.java:1142)
> > > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > > ThreadPoolExecutor.java:617)
> > > >       at java.lang.Thread.run(Thread.java:745)
> > > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> > IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:126)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > > build(DoggedCubeBuilder.java:73)
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:80)
> > > >       ... 5 more
> > > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:114)
> > > >       ... 7 more
> > > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > > >       at org.apache.kylin.common.util.Dictionary.
> getIdFromValueBytes(
> > > Dictionary.java:162)
> > > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > > TrieDictionary.java:167)
> > > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > > Dictionary.java:98)
> > > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:110)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:93)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:81)
> > > >       at org.apache.kylin.cube.inmemcubing.
> > > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > > .java:74)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:542)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:523)
> > > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > > GTAggregateScanner.java:139)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > createBaseCuboid(InMemCubeBuilder.java:339)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:166)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:135)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > SplitThread.run(DoggedCubeBuilder.java:282)
> > > >
> > > > 2016-11-25 15:15:52,138 INFO [IPC Server handler 12 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report
> > from
> > > attempt_1479580915733_0167_m_000000_0: Error: java.io.IOException:
> > Failed
> > > to build cube in mapper 0
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:145)
> > > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> > java:787)
> > > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.
> java:168)
> > > >       at java.security.AccessController.doPrivileged(Native Method)
> > > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > > UserGroupInformation.java:1724)
> > > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > > Caused by: java.util.concurrent.ExecutionException:
> > > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > > java.lang.IllegalArgumentException: Value not exists!
> > > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:143)
> > > >       ... 8 more
> > > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > > java.io.IOException: java.lang.IllegalArgumentException: Value not
> > exists!
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:82)
> > > >       at java.util.concurrent.Executors$RunnableAdapter.
> > > call(Executors.java:511)
> > > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > > ThreadPoolExecutor.java:1142)
> > > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > > ThreadPoolExecutor.java:617)
> > > >       at java.lang.Thread.run(Thread.java:745)
> > > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> > IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:126)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > > build(DoggedCubeBuilder.java:73)
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:80)
> > > >       ... 5 more
> > > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:114)
> > > >       ... 7 more
> > > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > > >       at org.apache.kylin.common.util.Dictionary.
> getIdFromValueBytes(
> > > Dictionary.java:162)
> > > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > > TrieDictionary.java:167)
> > > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > > Dictionary.java:98)
> > > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:110)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:93)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:81)
> > > >       at org.apache.kylin.cube.inmemcubing.
> > > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > > .java:74)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:542)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:523)
> > > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > > GTAggregateScanner.java:139)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > createBaseCuboid(InMemCubeBuilder.java:339)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:166)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:135)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > SplitThread.run(DoggedCubeBuilder.java:282)
> > > >
> > > > 2016-11-25 15:15:52,141 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> Diagnostics
> > > report from attempt_1479580915733_0167_m_000000_0: Error:
> > > java.io.IOException: Failed to build cube in mapper 0
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:145)
> > > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> > java:787)
> > > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.
> java:168)
> > > >       at java.security.AccessController.doPrivileged(Native Method)
> > > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > > UserGroupInformation.java:1724)
> > > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > > Caused by: java.util.concurrent.ExecutionException:
> > > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > > java.lang.IllegalArgumentException: Value not exists!
> > > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:143)
> > > >       ... 8 more
> > > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > > java.io.IOException: java.lang.IllegalArgumentException: Value not
> > exists!
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:82)
> > > >       at java.util.concurrent.Executors$RunnableAdapter.
> > > call(Executors.java:511)
> > > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > > ThreadPoolExecutor.java:1142)
> > > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > > ThreadPoolExecutor.java:617)
> > > >       at java.lang.Thread.run(Thread.java:745)
> > > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> > IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:126)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > > build(DoggedCubeBuilder.java:73)
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:80)
> > > >       ... 5 more
> > > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:114)
> > > >       ... 7 more
> > > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > > >       at org.apache.kylin.common.util.Dictionary.
> getIdFromValueBytes(
> > > Dictionary.java:162)
> > > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > > TrieDictionary.java:167)
> > > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > > Dictionary.java:98)
> > > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:110)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:93)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:81)
> > > >       at org.apache.kylin.cube.inmemcubing.
> > > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > > .java:74)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:542)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:523)
> > > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > > GTAggregateScanner.java:139)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > createBaseCuboid(InMemCubeBuilder.java:339)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:166)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:135)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > SplitThread.run(DoggedCubeBuilder.java:282)
> > > >
> > > > 2016-11-25 15:15:52,142 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
> > > RUNNING to FAIL_CONTAINER_CLEANUP
> > > > 2016-11-25 15:15:52,155 INFO [ContainerLauncher #1]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container
> > > container_e125_1479580915733_0167_01_000002 taskAttempt
> > > attempt_1479580915733_0167_m_000000_0
> > > > 2016-11-25 15:15:52,156 INFO [ContainerLauncher #1]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > KILLING attempt_1479580915733_0167_m_000000_0
> > > > 2016-11-25 15:15:52,156 INFO [ContainerLauncher #1]
> > > org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolPro
> xy:
> > > Opening proxy : hadoopclusterslic73.ad.infosys.com:45454
> > > > 2016-11-25 15:15:52,195 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
> > > FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
> > > > 2016-11-25 15:15:52,204 INFO [CommitterEvent Processor #1]
> > > org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
> > > Processing the event EventType: TASK_ABORT
> > > > 2016-11-25 15:15:52,215 WARN [CommitterEvent Processor #1]
> > > org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not
> > > delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-
> > > 4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_
> > temporary/attempt_
> > > 1479580915733_0167_m_000000_0
> > > > 2016-11-25 15:15:52,218 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
> > > FAIL_TASK_CLEANUP to FAILED
> > > > 2016-11-25 15:15:52,224 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.yarn.util.RackResolver: Resolved
> > hadoopclusterslic72.ad.
> > > infosys.com to /default-rack
> > > > 2016-11-25 15:15:52,224 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.yarn.util.RackResolver: Resolved
> > hadoopclusterslic73.ad.
> > > infosys.com to /default-rack
> > > > 2016-11-25 15:15:52,226 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
> NEW
> > > to UNASSIGNED
> > > > 2016-11-25 15:15:52,226 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > > v2.app.rm.RMContainerRequestor: 1 failures on node
> > hadoopclusterslic73.ad.
> > > infosys.com
> > > > 2016-11-25 15:15:52,230 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > > v2.app.rm.RMContainerAllocator: Added attempt_1479580915733_0167_m_
> > 000000_1
> > > to list of failed maps
> > > > 2016-11-25 15:15:52,291 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before
> > > Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0
> AssignedMaps:1
> > > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:1 ContRel:0
> > > HostLocal:1 RackLocal:0
> > > > 2016-11-25 15:15:52,299 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> > > getResources() for application_1479580915733_0167: ask=1 release= 0
> > > newContainers=0 finishedContainers=1 resourcelimit=<memory:38912,
> > vCores:1>
> > > knownNMs=2
> > > > 2016-11-25 15:15:52,299 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received
> > > completed container container_e125_1479580915733_0167_01_000002
> > > > 2016-11-25 15:15:52,300 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
> > Recalculating
> > > schedule, headroom=<memory:38912, vCores:1>
> > > > 2016-11-25 15:15:52,300 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> Diagnostics
> > > report from attempt_1479580915733_0167_m_000000_0: Container killed by
> > > the ApplicationMaster.
> > > > Container killed on request. Exit code is 143
> > > > Container exited with a non-zero exit code 143
> > > >
> > > > 2016-11-25 15:15:52,300 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce
> slow
> > > start threshold not met. completedMapsForReduceSlowstart 1
> > > > 2016-11-25 15:15:52,300 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
> > > Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0
> AssignedMaps:0
> > > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:1 ContRel:0
> > > HostLocal:1 RackLocal:0
> > > > 2016-11-25 15:15:53,303 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got
> > allocated
> > > containers 1
> > > > 2016-11-25 15:15:53,303 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning
> > > container Container: [ContainerId: container_e125_1479580915733_
> > 0167_01_000003,
> > > NodeId: hadoopclusterslic72.ad.infosys.com:45454, NodeHttpAddress:
> > > hadoopclusterslic72.ad.infosys.com:8042, Resource: <memory:4096,
> > > vCores:1>, Priority: 5, Token: Token { kind: ContainerToken, service:
> > > 10.122.97.72:45454 }, ] to fast fail map
> > > > 2016-11-25 15:15:53,303 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
> > from
> > > earlierFailedMaps
> > > > 2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
> > > container container_e125_1479580915733_0167_01_000003 to
> > > attempt_1479580915733_0167_m_000000_1
> > > > 2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
> > Recalculating
> > > schedule, headroom=<memory:34816, vCores:0>
> > > > 2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce
> slow
> > > start threshold not met. completedMapsForReduceSlowstart 1
> > > > 2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
> > > Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0
> AssignedMaps:1
> > > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:2 ContRel:0
> > > HostLocal:1 RackLocal:0
> > > > 2016-11-25 15:15:53,304 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.yarn.util.RackResolver: Resolved
> > hadoopclusterslic72.ad.
> > > infosys.com to /default-rack
> > > > 2016-11-25 15:15:53,304 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
> > > UNASSIGNED to ASSIGNED
> > > > 2016-11-25 15:15:53,305 INFO [ContainerLauncher #2]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container
> > > container_e125_1479580915733_0167_01_000003 taskAttempt
> > > attempt_1479580915733_0167_m_000000_1
> > > > 2016-11-25 15:15:53,305 INFO [ContainerLauncher #2]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > Launching attempt_1479580915733_0167_m_000000_1
> > > > 2016-11-25 15:15:53,305 INFO [ContainerLauncher #2]
> > > org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolPro
> xy:
> > > Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> > > > 2016-11-25 15:15:53,318 INFO [ContainerLauncher #2]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > Shuffle port returned by ContainerManager for
> > attempt_1479580915733_0167_m_000000_1
> > > : 13562
> > > > 2016-11-25 15:15:53,318 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > TaskAttempt:
> > > [attempt_1479580915733_0167_m_000000_1] using containerId:
> > > [container_e125_1479580915733_0167_01_000003 on NM: [
> > > hadoopclusterslic72.ad.infosys.com:45454]
> > > > 2016-11-25 15:15:53,318 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
> > > ASSIGNED to RUNNING
> > > > 2016-11-25 15:15:54,309 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> > > getResources() for application_1479580915733_0167: ask=1 release= 0
> > > newContainers=0 finishedContainers=0 resourcelimit=<memory:34816,
> > vCores:0>
> > > knownNMs=2
> > > > 2016-11-25 15:15:55,797 INFO [Socket Reader #1 for port 57220]
> > > SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
> > > job_1479580915733_0167 (auth:SIMPLE)
> > > > 2016-11-25 15:15:55,819 INFO [IPC Server handler 13 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID :
> > > jvm_1479580915733_0167_m_137438953472003 asked for a task
> > > > 2016-11-25 15:15:55,819 INFO [IPC Server handler 13 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID:
> > > jvm_1479580915733_0167_m_137438953472003 given task:
> > > attempt_1479580915733_0167_m_000000_1
> > > > 2016-11-25 15:16:02,857 INFO [IPC Server handler 8 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
> > TaskAttempt
> > > attempt_1479580915733_0167_m_000000_1 is : 0.667
> > > > 2016-11-25 15:16:03,332 INFO [IPC Server handler 13 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
> > TaskAttempt
> > > attempt_1479580915733_0167_m_000000_1 is : 0.667
> > > > 2016-11-25 15:16:03,347 ERROR [IPC Server handler 8 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
> > > attempt_1479580915733_0167_m_000000_1 - exited : java.io.IOException:
> > > Failed to build cube in mapper 0
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:145)
> > > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> > java:787)
> > > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.
> java:168)
> > > >       at java.security.AccessController.doPrivileged(Native Method)
> > > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > > UserGroupInformation.java:1724)
> > > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > > Caused by: java.util.concurrent.ExecutionException:
> > > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > > java.lang.IllegalArgumentException: Value not exists!
> > > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:143)
> > > >       ... 8 more
> > > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > > java.io.IOException: java.lang.IllegalArgumentException: Value not
> > exists!
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:82)
> > > >       at java.util.concurrent.Executors$RunnableAdapter.
> > > call(Executors.java:511)
> > > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > > ThreadPoolExecutor.java:1142)
> > > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > > ThreadPoolExecutor.java:617)
> > > >       at java.lang.Thread.run(Thread.java:745)
> > > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> > IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:126)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > > build(DoggedCubeBuilder.java:73)
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:80)
> > > >       ... 5 more
> > > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:114)
> > > >       ... 7 more
> > > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > > >       at org.apache.kylin.common.util.Dictionary.
> getIdFromValueBytes(
> > > Dictionary.java:162)
> > > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > > TrieDictionary.java:167)
> > > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > > Dictionary.java:98)
> > > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:110)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:93)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:81)
> > > >       at org.apache.kylin.cube.inmemcubing.
> > > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > > .java:74)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:542)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:523)
> > > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > > GTAggregateScanner.java:139)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > createBaseCuboid(InMemCubeBuilder.java:339)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:166)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:135)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > SplitThread.run(DoggedCubeBuilder.java:282)
> > > >
> > > > 2016-11-25 15:16:03,347 INFO [IPC Server handler 8 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report
> > from
> > > attempt_1479580915733_0167_m_000000_1: Error: java.io.IOException:
> > Failed
> > > to build cube in mapper 0
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:145)
> > > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> > java:787)
> > > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.
> java:168)
> > > >       at java.security.AccessController.doPrivileged(Native Method)
> > > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > > UserGroupInformation.java:1724)
> > > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > > Caused by: java.util.concurrent.ExecutionException:
> > > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > > java.lang.IllegalArgumentException: Value not exists!
> > > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:143)
> > > >       ... 8 more
> > > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > > java.io.IOException: java.lang.IllegalArgumentException: Value not
> > exists!
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:82)
> > > >       at java.util.concurrent.Executors$RunnableAdapter.
> > > call(Executors.java:511)
> > > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > > ThreadPoolExecutor.java:1142)
> > > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > > ThreadPoolExecutor.java:617)
> > > >       at java.lang.Thread.run(Thread.java:745)
> > > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> > IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:126)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > > build(DoggedCubeBuilder.java:73)
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:80)
> > > >       ... 5 more
> > > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:114)
> > > >       ... 7 more
> > > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > > >       at org.apache.kylin.common.util.Dictionary.
> getIdFromValueBytes(
> > > Dictionary.java:162)
> > > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > > TrieDictionary.java:167)
> > > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > > Dictionary.java:98)
> > > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:110)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:93)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:81)
> > > >       at org.apache.kylin.cube.inmemcubing.
> > > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > > .java:74)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:542)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:523)
> > > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > > GTAggregateScanner.java:139)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > createBaseCuboid(InMemCubeBuilder.java:339)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:166)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:135)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > SplitThread.run(DoggedCubeBuilder.java:282)
> > > >
> > > > 2016-11-25 15:16:03,349 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> Diagnostics
> > > report from attempt_1479580915733_0167_m_000000_1: Error:
> > > java.io.IOException: Failed to build cube in mapper 0
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:145)
> > > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> > java:787)
> > > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.
> java:168)
> > > >       at java.security.AccessController.doPrivileged(Native Method)
> > > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > > UserGroupInformation.java:1724)
> > > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > > Caused by: java.util.concurrent.ExecutionException:
> > > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > > java.lang.IllegalArgumentException: Value not exists!
> > > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:143)
> > > >       ... 8 more
> > > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > > java.io.IOException: java.lang.IllegalArgumentException: Value not
> > exists!
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:82)
> > > >       at java.util.concurrent.Executors$RunnableAdapter.
> > > call(Executors.java:511)
> > > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > > ThreadPoolExecutor.java:1142)
> > > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > > ThreadPoolExecutor.java:617)
> > > >       at java.lang.Thread.run(Thread.java:745)
> > > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> > IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:126)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > > build(DoggedCubeBuilder.java:73)
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:80)
> > > >       ... 5 more
> > > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:114)
> > > >       ... 7 more
> > > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > > >       at org.apache.kylin.common.util.Dictionary.
> getIdFromValueBytes(
> > > Dictionary.java:162)
> > > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > > TrieDictionary.java:167)
> > > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > > Dictionary.java:98)
> > > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:110)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:93)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:81)
> > > >       at org.apache.kylin.cube.inmemcubing.
> > > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > > .java:74)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:542)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:523)
> > > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > > GTAggregateScanner.java:139)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > createBaseCuboid(InMemCubeBuilder.java:339)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:166)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:135)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > SplitThread.run(DoggedCubeBuilder.java:282)
> > > >
> > > > 2016-11-25 15:16:03,350 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
> > > RUNNING to FAIL_CONTAINER_CLEANUP
> > > > 2016-11-25 15:16:03,351 INFO [ContainerLauncher #3]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container
> > > container_e125_1479580915733_0167_01_000003 taskAttempt
> > > attempt_1479580915733_0167_m_000000_1
> > > > 2016-11-25 15:16:03,355 INFO [ContainerLauncher #3]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > KILLING attempt_1479580915733_0167_m_000000_1
> > > > 2016-11-25 15:16:03,355 INFO [ContainerLauncher #3]
> > > org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolPro
> xy:
> > > Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> > > > 2016-11-25 15:16:03,369 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
> > > FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
> > > > 2016-11-25 15:16:03,369 INFO [CommitterEvent Processor #2]
> > > org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
> > > Processing the event EventType: TASK_ABORT
> > > > 2016-11-25 15:16:03,375 WARN [CommitterEvent Processor #2]
> > > org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not
> > > delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-
> > > 4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_
> > temporary/attempt_
> > > 1479580915733_0167_m_000000_1
> > > > 2016-11-25 15:16:03,375 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
> > > FAIL_TASK_CLEANUP to FAILED
> > > > 2016-11-25 15:16:03,375 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.yarn.util.RackResolver: Resolved
> > hadoopclusterslic72.ad.
> > > infosys.com to /default-rack
> > > > 2016-11-25 15:16:03,375 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.yarn.util.RackResolver: Resolved
> > hadoopclusterslic73.ad.
> > > infosys.com to /default-rack
> > > > 2016-11-25 15:16:03,376 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > > v2.app.rm.RMContainerRequestor: 1 failures on node
> > hadoopclusterslic72.ad.
> > > infosys.com
> > > > 2016-11-25 15:16:03,376 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
> NEW
> > > to UNASSIGNED
> > > > 2016-11-25 15:16:03,380 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > > v2.app.rm.RMContainerAllocator: Added attempt_1479580915733_0167_m_
> > 000000_2
> > > to list of failed maps
> > > > 2016-11-25 15:16:04,341 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before
> > > Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0
> AssignedMaps:1
> > > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:2 ContRel:0
> > > HostLocal:1 RackLocal:0
> > > > 2016-11-25 15:16:04,344 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> > > getResources() for application_1479580915733_0167: ask=1 release= 0
> > > newContainers=0 finishedContainers=0 resourcelimit=<memory:34816,
> > vCores:0>
> > > knownNMs=2
> > > > 2016-11-25 15:16:04,344 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
> > Recalculating
> > > schedule, headroom=<memory:34816, vCores:0>
> > > > 2016-11-25 15:16:04,344 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce
> slow
> > > start threshold not met. completedMapsForReduceSlowstart 1
> > > > 2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received
> > > completed container container_e125_1479580915733_0167_01_000003
> > > > 2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got
> > allocated
> > > containers 1
> > > > 2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning
> > > container Container: [ContainerId: container_e125_1479580915733_
> > 0167_01_000004,
> > > NodeId: hadoopclusterslic72.ad.infosys.com:45454, NodeHttpAddress:
> > > hadoopclusterslic72.ad.infosys.com:8042, Resource: <memory:4096,
> > > vCores:1>, Priority: 5, Token: Token { kind: ContainerToken, service:
> > > 10.122.97.72:45454 }, ] to fast fail map
> > > > 2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
> > from
> > > earlierFailedMaps
> > > > 2016-11-25 15:16:05,352 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> Diagnostics
> > > report from attempt_1479580915733_0167_m_000000_1: Container killed by
> > > the ApplicationMaster.
> > > > Container killed on request. Exit code is 143
> > > > Container exited with a non-zero exit code 143
> > > >
> > > > 2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
> > > container container_e125_1479580915733_0167_01_000004 to
> > > attempt_1479580915733_0167_m_000000_2
> > > > 2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
> > Recalculating
> > > schedule, headroom=<memory:34816, vCores:0>
> > > > 2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce
> slow
> > > start threshold not met. completedMapsForReduceSlowstart 1
> > > > 2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
> > > Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0
> AssignedMaps:1
> > > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:3 ContRel:0
> > > HostLocal:1 RackLocal:0
> > > > 2016-11-25 15:16:05,353 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.yarn.util.RackResolver: Resolved
> > hadoopclusterslic72.ad.
> > > infosys.com to /default-rack
> > > > 2016-11-25 15:16:05,354 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
> > > UNASSIGNED to ASSIGNED
> > > > 2016-11-25 15:16:05,356 INFO [ContainerLauncher #4]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container
> > > container_e125_1479580915733_0167_01_000004 taskAttempt
> > > attempt_1479580915733_0167_m_000000_2
> > > > 2016-11-25 15:16:05,356 INFO [ContainerLauncher #4]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > Launching attempt_1479580915733_0167_m_000000_2
> > > > 2016-11-25 15:16:05,357 INFO [ContainerLauncher #4]
> > > org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolPro
> xy:
> > > Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> > > > 2016-11-25 15:16:05,371 INFO [ContainerLauncher #4]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > Shuffle port returned by ContainerManager for
> > attempt_1479580915733_0167_m_000000_2
> > > : 13562
> > > > 2016-11-25 15:16:05,371 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > TaskAttempt:
> > > [attempt_1479580915733_0167_m_000000_2] using containerId:
> > > [container_e125_1479580915733_0167_01_000004 on NM: [
> > > hadoopclusterslic72.ad.infosys.com:45454]
> > > > 2016-11-25 15:16:05,372 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
> > > ASSIGNED to RUNNING
> > > > 2016-11-25 15:16:06,362 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> > > getResources() for application_1479580915733_0167: ask=1 release= 0
> > > newContainers=0 finishedContainers=0 resourcelimit=<memory:34816,
> > vCores:0>
> > > knownNMs=2
> > > > 2016-11-25 15:16:07,537 INFO [Socket Reader #1 for port 57220]
> > > SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
> > > job_1479580915733_0167 (auth:SIMPLE)
> > > > 2016-11-25 15:16:07,567 INFO [IPC Server handler 24 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID :
> > > jvm_1479580915733_0167_m_137438953472004 asked for a task
> > > > 2016-11-25 15:16:07,567 INFO [IPC Server handler 24 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID:
> > > jvm_1479580915733_0167_m_137438953472004 given task:
> > > attempt_1479580915733_0167_m_000000_2
> > > > 2016-11-25 15:16:14,753 INFO [IPC Server handler 6 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
> > TaskAttempt
> > > attempt_1479580915733_0167_m_000000_2 is : 0.667
> > > > 2016-11-25 15:16:15,241 INFO [IPC Server handler 13 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
> > TaskAttempt
> > > attempt_1479580915733_0167_m_000000_2 is : 0.667
> > > > 2016-11-25 15:16:15,258 ERROR [IPC Server handler 8 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
> > > attempt_1479580915733_0167_m_000000_2 - exited : java.io.IOException:
> > > Failed to build cube in mapper 0
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:145)
> > > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> > java:787)
> > > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.
> java:168)
> > > >       at java.security.AccessController.doPrivileged(Native Method)
> > > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > > UserGroupInformation.java:1724)
> > > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > > Caused by: java.util.concurrent.ExecutionException:
> > > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > > java.lang.IllegalArgumentException: Value not exists!
> > > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:143)
> > > >       ... 8 more
> > > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > > java.io.IOException: java.lang.IllegalArgumentException: Value not
> > exists!
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:82)
> > > >       at java.util.concurrent.Executors$RunnableAdapter.
> > > call(Executors.java:511)
> > > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > > ThreadPoolExecutor.java:1142)
> > > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > > ThreadPoolExecutor.java:617)
> > > >       at java.lang.Thread.run(Thread.java:745)
> > > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> > IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:126)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > > build(DoggedCubeBuilder.java:73)
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:80)
> > > >       ... 5 more
> > > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:114)
> > > >       ... 7 more
> > > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > > >       at org.apache.kylin.common.util.Dictionary.
> getIdFromValueBytes(
> > > Dictionary.java:162)
> > > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > > TrieDictionary.java:167)
> > > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > > Dictionary.java:98)
> > > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:110)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:93)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:81)
> > > >       at org.apache.kylin.cube.inmemcubing.
> > > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > > .java:74)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:542)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:523)
> > > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > > GTAggregateScanner.java:139)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > createBaseCuboid(InMemCubeBuilder.java:339)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:166)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:135)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > SplitThread.run(DoggedCubeBuilder.java:282)
> > > >
> > > > 2016-11-25 15:16:15,258 INFO [IPC Server handler 8 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report
> > from
> > > attempt_1479580915733_0167_m_000000_2: Error: java.io.IOException:
> > Failed
> > > to build cube in mapper 0
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:145)
> > > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> > java:787)
> > > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.
> java:168)
> > > >       at java.security.AccessController.doPrivileged(Native Method)
> > > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > > UserGroupInformation.java:1724)
> > > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > > Caused by: java.util.concurrent.ExecutionException:
> > > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > > java.lang.IllegalArgumentException: Value not exists!
> > > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:143)
> > > >       ... 8 more
> > > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > > java.io.IOException: java.lang.IllegalArgumentException: Value not
> > exists!
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:82)
> > > >       at java.util.concurrent.Executors$RunnableAdapter.
> > > call(Executors.java:511)
> > > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > > ThreadPoolExecutor.java:1142)
> > > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > > ThreadPoolExecutor.java:617)
> > > >       at java.lang.Thread.run(Thread.java:745)
> > > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> > IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:126)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > > build(DoggedCubeBuilder.java:73)
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:80)
> > > >       ... 5 more
> > > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:114)
> > > >       ... 7 more
> > > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > > >       at org.apache.kylin.common.util.Dictionary.
> getIdFromValueBytes(
> > > Dictionary.java:162)
> > > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > > TrieDictionary.java:167)
> > > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > > Dictionary.java:98)
> > > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:110)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:93)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:81)
> > > >       at org.apache.kylin.cube.inmemcubing.
> > > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > > .java:74)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:542)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:523)
> > > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > > GTAggregateScanner.java:139)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > createBaseCuboid(InMemCubeBuilder.java:339)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:166)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:135)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > SplitThread.run(DoggedCubeBuilder.java:282)
> > > >
> > > > 2016-11-25 15:16:15,261 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> Diagnostics
> > > report from attempt_1479580915733_0167_m_000000_2: Error:
> > > java.io.IOException: Failed to build cube in mapper 0
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:145)
> > > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> > java:787)
> > > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.
> java:168)
> > > >       at java.security.AccessController.doPrivileged(Native Method)
> > > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > > UserGroupInformation.java:1724)
> > > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > > Caused by: java.util.concurrent.ExecutionException:
> > > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > > java.lang.IllegalArgumentException: Value not exists!
> > > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:143)
> > > >       ... 8 more
> > > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > > java.io.IOException: java.lang.IllegalArgumentException: Value not
> > exists!
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:82)
> > > >       at java.util.concurrent.Executors$RunnableAdapter.
> > > call(Executors.java:511)
> > > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > > ThreadPoolExecutor.java:1142)
> > > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > > ThreadPoolExecutor.java:617)
> > > >       at java.lang.Thread.run(Thread.java:745)
> > > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> > IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:126)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > > build(DoggedCubeBuilder.java:73)
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:80)
> > > >       ... 5 more
> > > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:114)
> > > >       ... 7 more
> > > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > > >       at org.apache.kylin.common.util.Dictionary.
> getIdFromValueBytes(
> > > Dictionary.java:162)
> > > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > > TrieDictionary.java:167)
> > > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > > Dictionary.java:98)
> > > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:110)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:93)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:81)
> > > >       at org.apache.kylin.cube.inmemcubing.
> > > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > > .java:74)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:542)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:523)
> > > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > > GTAggregateScanner.java:139)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > createBaseCuboid(InMemCubeBuilder.java:339)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:166)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:135)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > SplitThread.run(DoggedCubeBuilder.java:282)
> > > >
> > > > 2016-11-25 15:16:15,273 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
> > > RUNNING to FAIL_CONTAINER_CLEANUP
> > > > 2016-11-25 15:16:15,274 INFO [ContainerLauncher #5]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container
> > > container_e125_1479580915733_0167_01_000004 taskAttempt
> > > attempt_1479580915733_0167_m_000000_2
> > > > 2016-11-25 15:16:15,274 INFO [ContainerLauncher #5]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > KILLING attempt_1479580915733_0167_m_000000_2
> > > > 2016-11-25 15:16:15,274 INFO [ContainerLauncher #5]
> > > org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolPro
> xy:
> > > Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> > > > 2016-11-25 15:16:15,289 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
> > > FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
> > > > 2016-11-25 15:16:15,292 INFO [CommitterEvent Processor #3]
> > > org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
> > > Processing the event EventType: TASK_ABORT
> > > > 2016-11-25 15:16:15,300 WARN [CommitterEvent Processor #3]
> > > org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not
> > > delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-
> > > 4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_
> > temporary/attempt_
> > > 1479580915733_0167_m_000000_2
> > > > 2016-11-25 15:16:15,300 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
> > > FAIL_TASK_CLEANUP to FAILED
> > > > 2016-11-25 15:16:15,301 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.yarn.util.RackResolver: Resolved
> > hadoopclusterslic72.ad.
> > > infosys.com to /default-rack
> > > > 2016-11-25 15:16:15,301 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.yarn.util.RackResolver: Resolved
> > hadoopclusterslic73.ad.
> > > infosys.com to /default-rack
> > > > 2016-11-25 15:16:15,301 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
> NEW
> > > to UNASSIGNED
> > > > 2016-11-25 15:16:15,301 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > > v2.app.rm.RMContainerRequestor: 2 failures on node
> > hadoopclusterslic72.ad.
> > > infosys.com
> > > > 2016-11-25 15:16:15,307 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > > v2.app.rm.RMContainerAllocator: Added attempt_1479580915733_0167_m_
> > 000000_3
> > > to list of failed maps
> > > > 2016-11-25 15:16:15,412 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before
> > > Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0
> AssignedMaps:1
> > > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:3 ContRel:0
> > > HostLocal:1 RackLocal:0
> > > > 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> > > getResources() for application_1479580915733_0167: ask=1 release= 0
> > > newContainers=0 finishedContainers=1 resourcelimit=<memory:38912,
> > vCores:1>
> > > knownNMs=2
> > > > 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received
> > > completed container container_e125_1479580915733_0167_01_000004
> > > > 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
> > Recalculating
> > > schedule, headroom=<memory:38912, vCores:1>
> > > > 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce
> slow
> > > start threshold not met. completedMapsForReduceSlowstart 1
> > > > 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
> > > Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0
> AssignedMaps:0
> > > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:3 ContRel:0
> > > HostLocal:1 RackLocal:0
> > > > 2016-11-25 15:16:15,421 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> Diagnostics
> > > report from attempt_1479580915733_0167_m_000000_2: Container killed by
> > > the ApplicationMaster.
> > > > Container killed on request. Exit code is 143
> > > > Container exited with a non-zero exit code 143
> > > >
> > > > 2016-11-25 15:16:16,432 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got
> > allocated
> > > containers 1
> > > > 2016-11-25 15:16:16,432 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning
> > > container Container: [ContainerId: container_e125_1479580915733_
> > 0167_01_000005,
> > > NodeId: hadoopclusterslic72.ad.infosys.com:45454, NodeHttpAddress:
> > > hadoopclusterslic72.ad.infosys.com:8042, Resource: <memory:4096,
> > > vCores:1>, Priority: 5, Token: Token { kind: ContainerToken, service:
> > > 10.122.97.72:45454 }, ] to fast fail map
> > > > 2016-11-25 15:16:16,432 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
> > from
> > > earlierFailedMaps
> > > > 2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
> > > container container_e125_1479580915733_0167_01_000005 to
> > > attempt_1479580915733_0167_m_000000_3
> > > > 2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
> > Recalculating
> > > schedule, headroom=<memory:34816, vCores:0>
> > > > 2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce
> slow
> > > start threshold not met. completedMapsForReduceSlowstart 1
> > > > 2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
> > > Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0
> AssignedMaps:1
> > > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:4 ContRel:0
> > > HostLocal:1 RackLocal:0
> > > > 2016-11-25 15:16:16,433 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.yarn.util.RackResolver: Resolved
> > hadoopclusterslic72.ad.
> > > infosys.com to /default-rack
> > > > 2016-11-25 15:16:16,434 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
> > > UNASSIGNED to ASSIGNED
> > > > 2016-11-25 15:16:16,436 INFO [ContainerLauncher #6]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container
> > > container_e125_1479580915733_0167_01_000005 taskAttempt
> > > attempt_1479580915733_0167_m_000000_3
> > > > 2016-11-25 15:16:16,436 INFO [ContainerLauncher #6]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > Launching attempt_1479580915733_0167_m_000000_3
> > > > 2016-11-25 15:16:16,436 INFO [ContainerLauncher #6]
> > > org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolPro
> xy:
> > > Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> > > > 2016-11-25 15:16:16,516 INFO [ContainerLauncher #6]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > Shuffle port returned by ContainerManager for
> > attempt_1479580915733_0167_m_000000_3
> > > : 13562
> > > > 2016-11-25 15:16:16,517 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > TaskAttempt:
> > > [attempt_1479580915733_0167_m_000000_3] using containerId:
> > > [container_e125_1479580915733_0167_01_000005 on NM: [
> > > hadoopclusterslic72.ad.infosys.com:45454]
> > > > 2016-11-25 15:16:16,517 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
> > > ASSIGNED to RUNNING
> > > > 2016-11-25 15:16:17,436 INFO [RMCommunicator Allocator]
> > > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> > > getResources() for application_1479580915733_0167: ask=1 release= 0
> > > newContainers=0 finishedContainers=0 resourcelimit=<memory:34816,
> > vCores:0>
> > > knownNMs=2
> > > > 2016-11-25 15:16:19,664 INFO [Socket Reader #1 for port 57220]
> > > SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
> > > job_1479580915733_0167 (auth:SIMPLE)
> > > > 2016-11-25 15:16:19,692 INFO [IPC Server handler 6 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID :
> > > jvm_1479580915733_0167_m_137438953472005 asked for a task
> > > > 2016-11-25 15:16:19,692 INFO [IPC Server handler 6 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID:
> > > jvm_1479580915733_0167_m_137438953472005 given task:
> > > attempt_1479580915733_0167_m_000000_3
> > > > 2016-11-25 15:16:27,222 INFO [IPC Server handler 13 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
> > TaskAttempt
> > > attempt_1479580915733_0167_m_000000_3 is : 0.667
> > > > 2016-11-25 15:16:27,952 INFO [IPC Server handler 7 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
> > TaskAttempt
> > > attempt_1479580915733_0167_m_000000_3 is : 0.667
> > > > 2016-11-25 15:16:27,971 ERROR [IPC Server handler 11 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
> > > attempt_1479580915733_0167_m_000000_3 - exited : java.io.IOException:
> > > Failed to build cube in mapper 0
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:145)
> > > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> > java:787)
> > > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.
> java:168)
> > > >       at java.security.AccessController.doPrivileged(Native Method)
> > > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > > UserGroupInformation.java:1724)
> > > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > > Caused by: java.util.concurrent.ExecutionException:
> > > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > > java.lang.IllegalArgumentException: Value not exists!
> > > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:143)
> > > >       ... 8 more
> > > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > > java.io.IOException: java.lang.IllegalArgumentException: Value not
> > exists!
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:82)
> > > >       at java.util.concurrent.Executors$RunnableAdapter.
> > > call(Executors.java:511)
> > > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > > ThreadPoolExecutor.java:1142)
> > > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > > ThreadPoolExecutor.java:617)
> > > >       at java.lang.Thread.run(Thread.java:745)
> > > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> > IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:126)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > > build(DoggedCubeBuilder.java:73)
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:80)
> > > >       ... 5 more
> > > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:114)
> > > >       ... 7 more
> > > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > > >       at org.apache.kylin.common.util.Dictionary.
> getIdFromValueBytes(
> > > Dictionary.java:162)
> > > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > > TrieDictionary.java:167)
> > > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > > Dictionary.java:98)
> > > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:110)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:93)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:81)
> > > >       at org.apache.kylin.cube.inmemcubing.
> > > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > > .java:74)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:542)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:523)
> > > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > > GTAggregateScanner.java:139)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > createBaseCuboid(InMemCubeBuilder.java:339)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:166)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:135)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > SplitThread.run(DoggedCubeBuilder.java:282)
> > > >
> > > > 2016-11-25 15:16:27,971 INFO [IPC Server handler 11 on 57220]
> > > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report
> > from
> > > attempt_1479580915733_0167_m_000000_3: Error: java.io.IOException:
> > Failed
> > > to build cube in mapper 0
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:145)
> > > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> > java:787)
> > > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.
> java:168)
> > > >       at java.security.AccessController.doPrivileged(Native Method)
> > > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > > UserGroupInformation.java:1724)
> > > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > > Caused by: java.util.concurrent.ExecutionException:
> > > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > > java.lang.IllegalArgumentException: Value not exists!
> > > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:143)
> > > >       ... 8 more
> > > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > > java.io.IOException: java.lang.IllegalArgumentException: Value not
> > exists!
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:82)
> > > >       at java.util.concurrent.Executors$RunnableAdapter.
> > > call(Executors.java:511)
> > > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > > ThreadPoolExecutor.java:1142)
> > > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > > ThreadPoolExecutor.java:617)
> > > >       at java.lang.Thread.run(Thread.java:745)
> > > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> > IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:126)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > > build(DoggedCubeBuilder.java:73)
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:80)
> > > >       ... 5 more
> > > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:114)
> > > >       ... 7 more
> > > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > > >       at org.apache.kylin.common.util.Dictionary.
> getIdFromValueBytes(
> > > Dictionary.java:162)
> > > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > > TrieDictionary.java:167)
> > > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > > Dictionary.java:98)
> > > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:110)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:93)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:81)
> > > >       at org.apache.kylin.cube.inmemcubing.
> > > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > > .java:74)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:542)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:523)
> > > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > > GTAggregateScanner.java:139)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > createBaseCuboid(InMemCubeBuilder.java:339)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:166)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:135)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > SplitThread.run(DoggedCubeBuilder.java:282)
> > > >
> > > > 2016-11-25 15:16:27,974 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> Diagnostics
> > > report from attempt_1479580915733_0167_m_000000_3: Error:
> > > java.io.IOException: Failed to build cube in mapper 0
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:145)
> > > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> > java:787)
> > > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.
> java:168)
> > > >       at java.security.AccessController.doPrivileged(Native Method)
> > > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > > UserGroupInformation.java:1724)
> > > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > > Caused by: java.util.concurrent.ExecutionException:
> > > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > > java.lang.IllegalArgumentException: Value not exists!
> > > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > > cleanup(InMemCuboidMapper.java:143)
> > > >       ... 8 more
> > > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > > java.io.IOException: java.lang.IllegalArgumentException: Value not
> > exists!
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:82)
> > > >       at java.util.concurrent.Executors$RunnableAdapter.
> > > call(Executors.java:511)
> > > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > > ThreadPoolExecutor.java:1142)
> > > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > > ThreadPoolExecutor.java:617)
> > > >       at java.lang.Thread.run(Thread.java:745)
> > > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> > IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:126)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > > build(DoggedCubeBuilder.java:73)
> > > >       at org.apache.kylin.cube.inmemcubing.
> AbstractInMemCubeBuilder$1.
> > > run(AbstractInMemCubeBuilder.java:80)
> > > >       ... 5 more
> > > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > > Value not exists!
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > BuildOnce.build(DoggedCubeBuilder.java:114)
> > > >       ... 7 more
> > > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > > >       at org.apache.kylin.common.util.Dictionary.
> getIdFromValueBytes(
> > > Dictionary.java:162)
> > > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > > TrieDictionary.java:167)
> > > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > > Dictionary.java:98)
> > > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:121)
> > > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > > encodeColumnValue(CubeCodeSystem.java:110)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:93)
> > > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> > java:81)
> > > >       at org.apache.kylin.cube.inmemcubing.
> > > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > > .java:74)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:542)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > > InputConverter$1.next(InMemCubeBuilder.java:523)
> > > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > > GTAggregateScanner.java:139)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > createBaseCuboid(InMemCubeBuilder.java:339)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:166)
> > > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > > build(InMemCubeBuilder.java:135)
> > > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > > SplitThread.run(DoggedCubeBuilder.java:282)
> > > >
> > > > 2016-11-25 15:16:27,975 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
> > > RUNNING to FAIL_CONTAINER_CLEANUP
> > > > 2016-11-25 15:16:27,976 INFO [ContainerLauncher #7]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container
> > > container_e125_1479580915733_0167_01_000005 taskAttempt
> > > attempt_1479580915733_0167_m_000000_3
> > > > 2016-11-25 15:16:27,997 INFO [ContainerLauncher #7]
> > > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > > KILLING attempt_1479580915733_0167_m_000000_3
> > > > 2016-11-25 15:16:27,997 INFO [ContainerLauncher #7]
> > > org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolPro
> xy:
> > > Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> > > > 2016-11-25 15:16:28,009 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
> > > FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
> > > > 2016-11-25 15:16:28,011 INFO [CommitterEvent Processor #4]
> > > org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
> > > Processing the event EventType: TASK_ABORT
> > > > 2016-11-25 15:16:28,013 WARN [CommitterEvent Processor #4]
> > > org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not
> > > delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-
> > > 4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_
> > temporary/attempt_
> > > 1479580915733_0167_m_000000_3
> > > > 2016-11-25 15:16:28,014 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
> > > FAIL_TASK_CLEANUP to FAILED
> > > > 2016-11-25 15:16:28,026 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
> > > task_1479580915733_0167_m_000000 Task Transitioned from RUNNING to
> > FAILED
> > > > 2016-11-25 15:16:28,026 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Num completed
> > Tasks:
> > > 1
> > > > 2016-11-25 15:16:28,027 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Job failed as
> tasks
> > > failed. failedMaps:1 failedReduces:0
> > > > 2016-11-25 15:16:28,027 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > > v2.app.rm.RMContainerRequestor: 3 failures on node
> > hadoopclusterslic72.ad.
> > > infosys.com
> > > > 2016-11-25 15:16:28,027 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > > v2.app.rm.RMContainerRequestor: Blacklisted host
> hadoopclusterslic72.ad.
> > > infosys.com
> > > > 2016-11-25 15:16:28,032 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> > > job_1479580915733_0167Job Transitioned from RUNNING to FAIL_WAIT
> > > > 2016-11-25 15:16:28,033 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
> > > task_1479580915733_0167_r_000000 Task Transitioned from SCHEDULED to
> > > KILL_WAIT
> > > > 2016-11-25 15:16:28,033 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > > attempt_1479580915733_0167_r_000000_0 TaskAttempt Transitioned from
> > > UNASSIGNED to KILLED
> > > > 2016-11-25 15:16:28,034 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
> > > task_1479580915733_0167_r_000000 Task Transitioned from KILL_WAIT to
> > > KILLED
> > > > 2016-11-25 15:16:28,034 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> > > job_1479580915733_0167Job Transitioned from FAIL_WAIT to FAIL_ABORT
> > > > 2016-11-25 15:16:28,037 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > > v2.app.rm.RMContainerAllocator: Processing the event EventType:
> > > CONTAINER_DEALLOCATE
> > > > 2016-11-25 15:16:28,037 ERROR [Thread-53]
> org.apache.hadoop.mapreduce.
> > > v2.app.rm.RMContainerAllocator: Could not deallocate container for
> task
> > > attemptId attempt_1479580915733_0167_r_000000_0
> > > > 2016-11-25 15:16:28,043 INFO [CommitterEvent Processor #0]
> > > org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
> > > Processing the event EventType: JOB_ABORT
> > > > 2016-11-25 15:16:28,058 INFO [AsyncDispatcher event handler]
> > > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> > > job_1479580915733_0167Job Transitioned from FAIL_ABORT to FAILED
> > > > 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > v2.app.MRAppMaster:
> > > We are finishing cleanly so this is the last retry
> > > > 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > v2.app.MRAppMaster:
> > > Notify RMCommunicator isAMLastRetry: true
> > > > 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMCommunicator:
> > > RMCommunicator notified that shouldUnregistered is: true
> > > > 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > v2.app.MRAppMaster:
> > > Notify JHEH isAMLastRetry: true
> > > > 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > > jobhistory.JobHistoryEventHandler: JobHistoryEventHandler notified
> that
> > > forceJobCompletion is true
> > > > 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > v2.app.MRAppMaster:
> > > Calling stop for all the services
> > > > 2016-11-25 15:16:28,093 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > > jobhistory.JobHistoryEventHandler: Stopping JobHistoryEventHandler.
> Size
> > > of the outstanding queue size is 2
> > > > 2016-11-25 15:16:28,097 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > > jobhistory.JobHistoryEventHandler: In stop, writing event TASK_FAILED
> > > > 2016-11-25 15:16:28,099 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > > jobhistory.JobHistoryEventHandler: In stop, writing event JOB_FAILED
> > > > 2016-11-25 15:16:28,177 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > > jobhistory.JobHistoryEventHandler: Copying
> > hdfs://SLICHDP:8020/user/hdfs/
> > > .staging/job_1479580915733_0167/job_1479580915733_0167_1.jhist to
> > > hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-
> > > 1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-
> > > 1480067188027-0-0-FAILED-default-1480067140199.jhist_tmp
> > > > 2016-11-25 15:16:28,248 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > > jobhistory.JobHistoryEventHandler: Copied to done location:
> > > hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-
> > > 1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-
> > > 1480067188027-0-0-FAILED-default-1480067140199.jhist_tmp
> > > > 2016-11-25 15:16:28,253 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > > jobhistory.JobHistoryEventHandler: Copying
> > hdfs://SLICHDP:8020/user/hdfs/
> > > .staging/job_1479580915733_0167/job_1479580915733_0167_1_conf.xml to
> > > hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_
> 1479580915733_0167_conf.xml_
> > > tmp
> > > > 2016-11-25 15:16:28,320 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > > jobhistory.JobHistoryEventHandler: Copied to done location:
> > > hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_
> 1479580915733_0167_conf.xml_
> > > tmp
> > > > 2016-11-25 15:16:28,338 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > > jobhistory.JobHistoryEventHandler: Moved tmp to done:
> > > hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_
> > 1479580915733_0167.summary_tmp
> > > to hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_
> > 1479580915733_0167.summary
> > > > 2016-11-25 15:16:28,350 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > > jobhistory.JobHistoryEventHandler: Moved tmp to done:
> > > hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_
> 1479580915733_0167_conf.xml_
> > tmp
> > > to hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_
> > 1479580915733_0167_conf.xml
> > > > 2016-11-25 15:16:28,353 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > > jobhistory.JobHistoryEventHandler: Moved tmp to done:
> > > hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-
> > > 1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-
> > > 1480067188027-0-0-FAILED-default-1480067140199.jhist_tmp to
> > > hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-
> > > 1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-
> > > 1480067188027-0-0-FAILED-default-1480067140199.jhist
> > > > 2016-11-25 15:16:28,353 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > > jobhistory.JobHistoryEventHandler: Stopped JobHistoryEventHandler.
> > > super.stop()
> > > > 2016-11-25 15:16:28,357 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMCommunicator:
> > > Setting job diagnostics to Task failed task_1479580915733_0167_m_
> 000000
> > > > Job failed as tasks failed. failedMaps:1 failedReduces:0
> > > >
> > > > 2016-11-25 15:16:28,357 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMCommunicator:
> > > History url is http://hadoopclusterslic73.ad.
> > infosys.com:19888/jobhistory/
> > > job/job_1479580915733_0167
> > > > 2016-11-25 <http://hadoopclusterslic73.ad.infosys.com:19888/
> > > jobhistory/job/job_1479580915733_01672016-11-25> 15:16:28,373 INFO
> > > [Thread-74] org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator:
> > Waiting
> > > for application to be successfully unregistered.
> > > > 2016-11-25 15:16:29,375 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > > v2.app.rm.RMContainerAllocator: Final Stats: PendingReds:1
> > > ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1 AssignedReds:0
> > > CompletedMaps:0 CompletedReds:0 ContAlloc:4 ContRel:0 HostLocal:1
> > > RackLocal:0
> > > > 2016-11-25 15:16:29,377 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > v2.app.MRAppMaster:
> > > Deleting staging directory hdfs://SLICHDP /user/hdfs/.staging/job_
> > > 1479580915733_0167
> > > > 2016-11-25 15:16:29,380 INFO [Thread-74]
> org.apache.hadoop.ipc.Server:
> > > Stopping server on 57220
> > > > 2016-11-25 15:16:29,387 INFO [TaskHeartbeatHandler PingChecker]
> > > org.apache.hadoop.mapreduce.v2.app.TaskHeartbeatHandler:
> > > TaskHeartbeatHandler thread interrupted
> > > > 2016-11-25 15:16:29,387 INFO [IPC Server listener on 57220]
> > > org.apache.hadoop.ipc.Server: Stopping IPC Server listener on 57220
> > > >
> > > >
> > > > On Fri, Nov 25, 2016 at 7:36 PM, ShaoFeng Shi <
> shaofengshi@apache.org>
> > > > wrote:
> > > >
> > > >> Didn't hear of that. Hive table's file format is transparent for
> > Kylin;
> > > >> Even if the table is a view, Kylin can build from it.
> > > >>
> > > >> What's the detail error you got when using ORC table? If you can
> > provide
> > > >> the detail information, that would be better.
> > > >>
> > > >> 2016-11-25 18:22 GMT+08:00 suresh m <su...@gmail.com>:
> > > >>
> > > >> > Hi Facing an issue where i can able to build cube with text format
> > but
> > > >> > unable to building cube with ORC tables.
> > > >> >
> > > >> > Let me know kylin having any issues with ORC format.?
> > > >> >
> > > >> >  Hive having limitation that Text format tables not having
> > possibility
> > > >> to
> > > >> > enabling ACID properties since text format not supporting ACID.
> But
> > > for
> > > >> me
> > > >> > ACID properties is important to handle my data, this i can do with
> > ORC
> > > >> but
> > > >> > kylin throwing errors with ORC format.
> > > >> >
> > > >> >
> > > >> > Regards,
> > > >> > Suresh
> > > >> >
> > > >>
> > > >>
> > > >>
> > > >> --
> > > >> Best regards,
> > > >>
> > > >> Shaofeng Shi 史少锋
> > > >>
> > > >
> > > >
> > >
> >
> >
> >
> > --
> > Best regards,
> >
> > Shaofeng Shi 史少锋
> >
>



-- 
Best regards,

Shaofeng Shi 史少锋

Re: Is kylin not support Hive ORC tables with ACID properties.

Posted by suresh m <su...@gmail.com>.
Hi ShaoFeng Shi,

Thanks for your reply, but here my case is different. I can able to create
and build cube with same data using text formatted tables but when i tried
to build cube with ORC formatted tables with same data facing an issue.

Regards,
Suresh

On Mon, Nov 28, 2016 at 7:31 PM, ShaoFeng Shi <sh...@apache.org>
wrote:

> Hi Suresh,
>
> Another user also got similar problem, and I replied in the user@ group;
> Just minute ago I forwarded it to dev@ group; please take a look and let
> me
> know whether it is the same:
> http://apache-kylin.74782.x6.nabble.com/Fwd-Re-org-apache-
> kylin-dict-TrieDictionary-Not-a-valid-value-td6428.html
>
> 2016-11-28 19:38 GMT+08:00 suresh m <su...@gmail.com>:
>
> > can some see log and help me what is the exact issue facing with ORC
> > formatted tables. Why i am unable build cube successfully with ORC
> > formatted tables.
> >
> > On Mon, Nov 28, 2016 at 10:47 AM, suresh m <su...@gmail.com>
> wrote:
> >
> > > Please find detail as requested,
> > >
> > > Log Type: syslog
> > >
> > > Log Upload Time: Fri Nov 25 15:16:35 +0530 2016
> > >
> > > Log Length: 107891
> > >
> > > 2016-11-25 15:15:35,185 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.MRAppMaster:
> > Created MRAppMaster for application appattempt_1479580915733_0167_000001
> > > 2016-11-25 15:15:35,592 WARN [main] org.apache.hadoop.util.
> NativeCodeLoader:
> > Unable to load native-hadoop library for your platform... using
> > builtin-java classes where applicable
> > > 2016-11-25 15:15:35,630 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.MRAppMaster:
> > Executing with tokens:
> > > 2016-11-25 15:15:35,956 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.MRAppMaster:
> > Kind: YARN_AM_RM_TOKEN, Service: , Ident: (appAttemptId { application_id
> {
> > id: 167 cluster_timestamp: 1479580915733 } attemptId: 1 } keyId:
> 2128280969)
> > > 2016-11-25 15:15:35,974 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.MRAppMaster:
> > Using mapred newApiCommitter.
> > > 2016-11-25 15:15:35,976 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.MRAppMaster:
> > OutputCommitter set in config null
> > > 2016-11-25 15:15:36,029 INFO [main] org.apache.hadoop.mapreduce.
> > lib.output.FileOutputCommitter: File Output Committer Algorithm version
> > is 1
> > > 2016-11-25 15:15:36,029 INFO [main] org.apache.hadoop.mapreduce.
> > lib.output.FileOutputCommitter: FileOutputCommitter skip cleanup
> > _temporary folders under output directory:false, ignore cleanup failures:
> > false
> > > 2016-11-25 15:15:36,692 WARN [main] org.apache.hadoop.hdfs.
> shortcircuit.DomainSocketFactory:
> > The short-circuit local reads feature cannot be used because libhadoop
> > cannot be loaded.
> > > 2016-11-25 15:15:36,702 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.MRAppMaster:
> > OutputCommitter is org.apache.hadoop.mapreduce.
> > lib.output.FileOutputCommitter
> > > 2016-11-25 15:15:36,891 INFO [main] org.apache.hadoop.yarn.event.
> AsyncDispatcher:
> > Registering class org.apache.hadoop.mapreduce.jobhistory.EventType for
> > class org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler
> > > 2016-11-25 15:15:36,891 INFO [main] org.apache.hadoop.yarn.event.
> AsyncDispatcher:
> > Registering class org.apache.hadoop.mapreduce.
> v2.app.job.event.JobEventType
> > for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> > JobEventDispatcher
> > > 2016-11-25 15:15:36,892 INFO [main] org.apache.hadoop.yarn.event.
> AsyncDispatcher:
> > Registering class org.apache.hadoop.mapreduce.
> v2.app.job.event.TaskEventType
> > for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> > TaskEventDispatcher
> > > 2016-11-25 15:15:36,893 INFO [main] org.apache.hadoop.yarn.event.
> AsyncDispatcher:
> > Registering class org.apache.hadoop.mapreduce.v2.app.job.event.
> TaskAttemptEventType
> > for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> > TaskAttemptEventDispatcher
> > > 2016-11-25 15:15:36,893 INFO [main] org.apache.hadoop.yarn.event.
> AsyncDispatcher:
> > Registering class org.apache.hadoop.mapreduce.v2.app.commit.
> CommitterEventType
> > for class org.apache.hadoop.mapreduce.v2.app.commit.
> CommitterEventHandler
> > > 2016-11-25 15:15:36,894 INFO [main] org.apache.hadoop.yarn.event.
> AsyncDispatcher:
> > Registering class org.apache.hadoop.mapreduce.
> v2.app.speculate.Speculator$EventType
> > for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> > SpeculatorEventDispatcher
> > > 2016-11-25 15:15:36,894 INFO [main] org.apache.hadoop.yarn.event.
> AsyncDispatcher:
> > Registering class org.apache.hadoop.mapreduce.
> > v2.app.rm.ContainerAllocator$EventType for class
> > org.apache.hadoop.mapreduce.v2.app.MRAppMaster$ContainerAllocatorRouter
> > > 2016-11-25 15:15:36,895 INFO [main] org.apache.hadoop.yarn.event.
> AsyncDispatcher:
> > Registering class org.apache.hadoop.mapreduce.v2.app.launcher.
> ContainerLauncher$EventType
> > for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> > ContainerLauncherRouter
> > > 2016-11-25 15:15:36,923 INFO [main] org.apache.hadoop.mapreduce.
> v2.jobhistory.JobHistoryUtils:
> > Default file system is set solely by core-default.xml therefore -
> ignoring
> > > 2016-11-25 15:15:36,945 INFO [main] org.apache.hadoop.mapreduce.
> v2.jobhistory.JobHistoryUtils:
> > Default file system is set solely by core-default.xml therefore -
> ignoring
> > > 2016-11-25 15:15:36,967 INFO [main] org.apache.hadoop.mapreduce.
> v2.jobhistory.JobHistoryUtils:
> > Default file system is set solely by core-default.xml therefore -
> ignoring
> > > 2016-11-25 15:15:37,029 INFO [main] org.apache.hadoop.mapreduce.
> > jobhistory.JobHistoryEventHandler: Emitting job history data to the
> > timeline server is not enabled
> > > 2016-11-25 15:15:37,064 INFO [main] org.apache.hadoop.yarn.event.
> AsyncDispatcher:
> > Registering class org.apache.hadoop.mapreduce.v2.app.job.event.
> JobFinishEvent$Type
> > for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> > JobFinishEventHandler
> > > 2016-11-25 15:15:37,204 WARN [main] org.apache.hadoop.metrics2.
> impl.MetricsConfig:
> > Cannot locate configuration: tried hadoop-metrics2-mrappmaster.
> > properties,hadoop-metrics2.properties
> > > 2016-11-25 15:15:37,300 INFO [main] org.apache.hadoop.metrics2.
> impl.MetricsSystemImpl:
> > Scheduled snapshot period at 10 second(s).
> > > 2016-11-25 15:15:37,300 INFO [main] org.apache.hadoop.metrics2.
> impl.MetricsSystemImpl:
> > MRAppMaster metrics system started
> > > 2016-11-25 15:15:37,308 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.job.impl.JobImpl:
> > Adding job token for job_1479580915733_0167 to jobTokenSecretManager
> > > 2016-11-25 15:15:37,452 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.job.impl.JobImpl:
> > Not uberizing job_1479580915733_0167 because: not enabled; too much RAM;
> > > 2016-11-25 15:15:37,468 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.job.impl.JobImpl:
> > Input size for job job_1479580915733_0167 = 223589. Number of splits = 1
> > > 2016-11-25 15:15:37,469 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.job.impl.JobImpl:
> > Number of reduces for job job_1479580915733_0167 = 1
> > > 2016-11-25 15:15:37,469 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.job.impl.JobImpl:
> > job_1479580915733_0167Job Transitioned from NEW to INITED
> > > 2016-11-25 15:15:37,470 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.MRAppMaster:
> > MRAppMaster launching normal, non-uberized, multi-container job
> > job_1479580915733_0167.
> > > 2016-11-25 15:15:37,493 INFO [main] org.apache.hadoop.ipc.
> CallQueueManager:
> > Using callQueue: class java.util.concurrent.LinkedBlockingQueue
> > scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler
> > > 2016-11-25 15:15:37,506 INFO [Socket Reader #1 for port 60945]
> > org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 60945
> > > 2016-11-25 15:15:37,525 INFO [main] org.apache.hadoop.yarn.
> > factories.impl.pb.RpcServerFactoryPBImpl: Adding protocol
> > org.apache.hadoop.mapreduce.v2.api.MRClientProtocolPB to the server
> > > 2016-11-25 15:15:37,527 INFO [IPC Server Responder]
> > org.apache.hadoop.ipc.Server: IPC Server Responder: starting
> > > 2016-11-25 15:15:37,529 INFO [IPC Server listener on 60945]
> > org.apache.hadoop.ipc.Server: IPC Server listener on 60945: starting
> > > 2016-11-25 15:15:37,529 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.client.MRClientService:
> > Instantiated MRClientService at hadoopclusterslic73.ad.
> > infosys.com/10.122.97.73:60945
> > > 2016-11-25 <http://hadoopclusterslic73.ad.infosys.com/10.122.97.73:
> > 609452016-11-25> 15:15:37,614 INFO [main] org.mortbay.log: Logging to
> > org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
> > org.mortbay.log.Slf4jLog
> > > 2016-11-25 15:15:37,623 INFO [main] org.apache.hadoop.security.
> > authentication.server.AuthenticationFilter: Unable to initialize
> > FileSignerSecretProvider, falling back to use random secrets.
> > > 2016-11-25 15:15:37,628 WARN [main] org.apache.hadoop.http.
> HttpRequestLog:
> > Jetty request log can only be enabled using Log4j
> > > 2016-11-25 15:15:37,636 INFO [main] org.apache.hadoop.http.
> HttpServer2:
> > Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$
> > QuotingInputFilter)
> > > 2016-11-25 15:15:37,688 INFO [main] org.apache.hadoop.http.
> HttpServer2:
> > Added filter AM_PROXY_FILTER (class=org.apache.hadoop.yarn.
> > server.webproxy.amfilter.AmIpFilter) to context mapreduce
> > > 2016-11-25 15:15:37,688 INFO [main] org.apache.hadoop.http.
> HttpServer2:
> > Added filter AM_PROXY_FILTER (class=org.apache.hadoop.yarn.
> > server.webproxy.amfilter.AmIpFilter) to context static
> > > 2016-11-25 15:15:37,691 INFO [main] org.apache.hadoop.http.
> HttpServer2:
> > adding path spec: /mapreduce/*
> > > 2016-11-25 15:15:37,692 INFO [main] org.apache.hadoop.http.
> HttpServer2:
> > adding path spec: /ws/*
> > > 2016-11-25 15:15:38,181 INFO [main] org.apache.hadoop.yarn.webapp.
> WebApps:
> > Registered webapp guice modules
> > > 2016-11-25 15:15:38,183 INFO [main] org.apache.hadoop.http.
> HttpServer2:
> > Jetty bound to port 34311
> > > 2016-11-25 15:15:38,183 INFO [main] org.mortbay.log: jetty-6.1.26.hwx
> > > 2016-11-25 15:15:38,263 INFO [main] org.mortbay.log: Extract
> > jar:file:/hadoop/yarn/local/filecache/16/mapreduce.tar.gz/
> > hadoop/share/hadoop/yarn/hadoop-yarn-common-2.7.3.2.5.
> > 0.0-1245.jar!/webapps/mapreduce to /hadoop/yarn/local/usercache/
> > hdfs/appcache/application_1479580915733_0167/container_
> > e125_1479580915733_0167_01_000001/tmp/Jetty_0_0_0_0_
> > 34311_mapreduce____2ncvaf/webapp
> > > 2016-11-25 15:15:39,882 INFO [main] org.mortbay.log: Started
> HttpServer2$
> > SelectChannelConnectorWithSafeStartup@0.0.0.0:34311
> > > 2016-11-25 15:15:39,882 INFO [main] org.apache.hadoop.yarn.webapp.
> WebApps:
> > Web app mapreduce started at 34311
> > > 2016-11-25 15:15:39,933 INFO [main] org.apache.hadoop.ipc.
> CallQueueManager:
> > Using callQueue: class java.util.concurrent.LinkedBlockingQueue
> > scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler
> > > 2016-11-25 15:15:39,936 INFO [Socket Reader #1 for port 57220]
> > org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 57220
> > > 2016-11-25 15:15:39,943 INFO [IPC Server listener on 57220]
> > org.apache.hadoop.ipc.Server: IPC Server listener on 57220: starting
> > > 2016-11-25 15:15:39,953 INFO [IPC Server Responder]
> > org.apache.hadoop.ipc.Server: IPC Server Responder: starting
> > > 2016-11-25 15:15:39,978 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMContainerRequestor: nodeBlacklistingEnabled:true
> > > 2016-11-25 15:15:39,978 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMContainerRequestor: maxTaskFailuresPerNode is 3
> > > 2016-11-25 15:15:39,978 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMContainerRequestor: blacklistDisablePercent is 33
> > > 2016-11-25 15:15:40,082 WARN [main] org.apache.hadoop.ipc.Client:
> Failed
> > to connect to server: hadoopclusterslic71.ad.
> infosys.com/10.122.97.71:8030:
> > retries get failed due to exceeded maximum allowed retries number: 0
> > > java.net.ConnectException: Connection refused
> > >       at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> > >       at sun.nio.ch.SocketChannelImpl.finishConnect(
> > SocketChannelImpl.java:717)
> > >       at org.apache.hadoop.net.SocketIOWithTimeout.connect(
> > SocketIOWithTimeout.java:206)
> > >       at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
> > >       at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
> > >       at org.apache.hadoop.ipc.Client$Connection.setupConnection(
> > Client.java:650)
> > >       at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(
> > Client.java:745)
> > >       at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.
> > java:397)
> > >       at org.apache.hadoop.ipc.Client.getConnection(Client.java:1618)
> > >       at org.apache.hadoop.ipc.Client.call(Client.java:1449)
> > >       at org.apache.hadoop.ipc.Client.call(Client.java:1396)
> > >       at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.
> > invoke(ProtobufRpcEngine.java:233)
> > >       at com.sun.proxy.$Proxy80.registerApplicationMaster(Unknown
> > Source)
> > >       at org.apache.hadoop.yarn.api.impl.pb.client.
> > ApplicationMasterProtocolPBClientImpl.registerApplicationMaster(
> > ApplicationMasterProtocolPBClientImpl.java:106)
> > >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >       at sun.reflect.NativeMethodAccessorImpl.invoke(
> > NativeMethodAccessorImpl.java:62)
> > >       at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> > DelegatingMethodAccessorImpl.java:43)
> > >       at java.lang.reflect.Method.invoke(Method.java:497)
> > >       at org.apache.hadoop.io.retry.RetryInvocationHandler.
> invokeMethod(
> > RetryInvocationHandler.java:278)
> > >       at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
> > RetryInvocationHandler.java:194)
> > >       at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
> > RetryInvocationHandler.java:176)
> > >       at com.sun.proxy.$Proxy81.registerApplicationMaster(Unknown
> > Source)
> > >       at org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator.
> > register(RMCommunicator.java:160)
> > >       at org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator.
> > serviceStart(RMCommunicator.java:121)
> > >       at org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator.
> > serviceStart(RMContainerAllocator.java:250)
> > >       at org.apache.hadoop.service.AbstractService.start(
> > AbstractService.java:193)
> > >       at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> > ContainerAllocatorRouter.serviceStart(MRAppMaster.java:881)
> > >       at org.apache.hadoop.service.AbstractService.start(
> > AbstractService.java:193)
> > >       at org.apache.hadoop.service.CompositeService.serviceStart(
> > CompositeService.java:120)
> > >       at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.
> > serviceStart(MRAppMaster.java:1151)
> > >       at org.apache.hadoop.service.AbstractService.start(
> > AbstractService.java:193)
> > >       at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$5.run(
> > MRAppMaster.java:1557)
> > >       at java.security.AccessController.doPrivileged(Native Method)
> > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > UserGroupInformation.java:1724)
> > >       at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.
> > initAndStartAppMaster(MRAppMaster.java:1553)
> > >       at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(
> > MRAppMaster.java:1486)
> > > 2016-11-25 15:15:40,089 INFO [main] org.apache.hadoop.yarn.client.
> > ConfiguredRMFailoverProxyProvider: Failing over to rm2
> > > 2016-11-25 15:15:40,183 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.rm.RMCommunicator:
> > maxContainerCapability: <memory:28672, vCores:3>
> > > 2016-11-25 15:15:40,183 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.rm.RMCommunicator:
> > queue: default
> > > 2016-11-25 15:15:40,186 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.launcher.ContainerLauncherImpl: Upper limit on the thread pool
> > size is 500
> > > 2016-11-25 15:15:40,186 INFO [main] org.apache.hadoop.mapreduce.
> > v2.app.launcher.ContainerLauncherImpl: The thread pool initial size is
> 10
> > > 2016-11-25 15:15:40,189 INFO [main] org.apache.hadoop.yarn.client.
> > api.impl.ContainerManagementProtocolProxy: yarn.client.max-cached-
> nodemanagers-proxies
> > : 0
> > > 2016-11-25 15:15:40,202 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> > job_1479580915733_0167Job Transitioned from INITED to SETUP
> > > 2016-11-25 15:15:40,212 INFO [CommitterEvent Processor #0]
> > org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
> > Processing the event EventType: JOB_SETUP
> > > 2016-11-25 15:15:40,226 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> > job_1479580915733_0167Job Transitioned from SETUP to RUNNING
> > > 2016-11-25 15:15:40,291 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.yarn.util.RackResolver: Resolved
> hadoopclusterslic72.ad.
> > infosys.com to /default-rack
> > > 2016-11-25 15:15:40,328 INFO [eventHandlingThread]
> > org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Event
> > Writer setup for JobId: job_1479580915733_0167, File:
> > hdfs://SLICHDP:8020/user/hdfs/.staging/job_1479580915733_
> > 0167/job_1479580915733_0167_1.jhist
> > > 2016-11-25 15:15:40,351 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.yarn.util.RackResolver: Resolved
> hadoopclusterslic73.ad.
> > infosys.com to /default-rack
> > > 2016-11-25 15:15:40,357 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
> > task_1479580915733_0167_m_000000 Task Transitioned from NEW to SCHEDULED
> > > 2016-11-25 15:15:40,358 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
> > task_1479580915733_0167_r_000000 Task Transitioned from NEW to SCHEDULED
> > > 2016-11-25 15:15:40,359 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from NEW
> > to UNASSIGNED
> > > 2016-11-25 15:15:40,359 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_r_000000_0 TaskAttempt Transitioned from NEW
> > to UNASSIGNED
> > > 2016-11-25 15:15:40,401 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMContainerAllocator: mapResourceRequest:<memory:3072,
> vCores:1>
> > > 2016-11-25 15:15:40,416 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMContainerAllocator: reduceResourceRequest:<memory:4096,
> > vCores:1>
> > > 2016-11-25 15:15:41,191 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before
> > Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:0
> > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:0 ContRel:0
> > HostLocal:0 RackLocal:0
> > > 2016-11-25 15:15:41,225 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> > getResources() for application_1479580915733_0167: ask=4 release= 0
> > newContainers=0 finishedContainers=0 resourcelimit=<memory:38912,
> vCores:1>
> > knownNMs=2
> > > 2016-11-25 15:15:41,225 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
> Recalculating
> > schedule, headroom=<memory:38912, vCores:1>
> > > 2016-11-25 15:15:41,226 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow
> > start threshold not met. completedMapsForReduceSlowstart 1
> > > 2016-11-25 15:15:42,235 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got
> allocated
> > containers 1
> > > 2016-11-25 15:15:42,237 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
> > container container_e125_1479580915733_0167_01_000002 to
> > attempt_1479580915733_0167_m_000000_0
> > > 2016-11-25 15:15:42,238 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
> Recalculating
> > schedule, headroom=<memory:34816, vCores:0>
> > > 2016-11-25 15:15:42,238 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow
> > start threshold not met. completedMapsForReduceSlowstart 1
> > > 2016-11-25 15:15:42,238 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
> > Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1
> > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:1 ContRel:0
> > HostLocal:1 RackLocal:0
> > > 2016-11-25 15:15:42,286 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.yarn.util.RackResolver: Resolved
> hadoopclusterslic73.ad.
> > infosys.com to /default-rack
> > > 2016-11-25 15:15:42,311 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: The job-jar
> > file on the remote FS is hdfs://SLICHDP/user/hdfs/.
> > staging/job_1479580915733_0167/job.jar
> > > 2016-11-25 15:15:42,315 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: The
> job-conf
> > file on the remote FS is /user/hdfs/.staging/job_
> > 1479580915733_0167/job.xml
> > > 2016-11-25 15:15:42,336 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Adding #0
> > tokens and #1 secret keys for NM use for launching container
> > > 2016-11-25 15:15:42,336 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Size of
> > containertokens_dob is 1
> > > 2016-11-25 15:15:42,336 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Putting
> > shuffle token in serviceData
> > > 2016-11-25 15:15:42,441 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
> > UNASSIGNED to ASSIGNED
> > > 2016-11-25 15:15:42,455 INFO [ContainerLauncher #0]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container
> > container_e125_1479580915733_0167_01_000002 taskAttempt
> > attempt_1479580915733_0167_m_000000_0
> > > 2016-11-25 15:15:42,457 INFO [ContainerLauncher #0]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > Launching attempt_1479580915733_0167_m_000000_0
> > > 2016-11-25 15:15:42,457 INFO [ContainerLauncher #0]
> > org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
> > Opening proxy : hadoopclusterslic73.ad.infosys.com:45454
> > > 2016-11-25 15:15:42,531 INFO [ContainerLauncher #0]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > Shuffle port returned by ContainerManager for
> attempt_1479580915733_0167_m_000000_0
> > : 13562
> > > 2016-11-25 15:15:42,533 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> TaskAttempt:
> > [attempt_1479580915733_0167_m_000000_0] using containerId:
> > [container_e125_1479580915733_0167_01_000002 on NM: [
> > hadoopclusterslic73.ad.infosys.com:45454]
> > > 2016-11-25 15:15:42,536 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
> > ASSIGNED to RUNNING
> > > 2016-11-25 15:15:42,536 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
> > task_1479580915733_0167_m_000000 Task Transitioned from SCHEDULED to
> > RUNNING
> > > 2016-11-25 15:15:43,241 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> > getResources() for application_1479580915733_0167: ask=4 release= 0
> > newContainers=0 finishedContainers=0 resourcelimit=<memory:34816,
> vCores:0>
> > knownNMs=2
> > > 2016-11-25 15:15:44,790 INFO [Socket Reader #1 for port 57220]
> > SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
> > job_1479580915733_0167 (auth:SIMPLE)
> > > 2016-11-25 15:15:44,870 INFO [IPC Server handler 5 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID :
> > jvm_1479580915733_0167_m_137438953472002 asked for a task
> > > 2016-11-25 15:15:44,870 INFO [IPC Server handler 5 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID:
> > jvm_1479580915733_0167_m_137438953472002 given task:
> > attempt_1479580915733_0167_m_000000_0
> > > 2016-11-25 15:15:51,923 INFO [IPC Server handler 12 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
> TaskAttempt
> > attempt_1479580915733_0167_m_000000_0 is : 0.667
> > > 2016-11-25 15:15:52,099 INFO [IPC Server handler 5 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
> TaskAttempt
> > attempt_1479580915733_0167_m_000000_0 is : 0.667
> > > 2016-11-25 15:15:52,137 ERROR [IPC Server handler 12 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
> > attempt_1479580915733_0167_m_000000_0 - exited : java.io.IOException:
> > Failed to build cube in mapper 0
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:145)
> > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> java:787)
> > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> > >       at java.security.AccessController.doPrivileged(Native Method)
> > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > UserGroupInformation.java:1724)
> > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > Caused by: java.util.concurrent.ExecutionException:
> > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > java.lang.IllegalArgumentException: Value not exists!
> > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:143)
> > >       ... 8 more
> > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > java.io.IOException: java.lang.IllegalArgumentException: Value not
> exists!
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:82)
> > >       at java.util.concurrent.Executors$RunnableAdapter.
> > call(Executors.java:511)
> > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> > >       at java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:126)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > build(DoggedCubeBuilder.java:73)
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:80)
> > >       ... 5 more
> > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:114)
> > >       ... 7 more
> > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> > Dictionary.java:162)
> > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > TrieDictionary.java:167)
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > Dictionary.java:98)
> > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:110)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:93)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:81)
> > >       at org.apache.kylin.cube.inmemcubing.
> > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > .java:74)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:542)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:523)
> > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > GTAggregateScanner.java:139)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > createBaseCuboid(InMemCubeBuilder.java:339)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:166)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:135)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > SplitThread.run(DoggedCubeBuilder.java:282)
> > >
> > > 2016-11-25 15:15:52,138 INFO [IPC Server handler 12 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report
> from
> > attempt_1479580915733_0167_m_000000_0: Error: java.io.IOException:
> Failed
> > to build cube in mapper 0
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:145)
> > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> java:787)
> > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> > >       at java.security.AccessController.doPrivileged(Native Method)
> > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > UserGroupInformation.java:1724)
> > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > Caused by: java.util.concurrent.ExecutionException:
> > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > java.lang.IllegalArgumentException: Value not exists!
> > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:143)
> > >       ... 8 more
> > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > java.io.IOException: java.lang.IllegalArgumentException: Value not
> exists!
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:82)
> > >       at java.util.concurrent.Executors$RunnableAdapter.
> > call(Executors.java:511)
> > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> > >       at java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:126)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > build(DoggedCubeBuilder.java:73)
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:80)
> > >       ... 5 more
> > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:114)
> > >       ... 7 more
> > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> > Dictionary.java:162)
> > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > TrieDictionary.java:167)
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > Dictionary.java:98)
> > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:110)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:93)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:81)
> > >       at org.apache.kylin.cube.inmemcubing.
> > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > .java:74)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:542)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:523)
> > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > GTAggregateScanner.java:139)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > createBaseCuboid(InMemCubeBuilder.java:339)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:166)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:135)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > SplitThread.run(DoggedCubeBuilder.java:282)
> > >
> > > 2016-11-25 15:15:52,141 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics
> > report from attempt_1479580915733_0167_m_000000_0: Error:
> > java.io.IOException: Failed to build cube in mapper 0
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:145)
> > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> java:787)
> > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> > >       at java.security.AccessController.doPrivileged(Native Method)
> > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > UserGroupInformation.java:1724)
> > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > Caused by: java.util.concurrent.ExecutionException:
> > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > java.lang.IllegalArgumentException: Value not exists!
> > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:143)
> > >       ... 8 more
> > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > java.io.IOException: java.lang.IllegalArgumentException: Value not
> exists!
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:82)
> > >       at java.util.concurrent.Executors$RunnableAdapter.
> > call(Executors.java:511)
> > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> > >       at java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:126)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > build(DoggedCubeBuilder.java:73)
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:80)
> > >       ... 5 more
> > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:114)
> > >       ... 7 more
> > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> > Dictionary.java:162)
> > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > TrieDictionary.java:167)
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > Dictionary.java:98)
> > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:110)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:93)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:81)
> > >       at org.apache.kylin.cube.inmemcubing.
> > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > .java:74)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:542)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:523)
> > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > GTAggregateScanner.java:139)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > createBaseCuboid(InMemCubeBuilder.java:339)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:166)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:135)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > SplitThread.run(DoggedCubeBuilder.java:282)
> > >
> > > 2016-11-25 15:15:52,142 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
> > RUNNING to FAIL_CONTAINER_CLEANUP
> > > 2016-11-25 15:15:52,155 INFO [ContainerLauncher #1]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container
> > container_e125_1479580915733_0167_01_000002 taskAttempt
> > attempt_1479580915733_0167_m_000000_0
> > > 2016-11-25 15:15:52,156 INFO [ContainerLauncher #1]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > KILLING attempt_1479580915733_0167_m_000000_0
> > > 2016-11-25 15:15:52,156 INFO [ContainerLauncher #1]
> > org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
> > Opening proxy : hadoopclusterslic73.ad.infosys.com:45454
> > > 2016-11-25 15:15:52,195 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
> > FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
> > > 2016-11-25 15:15:52,204 INFO [CommitterEvent Processor #1]
> > org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
> > Processing the event EventType: TASK_ABORT
> > > 2016-11-25 15:15:52,215 WARN [CommitterEvent Processor #1]
> > org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not
> > delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-
> > 4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_
> temporary/attempt_
> > 1479580915733_0167_m_000000_0
> > > 2016-11-25 15:15:52,218 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
> > FAIL_TASK_CLEANUP to FAILED
> > > 2016-11-25 15:15:52,224 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.yarn.util.RackResolver: Resolved
> hadoopclusterslic72.ad.
> > infosys.com to /default-rack
> > > 2016-11-25 15:15:52,224 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.yarn.util.RackResolver: Resolved
> hadoopclusterslic73.ad.
> > infosys.com to /default-rack
> > > 2016-11-25 15:15:52,226 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from NEW
> > to UNASSIGNED
> > > 2016-11-25 15:15:52,226 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMContainerRequestor: 1 failures on node
> hadoopclusterslic73.ad.
> > infosys.com
> > > 2016-11-25 15:15:52,230 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMContainerAllocator: Added attempt_1479580915733_0167_m_
> 000000_1
> > to list of failed maps
> > > 2016-11-25 15:15:52,291 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before
> > Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:1
> > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:1 ContRel:0
> > HostLocal:1 RackLocal:0
> > > 2016-11-25 15:15:52,299 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> > getResources() for application_1479580915733_0167: ask=1 release= 0
> > newContainers=0 finishedContainers=1 resourcelimit=<memory:38912,
> vCores:1>
> > knownNMs=2
> > > 2016-11-25 15:15:52,299 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received
> > completed container container_e125_1479580915733_0167_01_000002
> > > 2016-11-25 15:15:52,300 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
> Recalculating
> > schedule, headroom=<memory:38912, vCores:1>
> > > 2016-11-25 15:15:52,300 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics
> > report from attempt_1479580915733_0167_m_000000_0: Container killed by
> > the ApplicationMaster.
> > > Container killed on request. Exit code is 143
> > > Container exited with a non-zero exit code 143
> > >
> > > 2016-11-25 15:15:52,300 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow
> > start threshold not met. completedMapsForReduceSlowstart 1
> > > 2016-11-25 15:15:52,300 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
> > Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:0
> > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:1 ContRel:0
> > HostLocal:1 RackLocal:0
> > > 2016-11-25 15:15:53,303 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got
> allocated
> > containers 1
> > > 2016-11-25 15:15:53,303 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning
> > container Container: [ContainerId: container_e125_1479580915733_
> 0167_01_000003,
> > NodeId: hadoopclusterslic72.ad.infosys.com:45454, NodeHttpAddress:
> > hadoopclusterslic72.ad.infosys.com:8042, Resource: <memory:4096,
> > vCores:1>, Priority: 5, Token: Token { kind: ContainerToken, service:
> > 10.122.97.72:45454 }, ] to fast fail map
> > > 2016-11-25 15:15:53,303 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
> from
> > earlierFailedMaps
> > > 2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
> > container container_e125_1479580915733_0167_01_000003 to
> > attempt_1479580915733_0167_m_000000_1
> > > 2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
> Recalculating
> > schedule, headroom=<memory:34816, vCores:0>
> > > 2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow
> > start threshold not met. completedMapsForReduceSlowstart 1
> > > 2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
> > Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1
> > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:2 ContRel:0
> > HostLocal:1 RackLocal:0
> > > 2016-11-25 15:15:53,304 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.yarn.util.RackResolver: Resolved
> hadoopclusterslic72.ad.
> > infosys.com to /default-rack
> > > 2016-11-25 15:15:53,304 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
> > UNASSIGNED to ASSIGNED
> > > 2016-11-25 15:15:53,305 INFO [ContainerLauncher #2]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container
> > container_e125_1479580915733_0167_01_000003 taskAttempt
> > attempt_1479580915733_0167_m_000000_1
> > > 2016-11-25 15:15:53,305 INFO [ContainerLauncher #2]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > Launching attempt_1479580915733_0167_m_000000_1
> > > 2016-11-25 15:15:53,305 INFO [ContainerLauncher #2]
> > org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
> > Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> > > 2016-11-25 15:15:53,318 INFO [ContainerLauncher #2]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > Shuffle port returned by ContainerManager for
> attempt_1479580915733_0167_m_000000_1
> > : 13562
> > > 2016-11-25 15:15:53,318 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> TaskAttempt:
> > [attempt_1479580915733_0167_m_000000_1] using containerId:
> > [container_e125_1479580915733_0167_01_000003 on NM: [
> > hadoopclusterslic72.ad.infosys.com:45454]
> > > 2016-11-25 15:15:53,318 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
> > ASSIGNED to RUNNING
> > > 2016-11-25 15:15:54,309 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> > getResources() for application_1479580915733_0167: ask=1 release= 0
> > newContainers=0 finishedContainers=0 resourcelimit=<memory:34816,
> vCores:0>
> > knownNMs=2
> > > 2016-11-25 15:15:55,797 INFO [Socket Reader #1 for port 57220]
> > SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
> > job_1479580915733_0167 (auth:SIMPLE)
> > > 2016-11-25 15:15:55,819 INFO [IPC Server handler 13 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID :
> > jvm_1479580915733_0167_m_137438953472003 asked for a task
> > > 2016-11-25 15:15:55,819 INFO [IPC Server handler 13 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID:
> > jvm_1479580915733_0167_m_137438953472003 given task:
> > attempt_1479580915733_0167_m_000000_1
> > > 2016-11-25 15:16:02,857 INFO [IPC Server handler 8 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
> TaskAttempt
> > attempt_1479580915733_0167_m_000000_1 is : 0.667
> > > 2016-11-25 15:16:03,332 INFO [IPC Server handler 13 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
> TaskAttempt
> > attempt_1479580915733_0167_m_000000_1 is : 0.667
> > > 2016-11-25 15:16:03,347 ERROR [IPC Server handler 8 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
> > attempt_1479580915733_0167_m_000000_1 - exited : java.io.IOException:
> > Failed to build cube in mapper 0
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:145)
> > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> java:787)
> > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> > >       at java.security.AccessController.doPrivileged(Native Method)
> > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > UserGroupInformation.java:1724)
> > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > Caused by: java.util.concurrent.ExecutionException:
> > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > java.lang.IllegalArgumentException: Value not exists!
> > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:143)
> > >       ... 8 more
> > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > java.io.IOException: java.lang.IllegalArgumentException: Value not
> exists!
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:82)
> > >       at java.util.concurrent.Executors$RunnableAdapter.
> > call(Executors.java:511)
> > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> > >       at java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:126)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > build(DoggedCubeBuilder.java:73)
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:80)
> > >       ... 5 more
> > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:114)
> > >       ... 7 more
> > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> > Dictionary.java:162)
> > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > TrieDictionary.java:167)
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > Dictionary.java:98)
> > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:110)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:93)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:81)
> > >       at org.apache.kylin.cube.inmemcubing.
> > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > .java:74)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:542)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:523)
> > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > GTAggregateScanner.java:139)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > createBaseCuboid(InMemCubeBuilder.java:339)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:166)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:135)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > SplitThread.run(DoggedCubeBuilder.java:282)
> > >
> > > 2016-11-25 15:16:03,347 INFO [IPC Server handler 8 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report
> from
> > attempt_1479580915733_0167_m_000000_1: Error: java.io.IOException:
> Failed
> > to build cube in mapper 0
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:145)
> > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> java:787)
> > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> > >       at java.security.AccessController.doPrivileged(Native Method)
> > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > UserGroupInformation.java:1724)
> > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > Caused by: java.util.concurrent.ExecutionException:
> > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > java.lang.IllegalArgumentException: Value not exists!
> > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:143)
> > >       ... 8 more
> > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > java.io.IOException: java.lang.IllegalArgumentException: Value not
> exists!
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:82)
> > >       at java.util.concurrent.Executors$RunnableAdapter.
> > call(Executors.java:511)
> > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> > >       at java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:126)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > build(DoggedCubeBuilder.java:73)
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:80)
> > >       ... 5 more
> > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:114)
> > >       ... 7 more
> > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> > Dictionary.java:162)
> > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > TrieDictionary.java:167)
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > Dictionary.java:98)
> > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:110)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:93)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:81)
> > >       at org.apache.kylin.cube.inmemcubing.
> > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > .java:74)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:542)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:523)
> > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > GTAggregateScanner.java:139)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > createBaseCuboid(InMemCubeBuilder.java:339)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:166)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:135)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > SplitThread.run(DoggedCubeBuilder.java:282)
> > >
> > > 2016-11-25 15:16:03,349 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics
> > report from attempt_1479580915733_0167_m_000000_1: Error:
> > java.io.IOException: Failed to build cube in mapper 0
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:145)
> > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> java:787)
> > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> > >       at java.security.AccessController.doPrivileged(Native Method)
> > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > UserGroupInformation.java:1724)
> > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > Caused by: java.util.concurrent.ExecutionException:
> > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > java.lang.IllegalArgumentException: Value not exists!
> > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:143)
> > >       ... 8 more
> > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > java.io.IOException: java.lang.IllegalArgumentException: Value not
> exists!
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:82)
> > >       at java.util.concurrent.Executors$RunnableAdapter.
> > call(Executors.java:511)
> > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> > >       at java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:126)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > build(DoggedCubeBuilder.java:73)
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:80)
> > >       ... 5 more
> > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:114)
> > >       ... 7 more
> > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> > Dictionary.java:162)
> > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > TrieDictionary.java:167)
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > Dictionary.java:98)
> > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:110)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:93)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:81)
> > >       at org.apache.kylin.cube.inmemcubing.
> > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > .java:74)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:542)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:523)
> > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > GTAggregateScanner.java:139)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > createBaseCuboid(InMemCubeBuilder.java:339)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:166)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:135)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > SplitThread.run(DoggedCubeBuilder.java:282)
> > >
> > > 2016-11-25 15:16:03,350 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
> > RUNNING to FAIL_CONTAINER_CLEANUP
> > > 2016-11-25 15:16:03,351 INFO [ContainerLauncher #3]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container
> > container_e125_1479580915733_0167_01_000003 taskAttempt
> > attempt_1479580915733_0167_m_000000_1
> > > 2016-11-25 15:16:03,355 INFO [ContainerLauncher #3]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > KILLING attempt_1479580915733_0167_m_000000_1
> > > 2016-11-25 15:16:03,355 INFO [ContainerLauncher #3]
> > org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
> > Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> > > 2016-11-25 15:16:03,369 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
> > FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
> > > 2016-11-25 15:16:03,369 INFO [CommitterEvent Processor #2]
> > org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
> > Processing the event EventType: TASK_ABORT
> > > 2016-11-25 15:16:03,375 WARN [CommitterEvent Processor #2]
> > org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not
> > delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-
> > 4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_
> temporary/attempt_
> > 1479580915733_0167_m_000000_1
> > > 2016-11-25 15:16:03,375 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
> > FAIL_TASK_CLEANUP to FAILED
> > > 2016-11-25 15:16:03,375 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.yarn.util.RackResolver: Resolved
> hadoopclusterslic72.ad.
> > infosys.com to /default-rack
> > > 2016-11-25 15:16:03,375 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.yarn.util.RackResolver: Resolved
> hadoopclusterslic73.ad.
> > infosys.com to /default-rack
> > > 2016-11-25 15:16:03,376 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMContainerRequestor: 1 failures on node
> hadoopclusterslic72.ad.
> > infosys.com
> > > 2016-11-25 15:16:03,376 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from NEW
> > to UNASSIGNED
> > > 2016-11-25 15:16:03,380 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMContainerAllocator: Added attempt_1479580915733_0167_m_
> 000000_2
> > to list of failed maps
> > > 2016-11-25 15:16:04,341 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before
> > Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:1
> > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:2 ContRel:0
> > HostLocal:1 RackLocal:0
> > > 2016-11-25 15:16:04,344 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> > getResources() for application_1479580915733_0167: ask=1 release= 0
> > newContainers=0 finishedContainers=0 resourcelimit=<memory:34816,
> vCores:0>
> > knownNMs=2
> > > 2016-11-25 15:16:04,344 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
> Recalculating
> > schedule, headroom=<memory:34816, vCores:0>
> > > 2016-11-25 15:16:04,344 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow
> > start threshold not met. completedMapsForReduceSlowstart 1
> > > 2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received
> > completed container container_e125_1479580915733_0167_01_000003
> > > 2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got
> allocated
> > containers 1
> > > 2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning
> > container Container: [ContainerId: container_e125_1479580915733_
> 0167_01_000004,
> > NodeId: hadoopclusterslic72.ad.infosys.com:45454, NodeHttpAddress:
> > hadoopclusterslic72.ad.infosys.com:8042, Resource: <memory:4096,
> > vCores:1>, Priority: 5, Token: Token { kind: ContainerToken, service:
> > 10.122.97.72:45454 }, ] to fast fail map
> > > 2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
> from
> > earlierFailedMaps
> > > 2016-11-25 15:16:05,352 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics
> > report from attempt_1479580915733_0167_m_000000_1: Container killed by
> > the ApplicationMaster.
> > > Container killed on request. Exit code is 143
> > > Container exited with a non-zero exit code 143
> > >
> > > 2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
> > container container_e125_1479580915733_0167_01_000004 to
> > attempt_1479580915733_0167_m_000000_2
> > > 2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
> Recalculating
> > schedule, headroom=<memory:34816, vCores:0>
> > > 2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow
> > start threshold not met. completedMapsForReduceSlowstart 1
> > > 2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
> > Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1
> > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:3 ContRel:0
> > HostLocal:1 RackLocal:0
> > > 2016-11-25 15:16:05,353 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.yarn.util.RackResolver: Resolved
> hadoopclusterslic72.ad.
> > infosys.com to /default-rack
> > > 2016-11-25 15:16:05,354 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
> > UNASSIGNED to ASSIGNED
> > > 2016-11-25 15:16:05,356 INFO [ContainerLauncher #4]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container
> > container_e125_1479580915733_0167_01_000004 taskAttempt
> > attempt_1479580915733_0167_m_000000_2
> > > 2016-11-25 15:16:05,356 INFO [ContainerLauncher #4]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > Launching attempt_1479580915733_0167_m_000000_2
> > > 2016-11-25 15:16:05,357 INFO [ContainerLauncher #4]
> > org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
> > Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> > > 2016-11-25 15:16:05,371 INFO [ContainerLauncher #4]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > Shuffle port returned by ContainerManager for
> attempt_1479580915733_0167_m_000000_2
> > : 13562
> > > 2016-11-25 15:16:05,371 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> TaskAttempt:
> > [attempt_1479580915733_0167_m_000000_2] using containerId:
> > [container_e125_1479580915733_0167_01_000004 on NM: [
> > hadoopclusterslic72.ad.infosys.com:45454]
> > > 2016-11-25 15:16:05,372 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
> > ASSIGNED to RUNNING
> > > 2016-11-25 15:16:06,362 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> > getResources() for application_1479580915733_0167: ask=1 release= 0
> > newContainers=0 finishedContainers=0 resourcelimit=<memory:34816,
> vCores:0>
> > knownNMs=2
> > > 2016-11-25 15:16:07,537 INFO [Socket Reader #1 for port 57220]
> > SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
> > job_1479580915733_0167 (auth:SIMPLE)
> > > 2016-11-25 15:16:07,567 INFO [IPC Server handler 24 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID :
> > jvm_1479580915733_0167_m_137438953472004 asked for a task
> > > 2016-11-25 15:16:07,567 INFO [IPC Server handler 24 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID:
> > jvm_1479580915733_0167_m_137438953472004 given task:
> > attempt_1479580915733_0167_m_000000_2
> > > 2016-11-25 15:16:14,753 INFO [IPC Server handler 6 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
> TaskAttempt
> > attempt_1479580915733_0167_m_000000_2 is : 0.667
> > > 2016-11-25 15:16:15,241 INFO [IPC Server handler 13 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
> TaskAttempt
> > attempt_1479580915733_0167_m_000000_2 is : 0.667
> > > 2016-11-25 15:16:15,258 ERROR [IPC Server handler 8 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
> > attempt_1479580915733_0167_m_000000_2 - exited : java.io.IOException:
> > Failed to build cube in mapper 0
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:145)
> > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> java:787)
> > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> > >       at java.security.AccessController.doPrivileged(Native Method)
> > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > UserGroupInformation.java:1724)
> > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > Caused by: java.util.concurrent.ExecutionException:
> > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > java.lang.IllegalArgumentException: Value not exists!
> > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:143)
> > >       ... 8 more
> > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > java.io.IOException: java.lang.IllegalArgumentException: Value not
> exists!
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:82)
> > >       at java.util.concurrent.Executors$RunnableAdapter.
> > call(Executors.java:511)
> > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> > >       at java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:126)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > build(DoggedCubeBuilder.java:73)
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:80)
> > >       ... 5 more
> > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:114)
> > >       ... 7 more
> > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> > Dictionary.java:162)
> > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > TrieDictionary.java:167)
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > Dictionary.java:98)
> > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:110)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:93)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:81)
> > >       at org.apache.kylin.cube.inmemcubing.
> > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > .java:74)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:542)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:523)
> > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > GTAggregateScanner.java:139)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > createBaseCuboid(InMemCubeBuilder.java:339)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:166)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:135)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > SplitThread.run(DoggedCubeBuilder.java:282)
> > >
> > > 2016-11-25 15:16:15,258 INFO [IPC Server handler 8 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report
> from
> > attempt_1479580915733_0167_m_000000_2: Error: java.io.IOException:
> Failed
> > to build cube in mapper 0
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:145)
> > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> java:787)
> > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> > >       at java.security.AccessController.doPrivileged(Native Method)
> > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > UserGroupInformation.java:1724)
> > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > Caused by: java.util.concurrent.ExecutionException:
> > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > java.lang.IllegalArgumentException: Value not exists!
> > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:143)
> > >       ... 8 more
> > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > java.io.IOException: java.lang.IllegalArgumentException: Value not
> exists!
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:82)
> > >       at java.util.concurrent.Executors$RunnableAdapter.
> > call(Executors.java:511)
> > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> > >       at java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:126)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > build(DoggedCubeBuilder.java:73)
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:80)
> > >       ... 5 more
> > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:114)
> > >       ... 7 more
> > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> > Dictionary.java:162)
> > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > TrieDictionary.java:167)
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > Dictionary.java:98)
> > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:110)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:93)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:81)
> > >       at org.apache.kylin.cube.inmemcubing.
> > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > .java:74)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:542)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:523)
> > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > GTAggregateScanner.java:139)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > createBaseCuboid(InMemCubeBuilder.java:339)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:166)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:135)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > SplitThread.run(DoggedCubeBuilder.java:282)
> > >
> > > 2016-11-25 15:16:15,261 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics
> > report from attempt_1479580915733_0167_m_000000_2: Error:
> > java.io.IOException: Failed to build cube in mapper 0
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:145)
> > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> java:787)
> > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> > >       at java.security.AccessController.doPrivileged(Native Method)
> > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > UserGroupInformation.java:1724)
> > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > Caused by: java.util.concurrent.ExecutionException:
> > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > java.lang.IllegalArgumentException: Value not exists!
> > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:143)
> > >       ... 8 more
> > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > java.io.IOException: java.lang.IllegalArgumentException: Value not
> exists!
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:82)
> > >       at java.util.concurrent.Executors$RunnableAdapter.
> > call(Executors.java:511)
> > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> > >       at java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:126)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > build(DoggedCubeBuilder.java:73)
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:80)
> > >       ... 5 more
> > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:114)
> > >       ... 7 more
> > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> > Dictionary.java:162)
> > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > TrieDictionary.java:167)
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > Dictionary.java:98)
> > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:110)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:93)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:81)
> > >       at org.apache.kylin.cube.inmemcubing.
> > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > .java:74)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:542)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:523)
> > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > GTAggregateScanner.java:139)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > createBaseCuboid(InMemCubeBuilder.java:339)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:166)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:135)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > SplitThread.run(DoggedCubeBuilder.java:282)
> > >
> > > 2016-11-25 15:16:15,273 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
> > RUNNING to FAIL_CONTAINER_CLEANUP
> > > 2016-11-25 15:16:15,274 INFO [ContainerLauncher #5]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container
> > container_e125_1479580915733_0167_01_000004 taskAttempt
> > attempt_1479580915733_0167_m_000000_2
> > > 2016-11-25 15:16:15,274 INFO [ContainerLauncher #5]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > KILLING attempt_1479580915733_0167_m_000000_2
> > > 2016-11-25 15:16:15,274 INFO [ContainerLauncher #5]
> > org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
> > Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> > > 2016-11-25 15:16:15,289 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
> > FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
> > > 2016-11-25 15:16:15,292 INFO [CommitterEvent Processor #3]
> > org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
> > Processing the event EventType: TASK_ABORT
> > > 2016-11-25 15:16:15,300 WARN [CommitterEvent Processor #3]
> > org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not
> > delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-
> > 4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_
> temporary/attempt_
> > 1479580915733_0167_m_000000_2
> > > 2016-11-25 15:16:15,300 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
> > FAIL_TASK_CLEANUP to FAILED
> > > 2016-11-25 15:16:15,301 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.yarn.util.RackResolver: Resolved
> hadoopclusterslic72.ad.
> > infosys.com to /default-rack
> > > 2016-11-25 15:16:15,301 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.yarn.util.RackResolver: Resolved
> hadoopclusterslic73.ad.
> > infosys.com to /default-rack
> > > 2016-11-25 15:16:15,301 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from NEW
> > to UNASSIGNED
> > > 2016-11-25 15:16:15,301 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMContainerRequestor: 2 failures on node
> hadoopclusterslic72.ad.
> > infosys.com
> > > 2016-11-25 15:16:15,307 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMContainerAllocator: Added attempt_1479580915733_0167_m_
> 000000_3
> > to list of failed maps
> > > 2016-11-25 15:16:15,412 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before
> > Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:1
> > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:3 ContRel:0
> > HostLocal:1 RackLocal:0
> > > 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> > getResources() for application_1479580915733_0167: ask=1 release= 0
> > newContainers=0 finishedContainers=1 resourcelimit=<memory:38912,
> vCores:1>
> > knownNMs=2
> > > 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received
> > completed container container_e125_1479580915733_0167_01_000004
> > > 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
> Recalculating
> > schedule, headroom=<memory:38912, vCores:1>
> > > 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow
> > start threshold not met. completedMapsForReduceSlowstart 1
> > > 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
> > Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:0
> > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:3 ContRel:0
> > HostLocal:1 RackLocal:0
> > > 2016-11-25 15:16:15,421 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics
> > report from attempt_1479580915733_0167_m_000000_2: Container killed by
> > the ApplicationMaster.
> > > Container killed on request. Exit code is 143
> > > Container exited with a non-zero exit code 143
> > >
> > > 2016-11-25 15:16:16,432 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got
> allocated
> > containers 1
> > > 2016-11-25 15:16:16,432 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning
> > container Container: [ContainerId: container_e125_1479580915733_
> 0167_01_000005,
> > NodeId: hadoopclusterslic72.ad.infosys.com:45454, NodeHttpAddress:
> > hadoopclusterslic72.ad.infosys.com:8042, Resource: <memory:4096,
> > vCores:1>, Priority: 5, Token: Token { kind: ContainerToken, service:
> > 10.122.97.72:45454 }, ] to fast fail map
> > > 2016-11-25 15:16:16,432 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
> from
> > earlierFailedMaps
> > > 2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
> > container container_e125_1479580915733_0167_01_000005 to
> > attempt_1479580915733_0167_m_000000_3
> > > 2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
> Recalculating
> > schedule, headroom=<memory:34816, vCores:0>
> > > 2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow
> > start threshold not met. completedMapsForReduceSlowstart 1
> > > 2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
> > Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1
> > AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:4 ContRel:0
> > HostLocal:1 RackLocal:0
> > > 2016-11-25 15:16:16,433 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.yarn.util.RackResolver: Resolved
> hadoopclusterslic72.ad.
> > infosys.com to /default-rack
> > > 2016-11-25 15:16:16,434 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
> > UNASSIGNED to ASSIGNED
> > > 2016-11-25 15:16:16,436 INFO [ContainerLauncher #6]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container
> > container_e125_1479580915733_0167_01_000005 taskAttempt
> > attempt_1479580915733_0167_m_000000_3
> > > 2016-11-25 15:16:16,436 INFO [ContainerLauncher #6]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > Launching attempt_1479580915733_0167_m_000000_3
> > > 2016-11-25 15:16:16,436 INFO [ContainerLauncher #6]
> > org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
> > Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> > > 2016-11-25 15:16:16,516 INFO [ContainerLauncher #6]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > Shuffle port returned by ContainerManager for
> attempt_1479580915733_0167_m_000000_3
> > : 13562
> > > 2016-11-25 15:16:16,517 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> TaskAttempt:
> > [attempt_1479580915733_0167_m_000000_3] using containerId:
> > [container_e125_1479580915733_0167_01_000005 on NM: [
> > hadoopclusterslic72.ad.infosys.com:45454]
> > > 2016-11-25 15:16:16,517 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
> > ASSIGNED to RUNNING
> > > 2016-11-25 15:16:17,436 INFO [RMCommunicator Allocator]
> > org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> > getResources() for application_1479580915733_0167: ask=1 release= 0
> > newContainers=0 finishedContainers=0 resourcelimit=<memory:34816,
> vCores:0>
> > knownNMs=2
> > > 2016-11-25 15:16:19,664 INFO [Socket Reader #1 for port 57220]
> > SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
> > job_1479580915733_0167 (auth:SIMPLE)
> > > 2016-11-25 15:16:19,692 INFO [IPC Server handler 6 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID :
> > jvm_1479580915733_0167_m_137438953472005 asked for a task
> > > 2016-11-25 15:16:19,692 INFO [IPC Server handler 6 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID:
> > jvm_1479580915733_0167_m_137438953472005 given task:
> > attempt_1479580915733_0167_m_000000_3
> > > 2016-11-25 15:16:27,222 INFO [IPC Server handler 13 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
> TaskAttempt
> > attempt_1479580915733_0167_m_000000_3 is : 0.667
> > > 2016-11-25 15:16:27,952 INFO [IPC Server handler 7 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
> TaskAttempt
> > attempt_1479580915733_0167_m_000000_3 is : 0.667
> > > 2016-11-25 15:16:27,971 ERROR [IPC Server handler 11 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
> > attempt_1479580915733_0167_m_000000_3 - exited : java.io.IOException:
> > Failed to build cube in mapper 0
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:145)
> > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> java:787)
> > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> > >       at java.security.AccessController.doPrivileged(Native Method)
> > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > UserGroupInformation.java:1724)
> > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > Caused by: java.util.concurrent.ExecutionException:
> > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > java.lang.IllegalArgumentException: Value not exists!
> > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:143)
> > >       ... 8 more
> > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > java.io.IOException: java.lang.IllegalArgumentException: Value not
> exists!
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:82)
> > >       at java.util.concurrent.Executors$RunnableAdapter.
> > call(Executors.java:511)
> > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> > >       at java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:126)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > build(DoggedCubeBuilder.java:73)
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:80)
> > >       ... 5 more
> > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:114)
> > >       ... 7 more
> > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> > Dictionary.java:162)
> > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > TrieDictionary.java:167)
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > Dictionary.java:98)
> > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:110)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:93)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:81)
> > >       at org.apache.kylin.cube.inmemcubing.
> > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > .java:74)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:542)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:523)
> > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > GTAggregateScanner.java:139)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > createBaseCuboid(InMemCubeBuilder.java:339)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:166)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:135)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > SplitThread.run(DoggedCubeBuilder.java:282)
> > >
> > > 2016-11-25 15:16:27,971 INFO [IPC Server handler 11 on 57220]
> > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report
> from
> > attempt_1479580915733_0167_m_000000_3: Error: java.io.IOException:
> Failed
> > to build cube in mapper 0
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:145)
> > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> java:787)
> > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> > >       at java.security.AccessController.doPrivileged(Native Method)
> > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > UserGroupInformation.java:1724)
> > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > Caused by: java.util.concurrent.ExecutionException:
> > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > java.lang.IllegalArgumentException: Value not exists!
> > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:143)
> > >       ... 8 more
> > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > java.io.IOException: java.lang.IllegalArgumentException: Value not
> exists!
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:82)
> > >       at java.util.concurrent.Executors$RunnableAdapter.
> > call(Executors.java:511)
> > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> > >       at java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:126)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > build(DoggedCubeBuilder.java:73)
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:80)
> > >       ... 5 more
> > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:114)
> > >       ... 7 more
> > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> > Dictionary.java:162)
> > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > TrieDictionary.java:167)
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > Dictionary.java:98)
> > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:110)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:93)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:81)
> > >       at org.apache.kylin.cube.inmemcubing.
> > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > .java:74)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:542)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:523)
> > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > GTAggregateScanner.java:139)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > createBaseCuboid(InMemCubeBuilder.java:339)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:166)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:135)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > SplitThread.run(DoggedCubeBuilder.java:282)
> > >
> > > 2016-11-25 15:16:27,974 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics
> > report from attempt_1479580915733_0167_m_000000_3: Error:
> > java.io.IOException: Failed to build cube in mapper 0
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:145)
> > >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> > >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
> java:787)
> > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> > >       at java.security.AccessController.doPrivileged(Native Method)
> > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> > UserGroupInformation.java:1724)
> > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > > Caused by: java.util.concurrent.ExecutionException:
> > java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> > java.lang.IllegalArgumentException: Value not exists!
> > >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> > >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> > cleanup(InMemCuboidMapper.java:143)
> > >       ... 8 more
> > > Caused by: java.lang.RuntimeException: java.io.IOException:
> > java.io.IOException: java.lang.IllegalArgumentException: Value not
> exists!
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:82)
> > >       at java.util.concurrent.Executors$RunnableAdapter.
> > call(Executors.java:511)
> > >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> > >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> > >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> > >       at java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.io.IOException: java.io.IOException: java.lang.
> IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:126)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> > build(DoggedCubeBuilder.java:73)
> > >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> > run(AbstractInMemCubeBuilder.java:80)
> > >       ... 5 more
> > > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> > Value not exists!
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.abort(DoggedCubeBuilder.java:194)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.checkException(DoggedCubeBuilder.java:167)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > BuildOnce.build(DoggedCubeBuilder.java:114)
> > >       ... 7 more
> > > Caused by: java.lang.IllegalArgumentException: Value not exists!
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> > Dictionary.java:162)
> > >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> > TrieDictionary.java:167)
> > >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> > Dictionary.java:98)
> > >       at org.apache.kylin.dimension.DictionaryDimEnc$
> > DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:121)
> > >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> > encodeColumnValue(CubeCodeSystem.java:110)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:93)
> > >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.
> java:81)
> > >       at org.apache.kylin.cube.inmemcubing.
> > InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> > .java:74)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:542)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> > InputConverter$1.next(InMemCubeBuilder.java:523)
> > >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> > GTAggregateScanner.java:139)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > createBaseCuboid(InMemCubeBuilder.java:339)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:166)
> > >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> > build(InMemCubeBuilder.java:135)
> > >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> > SplitThread.run(DoggedCubeBuilder.java:282)
> > >
> > > 2016-11-25 15:16:27,975 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
> > RUNNING to FAIL_CONTAINER_CLEANUP
> > > 2016-11-25 15:16:27,976 INFO [ContainerLauncher #7]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container
> > container_e125_1479580915733_0167_01_000005 taskAttempt
> > attempt_1479580915733_0167_m_000000_3
> > > 2016-11-25 15:16:27,997 INFO [ContainerLauncher #7]
> > org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> > KILLING attempt_1479580915733_0167_m_000000_3
> > > 2016-11-25 15:16:27,997 INFO [ContainerLauncher #7]
> > org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
> > Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> > > 2016-11-25 15:16:28,009 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
> > FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
> > > 2016-11-25 15:16:28,011 INFO [CommitterEvent Processor #4]
> > org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
> > Processing the event EventType: TASK_ABORT
> > > 2016-11-25 15:16:28,013 WARN [CommitterEvent Processor #4]
> > org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not
> > delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-
> > 4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_
> temporary/attempt_
> > 1479580915733_0167_m_000000_3
> > > 2016-11-25 15:16:28,014 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
> > FAIL_TASK_CLEANUP to FAILED
> > > 2016-11-25 15:16:28,026 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
> > task_1479580915733_0167_m_000000 Task Transitioned from RUNNING to
> FAILED
> > > 2016-11-25 15:16:28,026 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Num completed
> Tasks:
> > 1
> > > 2016-11-25 15:16:28,027 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Job failed as tasks
> > failed. failedMaps:1 failedReduces:0
> > > 2016-11-25 15:16:28,027 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMContainerRequestor: 3 failures on node
> hadoopclusterslic72.ad.
> > infosys.com
> > > 2016-11-25 15:16:28,027 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMContainerRequestor: Blacklisted host hadoopclusterslic72.ad.
> > infosys.com
> > > 2016-11-25 15:16:28,032 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> > job_1479580915733_0167Job Transitioned from RUNNING to FAIL_WAIT
> > > 2016-11-25 15:16:28,033 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
> > task_1479580915733_0167_r_000000 Task Transitioned from SCHEDULED to
> > KILL_WAIT
> > > 2016-11-25 15:16:28,033 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> > attempt_1479580915733_0167_r_000000_0 TaskAttempt Transitioned from
> > UNASSIGNED to KILLED
> > > 2016-11-25 15:16:28,034 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
> > task_1479580915733_0167_r_000000 Task Transitioned from KILL_WAIT to
> > KILLED
> > > 2016-11-25 15:16:28,034 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> > job_1479580915733_0167Job Transitioned from FAIL_WAIT to FAIL_ABORT
> > > 2016-11-25 15:16:28,037 INFO [Thread-53] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMContainerAllocator: Processing the event EventType:
> > CONTAINER_DEALLOCATE
> > > 2016-11-25 15:16:28,037 ERROR [Thread-53] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMContainerAllocator: Could not deallocate container for task
> > attemptId attempt_1479580915733_0167_r_000000_0
> > > 2016-11-25 15:16:28,043 INFO [CommitterEvent Processor #0]
> > org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
> > Processing the event EventType: JOB_ABORT
> > > 2016-11-25 15:16:28,058 INFO [AsyncDispatcher event handler]
> > org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> > job_1479580915733_0167Job Transitioned from FAIL_ABORT to FAILED
> > > 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.
> v2.app.MRAppMaster:
> > We are finishing cleanly so this is the last retry
> > > 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.
> v2.app.MRAppMaster:
> > Notify RMCommunicator isAMLastRetry: true
> > > 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.
> v2.app.rm.RMCommunicator:
> > RMCommunicator notified that shouldUnregistered is: true
> > > 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.
> v2.app.MRAppMaster:
> > Notify JHEH isAMLastRetry: true
> > > 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > jobhistory.JobHistoryEventHandler: JobHistoryEventHandler notified that
> > forceJobCompletion is true
> > > 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.
> v2.app.MRAppMaster:
> > Calling stop for all the services
> > > 2016-11-25 15:16:28,093 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > jobhistory.JobHistoryEventHandler: Stopping JobHistoryEventHandler. Size
> > of the outstanding queue size is 2
> > > 2016-11-25 15:16:28,097 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > jobhistory.JobHistoryEventHandler: In stop, writing event TASK_FAILED
> > > 2016-11-25 15:16:28,099 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > jobhistory.JobHistoryEventHandler: In stop, writing event JOB_FAILED
> > > 2016-11-25 15:16:28,177 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > jobhistory.JobHistoryEventHandler: Copying
> hdfs://SLICHDP:8020/user/hdfs/
> > .staging/job_1479580915733_0167/job_1479580915733_0167_1.jhist to
> > hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-
> > 1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-
> > 1480067188027-0-0-FAILED-default-1480067140199.jhist_tmp
> > > 2016-11-25 15:16:28,248 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > jobhistory.JobHistoryEventHandler: Copied to done location:
> > hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-
> > 1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-
> > 1480067188027-0-0-FAILED-default-1480067140199.jhist_tmp
> > > 2016-11-25 15:16:28,253 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > jobhistory.JobHistoryEventHandler: Copying
> hdfs://SLICHDP:8020/user/hdfs/
> > .staging/job_1479580915733_0167/job_1479580915733_0167_1_conf.xml to
> > hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167_conf.xml_
> > tmp
> > > 2016-11-25 15:16:28,320 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > jobhistory.JobHistoryEventHandler: Copied to done location:
> > hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167_conf.xml_
> > tmp
> > > 2016-11-25 15:16:28,338 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > jobhistory.JobHistoryEventHandler: Moved tmp to done:
> > hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_
> 1479580915733_0167.summary_tmp
> > to hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_
> 1479580915733_0167.summary
> > > 2016-11-25 15:16:28,350 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > jobhistory.JobHistoryEventHandler: Moved tmp to done:
> > hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167_conf.xml_
> tmp
> > to hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_
> 1479580915733_0167_conf.xml
> > > 2016-11-25 15:16:28,353 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > jobhistory.JobHistoryEventHandler: Moved tmp to done:
> > hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-
> > 1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-
> > 1480067188027-0-0-FAILED-default-1480067140199.jhist_tmp to
> > hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-
> > 1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-
> > 1480067188027-0-0-FAILED-default-1480067140199.jhist
> > > 2016-11-25 15:16:28,353 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > jobhistory.JobHistoryEventHandler: Stopped JobHistoryEventHandler.
> > super.stop()
> > > 2016-11-25 15:16:28,357 INFO [Thread-74] org.apache.hadoop.mapreduce.
> v2.app.rm.RMCommunicator:
> > Setting job diagnostics to Task failed task_1479580915733_0167_m_000000
> > > Job failed as tasks failed. failedMaps:1 failedReduces:0
> > >
> > > 2016-11-25 15:16:28,357 INFO [Thread-74] org.apache.hadoop.mapreduce.
> v2.app.rm.RMCommunicator:
> > History url is http://hadoopclusterslic73.ad.
> infosys.com:19888/jobhistory/
> > job/job_1479580915733_0167
> > > 2016-11-25 <http://hadoopclusterslic73.ad.infosys.com:19888/
> > jobhistory/job/job_1479580915733_01672016-11-25> 15:16:28,373 INFO
> > [Thread-74] org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator:
> Waiting
> > for application to be successfully unregistered.
> > > 2016-11-25 15:16:29,375 INFO [Thread-74] org.apache.hadoop.mapreduce.
> > v2.app.rm.RMContainerAllocator: Final Stats: PendingReds:1
> > ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1 AssignedReds:0
> > CompletedMaps:0 CompletedReds:0 ContAlloc:4 ContRel:0 HostLocal:1
> > RackLocal:0
> > > 2016-11-25 15:16:29,377 INFO [Thread-74] org.apache.hadoop.mapreduce.
> v2.app.MRAppMaster:
> > Deleting staging directory hdfs://SLICHDP /user/hdfs/.staging/job_
> > 1479580915733_0167
> > > 2016-11-25 15:16:29,380 INFO [Thread-74] org.apache.hadoop.ipc.Server:
> > Stopping server on 57220
> > > 2016-11-25 15:16:29,387 INFO [TaskHeartbeatHandler PingChecker]
> > org.apache.hadoop.mapreduce.v2.app.TaskHeartbeatHandler:
> > TaskHeartbeatHandler thread interrupted
> > > 2016-11-25 15:16:29,387 INFO [IPC Server listener on 57220]
> > org.apache.hadoop.ipc.Server: Stopping IPC Server listener on 57220
> > >
> > >
> > > On Fri, Nov 25, 2016 at 7:36 PM, ShaoFeng Shi <sh...@apache.org>
> > > wrote:
> > >
> > >> Didn't hear of that. Hive table's file format is transparent for
> Kylin;
> > >> Even if the table is a view, Kylin can build from it.
> > >>
> > >> What's the detail error you got when using ORC table? If you can
> provide
> > >> the detail information, that would be better.
> > >>
> > >> 2016-11-25 18:22 GMT+08:00 suresh m <su...@gmail.com>:
> > >>
> > >> > Hi Facing an issue where i can able to build cube with text format
> but
> > >> > unable to building cube with ORC tables.
> > >> >
> > >> > Let me know kylin having any issues with ORC format.?
> > >> >
> > >> >  Hive having limitation that Text format tables not having
> possibility
> > >> to
> > >> > enabling ACID properties since text format not supporting ACID. But
> > for
> > >> me
> > >> > ACID properties is important to handle my data, this i can do with
> ORC
> > >> but
> > >> > kylin throwing errors with ORC format.
> > >> >
> > >> >
> > >> > Regards,
> > >> > Suresh
> > >> >
> > >>
> > >>
> > >>
> > >> --
> > >> Best regards,
> > >>
> > >> Shaofeng Shi 史少锋
> > >>
> > >
> > >
> >
>
>
>
> --
> Best regards,
>
> Shaofeng Shi 史少锋
>

Re: Is kylin not support Hive ORC tables with ACID properties.

Posted by ShaoFeng Shi <sh...@apache.org>.
Hi Suresh,

Another user also got similar problem, and I replied in the user@ group;
Just minute ago I forwarded it to dev@ group; please take a look and let me
know whether it is the same:
http://apache-kylin.74782.x6.nabble.com/Fwd-Re-org-apache-kylin-dict-TrieDictionary-Not-a-valid-value-td6428.html

2016-11-28 19:38 GMT+08:00 suresh m <su...@gmail.com>:

> can some see log and help me what is the exact issue facing with ORC
> formatted tables. Why i am unable build cube successfully with ORC
> formatted tables.
>
> On Mon, Nov 28, 2016 at 10:47 AM, suresh m <su...@gmail.com> wrote:
>
> > Please find detail as requested,
> >
> > Log Type: syslog
> >
> > Log Upload Time: Fri Nov 25 15:16:35 +0530 2016
> >
> > Log Length: 107891
> >
> > 2016-11-25 15:15:35,185 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster:
> Created MRAppMaster for application appattempt_1479580915733_0167_000001
> > 2016-11-25 15:15:35,592 WARN [main] org.apache.hadoop.util.NativeCodeLoader:
> Unable to load native-hadoop library for your platform... using
> builtin-java classes where applicable
> > 2016-11-25 15:15:35,630 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster:
> Executing with tokens:
> > 2016-11-25 15:15:35,956 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster:
> Kind: YARN_AM_RM_TOKEN, Service: , Ident: (appAttemptId { application_id {
> id: 167 cluster_timestamp: 1479580915733 } attemptId: 1 } keyId: 2128280969)
> > 2016-11-25 15:15:35,974 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster:
> Using mapred newApiCommitter.
> > 2016-11-25 15:15:35,976 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster:
> OutputCommitter set in config null
> > 2016-11-25 15:15:36,029 INFO [main] org.apache.hadoop.mapreduce.
> lib.output.FileOutputCommitter: File Output Committer Algorithm version
> is 1
> > 2016-11-25 15:15:36,029 INFO [main] org.apache.hadoop.mapreduce.
> lib.output.FileOutputCommitter: FileOutputCommitter skip cleanup
> _temporary folders under output directory:false, ignore cleanup failures:
> false
> > 2016-11-25 15:15:36,692 WARN [main] org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory:
> The short-circuit local reads feature cannot be used because libhadoop
> cannot be loaded.
> > 2016-11-25 15:15:36,702 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster:
> OutputCommitter is org.apache.hadoop.mapreduce.
> lib.output.FileOutputCommitter
> > 2016-11-25 15:15:36,891 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher:
> Registering class org.apache.hadoop.mapreduce.jobhistory.EventType for
> class org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler
> > 2016-11-25 15:15:36,891 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher:
> Registering class org.apache.hadoop.mapreduce.v2.app.job.event.JobEventType
> for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> JobEventDispatcher
> > 2016-11-25 15:15:36,892 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher:
> Registering class org.apache.hadoop.mapreduce.v2.app.job.event.TaskEventType
> for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> TaskEventDispatcher
> > 2016-11-25 15:15:36,893 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher:
> Registering class org.apache.hadoop.mapreduce.v2.app.job.event.TaskAttemptEventType
> for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> TaskAttemptEventDispatcher
> > 2016-11-25 15:15:36,893 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher:
> Registering class org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventType
> for class org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler
> > 2016-11-25 15:15:36,894 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher:
> Registering class org.apache.hadoop.mapreduce.v2.app.speculate.Speculator$EventType
> for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> SpeculatorEventDispatcher
> > 2016-11-25 15:15:36,894 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher:
> Registering class org.apache.hadoop.mapreduce.
> v2.app.rm.ContainerAllocator$EventType for class
> org.apache.hadoop.mapreduce.v2.app.MRAppMaster$ContainerAllocatorRouter
> > 2016-11-25 15:15:36,895 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher:
> Registering class org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncher$EventType
> for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> ContainerLauncherRouter
> > 2016-11-25 15:15:36,923 INFO [main] org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils:
> Default file system is set solely by core-default.xml therefore -  ignoring
> > 2016-11-25 15:15:36,945 INFO [main] org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils:
> Default file system is set solely by core-default.xml therefore -  ignoring
> > 2016-11-25 15:15:36,967 INFO [main] org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils:
> Default file system is set solely by core-default.xml therefore -  ignoring
> > 2016-11-25 15:15:37,029 INFO [main] org.apache.hadoop.mapreduce.
> jobhistory.JobHistoryEventHandler: Emitting job history data to the
> timeline server is not enabled
> > 2016-11-25 15:15:37,064 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher:
> Registering class org.apache.hadoop.mapreduce.v2.app.job.event.JobFinishEvent$Type
> for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> JobFinishEventHandler
> > 2016-11-25 15:15:37,204 WARN [main] org.apache.hadoop.metrics2.impl.MetricsConfig:
> Cannot locate configuration: tried hadoop-metrics2-mrappmaster.
> properties,hadoop-metrics2.properties
> > 2016-11-25 15:15:37,300 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl:
> Scheduled snapshot period at 10 second(s).
> > 2016-11-25 15:15:37,300 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl:
> MRAppMaster metrics system started
> > 2016-11-25 15:15:37,308 INFO [main] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> Adding job token for job_1479580915733_0167 to jobTokenSecretManager
> > 2016-11-25 15:15:37,452 INFO [main] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> Not uberizing job_1479580915733_0167 because: not enabled; too much RAM;
> > 2016-11-25 15:15:37,468 INFO [main] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> Input size for job job_1479580915733_0167 = 223589. Number of splits = 1
> > 2016-11-25 15:15:37,469 INFO [main] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> Number of reduces for job job_1479580915733_0167 = 1
> > 2016-11-25 15:15:37,469 INFO [main] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> job_1479580915733_0167Job Transitioned from NEW to INITED
> > 2016-11-25 15:15:37,470 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster:
> MRAppMaster launching normal, non-uberized, multi-container job
> job_1479580915733_0167.
> > 2016-11-25 15:15:37,493 INFO [main] org.apache.hadoop.ipc.CallQueueManager:
> Using callQueue: class java.util.concurrent.LinkedBlockingQueue
> scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler
> > 2016-11-25 15:15:37,506 INFO [Socket Reader #1 for port 60945]
> org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 60945
> > 2016-11-25 15:15:37,525 INFO [main] org.apache.hadoop.yarn.
> factories.impl.pb.RpcServerFactoryPBImpl: Adding protocol
> org.apache.hadoop.mapreduce.v2.api.MRClientProtocolPB to the server
> > 2016-11-25 15:15:37,527 INFO [IPC Server Responder]
> org.apache.hadoop.ipc.Server: IPC Server Responder: starting
> > 2016-11-25 15:15:37,529 INFO [IPC Server listener on 60945]
> org.apache.hadoop.ipc.Server: IPC Server listener on 60945: starting
> > 2016-11-25 15:15:37,529 INFO [main] org.apache.hadoop.mapreduce.v2.app.client.MRClientService:
> Instantiated MRClientService at hadoopclusterslic73.ad.
> infosys.com/10.122.97.73:60945
> > 2016-11-25 <http://hadoopclusterslic73.ad.infosys.com/10.122.97.73:
> 609452016-11-25> 15:15:37,614 INFO [main] org.mortbay.log: Logging to
> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
> org.mortbay.log.Slf4jLog
> > 2016-11-25 15:15:37,623 INFO [main] org.apache.hadoop.security.
> authentication.server.AuthenticationFilter: Unable to initialize
> FileSignerSecretProvider, falling back to use random secrets.
> > 2016-11-25 15:15:37,628 WARN [main] org.apache.hadoop.http.HttpRequestLog:
> Jetty request log can only be enabled using Log4j
> > 2016-11-25 15:15:37,636 INFO [main] org.apache.hadoop.http.HttpServer2:
> Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$
> QuotingInputFilter)
> > 2016-11-25 15:15:37,688 INFO [main] org.apache.hadoop.http.HttpServer2:
> Added filter AM_PROXY_FILTER (class=org.apache.hadoop.yarn.
> server.webproxy.amfilter.AmIpFilter) to context mapreduce
> > 2016-11-25 15:15:37,688 INFO [main] org.apache.hadoop.http.HttpServer2:
> Added filter AM_PROXY_FILTER (class=org.apache.hadoop.yarn.
> server.webproxy.amfilter.AmIpFilter) to context static
> > 2016-11-25 15:15:37,691 INFO [main] org.apache.hadoop.http.HttpServer2:
> adding path spec: /mapreduce/*
> > 2016-11-25 15:15:37,692 INFO [main] org.apache.hadoop.http.HttpServer2:
> adding path spec: /ws/*
> > 2016-11-25 15:15:38,181 INFO [main] org.apache.hadoop.yarn.webapp.WebApps:
> Registered webapp guice modules
> > 2016-11-25 15:15:38,183 INFO [main] org.apache.hadoop.http.HttpServer2:
> Jetty bound to port 34311
> > 2016-11-25 15:15:38,183 INFO [main] org.mortbay.log: jetty-6.1.26.hwx
> > 2016-11-25 15:15:38,263 INFO [main] org.mortbay.log: Extract
> jar:file:/hadoop/yarn/local/filecache/16/mapreduce.tar.gz/
> hadoop/share/hadoop/yarn/hadoop-yarn-common-2.7.3.2.5.
> 0.0-1245.jar!/webapps/mapreduce to /hadoop/yarn/local/usercache/
> hdfs/appcache/application_1479580915733_0167/container_
> e125_1479580915733_0167_01_000001/tmp/Jetty_0_0_0_0_
> 34311_mapreduce____2ncvaf/webapp
> > 2016-11-25 15:15:39,882 INFO [main] org.mortbay.log: Started HttpServer2$
> SelectChannelConnectorWithSafeStartup@0.0.0.0:34311
> > 2016-11-25 15:15:39,882 INFO [main] org.apache.hadoop.yarn.webapp.WebApps:
> Web app mapreduce started at 34311
> > 2016-11-25 15:15:39,933 INFO [main] org.apache.hadoop.ipc.CallQueueManager:
> Using callQueue: class java.util.concurrent.LinkedBlockingQueue
> scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler
> > 2016-11-25 15:15:39,936 INFO [Socket Reader #1 for port 57220]
> org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 57220
> > 2016-11-25 15:15:39,943 INFO [IPC Server listener on 57220]
> org.apache.hadoop.ipc.Server: IPC Server listener on 57220: starting
> > 2016-11-25 15:15:39,953 INFO [IPC Server Responder]
> org.apache.hadoop.ipc.Server: IPC Server Responder: starting
> > 2016-11-25 15:15:39,978 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.rm.RMContainerRequestor: nodeBlacklistingEnabled:true
> > 2016-11-25 15:15:39,978 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.rm.RMContainerRequestor: maxTaskFailuresPerNode is 3
> > 2016-11-25 15:15:39,978 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.rm.RMContainerRequestor: blacklistDisablePercent is 33
> > 2016-11-25 15:15:40,082 WARN [main] org.apache.hadoop.ipc.Client: Failed
> to connect to server: hadoopclusterslic71.ad.infosys.com/10.122.97.71:8030:
> retries get failed due to exceeded maximum allowed retries number: 0
> > java.net.ConnectException: Connection refused
> >       at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> >       at sun.nio.ch.SocketChannelImpl.finishConnect(
> SocketChannelImpl.java:717)
> >       at org.apache.hadoop.net.SocketIOWithTimeout.connect(
> SocketIOWithTimeout.java:206)
> >       at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
> >       at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
> >       at org.apache.hadoop.ipc.Client$Connection.setupConnection(
> Client.java:650)
> >       at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(
> Client.java:745)
> >       at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.
> java:397)
> >       at org.apache.hadoop.ipc.Client.getConnection(Client.java:1618)
> >       at org.apache.hadoop.ipc.Client.call(Client.java:1449)
> >       at org.apache.hadoop.ipc.Client.call(Client.java:1396)
> >       at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.
> invoke(ProtobufRpcEngine.java:233)
> >       at com.sun.proxy.$Proxy80.registerApplicationMaster(Unknown
> Source)
> >       at org.apache.hadoop.yarn.api.impl.pb.client.
> ApplicationMasterProtocolPBClientImpl.registerApplicationMaster(
> ApplicationMasterProtocolPBClientImpl.java:106)
> >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >       at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62)
> >       at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> >       at java.lang.reflect.Method.invoke(Method.java:497)
> >       at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(
> RetryInvocationHandler.java:278)
> >       at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
> RetryInvocationHandler.java:194)
> >       at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
> RetryInvocationHandler.java:176)
> >       at com.sun.proxy.$Proxy81.registerApplicationMaster(Unknown
> Source)
> >       at org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator.
> register(RMCommunicator.java:160)
> >       at org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator.
> serviceStart(RMCommunicator.java:121)
> >       at org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator.
> serviceStart(RMContainerAllocator.java:250)
> >       at org.apache.hadoop.service.AbstractService.start(
> AbstractService.java:193)
> >       at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$
> ContainerAllocatorRouter.serviceStart(MRAppMaster.java:881)
> >       at org.apache.hadoop.service.AbstractService.start(
> AbstractService.java:193)
> >       at org.apache.hadoop.service.CompositeService.serviceStart(
> CompositeService.java:120)
> >       at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.
> serviceStart(MRAppMaster.java:1151)
> >       at org.apache.hadoop.service.AbstractService.start(
> AbstractService.java:193)
> >       at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$5.run(
> MRAppMaster.java:1557)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:422)
> >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1724)
> >       at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.
> initAndStartAppMaster(MRAppMaster.java:1553)
> >       at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(
> MRAppMaster.java:1486)
> > 2016-11-25 15:15:40,089 INFO [main] org.apache.hadoop.yarn.client.
> ConfiguredRMFailoverProxyProvider: Failing over to rm2
> > 2016-11-25 15:15:40,183 INFO [main] org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator:
> maxContainerCapability: <memory:28672, vCores:3>
> > 2016-11-25 15:15:40,183 INFO [main] org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator:
> queue: default
> > 2016-11-25 15:15:40,186 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.launcher.ContainerLauncherImpl: Upper limit on the thread pool
> size is 500
> > 2016-11-25 15:15:40,186 INFO [main] org.apache.hadoop.mapreduce.
> v2.app.launcher.ContainerLauncherImpl: The thread pool initial size is 10
> > 2016-11-25 15:15:40,189 INFO [main] org.apache.hadoop.yarn.client.
> api.impl.ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies
> : 0
> > 2016-11-25 15:15:40,202 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> job_1479580915733_0167Job Transitioned from INITED to SETUP
> > 2016-11-25 15:15:40,212 INFO [CommitterEvent Processor #0]
> org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
> Processing the event EventType: JOB_SETUP
> > 2016-11-25 15:15:40,226 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> job_1479580915733_0167Job Transitioned from SETUP to RUNNING
> > 2016-11-25 15:15:40,291 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic72.ad.
> infosys.com to /default-rack
> > 2016-11-25 15:15:40,328 INFO [eventHandlingThread]
> org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Event
> Writer setup for JobId: job_1479580915733_0167, File:
> hdfs://SLICHDP:8020/user/hdfs/.staging/job_1479580915733_
> 0167/job_1479580915733_0167_1.jhist
> > 2016-11-25 15:15:40,351 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic73.ad.
> infosys.com to /default-rack
> > 2016-11-25 15:15:40,357 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
> task_1479580915733_0167_m_000000 Task Transitioned from NEW to SCHEDULED
> > 2016-11-25 15:15:40,358 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
> task_1479580915733_0167_r_000000 Task Transitioned from NEW to SCHEDULED
> > 2016-11-25 15:15:40,359 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from NEW
> to UNASSIGNED
> > 2016-11-25 15:15:40,359 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_r_000000_0 TaskAttempt Transitioned from NEW
> to UNASSIGNED
> > 2016-11-25 15:15:40,401 INFO [Thread-53] org.apache.hadoop.mapreduce.
> v2.app.rm.RMContainerAllocator: mapResourceRequest:<memory:3072, vCores:1>
> > 2016-11-25 15:15:40,416 INFO [Thread-53] org.apache.hadoop.mapreduce.
> v2.app.rm.RMContainerAllocator: reduceResourceRequest:<memory:4096,
> vCores:1>
> > 2016-11-25 15:15:41,191 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before
> Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:0
> AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:0 ContRel:0
> HostLocal:0 RackLocal:0
> > 2016-11-25 15:15:41,225 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> getResources() for application_1479580915733_0167: ask=4 release= 0
> newContainers=0 finishedContainers=0 resourcelimit=<memory:38912, vCores:1>
> knownNMs=2
> > 2016-11-25 15:15:41,225 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating
> schedule, headroom=<memory:38912, vCores:1>
> > 2016-11-25 15:15:41,226 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow
> start threshold not met. completedMapsForReduceSlowstart 1
> > 2016-11-25 15:15:42,235 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got allocated
> containers 1
> > 2016-11-25 15:15:42,237 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
> container container_e125_1479580915733_0167_01_000002 to
> attempt_1479580915733_0167_m_000000_0
> > 2016-11-25 15:15:42,238 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating
> schedule, headroom=<memory:34816, vCores:0>
> > 2016-11-25 15:15:42,238 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow
> start threshold not met. completedMapsForReduceSlowstart 1
> > 2016-11-25 15:15:42,238 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
> Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1
> AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:1 ContRel:0
> HostLocal:1 RackLocal:0
> > 2016-11-25 15:15:42,286 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic73.ad.
> infosys.com to /default-rack
> > 2016-11-25 15:15:42,311 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: The job-jar
> file on the remote FS is hdfs://SLICHDP/user/hdfs/.
> staging/job_1479580915733_0167/job.jar
> > 2016-11-25 15:15:42,315 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: The job-conf
> file on the remote FS is /user/hdfs/.staging/job_
> 1479580915733_0167/job.xml
> > 2016-11-25 15:15:42,336 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Adding #0
> tokens and #1 secret keys for NM use for launching container
> > 2016-11-25 15:15:42,336 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Size of
> containertokens_dob is 1
> > 2016-11-25 15:15:42,336 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Putting
> shuffle token in serviceData
> > 2016-11-25 15:15:42,441 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
> UNASSIGNED to ASSIGNED
> > 2016-11-25 15:15:42,455 INFO [ContainerLauncher #0]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container
> container_e125_1479580915733_0167_01_000002 taskAttempt
> attempt_1479580915733_0167_m_000000_0
> > 2016-11-25 15:15:42,457 INFO [ContainerLauncher #0]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> Launching attempt_1479580915733_0167_m_000000_0
> > 2016-11-25 15:15:42,457 INFO [ContainerLauncher #0]
> org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
> Opening proxy : hadoopclusterslic73.ad.infosys.com:45454
> > 2016-11-25 15:15:42,531 INFO [ContainerLauncher #0]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> Shuffle port returned by ContainerManager for attempt_1479580915733_0167_m_000000_0
> : 13562
> > 2016-11-25 15:15:42,533 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: TaskAttempt:
> [attempt_1479580915733_0167_m_000000_0] using containerId:
> [container_e125_1479580915733_0167_01_000002 on NM: [
> hadoopclusterslic73.ad.infosys.com:45454]
> > 2016-11-25 15:15:42,536 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
> ASSIGNED to RUNNING
> > 2016-11-25 15:15:42,536 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
> task_1479580915733_0167_m_000000 Task Transitioned from SCHEDULED to
> RUNNING
> > 2016-11-25 15:15:43,241 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> getResources() for application_1479580915733_0167: ask=4 release= 0
> newContainers=0 finishedContainers=0 resourcelimit=<memory:34816, vCores:0>
> knownNMs=2
> > 2016-11-25 15:15:44,790 INFO [Socket Reader #1 for port 57220]
> SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
> job_1479580915733_0167 (auth:SIMPLE)
> > 2016-11-25 15:15:44,870 INFO [IPC Server handler 5 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID :
> jvm_1479580915733_0167_m_137438953472002 asked for a task
> > 2016-11-25 15:15:44,870 INFO [IPC Server handler 5 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID:
> jvm_1479580915733_0167_m_137438953472002 given task:
> attempt_1479580915733_0167_m_000000_0
> > 2016-11-25 15:15:51,923 INFO [IPC Server handler 12 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt
> attempt_1479580915733_0167_m_000000_0 is : 0.667
> > 2016-11-25 15:15:52,099 INFO [IPC Server handler 5 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt
> attempt_1479580915733_0167_m_000000_0 is : 0.667
> > 2016-11-25 15:15:52,137 ERROR [IPC Server handler 12 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
> attempt_1479580915733_0167_m_000000_0 - exited : java.io.IOException:
> Failed to build cube in mapper 0
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:145)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:422)
> >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1724)
> >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > Caused by: java.util.concurrent.ExecutionException:
> java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> java.lang.IllegalArgumentException: Value not exists!
> >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:143)
> >       ... 8 more
> > Caused by: java.lang.RuntimeException: java.io.IOException:
> java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:82)
> >       at java.util.concurrent.Executors$RunnableAdapter.
> call(Executors.java:511)
> >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
> >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
> >       at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:126)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> build(DoggedCubeBuilder.java:73)
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:80)
> >       ... 5 more
> > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.abort(DoggedCubeBuilder.java:194)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.checkException(DoggedCubeBuilder.java:167)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:114)
> >       ... 7 more
> > Caused by: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> Dictionary.java:162)
> >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> TrieDictionary.java:167)
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> Dictionary.java:98)
> >       at org.apache.kylin.dimension.DictionaryDimEnc$
> DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:110)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> >       at org.apache.kylin.cube.inmemcubing.
> InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> .java:74)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:542)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:523)
> >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> GTAggregateScanner.java:139)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> createBaseCuboid(InMemCubeBuilder.java:339)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:166)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:135)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> SplitThread.run(DoggedCubeBuilder.java:282)
> >
> > 2016-11-25 15:15:52,138 INFO [IPC Server handler 12 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from
> attempt_1479580915733_0167_m_000000_0: Error: java.io.IOException: Failed
> to build cube in mapper 0
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:145)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:422)
> >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1724)
> >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > Caused by: java.util.concurrent.ExecutionException:
> java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> java.lang.IllegalArgumentException: Value not exists!
> >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:143)
> >       ... 8 more
> > Caused by: java.lang.RuntimeException: java.io.IOException:
> java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:82)
> >       at java.util.concurrent.Executors$RunnableAdapter.
> call(Executors.java:511)
> >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
> >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
> >       at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:126)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> build(DoggedCubeBuilder.java:73)
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:80)
> >       ... 5 more
> > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.abort(DoggedCubeBuilder.java:194)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.checkException(DoggedCubeBuilder.java:167)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:114)
> >       ... 7 more
> > Caused by: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> Dictionary.java:162)
> >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> TrieDictionary.java:167)
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> Dictionary.java:98)
> >       at org.apache.kylin.dimension.DictionaryDimEnc$
> DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:110)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> >       at org.apache.kylin.cube.inmemcubing.
> InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> .java:74)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:542)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:523)
> >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> GTAggregateScanner.java:139)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> createBaseCuboid(InMemCubeBuilder.java:339)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:166)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:135)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> SplitThread.run(DoggedCubeBuilder.java:282)
> >
> > 2016-11-25 15:15:52,141 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics
> report from attempt_1479580915733_0167_m_000000_0: Error:
> java.io.IOException: Failed to build cube in mapper 0
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:145)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:422)
> >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1724)
> >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > Caused by: java.util.concurrent.ExecutionException:
> java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> java.lang.IllegalArgumentException: Value not exists!
> >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:143)
> >       ... 8 more
> > Caused by: java.lang.RuntimeException: java.io.IOException:
> java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:82)
> >       at java.util.concurrent.Executors$RunnableAdapter.
> call(Executors.java:511)
> >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
> >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
> >       at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:126)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> build(DoggedCubeBuilder.java:73)
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:80)
> >       ... 5 more
> > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.abort(DoggedCubeBuilder.java:194)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.checkException(DoggedCubeBuilder.java:167)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:114)
> >       ... 7 more
> > Caused by: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> Dictionary.java:162)
> >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> TrieDictionary.java:167)
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> Dictionary.java:98)
> >       at org.apache.kylin.dimension.DictionaryDimEnc$
> DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:110)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> >       at org.apache.kylin.cube.inmemcubing.
> InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> .java:74)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:542)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:523)
> >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> GTAggregateScanner.java:139)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> createBaseCuboid(InMemCubeBuilder.java:339)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:166)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:135)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> SplitThread.run(DoggedCubeBuilder.java:282)
> >
> > 2016-11-25 15:15:52,142 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
> RUNNING to FAIL_CONTAINER_CLEANUP
> > 2016-11-25 15:15:52,155 INFO [ContainerLauncher #1]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container
> container_e125_1479580915733_0167_01_000002 taskAttempt
> attempt_1479580915733_0167_m_000000_0
> > 2016-11-25 15:15:52,156 INFO [ContainerLauncher #1]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> KILLING attempt_1479580915733_0167_m_000000_0
> > 2016-11-25 15:15:52,156 INFO [ContainerLauncher #1]
> org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
> Opening proxy : hadoopclusterslic73.ad.infosys.com:45454
> > 2016-11-25 15:15:52,195 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
> FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
> > 2016-11-25 15:15:52,204 INFO [CommitterEvent Processor #1]
> org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
> Processing the event EventType: TASK_ABORT
> > 2016-11-25 15:15:52,215 WARN [CommitterEvent Processor #1]
> org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not
> delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-
> 4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_temporary/attempt_
> 1479580915733_0167_m_000000_0
> > 2016-11-25 15:15:52,218 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
> FAIL_TASK_CLEANUP to FAILED
> > 2016-11-25 15:15:52,224 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic72.ad.
> infosys.com to /default-rack
> > 2016-11-25 15:15:52,224 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic73.ad.
> infosys.com to /default-rack
> > 2016-11-25 15:15:52,226 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from NEW
> to UNASSIGNED
> > 2016-11-25 15:15:52,226 INFO [Thread-53] org.apache.hadoop.mapreduce.
> v2.app.rm.RMContainerRequestor: 1 failures on node hadoopclusterslic73.ad.
> infosys.com
> > 2016-11-25 15:15:52,230 INFO [Thread-53] org.apache.hadoop.mapreduce.
> v2.app.rm.RMContainerAllocator: Added attempt_1479580915733_0167_m_000000_1
> to list of failed maps
> > 2016-11-25 15:15:52,291 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before
> Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:1
> AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:1 ContRel:0
> HostLocal:1 RackLocal:0
> > 2016-11-25 15:15:52,299 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> getResources() for application_1479580915733_0167: ask=1 release= 0
> newContainers=0 finishedContainers=1 resourcelimit=<memory:38912, vCores:1>
> knownNMs=2
> > 2016-11-25 15:15:52,299 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received
> completed container container_e125_1479580915733_0167_01_000002
> > 2016-11-25 15:15:52,300 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating
> schedule, headroom=<memory:38912, vCores:1>
> > 2016-11-25 15:15:52,300 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics
> report from attempt_1479580915733_0167_m_000000_0: Container killed by
> the ApplicationMaster.
> > Container killed on request. Exit code is 143
> > Container exited with a non-zero exit code 143
> >
> > 2016-11-25 15:15:52,300 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow
> start threshold not met. completedMapsForReduceSlowstart 1
> > 2016-11-25 15:15:52,300 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
> Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:0
> AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:1 ContRel:0
> HostLocal:1 RackLocal:0
> > 2016-11-25 15:15:53,303 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got allocated
> containers 1
> > 2016-11-25 15:15:53,303 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning
> container Container: [ContainerId: container_e125_1479580915733_0167_01_000003,
> NodeId: hadoopclusterslic72.ad.infosys.com:45454, NodeHttpAddress:
> hadoopclusterslic72.ad.infosys.com:8042, Resource: <memory:4096,
> vCores:1>, Priority: 5, Token: Token { kind: ContainerToken, service:
> 10.122.97.72:45454 }, ] to fast fail map
> > 2016-11-25 15:15:53,303 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned from
> earlierFailedMaps
> > 2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
> container container_e125_1479580915733_0167_01_000003 to
> attempt_1479580915733_0167_m_000000_1
> > 2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating
> schedule, headroom=<memory:34816, vCores:0>
> > 2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow
> start threshold not met. completedMapsForReduceSlowstart 1
> > 2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
> Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1
> AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:2 ContRel:0
> HostLocal:1 RackLocal:0
> > 2016-11-25 15:15:53,304 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic72.ad.
> infosys.com to /default-rack
> > 2016-11-25 15:15:53,304 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
> UNASSIGNED to ASSIGNED
> > 2016-11-25 15:15:53,305 INFO [ContainerLauncher #2]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container
> container_e125_1479580915733_0167_01_000003 taskAttempt
> attempt_1479580915733_0167_m_000000_1
> > 2016-11-25 15:15:53,305 INFO [ContainerLauncher #2]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> Launching attempt_1479580915733_0167_m_000000_1
> > 2016-11-25 15:15:53,305 INFO [ContainerLauncher #2]
> org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
> Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> > 2016-11-25 15:15:53,318 INFO [ContainerLauncher #2]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> Shuffle port returned by ContainerManager for attempt_1479580915733_0167_m_000000_1
> : 13562
> > 2016-11-25 15:15:53,318 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: TaskAttempt:
> [attempt_1479580915733_0167_m_000000_1] using containerId:
> [container_e125_1479580915733_0167_01_000003 on NM: [
> hadoopclusterslic72.ad.infosys.com:45454]
> > 2016-11-25 15:15:53,318 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
> ASSIGNED to RUNNING
> > 2016-11-25 15:15:54,309 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> getResources() for application_1479580915733_0167: ask=1 release= 0
> newContainers=0 finishedContainers=0 resourcelimit=<memory:34816, vCores:0>
> knownNMs=2
> > 2016-11-25 15:15:55,797 INFO [Socket Reader #1 for port 57220]
> SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
> job_1479580915733_0167 (auth:SIMPLE)
> > 2016-11-25 15:15:55,819 INFO [IPC Server handler 13 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID :
> jvm_1479580915733_0167_m_137438953472003 asked for a task
> > 2016-11-25 15:15:55,819 INFO [IPC Server handler 13 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID:
> jvm_1479580915733_0167_m_137438953472003 given task:
> attempt_1479580915733_0167_m_000000_1
> > 2016-11-25 15:16:02,857 INFO [IPC Server handler 8 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt
> attempt_1479580915733_0167_m_000000_1 is : 0.667
> > 2016-11-25 15:16:03,332 INFO [IPC Server handler 13 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt
> attempt_1479580915733_0167_m_000000_1 is : 0.667
> > 2016-11-25 15:16:03,347 ERROR [IPC Server handler 8 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
> attempt_1479580915733_0167_m_000000_1 - exited : java.io.IOException:
> Failed to build cube in mapper 0
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:145)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:422)
> >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1724)
> >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > Caused by: java.util.concurrent.ExecutionException:
> java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> java.lang.IllegalArgumentException: Value not exists!
> >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:143)
> >       ... 8 more
> > Caused by: java.lang.RuntimeException: java.io.IOException:
> java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:82)
> >       at java.util.concurrent.Executors$RunnableAdapter.
> call(Executors.java:511)
> >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
> >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
> >       at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:126)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> build(DoggedCubeBuilder.java:73)
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:80)
> >       ... 5 more
> > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.abort(DoggedCubeBuilder.java:194)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.checkException(DoggedCubeBuilder.java:167)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:114)
> >       ... 7 more
> > Caused by: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> Dictionary.java:162)
> >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> TrieDictionary.java:167)
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> Dictionary.java:98)
> >       at org.apache.kylin.dimension.DictionaryDimEnc$
> DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:110)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> >       at org.apache.kylin.cube.inmemcubing.
> InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> .java:74)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:542)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:523)
> >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> GTAggregateScanner.java:139)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> createBaseCuboid(InMemCubeBuilder.java:339)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:166)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:135)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> SplitThread.run(DoggedCubeBuilder.java:282)
> >
> > 2016-11-25 15:16:03,347 INFO [IPC Server handler 8 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from
> attempt_1479580915733_0167_m_000000_1: Error: java.io.IOException: Failed
> to build cube in mapper 0
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:145)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:422)
> >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1724)
> >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > Caused by: java.util.concurrent.ExecutionException:
> java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> java.lang.IllegalArgumentException: Value not exists!
> >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:143)
> >       ... 8 more
> > Caused by: java.lang.RuntimeException: java.io.IOException:
> java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:82)
> >       at java.util.concurrent.Executors$RunnableAdapter.
> call(Executors.java:511)
> >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
> >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
> >       at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:126)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> build(DoggedCubeBuilder.java:73)
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:80)
> >       ... 5 more
> > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.abort(DoggedCubeBuilder.java:194)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.checkException(DoggedCubeBuilder.java:167)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:114)
> >       ... 7 more
> > Caused by: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> Dictionary.java:162)
> >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> TrieDictionary.java:167)
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> Dictionary.java:98)
> >       at org.apache.kylin.dimension.DictionaryDimEnc$
> DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:110)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> >       at org.apache.kylin.cube.inmemcubing.
> InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> .java:74)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:542)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:523)
> >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> GTAggregateScanner.java:139)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> createBaseCuboid(InMemCubeBuilder.java:339)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:166)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:135)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> SplitThread.run(DoggedCubeBuilder.java:282)
> >
> > 2016-11-25 15:16:03,349 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics
> report from attempt_1479580915733_0167_m_000000_1: Error:
> java.io.IOException: Failed to build cube in mapper 0
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:145)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:422)
> >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1724)
> >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > Caused by: java.util.concurrent.ExecutionException:
> java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> java.lang.IllegalArgumentException: Value not exists!
> >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:143)
> >       ... 8 more
> > Caused by: java.lang.RuntimeException: java.io.IOException:
> java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:82)
> >       at java.util.concurrent.Executors$RunnableAdapter.
> call(Executors.java:511)
> >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
> >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
> >       at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:126)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> build(DoggedCubeBuilder.java:73)
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:80)
> >       ... 5 more
> > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.abort(DoggedCubeBuilder.java:194)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.checkException(DoggedCubeBuilder.java:167)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:114)
> >       ... 7 more
> > Caused by: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> Dictionary.java:162)
> >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> TrieDictionary.java:167)
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> Dictionary.java:98)
> >       at org.apache.kylin.dimension.DictionaryDimEnc$
> DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:110)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> >       at org.apache.kylin.cube.inmemcubing.
> InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> .java:74)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:542)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:523)
> >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> GTAggregateScanner.java:139)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> createBaseCuboid(InMemCubeBuilder.java:339)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:166)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:135)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> SplitThread.run(DoggedCubeBuilder.java:282)
> >
> > 2016-11-25 15:16:03,350 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
> RUNNING to FAIL_CONTAINER_CLEANUP
> > 2016-11-25 15:16:03,351 INFO [ContainerLauncher #3]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container
> container_e125_1479580915733_0167_01_000003 taskAttempt
> attempt_1479580915733_0167_m_000000_1
> > 2016-11-25 15:16:03,355 INFO [ContainerLauncher #3]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> KILLING attempt_1479580915733_0167_m_000000_1
> > 2016-11-25 15:16:03,355 INFO [ContainerLauncher #3]
> org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
> Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> > 2016-11-25 15:16:03,369 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
> FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
> > 2016-11-25 15:16:03,369 INFO [CommitterEvent Processor #2]
> org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
> Processing the event EventType: TASK_ABORT
> > 2016-11-25 15:16:03,375 WARN [CommitterEvent Processor #2]
> org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not
> delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-
> 4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_temporary/attempt_
> 1479580915733_0167_m_000000_1
> > 2016-11-25 15:16:03,375 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
> FAIL_TASK_CLEANUP to FAILED
> > 2016-11-25 15:16:03,375 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic72.ad.
> infosys.com to /default-rack
> > 2016-11-25 15:16:03,375 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic73.ad.
> infosys.com to /default-rack
> > 2016-11-25 15:16:03,376 INFO [Thread-53] org.apache.hadoop.mapreduce.
> v2.app.rm.RMContainerRequestor: 1 failures on node hadoopclusterslic72.ad.
> infosys.com
> > 2016-11-25 15:16:03,376 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from NEW
> to UNASSIGNED
> > 2016-11-25 15:16:03,380 INFO [Thread-53] org.apache.hadoop.mapreduce.
> v2.app.rm.RMContainerAllocator: Added attempt_1479580915733_0167_m_000000_2
> to list of failed maps
> > 2016-11-25 15:16:04,341 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before
> Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:1
> AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:2 ContRel:0
> HostLocal:1 RackLocal:0
> > 2016-11-25 15:16:04,344 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> getResources() for application_1479580915733_0167: ask=1 release= 0
> newContainers=0 finishedContainers=0 resourcelimit=<memory:34816, vCores:0>
> knownNMs=2
> > 2016-11-25 15:16:04,344 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating
> schedule, headroom=<memory:34816, vCores:0>
> > 2016-11-25 15:16:04,344 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow
> start threshold not met. completedMapsForReduceSlowstart 1
> > 2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received
> completed container container_e125_1479580915733_0167_01_000003
> > 2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got allocated
> containers 1
> > 2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning
> container Container: [ContainerId: container_e125_1479580915733_0167_01_000004,
> NodeId: hadoopclusterslic72.ad.infosys.com:45454, NodeHttpAddress:
> hadoopclusterslic72.ad.infosys.com:8042, Resource: <memory:4096,
> vCores:1>, Priority: 5, Token: Token { kind: ContainerToken, service:
> 10.122.97.72:45454 }, ] to fast fail map
> > 2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned from
> earlierFailedMaps
> > 2016-11-25 15:16:05,352 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics
> report from attempt_1479580915733_0167_m_000000_1: Container killed by
> the ApplicationMaster.
> > Container killed on request. Exit code is 143
> > Container exited with a non-zero exit code 143
> >
> > 2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
> container container_e125_1479580915733_0167_01_000004 to
> attempt_1479580915733_0167_m_000000_2
> > 2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating
> schedule, headroom=<memory:34816, vCores:0>
> > 2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow
> start threshold not met. completedMapsForReduceSlowstart 1
> > 2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
> Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1
> AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:3 ContRel:0
> HostLocal:1 RackLocal:0
> > 2016-11-25 15:16:05,353 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic72.ad.
> infosys.com to /default-rack
> > 2016-11-25 15:16:05,354 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
> UNASSIGNED to ASSIGNED
> > 2016-11-25 15:16:05,356 INFO [ContainerLauncher #4]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container
> container_e125_1479580915733_0167_01_000004 taskAttempt
> attempt_1479580915733_0167_m_000000_2
> > 2016-11-25 15:16:05,356 INFO [ContainerLauncher #4]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> Launching attempt_1479580915733_0167_m_000000_2
> > 2016-11-25 15:16:05,357 INFO [ContainerLauncher #4]
> org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
> Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> > 2016-11-25 15:16:05,371 INFO [ContainerLauncher #4]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> Shuffle port returned by ContainerManager for attempt_1479580915733_0167_m_000000_2
> : 13562
> > 2016-11-25 15:16:05,371 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: TaskAttempt:
> [attempt_1479580915733_0167_m_000000_2] using containerId:
> [container_e125_1479580915733_0167_01_000004 on NM: [
> hadoopclusterslic72.ad.infosys.com:45454]
> > 2016-11-25 15:16:05,372 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
> ASSIGNED to RUNNING
> > 2016-11-25 15:16:06,362 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> getResources() for application_1479580915733_0167: ask=1 release= 0
> newContainers=0 finishedContainers=0 resourcelimit=<memory:34816, vCores:0>
> knownNMs=2
> > 2016-11-25 15:16:07,537 INFO [Socket Reader #1 for port 57220]
> SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
> job_1479580915733_0167 (auth:SIMPLE)
> > 2016-11-25 15:16:07,567 INFO [IPC Server handler 24 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID :
> jvm_1479580915733_0167_m_137438953472004 asked for a task
> > 2016-11-25 15:16:07,567 INFO [IPC Server handler 24 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID:
> jvm_1479580915733_0167_m_137438953472004 given task:
> attempt_1479580915733_0167_m_000000_2
> > 2016-11-25 15:16:14,753 INFO [IPC Server handler 6 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt
> attempt_1479580915733_0167_m_000000_2 is : 0.667
> > 2016-11-25 15:16:15,241 INFO [IPC Server handler 13 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt
> attempt_1479580915733_0167_m_000000_2 is : 0.667
> > 2016-11-25 15:16:15,258 ERROR [IPC Server handler 8 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
> attempt_1479580915733_0167_m_000000_2 - exited : java.io.IOException:
> Failed to build cube in mapper 0
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:145)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:422)
> >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1724)
> >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > Caused by: java.util.concurrent.ExecutionException:
> java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> java.lang.IllegalArgumentException: Value not exists!
> >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:143)
> >       ... 8 more
> > Caused by: java.lang.RuntimeException: java.io.IOException:
> java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:82)
> >       at java.util.concurrent.Executors$RunnableAdapter.
> call(Executors.java:511)
> >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
> >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
> >       at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:126)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> build(DoggedCubeBuilder.java:73)
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:80)
> >       ... 5 more
> > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.abort(DoggedCubeBuilder.java:194)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.checkException(DoggedCubeBuilder.java:167)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:114)
> >       ... 7 more
> > Caused by: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> Dictionary.java:162)
> >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> TrieDictionary.java:167)
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> Dictionary.java:98)
> >       at org.apache.kylin.dimension.DictionaryDimEnc$
> DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:110)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> >       at org.apache.kylin.cube.inmemcubing.
> InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> .java:74)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:542)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:523)
> >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> GTAggregateScanner.java:139)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> createBaseCuboid(InMemCubeBuilder.java:339)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:166)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:135)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> SplitThread.run(DoggedCubeBuilder.java:282)
> >
> > 2016-11-25 15:16:15,258 INFO [IPC Server handler 8 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from
> attempt_1479580915733_0167_m_000000_2: Error: java.io.IOException: Failed
> to build cube in mapper 0
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:145)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:422)
> >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1724)
> >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > Caused by: java.util.concurrent.ExecutionException:
> java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> java.lang.IllegalArgumentException: Value not exists!
> >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:143)
> >       ... 8 more
> > Caused by: java.lang.RuntimeException: java.io.IOException:
> java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:82)
> >       at java.util.concurrent.Executors$RunnableAdapter.
> call(Executors.java:511)
> >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
> >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
> >       at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:126)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> build(DoggedCubeBuilder.java:73)
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:80)
> >       ... 5 more
> > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.abort(DoggedCubeBuilder.java:194)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.checkException(DoggedCubeBuilder.java:167)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:114)
> >       ... 7 more
> > Caused by: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> Dictionary.java:162)
> >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> TrieDictionary.java:167)
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> Dictionary.java:98)
> >       at org.apache.kylin.dimension.DictionaryDimEnc$
> DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:110)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> >       at org.apache.kylin.cube.inmemcubing.
> InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> .java:74)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:542)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:523)
> >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> GTAggregateScanner.java:139)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> createBaseCuboid(InMemCubeBuilder.java:339)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:166)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:135)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> SplitThread.run(DoggedCubeBuilder.java:282)
> >
> > 2016-11-25 15:16:15,261 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics
> report from attempt_1479580915733_0167_m_000000_2: Error:
> java.io.IOException: Failed to build cube in mapper 0
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:145)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:422)
> >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1724)
> >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > Caused by: java.util.concurrent.ExecutionException:
> java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> java.lang.IllegalArgumentException: Value not exists!
> >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:143)
> >       ... 8 more
> > Caused by: java.lang.RuntimeException: java.io.IOException:
> java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:82)
> >       at java.util.concurrent.Executors$RunnableAdapter.
> call(Executors.java:511)
> >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
> >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
> >       at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:126)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> build(DoggedCubeBuilder.java:73)
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:80)
> >       ... 5 more
> > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.abort(DoggedCubeBuilder.java:194)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.checkException(DoggedCubeBuilder.java:167)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:114)
> >       ... 7 more
> > Caused by: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> Dictionary.java:162)
> >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> TrieDictionary.java:167)
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> Dictionary.java:98)
> >       at org.apache.kylin.dimension.DictionaryDimEnc$
> DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:110)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> >       at org.apache.kylin.cube.inmemcubing.
> InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> .java:74)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:542)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:523)
> >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> GTAggregateScanner.java:139)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> createBaseCuboid(InMemCubeBuilder.java:339)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:166)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:135)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> SplitThread.run(DoggedCubeBuilder.java:282)
> >
> > 2016-11-25 15:16:15,273 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
> RUNNING to FAIL_CONTAINER_CLEANUP
> > 2016-11-25 15:16:15,274 INFO [ContainerLauncher #5]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container
> container_e125_1479580915733_0167_01_000004 taskAttempt
> attempt_1479580915733_0167_m_000000_2
> > 2016-11-25 15:16:15,274 INFO [ContainerLauncher #5]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> KILLING attempt_1479580915733_0167_m_000000_2
> > 2016-11-25 15:16:15,274 INFO [ContainerLauncher #5]
> org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
> Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> > 2016-11-25 15:16:15,289 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
> FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
> > 2016-11-25 15:16:15,292 INFO [CommitterEvent Processor #3]
> org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
> Processing the event EventType: TASK_ABORT
> > 2016-11-25 15:16:15,300 WARN [CommitterEvent Processor #3]
> org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not
> delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-
> 4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_temporary/attempt_
> 1479580915733_0167_m_000000_2
> > 2016-11-25 15:16:15,300 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
> FAIL_TASK_CLEANUP to FAILED
> > 2016-11-25 15:16:15,301 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic72.ad.
> infosys.com to /default-rack
> > 2016-11-25 15:16:15,301 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic73.ad.
> infosys.com to /default-rack
> > 2016-11-25 15:16:15,301 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from NEW
> to UNASSIGNED
> > 2016-11-25 15:16:15,301 INFO [Thread-53] org.apache.hadoop.mapreduce.
> v2.app.rm.RMContainerRequestor: 2 failures on node hadoopclusterslic72.ad.
> infosys.com
> > 2016-11-25 15:16:15,307 INFO [Thread-53] org.apache.hadoop.mapreduce.
> v2.app.rm.RMContainerAllocator: Added attempt_1479580915733_0167_m_000000_3
> to list of failed maps
> > 2016-11-25 15:16:15,412 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before
> Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:1
> AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:3 ContRel:0
> HostLocal:1 RackLocal:0
> > 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> getResources() for application_1479580915733_0167: ask=1 release= 0
> newContainers=0 finishedContainers=1 resourcelimit=<memory:38912, vCores:1>
> knownNMs=2
> > 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received
> completed container container_e125_1479580915733_0167_01_000004
> > 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating
> schedule, headroom=<memory:38912, vCores:1>
> > 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow
> start threshold not met. completedMapsForReduceSlowstart 1
> > 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
> Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:0
> AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:3 ContRel:0
> HostLocal:1 RackLocal:0
> > 2016-11-25 15:16:15,421 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics
> report from attempt_1479580915733_0167_m_000000_2: Container killed by
> the ApplicationMaster.
> > Container killed on request. Exit code is 143
> > Container exited with a non-zero exit code 143
> >
> > 2016-11-25 15:16:16,432 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got allocated
> containers 1
> > 2016-11-25 15:16:16,432 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning
> container Container: [ContainerId: container_e125_1479580915733_0167_01_000005,
> NodeId: hadoopclusterslic72.ad.infosys.com:45454, NodeHttpAddress:
> hadoopclusterslic72.ad.infosys.com:8042, Resource: <memory:4096,
> vCores:1>, Priority: 5, Token: Token { kind: ContainerToken, service:
> 10.122.97.72:45454 }, ] to fast fail map
> > 2016-11-25 15:16:16,432 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned from
> earlierFailedMaps
> > 2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
> container container_e125_1479580915733_0167_01_000005 to
> attempt_1479580915733_0167_m_000000_3
> > 2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating
> schedule, headroom=<memory:34816, vCores:0>
> > 2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow
> start threshold not met. completedMapsForReduceSlowstart 1
> > 2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
> Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1
> AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:4 ContRel:0
> HostLocal:1 RackLocal:0
> > 2016-11-25 15:16:16,433 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic72.ad.
> infosys.com to /default-rack
> > 2016-11-25 15:16:16,434 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
> UNASSIGNED to ASSIGNED
> > 2016-11-25 15:16:16,436 INFO [ContainerLauncher #6]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container
> container_e125_1479580915733_0167_01_000005 taskAttempt
> attempt_1479580915733_0167_m_000000_3
> > 2016-11-25 15:16:16,436 INFO [ContainerLauncher #6]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> Launching attempt_1479580915733_0167_m_000000_3
> > 2016-11-25 15:16:16,436 INFO [ContainerLauncher #6]
> org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
> Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> > 2016-11-25 15:16:16,516 INFO [ContainerLauncher #6]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> Shuffle port returned by ContainerManager for attempt_1479580915733_0167_m_000000_3
> : 13562
> > 2016-11-25 15:16:16,517 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: TaskAttempt:
> [attempt_1479580915733_0167_m_000000_3] using containerId:
> [container_e125_1479580915733_0167_01_000005 on NM: [
> hadoopclusterslic72.ad.infosys.com:45454]
> > 2016-11-25 15:16:16,517 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
> ASSIGNED to RUNNING
> > 2016-11-25 15:16:17,436 INFO [RMCommunicator Allocator]
> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
> getResources() for application_1479580915733_0167: ask=1 release= 0
> newContainers=0 finishedContainers=0 resourcelimit=<memory:34816, vCores:0>
> knownNMs=2
> > 2016-11-25 15:16:19,664 INFO [Socket Reader #1 for port 57220]
> SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
> job_1479580915733_0167 (auth:SIMPLE)
> > 2016-11-25 15:16:19,692 INFO [IPC Server handler 6 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID :
> jvm_1479580915733_0167_m_137438953472005 asked for a task
> > 2016-11-25 15:16:19,692 INFO [IPC Server handler 6 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID:
> jvm_1479580915733_0167_m_137438953472005 given task:
> attempt_1479580915733_0167_m_000000_3
> > 2016-11-25 15:16:27,222 INFO [IPC Server handler 13 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt
> attempt_1479580915733_0167_m_000000_3 is : 0.667
> > 2016-11-25 15:16:27,952 INFO [IPC Server handler 7 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt
> attempt_1479580915733_0167_m_000000_3 is : 0.667
> > 2016-11-25 15:16:27,971 ERROR [IPC Server handler 11 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
> attempt_1479580915733_0167_m_000000_3 - exited : java.io.IOException:
> Failed to build cube in mapper 0
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:145)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:422)
> >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1724)
> >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > Caused by: java.util.concurrent.ExecutionException:
> java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> java.lang.IllegalArgumentException: Value not exists!
> >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:143)
> >       ... 8 more
> > Caused by: java.lang.RuntimeException: java.io.IOException:
> java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:82)
> >       at java.util.concurrent.Executors$RunnableAdapter.
> call(Executors.java:511)
> >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
> >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
> >       at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:126)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> build(DoggedCubeBuilder.java:73)
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:80)
> >       ... 5 more
> > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.abort(DoggedCubeBuilder.java:194)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.checkException(DoggedCubeBuilder.java:167)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:114)
> >       ... 7 more
> > Caused by: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> Dictionary.java:162)
> >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> TrieDictionary.java:167)
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> Dictionary.java:98)
> >       at org.apache.kylin.dimension.DictionaryDimEnc$
> DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:110)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> >       at org.apache.kylin.cube.inmemcubing.
> InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> .java:74)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:542)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:523)
> >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> GTAggregateScanner.java:139)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> createBaseCuboid(InMemCubeBuilder.java:339)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:166)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:135)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> SplitThread.run(DoggedCubeBuilder.java:282)
> >
> > 2016-11-25 15:16:27,971 INFO [IPC Server handler 11 on 57220]
> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from
> attempt_1479580915733_0167_m_000000_3: Error: java.io.IOException: Failed
> to build cube in mapper 0
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:145)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:422)
> >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1724)
> >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > Caused by: java.util.concurrent.ExecutionException:
> java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> java.lang.IllegalArgumentException: Value not exists!
> >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:143)
> >       ... 8 more
> > Caused by: java.lang.RuntimeException: java.io.IOException:
> java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:82)
> >       at java.util.concurrent.Executors$RunnableAdapter.
> call(Executors.java:511)
> >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
> >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
> >       at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:126)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> build(DoggedCubeBuilder.java:73)
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:80)
> >       ... 5 more
> > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.abort(DoggedCubeBuilder.java:194)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.checkException(DoggedCubeBuilder.java:167)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:114)
> >       ... 7 more
> > Caused by: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> Dictionary.java:162)
> >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> TrieDictionary.java:167)
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> Dictionary.java:98)
> >       at org.apache.kylin.dimension.DictionaryDimEnc$
> DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:110)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> >       at org.apache.kylin.cube.inmemcubing.
> InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> .java:74)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:542)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:523)
> >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> GTAggregateScanner.java:139)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> createBaseCuboid(InMemCubeBuilder.java:339)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:166)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:135)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> SplitThread.run(DoggedCubeBuilder.java:282)
> >
> > 2016-11-25 15:16:27,974 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics
> report from attempt_1479580915733_0167_m_000000_3: Error:
> java.io.IOException: Failed to build cube in mapper 0
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:145)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:422)
> >       at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1724)
> >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > Caused by: java.util.concurrent.ExecutionException:
> java.lang.RuntimeException: java.io.IOException: java.io.IOException:
> java.lang.IllegalArgumentException: Value not exists!
> >       at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> >       at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> >       at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.
> cleanup(InMemCuboidMapper.java:143)
> >       ... 8 more
> > Caused by: java.lang.RuntimeException: java.io.IOException:
> java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:82)
> >       at java.util.concurrent.Executors$RunnableAdapter.
> call(Executors.java:511)
> >       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> >       at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
> >       at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
> >       at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:126)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.
> build(DoggedCubeBuilder.java:73)
> >       at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.
> run(AbstractInMemCubeBuilder.java:80)
> >       ... 5 more
> > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> Value not exists!
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.abort(DoggedCubeBuilder.java:194)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.checkException(DoggedCubeBuilder.java:167)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> BuildOnce.build(DoggedCubeBuilder.java:114)
> >       ... 7 more
> > Caused by: java.lang.IllegalArgumentException: Value not exists!
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(
> Dictionary.java:162)
> >       at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(
> TrieDictionary.java:167)
> >       at org.apache.kylin.common.util.Dictionary.getIdFromValue(
> Dictionary.java:98)
> >       at org.apache.kylin.dimension.DictionaryDimEnc$
> DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:121)
> >       at org.apache.kylin.cube.gridtable.CubeCodeSystem.
> encodeColumnValue(CubeCodeSystem.java:110)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> >       at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> >       at org.apache.kylin.cube.inmemcubing.
> InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter
> .java:74)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:542)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$
> InputConverter$1.next(InMemCubeBuilder.java:523)
> >       at org.apache.kylin.gridtable.GTAggregateScanner.iterator(
> GTAggregateScanner.java:139)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> createBaseCuboid(InMemCubeBuilder.java:339)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:166)
> >       at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.
> build(InMemCubeBuilder.java:135)
> >       at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$
> SplitThread.run(DoggedCubeBuilder.java:282)
> >
> > 2016-11-25 15:16:27,975 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
> RUNNING to FAIL_CONTAINER_CLEANUP
> > 2016-11-25 15:16:27,976 INFO [ContainerLauncher #7]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container
> container_e125_1479580915733_0167_01_000005 taskAttempt
> attempt_1479580915733_0167_m_000000_3
> > 2016-11-25 15:16:27,997 INFO [ContainerLauncher #7]
> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
> KILLING attempt_1479580915733_0167_m_000000_3
> > 2016-11-25 15:16:27,997 INFO [ContainerLauncher #7]
> org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
> Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> > 2016-11-25 15:16:28,009 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
> FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
> > 2016-11-25 15:16:28,011 INFO [CommitterEvent Processor #4]
> org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
> Processing the event EventType: TASK_ABORT
> > 2016-11-25 15:16:28,013 WARN [CommitterEvent Processor #4]
> org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not
> delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-
> 4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_temporary/attempt_
> 1479580915733_0167_m_000000_3
> > 2016-11-25 15:16:28,014 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
> FAIL_TASK_CLEANUP to FAILED
> > 2016-11-25 15:16:28,026 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
> task_1479580915733_0167_m_000000 Task Transitioned from RUNNING to FAILED
> > 2016-11-25 15:16:28,026 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Num completed Tasks:
> 1
> > 2016-11-25 15:16:28,027 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Job failed as tasks
> failed. failedMaps:1 failedReduces:0
> > 2016-11-25 15:16:28,027 INFO [Thread-53] org.apache.hadoop.mapreduce.
> v2.app.rm.RMContainerRequestor: 3 failures on node hadoopclusterslic72.ad.
> infosys.com
> > 2016-11-25 15:16:28,027 INFO [Thread-53] org.apache.hadoop.mapreduce.
> v2.app.rm.RMContainerRequestor: Blacklisted host hadoopclusterslic72.ad.
> infosys.com
> > 2016-11-25 15:16:28,032 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> job_1479580915733_0167Job Transitioned from RUNNING to FAIL_WAIT
> > 2016-11-25 15:16:28,033 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
> task_1479580915733_0167_r_000000 Task Transitioned from SCHEDULED to
> KILL_WAIT
> > 2016-11-25 15:16:28,033 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
> attempt_1479580915733_0167_r_000000_0 TaskAttempt Transitioned from
> UNASSIGNED to KILLED
> > 2016-11-25 15:16:28,034 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
> task_1479580915733_0167_r_000000 Task Transitioned from KILL_WAIT to
> KILLED
> > 2016-11-25 15:16:28,034 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> job_1479580915733_0167Job Transitioned from FAIL_WAIT to FAIL_ABORT
> > 2016-11-25 15:16:28,037 INFO [Thread-53] org.apache.hadoop.mapreduce.
> v2.app.rm.RMContainerAllocator: Processing the event EventType:
> CONTAINER_DEALLOCATE
> > 2016-11-25 15:16:28,037 ERROR [Thread-53] org.apache.hadoop.mapreduce.
> v2.app.rm.RMContainerAllocator: Could not deallocate container for task
> attemptId attempt_1479580915733_0167_r_000000_0
> > 2016-11-25 15:16:28,043 INFO [CommitterEvent Processor #0]
> org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
> Processing the event EventType: JOB_ABORT
> > 2016-11-25 15:16:28,058 INFO [AsyncDispatcher event handler]
> org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
> job_1479580915733_0167Job Transitioned from FAIL_ABORT to FAILED
> > 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.v2.app.MRAppMaster:
> We are finishing cleanly so this is the last retry
> > 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.v2.app.MRAppMaster:
> Notify RMCommunicator isAMLastRetry: true
> > 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator:
> RMCommunicator notified that shouldUnregistered is: true
> > 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.v2.app.MRAppMaster:
> Notify JHEH isAMLastRetry: true
> > 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.
> jobhistory.JobHistoryEventHandler: JobHistoryEventHandler notified that
> forceJobCompletion is true
> > 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.v2.app.MRAppMaster:
> Calling stop for all the services
> > 2016-11-25 15:16:28,093 INFO [Thread-74] org.apache.hadoop.mapreduce.
> jobhistory.JobHistoryEventHandler: Stopping JobHistoryEventHandler. Size
> of the outstanding queue size is 2
> > 2016-11-25 15:16:28,097 INFO [Thread-74] org.apache.hadoop.mapreduce.
> jobhistory.JobHistoryEventHandler: In stop, writing event TASK_FAILED
> > 2016-11-25 15:16:28,099 INFO [Thread-74] org.apache.hadoop.mapreduce.
> jobhistory.JobHistoryEventHandler: In stop, writing event JOB_FAILED
> > 2016-11-25 15:16:28,177 INFO [Thread-74] org.apache.hadoop.mapreduce.
> jobhistory.JobHistoryEventHandler: Copying hdfs://SLICHDP:8020/user/hdfs/
> .staging/job_1479580915733_0167/job_1479580915733_0167_1.jhist to
> hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-
> 1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-
> 1480067188027-0-0-FAILED-default-1480067140199.jhist_tmp
> > 2016-11-25 15:16:28,248 INFO [Thread-74] org.apache.hadoop.mapreduce.
> jobhistory.JobHistoryEventHandler: Copied to done location:
> hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-
> 1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-
> 1480067188027-0-0-FAILED-default-1480067140199.jhist_tmp
> > 2016-11-25 15:16:28,253 INFO [Thread-74] org.apache.hadoop.mapreduce.
> jobhistory.JobHistoryEventHandler: Copying hdfs://SLICHDP:8020/user/hdfs/
> .staging/job_1479580915733_0167/job_1479580915733_0167_1_conf.xml to
> hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167_conf.xml_
> tmp
> > 2016-11-25 15:16:28,320 INFO [Thread-74] org.apache.hadoop.mapreduce.
> jobhistory.JobHistoryEventHandler: Copied to done location:
> hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167_conf.xml_
> tmp
> > 2016-11-25 15:16:28,338 INFO [Thread-74] org.apache.hadoop.mapreduce.
> jobhistory.JobHistoryEventHandler: Moved tmp to done:
> hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167.summary_tmp
> to hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167.summary
> > 2016-11-25 15:16:28,350 INFO [Thread-74] org.apache.hadoop.mapreduce.
> jobhistory.JobHistoryEventHandler: Moved tmp to done:
> hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167_conf.xml_tmp
> to hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167_conf.xml
> > 2016-11-25 15:16:28,353 INFO [Thread-74] org.apache.hadoop.mapreduce.
> jobhistory.JobHistoryEventHandler: Moved tmp to done:
> hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-
> 1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-
> 1480067188027-0-0-FAILED-default-1480067140199.jhist_tmp to
> hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-
> 1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-
> 1480067188027-0-0-FAILED-default-1480067140199.jhist
> > 2016-11-25 15:16:28,353 INFO [Thread-74] org.apache.hadoop.mapreduce.
> jobhistory.JobHistoryEventHandler: Stopped JobHistoryEventHandler.
> super.stop()
> > 2016-11-25 15:16:28,357 INFO [Thread-74] org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator:
> Setting job diagnostics to Task failed task_1479580915733_0167_m_000000
> > Job failed as tasks failed. failedMaps:1 failedReduces:0
> >
> > 2016-11-25 15:16:28,357 INFO [Thread-74] org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator:
> History url is http://hadoopclusterslic73.ad.infosys.com:19888/jobhistory/
> job/job_1479580915733_0167
> > 2016-11-25 <http://hadoopclusterslic73.ad.infosys.com:19888/
> jobhistory/job/job_1479580915733_01672016-11-25> 15:16:28,373 INFO
> [Thread-74] org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator: Waiting
> for application to be successfully unregistered.
> > 2016-11-25 15:16:29,375 INFO [Thread-74] org.apache.hadoop.mapreduce.
> v2.app.rm.RMContainerAllocator: Final Stats: PendingReds:1
> ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1 AssignedReds:0
> CompletedMaps:0 CompletedReds:0 ContAlloc:4 ContRel:0 HostLocal:1
> RackLocal:0
> > 2016-11-25 15:16:29,377 INFO [Thread-74] org.apache.hadoop.mapreduce.v2.app.MRAppMaster:
> Deleting staging directory hdfs://SLICHDP /user/hdfs/.staging/job_
> 1479580915733_0167
> > 2016-11-25 15:16:29,380 INFO [Thread-74] org.apache.hadoop.ipc.Server:
> Stopping server on 57220
> > 2016-11-25 15:16:29,387 INFO [TaskHeartbeatHandler PingChecker]
> org.apache.hadoop.mapreduce.v2.app.TaskHeartbeatHandler:
> TaskHeartbeatHandler thread interrupted
> > 2016-11-25 15:16:29,387 INFO [IPC Server listener on 57220]
> org.apache.hadoop.ipc.Server: Stopping IPC Server listener on 57220
> >
> >
> > On Fri, Nov 25, 2016 at 7:36 PM, ShaoFeng Shi <sh...@apache.org>
> > wrote:
> >
> >> Didn't hear of that. Hive table's file format is transparent for Kylin;
> >> Even if the table is a view, Kylin can build from it.
> >>
> >> What's the detail error you got when using ORC table? If you can provide
> >> the detail information, that would be better.
> >>
> >> 2016-11-25 18:22 GMT+08:00 suresh m <su...@gmail.com>:
> >>
> >> > Hi Facing an issue where i can able to build cube with text format but
> >> > unable to building cube with ORC tables.
> >> >
> >> > Let me know kylin having any issues with ORC format.?
> >> >
> >> >  Hive having limitation that Text format tables not having possibility
> >> to
> >> > enabling ACID properties since text format not supporting ACID. But
> for
> >> me
> >> > ACID properties is important to handle my data, this i can do with ORC
> >> but
> >> > kylin throwing errors with ORC format.
> >> >
> >> >
> >> > Regards,
> >> > Suresh
> >> >
> >>
> >>
> >>
> >> --
> >> Best regards,
> >>
> >> Shaofeng Shi 史少锋
> >>
> >
> >
>



-- 
Best regards,

Shaofeng Shi 史少锋

Re: Is kylin not support Hive ORC tables with ACID properties.

Posted by suresh m <su...@gmail.com>.
can some see log and help me what is the exact issue facing with ORC
formatted tables. Why i am unable build cube successfully with ORC
formatted tables.

On Mon, Nov 28, 2016 at 10:47 AM, suresh m <su...@gmail.com> wrote:

> Please find detail as requested,
>
> Log Type: syslog
>
> Log Upload Time: Fri Nov 25 15:16:35 +0530 2016
>
> Log Length: 107891
>
> 2016-11-25 15:15:35,185 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Created MRAppMaster for application appattempt_1479580915733_0167_000001
> 2016-11-25 15:15:35,592 WARN [main] org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> 2016-11-25 15:15:35,630 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Executing with tokens:
> 2016-11-25 15:15:35,956 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Kind: YARN_AM_RM_TOKEN, Service: , Ident: (appAttemptId { application_id { id: 167 cluster_timestamp: 1479580915733 } attemptId: 1 } keyId: 2128280969)
> 2016-11-25 15:15:35,974 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Using mapred newApiCommitter.
> 2016-11-25 15:15:35,976 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: OutputCommitter set in config null
> 2016-11-25 15:15:36,029 INFO [main] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: File Output Committer Algorithm version is 1
> 2016-11-25 15:15:36,029 INFO [main] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
> 2016-11-25 15:15:36,692 WARN [main] org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
> 2016-11-25 15:15:36,702 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: OutputCommitter is org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
> 2016-11-25 15:15:36,891 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.jobhistory.EventType for class org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler
> 2016-11-25 15:15:36,891 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.v2.app.job.event.JobEventType for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$JobEventDispatcher
> 2016-11-25 15:15:36,892 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.v2.app.job.event.TaskEventType for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$TaskEventDispatcher
> 2016-11-25 15:15:36,893 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.v2.app.job.event.TaskAttemptEventType for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$TaskAttemptEventDispatcher
> 2016-11-25 15:15:36,893 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventType for class org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler
> 2016-11-25 15:15:36,894 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.v2.app.speculate.Speculator$EventType for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$SpeculatorEventDispatcher
> 2016-11-25 15:15:36,894 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.v2.app.rm.ContainerAllocator$EventType for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$ContainerAllocatorRouter
> 2016-11-25 15:15:36,895 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncher$EventType for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$ContainerLauncherRouter
> 2016-11-25 15:15:36,923 INFO [main] org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils: Default file system is set solely by core-default.xml therefore -  ignoring
> 2016-11-25 15:15:36,945 INFO [main] org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils: Default file system is set solely by core-default.xml therefore -  ignoring
> 2016-11-25 15:15:36,967 INFO [main] org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils: Default file system is set solely by core-default.xml therefore -  ignoring
> 2016-11-25 15:15:37,029 INFO [main] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Emitting job history data to the timeline server is not enabled
> 2016-11-25 15:15:37,064 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.v2.app.job.event.JobFinishEvent$Type for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$JobFinishEventHandler
> 2016-11-25 15:15:37,204 WARN [main] org.apache.hadoop.metrics2.impl.MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-mrappmaster.properties,hadoop-metrics2.properties
> 2016-11-25 15:15:37,300 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
> 2016-11-25 15:15:37,300 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MRAppMaster metrics system started
> 2016-11-25 15:15:37,308 INFO [main] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Adding job token for job_1479580915733_0167 to jobTokenSecretManager
> 2016-11-25 15:15:37,452 INFO [main] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Not uberizing job_1479580915733_0167 because: not enabled; too much RAM;
> 2016-11-25 15:15:37,468 INFO [main] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Input size for job job_1479580915733_0167 = 223589. Number of splits = 1
> 2016-11-25 15:15:37,469 INFO [main] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Number of reduces for job job_1479580915733_0167 = 1
> 2016-11-25 15:15:37,469 INFO [main] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: job_1479580915733_0167Job Transitioned from NEW to INITED
> 2016-11-25 15:15:37,470 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: MRAppMaster launching normal, non-uberized, multi-container job job_1479580915733_0167.
> 2016-11-25 15:15:37,493 INFO [main] org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler
> 2016-11-25 15:15:37,506 INFO [Socket Reader #1 for port 60945] org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 60945
> 2016-11-25 15:15:37,525 INFO [main] org.apache.hadoop.yarn.factories.impl.pb.RpcServerFactoryPBImpl: Adding protocol org.apache.hadoop.mapreduce.v2.api.MRClientProtocolPB to the server
> 2016-11-25 15:15:37,527 INFO [IPC Server Responder] org.apache.hadoop.ipc.Server: IPC Server Responder: starting
> 2016-11-25 15:15:37,529 INFO [IPC Server listener on 60945] org.apache.hadoop.ipc.Server: IPC Server listener on 60945: starting
> 2016-11-25 15:15:37,529 INFO [main] org.apache.hadoop.mapreduce.v2.app.client.MRClientService: Instantiated MRClientService at hadoopclusterslic73.ad.infosys.com/10.122.97.73:60945
> 2016-11-25 <http://hadoopclusterslic73.ad.infosys.com/10.122.97.73:609452016-11-25> 15:15:37,614 INFO [main] org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
> 2016-11-25 15:15:37,623 INFO [main] org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets.
> 2016-11-25 15:15:37,628 WARN [main] org.apache.hadoop.http.HttpRequestLog: Jetty request log can only be enabled using Log4j
> 2016-11-25 15:15:37,636 INFO [main] org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
> 2016-11-25 15:15:37,688 INFO [main] org.apache.hadoop.http.HttpServer2: Added filter AM_PROXY_FILTER (class=org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter) to context mapreduce
> 2016-11-25 15:15:37,688 INFO [main] org.apache.hadoop.http.HttpServer2: Added filter AM_PROXY_FILTER (class=org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter) to context static
> 2016-11-25 15:15:37,691 INFO [main] org.apache.hadoop.http.HttpServer2: adding path spec: /mapreduce/*
> 2016-11-25 15:15:37,692 INFO [main] org.apache.hadoop.http.HttpServer2: adding path spec: /ws/*
> 2016-11-25 15:15:38,181 INFO [main] org.apache.hadoop.yarn.webapp.WebApps: Registered webapp guice modules
> 2016-11-25 15:15:38,183 INFO [main] org.apache.hadoop.http.HttpServer2: Jetty bound to port 34311
> 2016-11-25 15:15:38,183 INFO [main] org.mortbay.log: jetty-6.1.26.hwx
> 2016-11-25 15:15:38,263 INFO [main] org.mortbay.log: Extract jar:file:/hadoop/yarn/local/filecache/16/mapreduce.tar.gz/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.7.3.2.5.0.0-1245.jar!/webapps/mapreduce to /hadoop/yarn/local/usercache/hdfs/appcache/application_1479580915733_0167/container_e125_1479580915733_0167_01_000001/tmp/Jetty_0_0_0_0_34311_mapreduce____2ncvaf/webapp
> 2016-11-25 15:15:39,882 INFO [main] org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@0.0.0.0:34311
> 2016-11-25 15:15:39,882 INFO [main] org.apache.hadoop.yarn.webapp.WebApps: Web app mapreduce started at 34311
> 2016-11-25 15:15:39,933 INFO [main] org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler
> 2016-11-25 15:15:39,936 INFO [Socket Reader #1 for port 57220] org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 57220
> 2016-11-25 15:15:39,943 INFO [IPC Server listener on 57220] org.apache.hadoop.ipc.Server: IPC Server listener on 57220: starting
> 2016-11-25 15:15:39,953 INFO [IPC Server Responder] org.apache.hadoop.ipc.Server: IPC Server Responder: starting
> 2016-11-25 15:15:39,978 INFO [main] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: nodeBlacklistingEnabled:true
> 2016-11-25 15:15:39,978 INFO [main] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: maxTaskFailuresPerNode is 3
> 2016-11-25 15:15:39,978 INFO [main] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: blacklistDisablePercent is 33
> 2016-11-25 15:15:40,082 WARN [main] org.apache.hadoop.ipc.Client: Failed to connect to server: hadoopclusterslic71.ad.infosys.com/10.122.97.71:8030: retries get failed due to exceeded maximum allowed retries number: 0
> java.net.ConnectException: Connection refused
> 	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> 	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
> 	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
> 	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
> 	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
> 	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:650)
> 	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:745)
> 	at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.java:397)
> 	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1618)
> 	at org.apache.hadoop.ipc.Client.call(Client.java:1449)
> 	at org.apache.hadoop.ipc.Client.call(Client.java:1396)
> 	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
> 	at com.sun.proxy.$Proxy80.registerApplicationMaster(Unknown Source)
> 	at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationMasterProtocolPBClientImpl.registerApplicationMaster(ApplicationMasterProtocolPBClientImpl.java:106)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:497)
> 	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:278)
> 	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:194)
> 	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:176)
> 	at com.sun.proxy.$Proxy81.registerApplicationMaster(Unknown Source)
> 	at org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator.register(RMCommunicator.java:160)
> 	at org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator.serviceStart(RMCommunicator.java:121)
> 	at org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator.serviceStart(RMContainerAllocator.java:250)
> 	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
> 	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$ContainerAllocatorRouter.serviceStart(MRAppMaster.java:881)
> 	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
> 	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
> 	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceStart(MRAppMaster.java:1151)
> 	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
> 	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$5.run(MRAppMaster.java:1557)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
> 	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1553)
> 	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1486)
> 2016-11-25 15:15:40,089 INFO [main] org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider: Failing over to rm2
> 2016-11-25 15:15:40,183 INFO [main] org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator: maxContainerCapability: <memory:28672, vCores:3>
> 2016-11-25 15:15:40,183 INFO [main] org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator: queue: default
> 2016-11-25 15:15:40,186 INFO [main] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Upper limit on the thread pool size is 500
> 2016-11-25 15:15:40,186 INFO [main] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: The thread pool initial size is 10
> 2016-11-25 15:15:40,189 INFO [main] org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0
> 2016-11-25 15:15:40,202 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: job_1479580915733_0167Job Transitioned from INITED to SETUP
> 2016-11-25 15:15:40,212 INFO [CommitterEvent Processor #0] org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler: Processing the event EventType: JOB_SETUP
> 2016-11-25 15:15:40,226 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: job_1479580915733_0167Job Transitioned from SETUP to RUNNING
> 2016-11-25 15:15:40,291 INFO [AsyncDispatcher event handler] org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic72.ad.infosys.com to /default-rack
> 2016-11-25 15:15:40,328 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Event Writer setup for JobId: job_1479580915733_0167, File: hdfs://SLICHDP:8020/user/hdfs/.staging/job_1479580915733_0167/job_1479580915733_0167_1.jhist
> 2016-11-25 15:15:40,351 INFO [AsyncDispatcher event handler] org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic73.ad.infosys.com to /default-rack
> 2016-11-25 15:15:40,357 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl: task_1479580915733_0167_m_000000 Task Transitioned from NEW to SCHEDULED
> 2016-11-25 15:15:40,358 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl: task_1479580915733_0167_r_000000 Task Transitioned from NEW to SCHEDULED
> 2016-11-25 15:15:40,359 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from NEW to UNASSIGNED
> 2016-11-25 15:15:40,359 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_r_000000_0 TaskAttempt Transitioned from NEW to UNASSIGNED
> 2016-11-25 15:15:40,401 INFO [Thread-53] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: mapResourceRequest:<memory:3072, vCores:1>
> 2016-11-25 15:15:40,416 INFO [Thread-53] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: reduceResourceRequest:<memory:4096, vCores:1>
> 2016-11-25 15:15:41,191 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:0 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:0 ContRel:0 HostLocal:0 RackLocal:0
> 2016-11-25 15:15:41,225 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1479580915733_0167: ask=4 release= 0 newContainers=0 finishedContainers=0 resourcelimit=<memory:38912, vCores:1> knownNMs=2
> 2016-11-25 15:15:41,225 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating schedule, headroom=<memory:38912, vCores:1>
> 2016-11-25 15:15:41,226 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow start threshold not met. completedMapsForReduceSlowstart 1
> 2016-11-25 15:15:42,235 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got allocated containers 1
> 2016-11-25 15:15:42,237 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned container container_e125_1479580915733_0167_01_000002 to attempt_1479580915733_0167_m_000000_0
> 2016-11-25 15:15:42,238 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating schedule, headroom=<memory:34816, vCores:0>
> 2016-11-25 15:15:42,238 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow start threshold not met. completedMapsForReduceSlowstart 1
> 2016-11-25 15:15:42,238 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:1 ContRel:0 HostLocal:1 RackLocal:0
> 2016-11-25 15:15:42,286 INFO [AsyncDispatcher event handler] org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic73.ad.infosys.com to /default-rack
> 2016-11-25 15:15:42,311 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: The job-jar file on the remote FS is hdfs://SLICHDP/user/hdfs/.staging/job_1479580915733_0167/job.jar
> 2016-11-25 15:15:42,315 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: The job-conf file on the remote FS is /user/hdfs/.staging/job_1479580915733_0167/job.xml
> 2016-11-25 15:15:42,336 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Adding #0 tokens and #1 secret keys for NM use for launching container
> 2016-11-25 15:15:42,336 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Size of containertokens_dob is 1
> 2016-11-25 15:15:42,336 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Putting shuffle token in serviceData
> 2016-11-25 15:15:42,441 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from UNASSIGNED to ASSIGNED
> 2016-11-25 15:15:42,455 INFO [ContainerLauncher #0] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container container_e125_1479580915733_0167_01_000002 taskAttempt attempt_1479580915733_0167_m_000000_0
> 2016-11-25 15:15:42,457 INFO [ContainerLauncher #0] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Launching attempt_1479580915733_0167_m_000000_0
> 2016-11-25 15:15:42,457 INFO [ContainerLauncher #0] org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : hadoopclusterslic73.ad.infosys.com:45454
> 2016-11-25 15:15:42,531 INFO [ContainerLauncher #0] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Shuffle port returned by ContainerManager for attempt_1479580915733_0167_m_000000_0 : 13562
> 2016-11-25 15:15:42,533 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: TaskAttempt: [attempt_1479580915733_0167_m_000000_0] using containerId: [container_e125_1479580915733_0167_01_000002 on NM: [hadoopclusterslic73.ad.infosys.com:45454]
> 2016-11-25 15:15:42,536 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from ASSIGNED to RUNNING
> 2016-11-25 15:15:42,536 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl: task_1479580915733_0167_m_000000 Task Transitioned from SCHEDULED to RUNNING
> 2016-11-25 15:15:43,241 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1479580915733_0167: ask=4 release= 0 newContainers=0 finishedContainers=0 resourcelimit=<memory:34816, vCores:0> knownNMs=2
> 2016-11-25 15:15:44,790 INFO [Socket Reader #1 for port 57220] SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for job_1479580915733_0167 (auth:SIMPLE)
> 2016-11-25 15:15:44,870 INFO [IPC Server handler 5 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID : jvm_1479580915733_0167_m_137438953472002 asked for a task
> 2016-11-25 15:15:44,870 INFO [IPC Server handler 5 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID: jvm_1479580915733_0167_m_137438953472002 given task: attempt_1479580915733_0167_m_000000_0
> 2016-11-25 15:15:51,923 INFO [IPC Server handler 12 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt attempt_1479580915733_0167_m_000000_0 is : 0.667
> 2016-11-25 15:15:52,099 INFO [IPC Server handler 5 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt attempt_1479580915733_0167_m_000000_0 is : 0.667
> 2016-11-25 15:15:52,137 ERROR [IPC Server handler 12 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1479580915733_0167_m_000000_0 - exited : java.io.IOException: Failed to build cube in mapper 0
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
> 	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> 	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
> 	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> 	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
> 	... 8 more
> Caused by: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
> 	... 5 more
> Caused by: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
> 	... 7 more
> Caused by: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
> 	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
> 	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
> 	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)
>
> 2016-11-25 15:15:52,138 INFO [IPC Server handler 12 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from attempt_1479580915733_0167_m_000000_0: Error: java.io.IOException: Failed to build cube in mapper 0
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
> 	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> 	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
> 	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> 	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
> 	... 8 more
> Caused by: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
> 	... 5 more
> Caused by: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
> 	... 7 more
> Caused by: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
> 	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
> 	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
> 	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)
>
> 2016-11-25 15:15:52,141 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1479580915733_0167_m_000000_0: Error: java.io.IOException: Failed to build cube in mapper 0
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
> 	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> 	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
> 	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> 	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
> 	... 8 more
> Caused by: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
> 	... 5 more
> Caused by: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
> 	... 7 more
> Caused by: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
> 	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
> 	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
> 	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)
>
> 2016-11-25 15:15:52,142 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from RUNNING to FAIL_CONTAINER_CLEANUP
> 2016-11-25 15:15:52,155 INFO [ContainerLauncher #1] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container container_e125_1479580915733_0167_01_000002 taskAttempt attempt_1479580915733_0167_m_000000_0
> 2016-11-25 15:15:52,156 INFO [ContainerLauncher #1] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: KILLING attempt_1479580915733_0167_m_000000_0
> 2016-11-25 15:15:52,156 INFO [ContainerLauncher #1] org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : hadoopclusterslic73.ad.infosys.com:45454
> 2016-11-25 15:15:52,195 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
> 2016-11-25 15:15:52,204 INFO [CommitterEvent Processor #1] org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler: Processing the event EventType: TASK_ABORT
> 2016-11-25 15:15:52,215 WARN [CommitterEvent Processor #1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_temporary/attempt_1479580915733_0167_m_000000_0
> 2016-11-25 15:15:52,218 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from FAIL_TASK_CLEANUP to FAILED
> 2016-11-25 15:15:52,224 INFO [AsyncDispatcher event handler] org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic72.ad.infosys.com to /default-rack
> 2016-11-25 15:15:52,224 INFO [AsyncDispatcher event handler] org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic73.ad.infosys.com to /default-rack
> 2016-11-25 15:15:52,226 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from NEW to UNASSIGNED
> 2016-11-25 15:15:52,226 INFO [Thread-53] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: 1 failures on node hadoopclusterslic73.ad.infosys.com
> 2016-11-25 15:15:52,230 INFO [Thread-53] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Added attempt_1479580915733_0167_m_000000_1 to list of failed maps
> 2016-11-25 15:15:52,291 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:1 ContRel:0 HostLocal:1 RackLocal:0
> 2016-11-25 15:15:52,299 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1479580915733_0167: ask=1 release= 0 newContainers=0 finishedContainers=1 resourcelimit=<memory:38912, vCores:1> knownNMs=2
> 2016-11-25 15:15:52,299 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received completed container container_e125_1479580915733_0167_01_000002
> 2016-11-25 15:15:52,300 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating schedule, headroom=<memory:38912, vCores:1>
> 2016-11-25 15:15:52,300 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1479580915733_0167_m_000000_0: Container killed by the ApplicationMaster.
> Container killed on request. Exit code is 143
> Container exited with a non-zero exit code 143
>
> 2016-11-25 15:15:52,300 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow start threshold not met. completedMapsForReduceSlowstart 1
> 2016-11-25 15:15:52,300 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:0 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:1 ContRel:0 HostLocal:1 RackLocal:0
> 2016-11-25 15:15:53,303 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got allocated containers 1
> 2016-11-25 15:15:53,303 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning container Container: [ContainerId: container_e125_1479580915733_0167_01_000003, NodeId: hadoopclusterslic72.ad.infosys.com:45454, NodeHttpAddress: hadoopclusterslic72.ad.infosys.com:8042, Resource: <memory:4096, vCores:1>, Priority: 5, Token: Token { kind: ContainerToken, service: 10.122.97.72:45454 }, ] to fast fail map
> 2016-11-25 15:15:53,303 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned from earlierFailedMaps
> 2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned container container_e125_1479580915733_0167_01_000003 to attempt_1479580915733_0167_m_000000_1
> 2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating schedule, headroom=<memory:34816, vCores:0>
> 2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow start threshold not met. completedMapsForReduceSlowstart 1
> 2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:2 ContRel:0 HostLocal:1 RackLocal:0
> 2016-11-25 15:15:53,304 INFO [AsyncDispatcher event handler] org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic72.ad.infosys.com to /default-rack
> 2016-11-25 15:15:53,304 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from UNASSIGNED to ASSIGNED
> 2016-11-25 15:15:53,305 INFO [ContainerLauncher #2] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container container_e125_1479580915733_0167_01_000003 taskAttempt attempt_1479580915733_0167_m_000000_1
> 2016-11-25 15:15:53,305 INFO [ContainerLauncher #2] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Launching attempt_1479580915733_0167_m_000000_1
> 2016-11-25 15:15:53,305 INFO [ContainerLauncher #2] org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> 2016-11-25 15:15:53,318 INFO [ContainerLauncher #2] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Shuffle port returned by ContainerManager for attempt_1479580915733_0167_m_000000_1 : 13562
> 2016-11-25 15:15:53,318 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: TaskAttempt: [attempt_1479580915733_0167_m_000000_1] using containerId: [container_e125_1479580915733_0167_01_000003 on NM: [hadoopclusterslic72.ad.infosys.com:45454]
> 2016-11-25 15:15:53,318 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from ASSIGNED to RUNNING
> 2016-11-25 15:15:54,309 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1479580915733_0167: ask=1 release= 0 newContainers=0 finishedContainers=0 resourcelimit=<memory:34816, vCores:0> knownNMs=2
> 2016-11-25 15:15:55,797 INFO [Socket Reader #1 for port 57220] SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for job_1479580915733_0167 (auth:SIMPLE)
> 2016-11-25 15:15:55,819 INFO [IPC Server handler 13 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID : jvm_1479580915733_0167_m_137438953472003 asked for a task
> 2016-11-25 15:15:55,819 INFO [IPC Server handler 13 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID: jvm_1479580915733_0167_m_137438953472003 given task: attempt_1479580915733_0167_m_000000_1
> 2016-11-25 15:16:02,857 INFO [IPC Server handler 8 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt attempt_1479580915733_0167_m_000000_1 is : 0.667
> 2016-11-25 15:16:03,332 INFO [IPC Server handler 13 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt attempt_1479580915733_0167_m_000000_1 is : 0.667
> 2016-11-25 15:16:03,347 ERROR [IPC Server handler 8 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1479580915733_0167_m_000000_1 - exited : java.io.IOException: Failed to build cube in mapper 0
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
> 	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> 	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
> 	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> 	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
> 	... 8 more
> Caused by: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
> 	... 5 more
> Caused by: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
> 	... 7 more
> Caused by: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
> 	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
> 	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
> 	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)
>
> 2016-11-25 15:16:03,347 INFO [IPC Server handler 8 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from attempt_1479580915733_0167_m_000000_1: Error: java.io.IOException: Failed to build cube in mapper 0
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
> 	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> 	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
> 	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> 	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
> 	... 8 more
> Caused by: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
> 	... 5 more
> Caused by: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
> 	... 7 more
> Caused by: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
> 	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
> 	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
> 	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)
>
> 2016-11-25 15:16:03,349 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1479580915733_0167_m_000000_1: Error: java.io.IOException: Failed to build cube in mapper 0
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
> 	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> 	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
> 	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> 	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
> 	... 8 more
> Caused by: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
> 	... 5 more
> Caused by: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
> 	... 7 more
> Caused by: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
> 	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
> 	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
> 	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)
>
> 2016-11-25 15:16:03,350 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from RUNNING to FAIL_CONTAINER_CLEANUP
> 2016-11-25 15:16:03,351 INFO [ContainerLauncher #3] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container container_e125_1479580915733_0167_01_000003 taskAttempt attempt_1479580915733_0167_m_000000_1
> 2016-11-25 15:16:03,355 INFO [ContainerLauncher #3] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: KILLING attempt_1479580915733_0167_m_000000_1
> 2016-11-25 15:16:03,355 INFO [ContainerLauncher #3] org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> 2016-11-25 15:16:03,369 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
> 2016-11-25 15:16:03,369 INFO [CommitterEvent Processor #2] org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler: Processing the event EventType: TASK_ABORT
> 2016-11-25 15:16:03,375 WARN [CommitterEvent Processor #2] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_temporary/attempt_1479580915733_0167_m_000000_1
> 2016-11-25 15:16:03,375 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from FAIL_TASK_CLEANUP to FAILED
> 2016-11-25 15:16:03,375 INFO [AsyncDispatcher event handler] org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic72.ad.infosys.com to /default-rack
> 2016-11-25 15:16:03,375 INFO [AsyncDispatcher event handler] org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic73.ad.infosys.com to /default-rack
> 2016-11-25 15:16:03,376 INFO [Thread-53] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: 1 failures on node hadoopclusterslic72.ad.infosys.com
> 2016-11-25 15:16:03,376 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from NEW to UNASSIGNED
> 2016-11-25 15:16:03,380 INFO [Thread-53] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Added attempt_1479580915733_0167_m_000000_2 to list of failed maps
> 2016-11-25 15:16:04,341 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:2 ContRel:0 HostLocal:1 RackLocal:0
> 2016-11-25 15:16:04,344 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1479580915733_0167: ask=1 release= 0 newContainers=0 finishedContainers=0 resourcelimit=<memory:34816, vCores:0> knownNMs=2
> 2016-11-25 15:16:04,344 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating schedule, headroom=<memory:34816, vCores:0>
> 2016-11-25 15:16:04,344 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow start threshold not met. completedMapsForReduceSlowstart 1
> 2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received completed container container_e125_1479580915733_0167_01_000003
> 2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got allocated containers 1
> 2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning container Container: [ContainerId: container_e125_1479580915733_0167_01_000004, NodeId: hadoopclusterslic72.ad.infosys.com:45454, NodeHttpAddress: hadoopclusterslic72.ad.infosys.com:8042, Resource: <memory:4096, vCores:1>, Priority: 5, Token: Token { kind: ContainerToken, service: 10.122.97.72:45454 }, ] to fast fail map
> 2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned from earlierFailedMaps
> 2016-11-25 15:16:05,352 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1479580915733_0167_m_000000_1: Container killed by the ApplicationMaster.
> Container killed on request. Exit code is 143
> Container exited with a non-zero exit code 143
>
> 2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned container container_e125_1479580915733_0167_01_000004 to attempt_1479580915733_0167_m_000000_2
> 2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating schedule, headroom=<memory:34816, vCores:0>
> 2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow start threshold not met. completedMapsForReduceSlowstart 1
> 2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:3 ContRel:0 HostLocal:1 RackLocal:0
> 2016-11-25 15:16:05,353 INFO [AsyncDispatcher event handler] org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic72.ad.infosys.com to /default-rack
> 2016-11-25 15:16:05,354 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from UNASSIGNED to ASSIGNED
> 2016-11-25 15:16:05,356 INFO [ContainerLauncher #4] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container container_e125_1479580915733_0167_01_000004 taskAttempt attempt_1479580915733_0167_m_000000_2
> 2016-11-25 15:16:05,356 INFO [ContainerLauncher #4] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Launching attempt_1479580915733_0167_m_000000_2
> 2016-11-25 15:16:05,357 INFO [ContainerLauncher #4] org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> 2016-11-25 15:16:05,371 INFO [ContainerLauncher #4] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Shuffle port returned by ContainerManager for attempt_1479580915733_0167_m_000000_2 : 13562
> 2016-11-25 15:16:05,371 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: TaskAttempt: [attempt_1479580915733_0167_m_000000_2] using containerId: [container_e125_1479580915733_0167_01_000004 on NM: [hadoopclusterslic72.ad.infosys.com:45454]
> 2016-11-25 15:16:05,372 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from ASSIGNED to RUNNING
> 2016-11-25 15:16:06,362 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1479580915733_0167: ask=1 release= 0 newContainers=0 finishedContainers=0 resourcelimit=<memory:34816, vCores:0> knownNMs=2
> 2016-11-25 15:16:07,537 INFO [Socket Reader #1 for port 57220] SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for job_1479580915733_0167 (auth:SIMPLE)
> 2016-11-25 15:16:07,567 INFO [IPC Server handler 24 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID : jvm_1479580915733_0167_m_137438953472004 asked for a task
> 2016-11-25 15:16:07,567 INFO [IPC Server handler 24 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID: jvm_1479580915733_0167_m_137438953472004 given task: attempt_1479580915733_0167_m_000000_2
> 2016-11-25 15:16:14,753 INFO [IPC Server handler 6 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt attempt_1479580915733_0167_m_000000_2 is : 0.667
> 2016-11-25 15:16:15,241 INFO [IPC Server handler 13 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt attempt_1479580915733_0167_m_000000_2 is : 0.667
> 2016-11-25 15:16:15,258 ERROR [IPC Server handler 8 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1479580915733_0167_m_000000_2 - exited : java.io.IOException: Failed to build cube in mapper 0
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
> 	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> 	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
> 	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> 	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
> 	... 8 more
> Caused by: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
> 	... 5 more
> Caused by: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
> 	... 7 more
> Caused by: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
> 	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
> 	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
> 	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)
>
> 2016-11-25 15:16:15,258 INFO [IPC Server handler 8 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from attempt_1479580915733_0167_m_000000_2: Error: java.io.IOException: Failed to build cube in mapper 0
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
> 	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> 	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
> 	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> 	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
> 	... 8 more
> Caused by: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
> 	... 5 more
> Caused by: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
> 	... 7 more
> Caused by: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
> 	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
> 	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
> 	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)
>
> 2016-11-25 15:16:15,261 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1479580915733_0167_m_000000_2: Error: java.io.IOException: Failed to build cube in mapper 0
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
> 	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> 	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
> 	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> 	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
> 	... 8 more
> Caused by: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
> 	... 5 more
> Caused by: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
> 	... 7 more
> Caused by: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
> 	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
> 	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
> 	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)
>
> 2016-11-25 15:16:15,273 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from RUNNING to FAIL_CONTAINER_CLEANUP
> 2016-11-25 15:16:15,274 INFO [ContainerLauncher #5] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container container_e125_1479580915733_0167_01_000004 taskAttempt attempt_1479580915733_0167_m_000000_2
> 2016-11-25 15:16:15,274 INFO [ContainerLauncher #5] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: KILLING attempt_1479580915733_0167_m_000000_2
> 2016-11-25 15:16:15,274 INFO [ContainerLauncher #5] org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> 2016-11-25 15:16:15,289 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
> 2016-11-25 15:16:15,292 INFO [CommitterEvent Processor #3] org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler: Processing the event EventType: TASK_ABORT
> 2016-11-25 15:16:15,300 WARN [CommitterEvent Processor #3] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_temporary/attempt_1479580915733_0167_m_000000_2
> 2016-11-25 15:16:15,300 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from FAIL_TASK_CLEANUP to FAILED
> 2016-11-25 15:16:15,301 INFO [AsyncDispatcher event handler] org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic72.ad.infosys.com to /default-rack
> 2016-11-25 15:16:15,301 INFO [AsyncDispatcher event handler] org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic73.ad.infosys.com to /default-rack
> 2016-11-25 15:16:15,301 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from NEW to UNASSIGNED
> 2016-11-25 15:16:15,301 INFO [Thread-53] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: 2 failures on node hadoopclusterslic72.ad.infosys.com
> 2016-11-25 15:16:15,307 INFO [Thread-53] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Added attempt_1479580915733_0167_m_000000_3 to list of failed maps
> 2016-11-25 15:16:15,412 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:3 ContRel:0 HostLocal:1 RackLocal:0
> 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1479580915733_0167: ask=1 release= 0 newContainers=0 finishedContainers=1 resourcelimit=<memory:38912, vCores:1> knownNMs=2
> 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received completed container container_e125_1479580915733_0167_01_000004
> 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating schedule, headroom=<memory:38912, vCores:1>
> 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow start threshold not met. completedMapsForReduceSlowstart 1
> 2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:0 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:3 ContRel:0 HostLocal:1 RackLocal:0
> 2016-11-25 15:16:15,421 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1479580915733_0167_m_000000_2: Container killed by the ApplicationMaster.
> Container killed on request. Exit code is 143
> Container exited with a non-zero exit code 143
>
> 2016-11-25 15:16:16,432 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got allocated containers 1
> 2016-11-25 15:16:16,432 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning container Container: [ContainerId: container_e125_1479580915733_0167_01_000005, NodeId: hadoopclusterslic72.ad.infosys.com:45454, NodeHttpAddress: hadoopclusterslic72.ad.infosys.com:8042, Resource: <memory:4096, vCores:1>, Priority: 5, Token: Token { kind: ContainerToken, service: 10.122.97.72:45454 }, ] to fast fail map
> 2016-11-25 15:16:16,432 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned from earlierFailedMaps
> 2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned container container_e125_1479580915733_0167_01_000005 to attempt_1479580915733_0167_m_000000_3
> 2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating schedule, headroom=<memory:34816, vCores:0>
> 2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow start threshold not met. completedMapsForReduceSlowstart 1
> 2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:4 ContRel:0 HostLocal:1 RackLocal:0
> 2016-11-25 15:16:16,433 INFO [AsyncDispatcher event handler] org.apache.hadoop.yarn.util.RackResolver: Resolved hadoopclusterslic72.ad.infosys.com to /default-rack
> 2016-11-25 15:16:16,434 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from UNASSIGNED to ASSIGNED
> 2016-11-25 15:16:16,436 INFO [ContainerLauncher #6] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container container_e125_1479580915733_0167_01_000005 taskAttempt attempt_1479580915733_0167_m_000000_3
> 2016-11-25 15:16:16,436 INFO [ContainerLauncher #6] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Launching attempt_1479580915733_0167_m_000000_3
> 2016-11-25 15:16:16,436 INFO [ContainerLauncher #6] org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> 2016-11-25 15:16:16,516 INFO [ContainerLauncher #6] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Shuffle port returned by ContainerManager for attempt_1479580915733_0167_m_000000_3 : 13562
> 2016-11-25 15:16:16,517 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: TaskAttempt: [attempt_1479580915733_0167_m_000000_3] using containerId: [container_e125_1479580915733_0167_01_000005 on NM: [hadoopclusterslic72.ad.infosys.com:45454]
> 2016-11-25 15:16:16,517 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from ASSIGNED to RUNNING
> 2016-11-25 15:16:17,436 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1479580915733_0167: ask=1 release= 0 newContainers=0 finishedContainers=0 resourcelimit=<memory:34816, vCores:0> knownNMs=2
> 2016-11-25 15:16:19,664 INFO [Socket Reader #1 for port 57220] SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for job_1479580915733_0167 (auth:SIMPLE)
> 2016-11-25 15:16:19,692 INFO [IPC Server handler 6 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID : jvm_1479580915733_0167_m_137438953472005 asked for a task
> 2016-11-25 15:16:19,692 INFO [IPC Server handler 6 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID: jvm_1479580915733_0167_m_137438953472005 given task: attempt_1479580915733_0167_m_000000_3
> 2016-11-25 15:16:27,222 INFO [IPC Server handler 13 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt attempt_1479580915733_0167_m_000000_3 is : 0.667
> 2016-11-25 15:16:27,952 INFO [IPC Server handler 7 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt attempt_1479580915733_0167_m_000000_3 is : 0.667
> 2016-11-25 15:16:27,971 ERROR [IPC Server handler 11 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1479580915733_0167_m_000000_3 - exited : java.io.IOException: Failed to build cube in mapper 0
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
> 	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> 	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
> 	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> 	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
> 	... 8 more
> Caused by: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
> 	... 5 more
> Caused by: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
> 	... 7 more
> Caused by: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
> 	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
> 	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
> 	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)
>
> 2016-11-25 15:16:27,971 INFO [IPC Server handler 11 on 57220] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from attempt_1479580915733_0167_m_000000_3: Error: java.io.IOException: Failed to build cube in mapper 0
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
> 	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> 	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
> 	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> 	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
> 	... 8 more
> Caused by: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
> 	... 5 more
> Caused by: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
> 	... 7 more
> Caused by: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
> 	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
> 	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
> 	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)
>
> 2016-11-25 15:16:27,974 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1479580915733_0167_m_000000_3: Error: java.io.IOException: Failed to build cube in mapper 0
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
> 	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> 	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
> 	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> 	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> 	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
> 	... 8 more
> Caused by: java.lang.RuntimeException: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
> 	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
> 	... 5 more
> Caused by: java.io.IOException: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
> 	... 7 more
> Caused by: java.lang.IllegalArgumentException: Value not exists!
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
> 	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
> 	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
> 	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
> 	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
> 	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
> 	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
> 	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
> 	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)
>
> 2016-11-25 15:16:27,975 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from RUNNING to FAIL_CONTAINER_CLEANUP
> 2016-11-25 15:16:27,976 INFO [ContainerLauncher #7] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container container_e125_1479580915733_0167_01_000005 taskAttempt attempt_1479580915733_0167_m_000000_3
> 2016-11-25 15:16:27,997 INFO [ContainerLauncher #7] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: KILLING attempt_1479580915733_0167_m_000000_3
> 2016-11-25 15:16:27,997 INFO [ContainerLauncher #7] org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
> 2016-11-25 15:16:28,009 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
> 2016-11-25 15:16:28,011 INFO [CommitterEvent Processor #4] org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler: Processing the event EventType: TASK_ABORT
> 2016-11-25 15:16:28,013 WARN [CommitterEvent Processor #4] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_temporary/attempt_1479580915733_0167_m_000000_3
> 2016-11-25 15:16:28,014 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from FAIL_TASK_CLEANUP to FAILED
> 2016-11-25 15:16:28,026 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl: task_1479580915733_0167_m_000000 Task Transitioned from RUNNING to FAILED
> 2016-11-25 15:16:28,026 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Num completed Tasks: 1
> 2016-11-25 15:16:28,027 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Job failed as tasks failed. failedMaps:1 failedReduces:0
> 2016-11-25 15:16:28,027 INFO [Thread-53] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: 3 failures on node hadoopclusterslic72.ad.infosys.com
> 2016-11-25 15:16:28,027 INFO [Thread-53] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: Blacklisted host hadoopclusterslic72.ad.infosys.com
> 2016-11-25 15:16:28,032 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: job_1479580915733_0167Job Transitioned from RUNNING to FAIL_WAIT
> 2016-11-25 15:16:28,033 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl: task_1479580915733_0167_r_000000 Task Transitioned from SCHEDULED to KILL_WAIT
> 2016-11-25 15:16:28,033 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1479580915733_0167_r_000000_0 TaskAttempt Transitioned from UNASSIGNED to KILLED
> 2016-11-25 15:16:28,034 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl: task_1479580915733_0167_r_000000 Task Transitioned from KILL_WAIT to KILLED
> 2016-11-25 15:16:28,034 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: job_1479580915733_0167Job Transitioned from FAIL_WAIT to FAIL_ABORT
> 2016-11-25 15:16:28,037 INFO [Thread-53] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Processing the event EventType: CONTAINER_DEALLOCATE
> 2016-11-25 15:16:28,037 ERROR [Thread-53] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Could not deallocate container for task attemptId attempt_1479580915733_0167_r_000000_0
> 2016-11-25 15:16:28,043 INFO [CommitterEvent Processor #0] org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler: Processing the event EventType: JOB_ABORT
> 2016-11-25 15:16:28,058 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: job_1479580915733_0167Job Transitioned from FAIL_ABORT to FAILED
> 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: We are finishing cleanly so this is the last retry
> 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Notify RMCommunicator isAMLastRetry: true
> 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator: RMCommunicator notified that shouldUnregistered is: true
> 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Notify JHEH isAMLastRetry: true
> 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: JobHistoryEventHandler notified that forceJobCompletion is true
> 2016-11-25 15:16:28,092 INFO [Thread-74] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Calling stop for all the services
> 2016-11-25 15:16:28,093 INFO [Thread-74] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Stopping JobHistoryEventHandler. Size of the outstanding queue size is 2
> 2016-11-25 15:16:28,097 INFO [Thread-74] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: In stop, writing event TASK_FAILED
> 2016-11-25 15:16:28,099 INFO [Thread-74] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: In stop, writing event JOB_FAILED
> 2016-11-25 15:16:28,177 INFO [Thread-74] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Copying hdfs://SLICHDP:8020/user/hdfs/.staging/job_1479580915733_0167/job_1479580915733_0167_1.jhist to hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-1480067188027-0-0-FAILED-default-1480067140199.jhist_tmp
> 2016-11-25 15:16:28,248 INFO [Thread-74] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Copied to done location: hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-1480067188027-0-0-FAILED-default-1480067140199.jhist_tmp
> 2016-11-25 15:16:28,253 INFO [Thread-74] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Copying hdfs://SLICHDP:8020/user/hdfs/.staging/job_1479580915733_0167/job_1479580915733_0167_1_conf.xml to hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167_conf.xml_tmp
> 2016-11-25 15:16:28,320 INFO [Thread-74] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Copied to done location: hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167_conf.xml_tmp
> 2016-11-25 15:16:28,338 INFO [Thread-74] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Moved tmp to done: hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167.summary_tmp to hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167.summary
> 2016-11-25 15:16:28,350 INFO [Thread-74] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Moved tmp to done: hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167_conf.xml_tmp to hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167_conf.xml
> 2016-11-25 15:16:28,353 INFO [Thread-74] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Moved tmp to done: hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-1480067188027-0-0-FAILED-default-1480067140199.jhist_tmp to hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-1480067188027-0-0-FAILED-default-1480067140199.jhist
> 2016-11-25 15:16:28,353 INFO [Thread-74] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Stopped JobHistoryEventHandler. super.stop()
> 2016-11-25 15:16:28,357 INFO [Thread-74] org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator: Setting job diagnostics to Task failed task_1479580915733_0167_m_000000
> Job failed as tasks failed. failedMaps:1 failedReduces:0
>
> 2016-11-25 15:16:28,357 INFO [Thread-74] org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator: History url is http://hadoopclusterslic73.ad.infosys.com:19888/jobhistory/job/job_1479580915733_0167
> 2016-11-25 <http://hadoopclusterslic73.ad.infosys.com:19888/jobhistory/job/job_1479580915733_01672016-11-25> 15:16:28,373 INFO [Thread-74] org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator: Waiting for application to be successfully unregistered.
> 2016-11-25 15:16:29,375 INFO [Thread-74] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Final Stats: PendingReds:1 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:4 ContRel:0 HostLocal:1 RackLocal:0
> 2016-11-25 15:16:29,377 INFO [Thread-74] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Deleting staging directory hdfs://SLICHDP /user/hdfs/.staging/job_1479580915733_0167
> 2016-11-25 15:16:29,380 INFO [Thread-74] org.apache.hadoop.ipc.Server: Stopping server on 57220
> 2016-11-25 15:16:29,387 INFO [TaskHeartbeatHandler PingChecker] org.apache.hadoop.mapreduce.v2.app.TaskHeartbeatHandler: TaskHeartbeatHandler thread interrupted
> 2016-11-25 15:16:29,387 INFO [IPC Server listener on 57220] org.apache.hadoop.ipc.Server: Stopping IPC Server listener on 57220
>
>
> On Fri, Nov 25, 2016 at 7:36 PM, ShaoFeng Shi <sh...@apache.org>
> wrote:
>
>> Didn't hear of that. Hive table's file format is transparent for Kylin;
>> Even if the table is a view, Kylin can build from it.
>>
>> What's the detail error you got when using ORC table? If you can provide
>> the detail information, that would be better.
>>
>> 2016-11-25 18:22 GMT+08:00 suresh m <su...@gmail.com>:
>>
>> > Hi Facing an issue where i can able to build cube with text format but
>> > unable to building cube with ORC tables.
>> >
>> > Let me know kylin having any issues with ORC format.?
>> >
>> >  Hive having limitation that Text format tables not having possibility
>> to
>> > enabling ACID properties since text format not supporting ACID. But for
>> me
>> > ACID properties is important to handle my data, this i can do with ORC
>> but
>> > kylin throwing errors with ORC format.
>> >
>> >
>> > Regards,
>> > Suresh
>> >
>>
>>
>>
>> --
>> Best regards,
>>
>> Shaofeng Shi 史少锋
>>
>
>

Re: Is kylin not support Hive ORC tables with ACID properties.

Posted by suresh m <su...@gmail.com>.
Please find detail as requested,

Log Type: syslog

Log Upload Time: Fri Nov 25 15:16:35 +0530 2016

Log Length: 107891

2016-11-25 15:15:35,185 INFO [main]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Created MRAppMaster
for application appattempt_1479580915733_0167_000001
2016-11-25 15:15:35,592 WARN [main]
org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where
applicable
2016-11-25 15:15:35,630 INFO [main]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Executing with tokens:
2016-11-25 15:15:35,956 INFO [main]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Kind:
YARN_AM_RM_TOKEN, Service: , Ident: (appAttemptId { application_id {
id: 167 cluster_timestamp: 1479580915733 } attemptId: 1 } keyId:
2128280969)
2016-11-25 15:15:35,974 INFO [main]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Using mapred
newApiCommitter.
2016-11-25 15:15:35,976 INFO [main]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: OutputCommitter set in
config null
2016-11-25 15:15:36,029 INFO [main]
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: File
Output Committer Algorithm version is 1
2016-11-25 15:15:36,029 INFO [main]
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter:
FileOutputCommitter skip cleanup _temporary folders under output
directory:false, ignore cleanup failures: false
2016-11-25 15:15:36,692 WARN [main]
org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory: The
short-circuit local reads feature cannot be used because libhadoop
cannot be loaded.
2016-11-25 15:15:36,702 INFO [main]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: OutputCommitter is
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
2016-11-25 15:15:36,891 INFO [main]
org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class
org.apache.hadoop.mapreduce.jobhistory.EventType for class
org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler
2016-11-25 15:15:36,891 INFO [main]
org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class
org.apache.hadoop.mapreduce.v2.app.job.event.JobEventType for class
org.apache.hadoop.mapreduce.v2.app.MRAppMaster$JobEventDispatcher
2016-11-25 15:15:36,892 INFO [main]
org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class
org.apache.hadoop.mapreduce.v2.app.job.event.TaskEventType for class
org.apache.hadoop.mapreduce.v2.app.MRAppMaster$TaskEventDispatcher
2016-11-25 15:15:36,893 INFO [main]
org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class
org.apache.hadoop.mapreduce.v2.app.job.event.TaskAttemptEventType for
class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$TaskAttemptEventDispatcher
2016-11-25 15:15:36,893 INFO [main]
org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class
org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventType for class
org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler
2016-11-25 15:15:36,894 INFO [main]
org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class
org.apache.hadoop.mapreduce.v2.app.speculate.Speculator$EventType for
class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$SpeculatorEventDispatcher
2016-11-25 15:15:36,894 INFO [main]
org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class
org.apache.hadoop.mapreduce.v2.app.rm.ContainerAllocator$EventType for
class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$ContainerAllocatorRouter
2016-11-25 15:15:36,895 INFO [main]
org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncher$EventType
for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$ContainerLauncherRouter
2016-11-25 15:15:36,923 INFO [main]
org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils: Default
file system is set solely by core-default.xml therefore -  ignoring
2016-11-25 15:15:36,945 INFO [main]
org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils: Default
file system is set solely by core-default.xml therefore -  ignoring
2016-11-25 15:15:36,967 INFO [main]
org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils: Default
file system is set solely by core-default.xml therefore -  ignoring
2016-11-25 15:15:37,029 INFO [main]
org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler:
Emitting job history data to the timeline server is not enabled
2016-11-25 15:15:37,064 INFO [main]
org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class
org.apache.hadoop.mapreduce.v2.app.job.event.JobFinishEvent$Type for
class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$JobFinishEventHandler
2016-11-25 15:15:37,204 WARN [main]
org.apache.hadoop.metrics2.impl.MetricsConfig: Cannot locate
configuration: tried
hadoop-metrics2-mrappmaster.properties,hadoop-metrics2.properties
2016-11-25 15:15:37,300 INFO [main]
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
period at 10 second(s).
2016-11-25 15:15:37,300 INFO [main]
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MRAppMaster metrics
system started
2016-11-25 15:15:37,308 INFO [main]
org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Adding job token
for job_1479580915733_0167 to jobTokenSecretManager
2016-11-25 15:15:37,452 INFO [main]
org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Not uberizing
job_1479580915733_0167 because: not enabled; too much RAM;
2016-11-25 15:15:37,468 INFO [main]
org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Input size for
job job_1479580915733_0167 = 223589. Number of splits = 1
2016-11-25 15:15:37,469 INFO [main]
org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Number of reduces
for job job_1479580915733_0167 = 1
2016-11-25 15:15:37,469 INFO [main]
org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
job_1479580915733_0167Job Transitioned from NEW to INITED
2016-11-25 15:15:37,470 INFO [main]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: MRAppMaster launching
normal, non-uberized, multi-container job job_1479580915733_0167.
2016-11-25 15:15:37,493 INFO [main]
org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class
java.util.concurrent.LinkedBlockingQueue scheduler: class
org.apache.hadoop.ipc.DefaultRpcScheduler
2016-11-25 15:15:37,506 INFO [Socket Reader #1 for port 60945]
org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 60945
2016-11-25 15:15:37,525 INFO [main]
org.apache.hadoop.yarn.factories.impl.pb.RpcServerFactoryPBImpl:
Adding protocol org.apache.hadoop.mapreduce.v2.api.MRClientProtocolPB
to the server
2016-11-25 15:15:37,527 INFO [IPC Server Responder]
org.apache.hadoop.ipc.Server: IPC Server Responder: starting
2016-11-25 15:15:37,529 INFO [IPC Server listener on 60945]
org.apache.hadoop.ipc.Server: IPC Server listener on 60945: starting
2016-11-25 15:15:37,529 INFO [main]
org.apache.hadoop.mapreduce.v2.app.client.MRClientService:
Instantiated MRClientService at
hadoopclusterslic73.ad.infosys.com/10.122.97.73:60945
2016-11-25 15:15:37,614 INFO [main] org.mortbay.log: Logging to
org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
org.mortbay.log.Slf4jLog
2016-11-25 15:15:37,623 INFO [main]
org.apache.hadoop.security.authentication.server.AuthenticationFilter:
Unable to initialize FileSignerSecretProvider, falling back to use
random secrets.
2016-11-25 15:15:37,628 WARN [main]
org.apache.hadoop.http.HttpRequestLog: Jetty request log can only be
enabled using Log4j
2016-11-25 15:15:37,636 INFO [main]
org.apache.hadoop.http.HttpServer2: Added global filter 'safety'
(class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2016-11-25 15:15:37,688 INFO [main]
org.apache.hadoop.http.HttpServer2: Added filter AM_PROXY_FILTER
(class=org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter) to
context mapreduce
2016-11-25 15:15:37,688 INFO [main]
org.apache.hadoop.http.HttpServer2: Added filter AM_PROXY_FILTER
(class=org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter) to
context static
2016-11-25 15:15:37,691 INFO [main]
org.apache.hadoop.http.HttpServer2: adding path spec: /mapreduce/*
2016-11-25 15:15:37,692 INFO [main]
org.apache.hadoop.http.HttpServer2: adding path spec: /ws/*
2016-11-25 15:15:38,181 INFO [main]
org.apache.hadoop.yarn.webapp.WebApps: Registered webapp guice modules
2016-11-25 15:15:38,183 INFO [main]
org.apache.hadoop.http.HttpServer2: Jetty bound to port 34311
2016-11-25 15:15:38,183 INFO [main] org.mortbay.log: jetty-6.1.26.hwx
2016-11-25 15:15:38,263 INFO [main] org.mortbay.log: Extract
jar:file:/hadoop/yarn/local/filecache/16/mapreduce.tar.gz/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.7.3.2.5.0.0-1245.jar!/webapps/mapreduce
to /hadoop/yarn/local/usercache/hdfs/appcache/application_1479580915733_0167/container_e125_1479580915733_0167_01_000001/tmp/Jetty_0_0_0_0_34311_mapreduce____2ncvaf/webapp
2016-11-25 15:15:39,882 INFO [main] org.mortbay.log: Started
HttpServer2$SelectChannelConnectorWithSafeStartup@0.0.0.0:34311
2016-11-25 15:15:39,882 INFO [main]
org.apache.hadoop.yarn.webapp.WebApps: Web app mapreduce started at
34311
2016-11-25 15:15:39,933 INFO [main]
org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class
java.util.concurrent.LinkedBlockingQueue scheduler: class
org.apache.hadoop.ipc.DefaultRpcScheduler
2016-11-25 15:15:39,936 INFO [Socket Reader #1 for port 57220]
org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 57220
2016-11-25 15:15:39,943 INFO [IPC Server listener on 57220]
org.apache.hadoop.ipc.Server: IPC Server listener on 57220: starting
2016-11-25 15:15:39,953 INFO [IPC Server Responder]
org.apache.hadoop.ipc.Server: IPC Server Responder: starting
2016-11-25 15:15:39,978 INFO [main]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
nodeBlacklistingEnabled:true
2016-11-25 15:15:39,978 INFO [main]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
maxTaskFailuresPerNode is 3
2016-11-25 15:15:39,978 INFO [main]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
blacklistDisablePercent is 33
2016-11-25 15:15:40,082 WARN [main] org.apache.hadoop.ipc.Client:
Failed to connect to server:
hadoopclusterslic71.ad.infosys.com/10.122.97.71:8030: retries get
failed due to exceeded maximum allowed retries number: 0
java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:650)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:745)
	at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.java:397)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1618)
	at org.apache.hadoop.ipc.Client.call(Client.java:1449)
	at org.apache.hadoop.ipc.Client.call(Client.java:1396)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
	at com.sun.proxy.$Proxy80.registerApplicationMaster(Unknown Source)
	at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationMasterProtocolPBClientImpl.registerApplicationMaster(ApplicationMasterProtocolPBClientImpl.java:106)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:278)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:194)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:176)
	at com.sun.proxy.$Proxy81.registerApplicationMaster(Unknown Source)
	at org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator.register(RMCommunicator.java:160)
	at org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator.serviceStart(RMCommunicator.java:121)
	at org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator.serviceStart(RMContainerAllocator.java:250)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$ContainerAllocatorRouter.serviceStart(MRAppMaster.java:881)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceStart(MRAppMaster.java:1151)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$5.run(MRAppMaster.java:1557)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1553)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1486)
2016-11-25 15:15:40,089 INFO [main]
org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider:
Failing over to rm2
2016-11-25 15:15:40,183 INFO [main]
org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator:
maxContainerCapability: <memory:28672, vCores:3>
2016-11-25 15:15:40,183 INFO [main]
org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator: queue: default
2016-11-25 15:15:40,186 INFO [main]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
Upper limit on the thread pool size is 500
2016-11-25 15:15:40,186 INFO [main]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: The
thread pool initial size is 10
2016-11-25 15:15:40,189 INFO [main]
org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
yarn.client.max-cached-nodemanagers-proxies : 0
2016-11-25 15:15:40,202 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
job_1479580915733_0167Job Transitioned from INITED to SETUP
2016-11-25 15:15:40,212 INFO [CommitterEvent Processor #0]
org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
Processing the event EventType: JOB_SETUP
2016-11-25 15:15:40,226 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
job_1479580915733_0167Job Transitioned from SETUP to RUNNING
2016-11-25 15:15:40,291 INFO [AsyncDispatcher event handler]
org.apache.hadoop.yarn.util.RackResolver: Resolved
hadoopclusterslic72.ad.infosys.com to /default-rack
2016-11-25 15:15:40,328 INFO [eventHandlingThread]
org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Event
Writer setup for JobId: job_1479580915733_0167, File:
hdfs://SLICHDP:8020/user/hdfs/.staging/job_1479580915733_0167/job_1479580915733_0167_1.jhist
2016-11-25 15:15:40,351 INFO [AsyncDispatcher event handler]
org.apache.hadoop.yarn.util.RackResolver: Resolved
hadoopclusterslic73.ad.infosys.com to /default-rack
2016-11-25 15:15:40,357 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
task_1479580915733_0167_m_000000 Task Transitioned from NEW to
SCHEDULED
2016-11-25 15:15:40,358 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
task_1479580915733_0167_r_000000 Task Transitioned from NEW to
SCHEDULED
2016-11-25 15:15:40,359 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
NEW to UNASSIGNED
2016-11-25 15:15:40,359 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_r_000000_0 TaskAttempt Transitioned from
NEW to UNASSIGNED
2016-11-25 15:15:40,401 INFO [Thread-53]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
mapResourceRequest:<memory:3072, vCores:1>
2016-11-25 15:15:40,416 INFO [Thread-53]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
reduceResourceRequest:<memory:4096, vCores:1>
2016-11-25 15:15:41,191 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before
Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0
AssignedMaps:0 AssignedReds:0 CompletedMaps:0 CompletedReds:0
ContAlloc:0 ContRel:0 HostLocal:0 RackLocal:0
2016-11-25 15:15:41,225 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
getResources() for application_1479580915733_0167: ask=4 release= 0
newContainers=0 finishedContainers=0 resourcelimit=<memory:38912,
vCores:1> knownNMs=2
2016-11-25 15:15:41,225 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
Recalculating schedule, headroom=<memory:38912, vCores:1>
2016-11-25 15:15:41,226 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce
slow start threshold not met. completedMapsForReduceSlowstart 1
2016-11-25 15:15:42,235 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got
allocated containers 1
2016-11-25 15:15:42,237 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
container container_e125_1479580915733_0167_01_000002 to
attempt_1479580915733_0167_m_000000_0
2016-11-25 15:15:42,238 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
Recalculating schedule, headroom=<memory:34816, vCores:0>
2016-11-25 15:15:42,238 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce
slow start threshold not met. completedMapsForReduceSlowstart 1
2016-11-25 15:15:42,238 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0
AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0
ContAlloc:1 ContRel:0 HostLocal:1 RackLocal:0
2016-11-25 15:15:42,286 INFO [AsyncDispatcher event handler]
org.apache.hadoop.yarn.util.RackResolver: Resolved
hadoopclusterslic73.ad.infosys.com to /default-rack
2016-11-25 15:15:42,311 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: The
job-jar file on the remote FS is
hdfs://SLICHDP/user/hdfs/.staging/job_1479580915733_0167/job.jar
2016-11-25 15:15:42,315 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: The
job-conf file on the remote FS is
/user/hdfs/.staging/job_1479580915733_0167/job.xml
2016-11-25 15:15:42,336 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Adding #0
tokens and #1 secret keys for NM use for launching container
2016-11-25 15:15:42,336 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Size of
containertokens_dob is 1
2016-11-25 15:15:42,336 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Putting
shuffle token in serviceData
2016-11-25 15:15:42,441 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
UNASSIGNED to ASSIGNED
2016-11-25 15:15:42,455 INFO [ContainerLauncher #0]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container
container_e125_1479580915733_0167_01_000002 taskAttempt
attempt_1479580915733_0167_m_000000_0
2016-11-25 15:15:42,457 INFO [ContainerLauncher #0]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
Launching attempt_1479580915733_0167_m_000000_0
2016-11-25 15:15:42,457 INFO [ContainerLauncher #0]
org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
Opening proxy : hadoopclusterslic73.ad.infosys.com:45454
2016-11-25 15:15:42,531 INFO [ContainerLauncher #0]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
Shuffle port returned by ContainerManager for
attempt_1479580915733_0167_m_000000_0 : 13562
2016-11-25 15:15:42,533 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
TaskAttempt: [attempt_1479580915733_0167_m_000000_0] using
containerId: [container_e125_1479580915733_0167_01_000002 on NM:
[hadoopclusterslic73.ad.infosys.com:45454]
2016-11-25 15:15:42,536 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
ASSIGNED to RUNNING
2016-11-25 15:15:42,536 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
task_1479580915733_0167_m_000000 Task Transitioned from SCHEDULED to
RUNNING
2016-11-25 15:15:43,241 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
getResources() for application_1479580915733_0167: ask=4 release= 0
newContainers=0 finishedContainers=0 resourcelimit=<memory:34816,
vCores:0> knownNMs=2
2016-11-25 15:15:44,790 INFO [Socket Reader #1 for port 57220]
SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
job_1479580915733_0167 (auth:SIMPLE)
2016-11-25 15:15:44,870 INFO [IPC Server handler 5 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID :
jvm_1479580915733_0167_m_137438953472002 asked for a task
2016-11-25 15:15:44,870 INFO [IPC Server handler 5 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID:
jvm_1479580915733_0167_m_137438953472002 given task:
attempt_1479580915733_0167_m_000000_0
2016-11-25 15:15:51,923 INFO [IPC Server handler 12 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
TaskAttempt attempt_1479580915733_0167_m_000000_0 is : 0.667
2016-11-25 15:15:52,099 INFO [IPC Server handler 5 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
TaskAttempt attempt_1479580915733_0167_m_000000_0 is : 0.667
2016-11-25 15:15:52,137 ERROR [IPC Server handler 12 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
attempt_1479580915733_0167_m_000000_0 - exited : java.io.IOException:
Failed to build cube in mapper 0
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.util.concurrent.ExecutionException:
java.lang.RuntimeException: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
	... 8 more
Caused by: java.lang.RuntimeException: java.io.IOException:
java.io.IOException: java.lang.IllegalArgumentException: Value not
exists!
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
	... 5 more
Caused by: java.io.IOException: java.lang.IllegalArgumentException:
Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
	... 7 more
Caused by: java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)

2016-11-25 15:15:52,138 INFO [IPC Server handler 12 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report
from attempt_1479580915733_0167_m_000000_0: Error:
java.io.IOException: Failed to build cube in mapper 0
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.util.concurrent.ExecutionException:
java.lang.RuntimeException: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
	... 8 more
Caused by: java.lang.RuntimeException: java.io.IOException:
java.io.IOException: java.lang.IllegalArgumentException: Value not
exists!
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
	... 5 more
Caused by: java.io.IOException: java.lang.IllegalArgumentException:
Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
	... 7 more
Caused by: java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)

2016-11-25 15:15:52,141 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
Diagnostics report from attempt_1479580915733_0167_m_000000_0: Error:
java.io.IOException: Failed to build cube in mapper 0
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.util.concurrent.ExecutionException:
java.lang.RuntimeException: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
	... 8 more
Caused by: java.lang.RuntimeException: java.io.IOException:
java.io.IOException: java.lang.IllegalArgumentException: Value not
exists!
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
	... 5 more
Caused by: java.io.IOException: java.lang.IllegalArgumentException:
Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
	... 7 more
Caused by: java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)

2016-11-25 15:15:52,142 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
RUNNING to FAIL_CONTAINER_CLEANUP
2016-11-25 15:15:52,155 INFO [ContainerLauncher #1]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container
container_e125_1479580915733_0167_01_000002 taskAttempt
attempt_1479580915733_0167_m_000000_0
2016-11-25 15:15:52,156 INFO [ContainerLauncher #1]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
KILLING attempt_1479580915733_0167_m_000000_0
2016-11-25 15:15:52,156 INFO [ContainerLauncher #1]
org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
Opening proxy : hadoopclusterslic73.ad.infosys.com:45454
2016-11-25 15:15:52,195 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
2016-11-25 15:15:52,204 INFO [CommitterEvent Processor #1]
org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
Processing the event EventType: TASK_ABORT
2016-11-25 15:15:52,215 WARN [CommitterEvent Processor #1]
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not
delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_temporary/attempt_1479580915733_0167_m_000000_0
2016-11-25 15:15:52,218 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_0 TaskAttempt Transitioned from
FAIL_TASK_CLEANUP to FAILED
2016-11-25 15:15:52,224 INFO [AsyncDispatcher event handler]
org.apache.hadoop.yarn.util.RackResolver: Resolved
hadoopclusterslic72.ad.infosys.com to /default-rack
2016-11-25 15:15:52,224 INFO [AsyncDispatcher event handler]
org.apache.hadoop.yarn.util.RackResolver: Resolved
hadoopclusterslic73.ad.infosys.com to /default-rack
2016-11-25 15:15:52,226 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
NEW to UNASSIGNED
2016-11-25 15:15:52,226 INFO [Thread-53]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: 1 failures
on node hadoopclusterslic73.ad.infosys.com
2016-11-25 15:15:52,230 INFO [Thread-53]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Added
attempt_1479580915733_0167_m_000000_1 to list of failed maps
2016-11-25 15:15:52,291 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before
Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0
AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0
ContAlloc:1 ContRel:0 HostLocal:1 RackLocal:0
2016-11-25 15:15:52,299 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
getResources() for application_1479580915733_0167: ask=1 release= 0
newContainers=0 finishedContainers=1 resourcelimit=<memory:38912,
vCores:1> knownNMs=2
2016-11-25 15:15:52,299 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received
completed container container_e125_1479580915733_0167_01_000002
2016-11-25 15:15:52,300 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
Recalculating schedule, headroom=<memory:38912, vCores:1>
2016-11-25 15:15:52,300 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
Diagnostics report from attempt_1479580915733_0167_m_000000_0:
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143

2016-11-25 15:15:52,300 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce
slow start threshold not met. completedMapsForReduceSlowstart 1
2016-11-25 15:15:52,300 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0
AssignedMaps:0 AssignedReds:0 CompletedMaps:0 CompletedReds:0
ContAlloc:1 ContRel:0 HostLocal:1 RackLocal:0
2016-11-25 15:15:53,303 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got
allocated containers 1
2016-11-25 15:15:53,303 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning
container Container: [ContainerId:
container_e125_1479580915733_0167_01_000003, NodeId:
hadoopclusterslic72.ad.infosys.com:45454, NodeHttpAddress:
hadoopclusterslic72.ad.infosys.com:8042, Resource: <memory:4096,
vCores:1>, Priority: 5, Token: Token { kind: ContainerToken, service:
10.122.97.72:45454 }, ] to fast fail map
2016-11-25 15:15:53,303 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
from earlierFailedMaps
2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
container container_e125_1479580915733_0167_01_000003 to
attempt_1479580915733_0167_m_000000_1
2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
Recalculating schedule, headroom=<memory:34816, vCores:0>
2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce
slow start threshold not met. completedMapsForReduceSlowstart 1
2016-11-25 15:15:53,304 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0
AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0
ContAlloc:2 ContRel:0 HostLocal:1 RackLocal:0
2016-11-25 15:15:53,304 INFO [AsyncDispatcher event handler]
org.apache.hadoop.yarn.util.RackResolver: Resolved
hadoopclusterslic72.ad.infosys.com to /default-rack
2016-11-25 15:15:53,304 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
UNASSIGNED to ASSIGNED
2016-11-25 15:15:53,305 INFO [ContainerLauncher #2]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container
container_e125_1479580915733_0167_01_000003 taskAttempt
attempt_1479580915733_0167_m_000000_1
2016-11-25 15:15:53,305 INFO [ContainerLauncher #2]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
Launching attempt_1479580915733_0167_m_000000_1
2016-11-25 15:15:53,305 INFO [ContainerLauncher #2]
org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
2016-11-25 15:15:53,318 INFO [ContainerLauncher #2]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
Shuffle port returned by ContainerManager for
attempt_1479580915733_0167_m_000000_1 : 13562
2016-11-25 15:15:53,318 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
TaskAttempt: [attempt_1479580915733_0167_m_000000_1] using
containerId: [container_e125_1479580915733_0167_01_000003 on NM:
[hadoopclusterslic72.ad.infosys.com:45454]
2016-11-25 15:15:53,318 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
ASSIGNED to RUNNING
2016-11-25 15:15:54,309 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
getResources() for application_1479580915733_0167: ask=1 release= 0
newContainers=0 finishedContainers=0 resourcelimit=<memory:34816,
vCores:0> knownNMs=2
2016-11-25 15:15:55,797 INFO [Socket Reader #1 for port 57220]
SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
job_1479580915733_0167 (auth:SIMPLE)
2016-11-25 15:15:55,819 INFO [IPC Server handler 13 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID :
jvm_1479580915733_0167_m_137438953472003 asked for a task
2016-11-25 15:15:55,819 INFO [IPC Server handler 13 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID:
jvm_1479580915733_0167_m_137438953472003 given task:
attempt_1479580915733_0167_m_000000_1
2016-11-25 15:16:02,857 INFO [IPC Server handler 8 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
TaskAttempt attempt_1479580915733_0167_m_000000_1 is : 0.667
2016-11-25 15:16:03,332 INFO [IPC Server handler 13 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
TaskAttempt attempt_1479580915733_0167_m_000000_1 is : 0.667
2016-11-25 15:16:03,347 ERROR [IPC Server handler 8 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
attempt_1479580915733_0167_m_000000_1 - exited : java.io.IOException:
Failed to build cube in mapper 0
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.util.concurrent.ExecutionException:
java.lang.RuntimeException: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
	... 8 more
Caused by: java.lang.RuntimeException: java.io.IOException:
java.io.IOException: java.lang.IllegalArgumentException: Value not
exists!
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
	... 5 more
Caused by: java.io.IOException: java.lang.IllegalArgumentException:
Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
	... 7 more
Caused by: java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)

2016-11-25 15:16:03,347 INFO [IPC Server handler 8 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report
from attempt_1479580915733_0167_m_000000_1: Error:
java.io.IOException: Failed to build cube in mapper 0
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.util.concurrent.ExecutionException:
java.lang.RuntimeException: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
	... 8 more
Caused by: java.lang.RuntimeException: java.io.IOException:
java.io.IOException: java.lang.IllegalArgumentException: Value not
exists!
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
	... 5 more
Caused by: java.io.IOException: java.lang.IllegalArgumentException:
Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
	... 7 more
Caused by: java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)

2016-11-25 15:16:03,349 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
Diagnostics report from attempt_1479580915733_0167_m_000000_1: Error:
java.io.IOException: Failed to build cube in mapper 0
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.util.concurrent.ExecutionException:
java.lang.RuntimeException: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
	... 8 more
Caused by: java.lang.RuntimeException: java.io.IOException:
java.io.IOException: java.lang.IllegalArgumentException: Value not
exists!
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
	... 5 more
Caused by: java.io.IOException: java.lang.IllegalArgumentException:
Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
	... 7 more
Caused by: java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)

2016-11-25 15:16:03,350 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
RUNNING to FAIL_CONTAINER_CLEANUP
2016-11-25 15:16:03,351 INFO [ContainerLauncher #3]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container
container_e125_1479580915733_0167_01_000003 taskAttempt
attempt_1479580915733_0167_m_000000_1
2016-11-25 15:16:03,355 INFO [ContainerLauncher #3]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
KILLING attempt_1479580915733_0167_m_000000_1
2016-11-25 15:16:03,355 INFO [ContainerLauncher #3]
org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
2016-11-25 15:16:03,369 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
2016-11-25 15:16:03,369 INFO [CommitterEvent Processor #2]
org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
Processing the event EventType: TASK_ABORT
2016-11-25 15:16:03,375 WARN [CommitterEvent Processor #2]
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not
delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_temporary/attempt_1479580915733_0167_m_000000_1
2016-11-25 15:16:03,375 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_1 TaskAttempt Transitioned from
FAIL_TASK_CLEANUP to FAILED
2016-11-25 15:16:03,375 INFO [AsyncDispatcher event handler]
org.apache.hadoop.yarn.util.RackResolver: Resolved
hadoopclusterslic72.ad.infosys.com to /default-rack
2016-11-25 15:16:03,375 INFO [AsyncDispatcher event handler]
org.apache.hadoop.yarn.util.RackResolver: Resolved
hadoopclusterslic73.ad.infosys.com to /default-rack
2016-11-25 15:16:03,376 INFO [Thread-53]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: 1 failures
on node hadoopclusterslic72.ad.infosys.com
2016-11-25 15:16:03,376 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
NEW to UNASSIGNED
2016-11-25 15:16:03,380 INFO [Thread-53]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Added
attempt_1479580915733_0167_m_000000_2 to list of failed maps
2016-11-25 15:16:04,341 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before
Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0
AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0
ContAlloc:2 ContRel:0 HostLocal:1 RackLocal:0
2016-11-25 15:16:04,344 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
getResources() for application_1479580915733_0167: ask=1 release= 0
newContainers=0 finishedContainers=0 resourcelimit=<memory:34816,
vCores:0> knownNMs=2
2016-11-25 15:16:04,344 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
Recalculating schedule, headroom=<memory:34816, vCores:0>
2016-11-25 15:16:04,344 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce
slow start threshold not met. completedMapsForReduceSlowstart 1
2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received
completed container container_e125_1479580915733_0167_01_000003
2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got
allocated containers 1
2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning
container Container: [ContainerId:
container_e125_1479580915733_0167_01_000004, NodeId:
hadoopclusterslic72.ad.infosys.com:45454, NodeHttpAddress:
hadoopclusterslic72.ad.infosys.com:8042, Resource: <memory:4096,
vCores:1>, Priority: 5, Token: Token { kind: ContainerToken, service:
10.122.97.72:45454 }, ] to fast fail map
2016-11-25 15:16:05,352 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
from earlierFailedMaps
2016-11-25 15:16:05,352 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
Diagnostics report from attempt_1479580915733_0167_m_000000_1:
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143

2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
container container_e125_1479580915733_0167_01_000004 to
attempt_1479580915733_0167_m_000000_2
2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
Recalculating schedule, headroom=<memory:34816, vCores:0>
2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce
slow start threshold not met. completedMapsForReduceSlowstart 1
2016-11-25 15:16:05,353 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0
AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0
ContAlloc:3 ContRel:0 HostLocal:1 RackLocal:0
2016-11-25 15:16:05,353 INFO [AsyncDispatcher event handler]
org.apache.hadoop.yarn.util.RackResolver: Resolved
hadoopclusterslic72.ad.infosys.com to /default-rack
2016-11-25 15:16:05,354 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
UNASSIGNED to ASSIGNED
2016-11-25 15:16:05,356 INFO [ContainerLauncher #4]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container
container_e125_1479580915733_0167_01_000004 taskAttempt
attempt_1479580915733_0167_m_000000_2
2016-11-25 15:16:05,356 INFO [ContainerLauncher #4]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
Launching attempt_1479580915733_0167_m_000000_2
2016-11-25 15:16:05,357 INFO [ContainerLauncher #4]
org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
2016-11-25 15:16:05,371 INFO [ContainerLauncher #4]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
Shuffle port returned by ContainerManager for
attempt_1479580915733_0167_m_000000_2 : 13562
2016-11-25 15:16:05,371 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
TaskAttempt: [attempt_1479580915733_0167_m_000000_2] using
containerId: [container_e125_1479580915733_0167_01_000004 on NM:
[hadoopclusterslic72.ad.infosys.com:45454]
2016-11-25 15:16:05,372 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
ASSIGNED to RUNNING
2016-11-25 15:16:06,362 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
getResources() for application_1479580915733_0167: ask=1 release= 0
newContainers=0 finishedContainers=0 resourcelimit=<memory:34816,
vCores:0> knownNMs=2
2016-11-25 15:16:07,537 INFO [Socket Reader #1 for port 57220]
SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
job_1479580915733_0167 (auth:SIMPLE)
2016-11-25 15:16:07,567 INFO [IPC Server handler 24 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID :
jvm_1479580915733_0167_m_137438953472004 asked for a task
2016-11-25 15:16:07,567 INFO [IPC Server handler 24 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID:
jvm_1479580915733_0167_m_137438953472004 given task:
attempt_1479580915733_0167_m_000000_2
2016-11-25 15:16:14,753 INFO [IPC Server handler 6 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
TaskAttempt attempt_1479580915733_0167_m_000000_2 is : 0.667
2016-11-25 15:16:15,241 INFO [IPC Server handler 13 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
TaskAttempt attempt_1479580915733_0167_m_000000_2 is : 0.667
2016-11-25 15:16:15,258 ERROR [IPC Server handler 8 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
attempt_1479580915733_0167_m_000000_2 - exited : java.io.IOException:
Failed to build cube in mapper 0
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.util.concurrent.ExecutionException:
java.lang.RuntimeException: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
	... 8 more
Caused by: java.lang.RuntimeException: java.io.IOException:
java.io.IOException: java.lang.IllegalArgumentException: Value not
exists!
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
	... 5 more
Caused by: java.io.IOException: java.lang.IllegalArgumentException:
Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
	... 7 more
Caused by: java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)

2016-11-25 15:16:15,258 INFO [IPC Server handler 8 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report
from attempt_1479580915733_0167_m_000000_2: Error:
java.io.IOException: Failed to build cube in mapper 0
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.util.concurrent.ExecutionException:
java.lang.RuntimeException: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
	... 8 more
Caused by: java.lang.RuntimeException: java.io.IOException:
java.io.IOException: java.lang.IllegalArgumentException: Value not
exists!
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
	... 5 more
Caused by: java.io.IOException: java.lang.IllegalArgumentException:
Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
	... 7 more
Caused by: java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)

2016-11-25 15:16:15,261 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
Diagnostics report from attempt_1479580915733_0167_m_000000_2: Error:
java.io.IOException: Failed to build cube in mapper 0
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.util.concurrent.ExecutionException:
java.lang.RuntimeException: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
	... 8 more
Caused by: java.lang.RuntimeException: java.io.IOException:
java.io.IOException: java.lang.IllegalArgumentException: Value not
exists!
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
	... 5 more
Caused by: java.io.IOException: java.lang.IllegalArgumentException:
Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
	... 7 more
Caused by: java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)

2016-11-25 15:16:15,273 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
RUNNING to FAIL_CONTAINER_CLEANUP
2016-11-25 15:16:15,274 INFO [ContainerLauncher #5]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container
container_e125_1479580915733_0167_01_000004 taskAttempt
attempt_1479580915733_0167_m_000000_2
2016-11-25 15:16:15,274 INFO [ContainerLauncher #5]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
KILLING attempt_1479580915733_0167_m_000000_2
2016-11-25 15:16:15,274 INFO [ContainerLauncher #5]
org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
2016-11-25 15:16:15,289 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
2016-11-25 15:16:15,292 INFO [CommitterEvent Processor #3]
org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
Processing the event EventType: TASK_ABORT
2016-11-25 15:16:15,300 WARN [CommitterEvent Processor #3]
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not
delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_temporary/attempt_1479580915733_0167_m_000000_2
2016-11-25 15:16:15,300 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_2 TaskAttempt Transitioned from
FAIL_TASK_CLEANUP to FAILED
2016-11-25 15:16:15,301 INFO [AsyncDispatcher event handler]
org.apache.hadoop.yarn.util.RackResolver: Resolved
hadoopclusterslic72.ad.infosys.com to /default-rack
2016-11-25 15:16:15,301 INFO [AsyncDispatcher event handler]
org.apache.hadoop.yarn.util.RackResolver: Resolved
hadoopclusterslic73.ad.infosys.com to /default-rack
2016-11-25 15:16:15,301 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
NEW to UNASSIGNED
2016-11-25 15:16:15,301 INFO [Thread-53]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: 2 failures
on node hadoopclusterslic72.ad.infosys.com
2016-11-25 15:16:15,307 INFO [Thread-53]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Added
attempt_1479580915733_0167_m_000000_3 to list of failed maps
2016-11-25 15:16:15,412 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before
Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0
AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0
ContAlloc:3 ContRel:0 HostLocal:1 RackLocal:0
2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
getResources() for application_1479580915733_0167: ask=1 release= 0
newContainers=0 finishedContainers=1 resourcelimit=<memory:38912,
vCores:1> knownNMs=2
2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received
completed container container_e125_1479580915733_0167_01_000004
2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
Recalculating schedule, headroom=<memory:38912, vCores:1>
2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce
slow start threshold not met. completedMapsForReduceSlowstart 1
2016-11-25 15:16:15,420 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0
AssignedMaps:0 AssignedReds:0 CompletedMaps:0 CompletedReds:0
ContAlloc:3 ContRel:0 HostLocal:1 RackLocal:0
2016-11-25 15:16:15,421 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
Diagnostics report from attempt_1479580915733_0167_m_000000_2:
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143

2016-11-25 15:16:16,432 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got
allocated containers 1
2016-11-25 15:16:16,432 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning
container Container: [ContainerId:
container_e125_1479580915733_0167_01_000005, NodeId:
hadoopclusterslic72.ad.infosys.com:45454, NodeHttpAddress:
hadoopclusterslic72.ad.infosys.com:8042, Resource: <memory:4096,
vCores:1>, Priority: 5, Token: Token { kind: ContainerToken, service:
10.122.97.72:45454 }, ] to fast fail map
2016-11-25 15:16:16,432 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
from earlierFailedMaps
2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned
container container_e125_1479580915733_0167_01_000005 to
attempt_1479580915733_0167_m_000000_3
2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
Recalculating schedule, headroom=<memory:34816, vCores:0>
2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce
slow start threshold not met. completedMapsForReduceSlowstart 1
2016-11-25 15:16:16,433 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After
Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0
AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0
ContAlloc:4 ContRel:0 HostLocal:1 RackLocal:0
2016-11-25 15:16:16,433 INFO [AsyncDispatcher event handler]
org.apache.hadoop.yarn.util.RackResolver: Resolved
hadoopclusterslic72.ad.infosys.com to /default-rack
2016-11-25 15:16:16,434 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
UNASSIGNED to ASSIGNED
2016-11-25 15:16:16,436 INFO [ContainerLauncher #6]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container
container_e125_1479580915733_0167_01_000005 taskAttempt
attempt_1479580915733_0167_m_000000_3
2016-11-25 15:16:16,436 INFO [ContainerLauncher #6]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
Launching attempt_1479580915733_0167_m_000000_3
2016-11-25 15:16:16,436 INFO [ContainerLauncher #6]
org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
2016-11-25 15:16:16,516 INFO [ContainerLauncher #6]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
Shuffle port returned by ContainerManager for
attempt_1479580915733_0167_m_000000_3 : 13562
2016-11-25 15:16:16,517 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
TaskAttempt: [attempt_1479580915733_0167_m_000000_3] using
containerId: [container_e125_1479580915733_0167_01_000005 on NM:
[hadoopclusterslic72.ad.infosys.com:45454]
2016-11-25 15:16:16,517 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
ASSIGNED to RUNNING
2016-11-25 15:16:17,436 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
getResources() for application_1479580915733_0167: ask=1 release= 0
newContainers=0 finishedContainers=0 resourcelimit=<memory:34816,
vCores:0> knownNMs=2
2016-11-25 15:16:19,664 INFO [Socket Reader #1 for port 57220]
SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
job_1479580915733_0167 (auth:SIMPLE)
2016-11-25 15:16:19,692 INFO [IPC Server handler 6 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID :
jvm_1479580915733_0167_m_137438953472005 asked for a task
2016-11-25 15:16:19,692 INFO [IPC Server handler 6 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID:
jvm_1479580915733_0167_m_137438953472005 given task:
attempt_1479580915733_0167_m_000000_3
2016-11-25 15:16:27,222 INFO [IPC Server handler 13 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
TaskAttempt attempt_1479580915733_0167_m_000000_3 is : 0.667
2016-11-25 15:16:27,952 INFO [IPC Server handler 7 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
TaskAttempt attempt_1479580915733_0167_m_000000_3 is : 0.667
2016-11-25 15:16:27,971 ERROR [IPC Server handler 11 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
attempt_1479580915733_0167_m_000000_3 - exited : java.io.IOException:
Failed to build cube in mapper 0
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.util.concurrent.ExecutionException:
java.lang.RuntimeException: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
	... 8 more
Caused by: java.lang.RuntimeException: java.io.IOException:
java.io.IOException: java.lang.IllegalArgumentException: Value not
exists!
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
	... 5 more
Caused by: java.io.IOException: java.lang.IllegalArgumentException:
Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
	... 7 more
Caused by: java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)

2016-11-25 15:16:27,971 INFO [IPC Server handler 11 on 57220]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report
from attempt_1479580915733_0167_m_000000_3: Error:
java.io.IOException: Failed to build cube in mapper 0
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.util.concurrent.ExecutionException:
java.lang.RuntimeException: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
	... 8 more
Caused by: java.lang.RuntimeException: java.io.IOException:
java.io.IOException: java.lang.IllegalArgumentException: Value not
exists!
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
	... 5 more
Caused by: java.io.IOException: java.lang.IllegalArgumentException:
Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
	... 7 more
Caused by: java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)

2016-11-25 15:16:27,974 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
Diagnostics report from attempt_1479580915733_0167_m_000000_3: Error:
java.io.IOException: Failed to build cube in mapper 0
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:145)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:149)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.util.concurrent.ExecutionException:
java.lang.RuntimeException: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
	at org.apache.kylin.engine.mr.steps.InMemCuboidMapper.cleanup(InMemCuboidMapper.java:143)
	... 8 more
Caused by: java.lang.RuntimeException: java.io.IOException:
java.io.IOException: java.lang.IllegalArgumentException: Value not
exists!
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:82)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: java.io.IOException:
java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:126)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder.build(DoggedCubeBuilder.java:73)
	at org.apache.kylin.cube.inmemcubing.AbstractInMemCubeBuilder$1.run(AbstractInMemCubeBuilder.java:80)
	... 5 more
Caused by: java.io.IOException: java.lang.IllegalArgumentException:
Value not exists!
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.abort(DoggedCubeBuilder.java:194)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.checkException(DoggedCubeBuilder.java:167)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$BuildOnce.build(DoggedCubeBuilder.java:114)
	... 7 more
Caused by: java.lang.IllegalArgumentException: Value not exists!
	at org.apache.kylin.common.util.Dictionary.getIdFromValueBytes(Dictionary.java:162)
	at org.apache.kylin.dict.TrieDictionary.getIdFromValueImpl(TrieDictionary.java:167)
	at org.apache.kylin.common.util.Dictionary.getIdFromValue(Dictionary.java:98)
	at org.apache.kylin.dimension.DictionaryDimEnc$DictionarySerializer.serialize(DictionaryDimEnc.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:121)
	at org.apache.kylin.cube.gridtable.CubeCodeSystem.encodeColumnValue(CubeCodeSystem.java:110)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:93)
	at org.apache.kylin.gridtable.GTRecord.setValues(GTRecord.java:81)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilderInputConverter.convert(InMemCubeBuilderInputConverter.java:74)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:542)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder$InputConverter$1.next(InMemCubeBuilder.java:523)
	at org.apache.kylin.gridtable.GTAggregateScanner.iterator(GTAggregateScanner.java:139)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.createBaseCuboid(InMemCubeBuilder.java:339)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:166)
	at org.apache.kylin.cube.inmemcubing.InMemCubeBuilder.build(InMemCubeBuilder.java:135)
	at org.apache.kylin.cube.inmemcubing.DoggedCubeBuilder$SplitThread.run(DoggedCubeBuilder.java:282)

2016-11-25 15:16:27,975 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
RUNNING to FAIL_CONTAINER_CLEANUP
2016-11-25 15:16:27,976 INFO [ContainerLauncher #7]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container
container_e125_1479580915733_0167_01_000005 taskAttempt
attempt_1479580915733_0167_m_000000_3
2016-11-25 15:16:27,997 INFO [ContainerLauncher #7]
org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
KILLING attempt_1479580915733_0167_m_000000_3
2016-11-25 15:16:27,997 INFO [ContainerLauncher #7]
org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
Opening proxy : hadoopclusterslic72.ad.infosys.com:45454
2016-11-25 15:16:28,009 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
2016-11-25 15:16:28,011 INFO [CommitterEvent Processor #4]
org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
Processing the event EventType: TASK_ABORT
2016-11-25 15:16:28,013 WARN [CommitterEvent Processor #4]
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not
delete hdfs://SLICHDP/kylin/kylin_metadata/kylin-b2f43555-7105-4912-b0bf-c40a0b405a05/ORC_ALERT_C/cuboid/_temporary/1/_temporary/attempt_1479580915733_0167_m_000000_3
2016-11-25 15:16:28,014 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_m_000000_3 TaskAttempt Transitioned from
FAIL_TASK_CLEANUP to FAILED
2016-11-25 15:16:28,026 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
task_1479580915733_0167_m_000000 Task Transitioned from RUNNING to
FAILED
2016-11-25 15:16:28,026 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Num completed
Tasks: 1
2016-11-25 15:16:28,027 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Job failed as
tasks failed. failedMaps:1 failedReduces:0
2016-11-25 15:16:28,027 INFO [Thread-53]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: 3 failures
on node hadoopclusterslic72.ad.infosys.com
2016-11-25 15:16:28,027 INFO [Thread-53]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
Blacklisted host hadoopclusterslic72.ad.infosys.com
2016-11-25 15:16:28,032 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
job_1479580915733_0167Job Transitioned from RUNNING to FAIL_WAIT
2016-11-25 15:16:28,033 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
task_1479580915733_0167_r_000000 Task Transitioned from SCHEDULED to
KILL_WAIT
2016-11-25 15:16:28,033 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1479580915733_0167_r_000000_0 TaskAttempt Transitioned from
UNASSIGNED to KILLED
2016-11-25 15:16:28,034 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
task_1479580915733_0167_r_000000 Task Transitioned from KILL_WAIT to
KILLED
2016-11-25 15:16:28,034 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
job_1479580915733_0167Job Transitioned from FAIL_WAIT to FAIL_ABORT
2016-11-25 15:16:28,037 INFO [Thread-53]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Processing
the event EventType: CONTAINER_DEALLOCATE
2016-11-25 15:16:28,037 ERROR [Thread-53]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Could not
deallocate container for task attemptId
attempt_1479580915733_0167_r_000000_0
2016-11-25 15:16:28,043 INFO [CommitterEvent Processor #0]
org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
Processing the event EventType: JOB_ABORT
2016-11-25 15:16:28,058 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl:
job_1479580915733_0167Job Transitioned from FAIL_ABORT to FAILED
2016-11-25 15:16:28,092 INFO [Thread-74]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: We are finishing
cleanly so this is the last retry
2016-11-25 15:16:28,092 INFO [Thread-74]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Notify RMCommunicator
isAMLastRetry: true
2016-11-25 15:16:28,092 INFO [Thread-74]
org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator: RMCommunicator
notified that shouldUnregistered is: true
2016-11-25 15:16:28,092 INFO [Thread-74]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Notify JHEH
isAMLastRetry: true
2016-11-25 15:16:28,092 INFO [Thread-74]
org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler:
JobHistoryEventHandler notified that forceJobCompletion is true
2016-11-25 15:16:28,092 INFO [Thread-74]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Calling stop for all
the services
2016-11-25 15:16:28,093 INFO [Thread-74]
org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler:
Stopping JobHistoryEventHandler. Size of the outstanding queue size is
2
2016-11-25 15:16:28,097 INFO [Thread-74]
org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: In
stop, writing event TASK_FAILED
2016-11-25 15:16:28,099 INFO [Thread-74]
org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: In
stop, writing event JOB_FAILED
2016-11-25 15:16:28,177 INFO [Thread-74]
org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Copying
hdfs://SLICHDP:8020/user/hdfs/.staging/job_1479580915733_0167/job_1479580915733_0167_1.jhist
to hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-1480067188027-0-0-FAILED-default-1480067140199.jhist_tmp
2016-11-25 15:16:28,248 INFO [Thread-74]
org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Copied
to done location:
hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-1480067188027-0-0-FAILED-default-1480067140199.jhist_tmp
2016-11-25 15:16:28,253 INFO [Thread-74]
org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Copying
hdfs://SLICHDP:8020/user/hdfs/.staging/job_1479580915733_0167/job_1479580915733_0167_1_conf.xml
to hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167_conf.xml_tmp
2016-11-25 15:16:28,320 INFO [Thread-74]
org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Copied
to done location:
hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167_conf.xml_tmp
2016-11-25 15:16:28,338 INFO [Thread-74]
org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Moved
tmp to done: hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167.summary_tmp
to hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167.summary
2016-11-25 15:16:28,350 INFO [Thread-74]
org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Moved
tmp to done: hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167_conf.xml_tmp
to hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167_conf.xml
2016-11-25 15:16:28,353 INFO [Thread-74]
org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Moved
tmp to done: hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-1480067188027-0-0-FAILED-default-1480067140199.jhist_tmp
to hdfs://SLICHDP:8020/mr-history/tmp/hdfs/job_1479580915733_0167-1480067133013-hdfs-Kylin_Cube_Builder_ORC_ALERT_C-1480067188027-0-0-FAILED-default-1480067140199.jhist
2016-11-25 15:16:28,353 INFO [Thread-74]
org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Stopped
JobHistoryEventHandler. super.stop()
2016-11-25 15:16:28,357 INFO [Thread-74]
org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator: Setting job
diagnostics to Task failed task_1479580915733_0167_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

2016-11-25 15:16:28,357 INFO [Thread-74]
org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator: History url is
http://hadoopclusterslic73.ad.infosys.com:19888/jobhistory/job/job_1479580915733_0167
2016-11-25 15:16:28,373 INFO [Thread-74]
org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator: Waiting for
application to be successfully unregistered.
2016-11-25 15:16:29,375 INFO [Thread-74]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Final
Stats: PendingReds:1 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1
AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:4 ContRel:0
HostLocal:1 RackLocal:0
2016-11-25 15:16:29,377 INFO [Thread-74]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Deleting staging
directory hdfs://SLICHDP /user/hdfs/.staging/job_1479580915733_0167
2016-11-25 15:16:29,380 INFO [Thread-74] org.apache.hadoop.ipc.Server:
Stopping server on 57220
2016-11-25 15:16:29,387 INFO [TaskHeartbeatHandler PingChecker]
org.apache.hadoop.mapreduce.v2.app.TaskHeartbeatHandler:
TaskHeartbeatHandler thread interrupted
2016-11-25 15:16:29,387 INFO [IPC Server listener on 57220]
org.apache.hadoop.ipc.Server: Stopping IPC Server listener on 57220


On Fri, Nov 25, 2016 at 7:36 PM, ShaoFeng Shi <sh...@apache.org>
wrote:

> Didn't hear of that. Hive table's file format is transparent for Kylin;
> Even if the table is a view, Kylin can build from it.
>
> What's the detail error you got when using ORC table? If you can provide
> the detail information, that would be better.
>
> 2016-11-25 18:22 GMT+08:00 suresh m <su...@gmail.com>:
>
> > Hi Facing an issue where i can able to build cube with text format but
> > unable to building cube with ORC tables.
> >
> > Let me know kylin having any issues with ORC format.?
> >
> >  Hive having limitation that Text format tables not having possibility to
> > enabling ACID properties since text format not supporting ACID. But for
> me
> > ACID properties is important to handle my data, this i can do with ORC
> but
> > kylin throwing errors with ORC format.
> >
> >
> > Regards,
> > Suresh
> >
>
>
>
> --
> Best regards,
>
> Shaofeng Shi 史少锋
>

Re: Is kylin not support Hive ORC tables with ACID properties.

Posted by Xiaoyu Wang <wa...@apache.org>.
I think the problem may be the Hive ACID properties is not config in Kylin's $KYLIN_HOME/conf/kylin_hive_conf.xml
Hive ACID with ORC file format need some properties like :

hive.support.concurrency – true
hive.enforce.bucketing – true (Not required as of Hive 2.0)
hive.exec.dynamic.partition.mode – nonstrict
hive.txn.manager – org.apache.hadoop.hive.ql.lockmgr.DbTxnManager

So, try to config the Hive ACID properties in Kylin's $KYLIN_HOME/conf/kylin_hive_conf.xml

Hive wiki:https://cwiki.apache.org/confluence/display/Hive/Hive+Transactions#HiveTransactions-Configuration <https://cwiki.apache.org/confluence/display/Hive/Hive+Transactions#HiveTransactions-Configuration>


> 在 2016年11月25日,22:06,ShaoFeng Shi <sh...@apache.org> 写道:
> 
> Didn't hear of that. Hive table's file format is transparent for Kylin;
> Even if the table is a view, Kylin can build from it.
> 
> What's the detail error you got when using ORC table? If you can provide
> the detail information, that would be better.
> 
> 2016-11-25 18:22 GMT+08:00 suresh m <su...@gmail.com>:
> 
>> Hi Facing an issue where i can able to build cube with text format but
>> unable to building cube with ORC tables.
>> 
>> Let me know kylin having any issues with ORC format.?
>> 
>> Hive having limitation that Text format tables not having possibility to
>> enabling ACID properties since text format not supporting ACID. But for me
>> ACID properties is important to handle my data, this i can do with ORC but
>> kylin throwing errors with ORC format.
>> 
>> 
>> Regards,
>> Suresh
>> 
> 
> 
> 
> -- 
> Best regards,
> 
> Shaofeng Shi 史少锋


Re: Is kylin not support Hive ORC tables with ACID properties.

Posted by ShaoFeng Shi <sh...@apache.org>.
Didn't hear of that. Hive table's file format is transparent for Kylin;
Even if the table is a view, Kylin can build from it.

What's the detail error you got when using ORC table? If you can provide
the detail information, that would be better.

2016-11-25 18:22 GMT+08:00 suresh m <su...@gmail.com>:

> Hi Facing an issue where i can able to build cube with text format but
> unable to building cube with ORC tables.
>
> Let me know kylin having any issues with ORC format.?
>
>  Hive having limitation that Text format tables not having possibility to
> enabling ACID properties since text format not supporting ACID. But for me
> ACID properties is important to handle my data, this i can do with ORC but
> kylin throwing errors with ORC format.
>
>
> Regards,
> Suresh
>



-- 
Best regards,

Shaofeng Shi 史少锋