You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@druid.apache.org by GitBox <gi...@apache.org> on 2020/03/10 17:23:27 UTC

[GitHub] [druid] Observe-secretly opened a new issue #9497: The Druid I build occasionally shows the error "Rename cannot overwrite non empty destination directory / tmp / hadoop-root / mapred / local / 1583859663629". I have no way to solve this problem now. please help me

Observe-secretly opened a new issue #9497: The Druid I build occasionally shows the error "Rename cannot overwrite non empty destination directory / tmp / hadoop-root / mapred / local / 1583859663629". I have no way to solve this problem now. please help me
URL: https://github.com/apache/druid/issues/9497
 
 
   
   
   ### Affected Version
   
   0.17.0
   
   ### Description
   
   Our Druid cluster occasionally shows the error "Rename cannot overwrite non empty destination directory / tmp / hadoop-root / mapred / local / 1583859663629". It did not have this problem in the early stage of operation. So he is a problem that has only recently appeared. We did not clean up any files under "/ tmp / hadoop-root / mapred". We haven't found any information stored in this directory. The following is the exception stack, hope to help me locate the problem. thank
   
   ### Exception stack
   
   ```
   
   2020-03-10 17:00:53,654 task-runner-0-priority-0 WARN JNDI lookup class is not available because this JRE does not support JNDI. JNDI string lookups will not be available, continuing configuration. Ignoring java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JndiLookup to org.apache.logging.log4j.core.lookup.StrLookup
   2020-03-10 17:00:53,667 task-runner-0-priority-0 WARN JMX runtime input lookup class is not available because this JRE does not support JMX. JMX lookups will not be available, continuing configuration. Ignoring java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JmxRuntimeInputArgumentsLookup to org.apache.logging.log4j.core.lookup.StrLookup
   2020-03-10 17:00:53,725 task-runner-0-priority-0 WARN JNDI lookup class is not available because this JRE does not support JNDI. JNDI string lookups will not be available, continuing configuration. Ignoring java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JndiLookup to org.apache.logging.log4j.core.lookup.StrLookup
   2020-03-10 17:00:53,726 task-runner-0-priority-0 WARN JMX runtime input lookup class is not available because this JRE does not support JMX. JMX lookups will not be available, continuing configuration. Ignoring java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JmxRuntimeInputArgumentsLookup to org.apache.logging.log4j.core.lookup.StrLookup
   2020-03-10T17:00:54,888 INFO [main] org.eclipse.jetty.server.handler.ContextHandler - Started o.e.j.s.ServletContextHandler@68df8c6{/,null,AVAILABLE}
   2020-03-10T17:00:54,965 INFO [main] org.eclipse.jetty.server.AbstractConnector - Started ServerConnector@37f41a81{HTTP/1.1,[http/1.1]}{0.0.0.0:8101}
   2020-03-10T17:00:54,965 INFO [main] org.eclipse.jetty.server.Server - Started @20088ms
   2020-03-10T17:00:54,966 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle - Starting lifecycle [module] stage [ANNOUNCEMENTS]
   2020-03-10T17:00:55,053 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle - Successfully started lifecycle [module]
   2020-03-10T17:00:55,166 INFO [task-runner-0-priority-0] org.hibernate.validator.internal.util.Version - HV000001: Hibernate Validator 5.2.5.Final
   2020-03-10T17:00:56,849 INFO [task-runner-0-priority-0] org.apache.druid.initialization.Initialization - Loading extension [druid-hdfs-storage], jars: hadoop-mapreduce-client-core-2.8.5.jar, hadoop-yarn-api-2.8.5.jar, commons-configuration-1.6.jar, apacheds-i18n-2.0.0-M15.jar, hadoop-common-2.8.5.jar, jetty-sslengine-6.1.26.jar, hadoop-client-2.8.5.jar, curator-framework-4.1.0.jar, htrace-core4-4.0.1-incubating.jar, commons-digester-1.8.jar, jcip-annotations-1.0-1.jar, xmlenc-0.52.jar, hadoop-mapreduce-client-app-2.8.5.jar, json-smart-2.3.jar, hadoop-auth-2.8.5.jar, asm-7.1.jar, jackson-core-asl-1.9.13.jar, jsp-api-2.1.jar, hadoop-yarn-client-2.8.5.jar, api-util-1.0.3.jar, commons-collections-3.2.2.jar, api-asn1-api-1.0.0-M20.jar, apacheds-kerberos-codec-2.0.0-M15.jar, hadoop-yarn-server-common-2.8.5.jar, hadoop-annotations-2.8.5.jar, hadoop-mapreduce-client-jobclient-2.8.5.jar, hadoop-hdfs-client-2.8.5.jar, curator-recipes-4.1.0.jar, accessors-smart-1.2.jar, gson-2.2.4.jar, leveldbjni-all-1.8.jar, commons-net-3.6.jar, jackson-mapper-asl-1.9.13.jar, hadoop-mapreduce-client-common-2.8.5.jar, hadoop-mapreduce-client-shuffle-2.8.5.jar, nimbus-jose-jwt-4.41.1.jar, druid-hdfs-storage-0.17.0.jar
   2020-03-10T17:00:56,864 INFO [task-runner-0-priority-0] org.apache.druid.initialization.Initialization - Loading extension [druid-kafka-indexing-service], jars: druid-kafka-indexing-service-0.17.0.jar, snappy-java-1.1.7.2.jar, zstd-jni-1.3.3-1.jar, lz4-java-1.6.0.jar, kafka-clients-2.2.1.jar
   2020-03-10T17:00:56,907 INFO [task-runner-0-priority-0] org.apache.druid.initialization.Initialization - Loading extension [druid-datasketches], jars: druid-datasketches-0.17.0.jar, commons-math3-3.6.1.jar
   2020-03-10T17:00:56,909 INFO [task-runner-0-priority-0] org.apache.druid.initialization.Initialization - Loading extension [mysql-metadata-storage], jars: mysql-metadata-storage-0.17.0.jar, mysql-connector-java.jar
   2020-03-10T17:00:58,586 WARN [task-runner-0-priority-0] org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
   2020-03-10T17:01:00,307 WARN [task-runner-0-priority-0] org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
   2020-03-10T17:01:01,097 INFO [task-runner-0-priority-0] org.apache.curator.utils.Compatibility - Running in ZooKeeper 3.4.x compatibility mode
   2020-03-10T17:01:01,266 INFO [task-runner-0-priority-0] org.apache.druid.server.emitter.EmitterModule - Using emitter [NoopEmitter{}] for metrics and alerts, with dimensions [{version=0.17.0}].
   2020-03-10T17:01:01,301 INFO [task-runner-0-priority-0] org.apache.druid.server.metrics.MetricsModule - Loaded 2 monitors: org.apache.druid.java.util.metrics.JvmMonitor, org.apache.druid.server.initialization.jetty.JettyServerModule$JettyMonitor
   2020-03-10T17:01:01,677 INFO [task-runner-0-priority-0] org.apache.druid.indexing.common.task.HadoopIndexTask - Starting a hadoop determine configuration job...
   2020-03-10T17:01:01,712 INFO [task-runner-0-priority-0] org.apache.druid.indexer.path.StaticPathSpec - Adding paths[/user/spark/druid/druid_pushed/sign_alert_projectid_202003110058.json]
   2020-03-10T17:01:02,328 INFO [task-runner-0-priority-0] org.apache.druid.indexer.path.StaticPathSpec - Adding paths[/user/spark/druid/druid_pushed/sign_alert_projectid_202003110058.json]
   2020-03-10T17:01:02,407 INFO [task-runner-0-priority-0] org.apache.hadoop.conf.Configuration.deprecation - session.id is deprecated. Instead, use dfs.metrics.session-id
   2020-03-10T17:01:02,409 INFO [task-runner-0-priority-0] org.apache.hadoop.metrics.jvm.JvmMetrics - Initializing JVM Metrics with processName=JobTracker, sessionId=
   2020-03-10T17:01:02,671 WARN [task-runner-0-priority-0] org.apache.hadoop.mapreduce.JobResourceUploader - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
   2020-03-10T17:01:02,683 WARN [task-runner-0-priority-0] org.apache.hadoop.mapreduce.JobResourceUploader - No job jar file set.  User classes may not be found. See Job or Job#setJar(String).
   2020-03-10T17:01:02,908 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input files to process : 1
   2020-03-10T17:01:03,032 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.JobSubmitter - number of splits:1
   2020-03-10T17:01:03,248 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.JobSubmitter - Submitting tokens for job: job_local916051895_0001
   2020-03-10T17:01:06,516 INFO [task-runner-0-priority-0] org.apache.hadoop.mapred.LocalDistributedCacheManager - Creating symlink: /tmp/hadoop-root/mapred/local/1583859663628/javax.el-3.0.0.jar <- /mnt/apache-druid-0.17.0/javax.el-3.0.0.jar
   2020-03-10T17:01:07,896 WARN [task-runner-0-priority-0] org.apache.hadoop.fs.FileUtil - Command 'ln -s /tmp/hadoop-root/mapred/local/1583859663628/javax.el-3.0.0.jar /mnt/apache-druid-0.17.0/javax.el-3.0.0.jar' failed 1 with: ln: failed to create symbolic link ‘/mnt/apache-druid-0.17.0/javax.el-3.0.0.jar’: File exists
   
   2020-03-10T17:01:07,896 WARN [task-runner-0-priority-0] org.apache.hadoop.mapred.LocalDistributedCacheManager - Failed to create symlink: /tmp/hadoop-root/mapred/local/1583859663628/javax.el-3.0.0.jar <- /mnt/apache-druid-0.17.0/javax.el-3.0.0.jar
   2020-03-10T17:01:07,896 INFO [task-runner-0-priority-0] org.apache.hadoop.mapred.LocalDistributedCacheManager - Localized hdfs://hdfscluster/user/root/var/druid/hadoop-tmp/classpath/javax.el-3.0.0.jar as file:/tmp/hadoop-root/mapred/local/1583859663628/javax.el-3.0.0.jar
   2020-03-10T17:01:12,390 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.JobSubmitter - Cleaning up the staging area file:/tmp/hadoop-root/mapred/staging/root916051895/.staging/job_local916051895_0001
   2020-03-10T17:01:12,392 ERROR [task-runner-0-priority-0] org.apache.druid.indexing.common.task.HadoopIndexTask - Got invocation target exception in run(), cause: 
   java.lang.RuntimeException: java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: Rename cannot overwrite non empty destination directory /tmp/hadoop-root/mapred/local/1583859663629
   	at org.apache.druid.indexer.DetermineHashedPartitionsJob.run(DetermineHashedPartitionsJob.java:224) ~[druid-indexing-hadoop-0.17.0.jar:0.17.0]
   	at org.apache.druid.indexer.JobHelper.runSingleJob(JobHelper.java:397) ~[druid-indexing-hadoop-0.17.0.jar:0.17.0]
   	at org.apache.druid.indexer.HadoopDruidDetermineConfigurationJob.run(HadoopDruidDetermineConfigurationJob.java:60) ~[druid-indexing-hadoop-0.17.0.jar:0.17.0]
   	at org.apache.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessingRunner.runTask(HadoopIndexTask.java:654) ~[druid-indexing-service-0.17.0.jar:0.17.0]
   	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_111]
   	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_111]
   	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_111]
   	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_111]
   	at org.apache.druid.indexing.common.task.HadoopIndexTask.runInternal(HadoopIndexTask.java:347) ~[druid-indexing-service-0.17.0.jar:0.17.0]
   	at org.apache.druid.indexing.common.task.HadoopIndexTask.runTask(HadoopIndexTask.java:281) [druid-indexing-service-0.17.0.jar:0.17.0]
   	at org.apache.druid.indexing.common.task.AbstractBatchIndexTask.run(AbstractBatchIndexTask.java:138) [druid-indexing-service-0.17.0.jar:0.17.0]
   	at org.apache.druid.indexing.overlord.SingleTaskBackgroundRunner$SingleTaskBackgroundRunnerCallable.call(SingleTaskBackgroundRunner.java:419) [druid-indexing-service-0.17.0.jar:0.17.0]
   	at org.apache.druid.indexing.overlord.SingleTaskBackgroundRunner$SingleTaskBackgroundRunnerCallable.call(SingleTaskBackgroundRunner.java:391) [druid-indexing-service-0.17.0.jar:0.17.0]
   	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_111]
   	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_111]
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_111]
   	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_111]
   Caused by: java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: Rename cannot overwrite non empty destination directory /tmp/hadoop-root/mapred/local/1583859663629
   	at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:143) ~[?:?]
   	at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:171) ~[?:?]
   	at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:758) ~[?:?]
   	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:242) ~[?:?]
   	at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1341) ~[?:?]
   	at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1338) ~[?:?]
   	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_111]
   	at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_111]
   	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1844) ~[?:?]
   	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1338) ~[?:?]
   	at org.apache.druid.indexer.DetermineHashedPartitionsJob.run(DetermineHashedPartitionsJob.java:124) ~[druid-indexing-hadoop-0.17.0.jar:0.17.0]
   	... 16 more
   Caused by: java.util.concurrent.ExecutionException: java.io.IOException: Rename cannot overwrite non empty destination directory /tmp/hadoop-root/mapred/local/1583859663629
   	at java.util.concurrent.FutureTask.report(FutureTask.java:122) ~[?:1.8.0_111]
   	at java.util.concurrent.FutureTask.get(FutureTask.java:192) ~[?:1.8.0_111]
   	at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:139) ~[?:?]
   	at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:171) ~[?:?]
   	at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:758) ~[?:?]
   	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:242) ~[?:?]
   	at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1341) ~[?:?]
   	at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1338) ~[?:?]
   	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_111]
   	at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_111]
   	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1844) ~[?:?]
   	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1338) ~[?:?]
   	at org.apache.druid.indexer.DetermineHashedPartitionsJob.run(DetermineHashedPartitionsJob.java:124) ~[druid-indexing-hadoop-0.17.0.jar:0.17.0]
   	... 16 more
   Caused by: java.io.IOException: Rename cannot overwrite non empty destination directory /tmp/hadoop-root/mapred/local/1583859663629
   	at org.apache.hadoop.fs.AbstractFileSystem.renameInternal(AbstractFileSystem.java:748) ~[?:?]
   	at org.apache.hadoop.fs.FilterFs.renameInternal(FilterFs.java:249) ~[?:?]
   	at org.apache.hadoop.fs.AbstractFileSystem.rename(AbstractFileSystem.java:691) ~[?:?]
   	at org.apache.hadoop.fs.FileContext.rename(FileContext.java:966) ~[?:?]
   	at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:366) ~[?:?]
   	at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62) ~[?:?]
   	... 4 more
   2020-03-10T17:01:12,439 INFO [task-runner-0-priority-0] org.apache.druid.indexing.worker.executor.ExecutorLifecycle - Task completed with status: {
     "id" : "index_hadoop_sign_alert_projectid_ojbfifhc_2020-03-10T17:00:34.680Z",
     "status" : "FAILED",
     "duration" : 21302,
     "errorMsg" : "java.lang.RuntimeException: java.io.IOException: java.util.concurrent.ExecutionException: java.io.IO...",
     "location" : {
       "host" : null,
       "port" : -1,
       "tlsPort" : -1
     }
   }
   2020-03-10T17:01:12,449 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle - Stopping lifecycle [module] stage [ANNOUNCEMENTS]
   2020-03-10T17:01:12,452 INFO [main] org.apache.druid.curator.announcement.Announcer - Unannouncing [/druid/listeners/lookups/__default/http:druid03:8101]
   2020-03-10T17:01:12,471 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle - Stopping lifecycle [module] stage [SERVER]
   2020-03-10T17:01:12,475 INFO [main] org.eclipse.jetty.server.AbstractConnector - Stopped ServerConnector@37f41a81{HTTP/1.1,[http/1.1]}{0.0.0.0:8101}
   2020-03-10T17:01:12,475 INFO [main] org.eclipse.jetty.server.session - node0 Stopped scavenging
   2020-03-10T17:01:12,477 INFO [main] org.eclipse.jetty.server.handler.ContextHandler - Stopped o.e.j.s.ServletContextHandler@68df8c6{/,null,UNAVAILABLE}
   2020-03-10T17:01:12,493 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle - Stopping lifecycle [module] stage [NORMAL]
   2020-03-10T17:01:12,493 INFO [main] org.apache.druid.server.listener.announcer.ListenerResourceAnnouncer - Unannouncing start time on [/druid/listeners/lookups/__default/http:druid03:8101]
   2020-03-10T17:01:12,494 INFO [main] org.apache.druid.indexing.overlord.SingleTaskBackgroundRunner - Starting graceful shutdown of task[index_hadoop_sign_alert_projectid_ojbfifhc_2020-03-10T17:00:34.680Z].
   2020-03-10T17:01:12,500 INFO [LookupExtractorFactoryContainerProvider-MainThread] org.apache.druid.query.lookup.LookupReferencesManager - Lookup Management loop exited. Lookup notices are not handled anymore.
   2020-03-10T17:01:12,526 INFO [Curator-Framework-0] org.apache.curator.framework.imps.CuratorFrameworkImpl - backgroundOperationsLoop exiting
   2020-03-10T17:01:12,531 INFO [main] org.apache.zookeeper.ZooKeeper - Session: 0x26ecf9ca694483b closed
   2020-03-10T17:01:12,532 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle - Stopping lifecycle [module] stage [INIT]
   2020-03-10T17:01:12,534 INFO [main-EventThread] org.apache.zookeeper.ClientCnxn - EventThread shut down for session: 0x26ecf9ca694483b
   Finished peon task
   
   ```
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@druid.apache.org
For additional commands, e-mail: commits-help@druid.apache.org