You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by YouPeng Yang <yy...@gmail.com> on 2013/06/07 17:03:19 UTC

DirectoryScanner's OutOfMemoryError

Hi All

   I have found that the DirectoryScanner gets error:  Error compiling
report because of java.lang.OutOfMemoryError: Java heap space.
  The log details are as [1]:

  How does the error come out ,and how to solve this exception?


[1]
2013-06-07 22:20:28,199 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: Took 2737ms to process 1
commands from NN
2013-06-07 22:20:28,199 INFO
org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
Deleted block BP-471453121-172.16.250.16-1369298226760
blk_1870928037426403148_709040 at file
/home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_1870928037426403148
2013-06-07 22:20:28,199 INFO
org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
Deleted block BP-471453121-172.16.250.16-1369298226760
blk_3693882010743127822_709044 at file
/home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_3693882010743127822
2013-06-07 22:20:28,743 INFO
org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
Deleted block BP-471453121-172.16.250.16-1369298226760
blk_-5452984265504491579_709036 at file
/home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-5452984265504491579
2013-06-07 22:20:28,743 INFO
org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
Deleted block BP-471453121-172.16.250.16-1369298226760
blk_-1078215880381545528_709050 at file
/home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-1078215880381545528
2013-06-07 22:20:28,744 INFO
org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
Deleted block BP-471453121-172.16.250.16-1369298226760
blk_8107220088215975918_709064 at file
/home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_8107220088215975918
2013-06-07 22:20:29,278 INFO
org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
Deleted block BP-471453121-172.16.250.16-1369298226760
blk_-3527717187851336238_709052 at file
/home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-3527717187851336238
2013-06-07 22:20:29,812 INFO
org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
Deleted block BP-471453121-172.16.250.16-1369298226760
blk_1766998327682981895_709042 at file
/home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_1766998327682981895
2013-06-07 22:20:29,812 INFO
org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
Deleted block BP-471453121-172.16.250.16-1369298226760
blk_1650592414141359061_709028 at file
/home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_1650592414141359061
2013-06-07 22:20:29,812 INFO
org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
Deleted block BP-471453121-172.16.250.16-1369298226760
blk_-6527697040536951940_709038 at file
/home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-6527697040536951940
2013-06-07 22:20:43,766 ERROR
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Error compiling
report
java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java
heap space
at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222)
at java.util.concurrent.FutureTask.get(FutureTask.java:83)
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.getDiskReport(DirectoryScanner.java:468)
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.scan(DirectoryScanner.java:349)
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.reconcile(DirectoryScanner.java:330)
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.run(DirectoryScanner.java:286)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
at
java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204)
at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Caused by: java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2882)
at
java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:572)
at java.lang.StringBuilder.append(StringBuilder.java:203)
at java.io.UnixFileSystem.resolve(UnixFileSystem.java:93)
at java.io.File.<init>(File.java:207)
at java.io.File.listFiles(File.java:1056)
at org.apache.hadoop.fs.FileUtil.listFiles(FileUtil.java:730)
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:518)
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:508)
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:493)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
... 3 more
2013-06-07 22:20:43,994 ERROR
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Exception during
DirectoryScanner execution - will continue next cycle
java.lang.RuntimeException: java.util.concurrent.ExecutionException:
java.lang.OutOfMemoryError: Java heap space
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.getDiskReport(DirectoryScanner.java:472)
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.scan(DirectoryScanner.java:349)
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.reconcile(DirectoryScanner.java:330)
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.run(DirectoryScanner.java:286)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
at
java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204)
at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Caused by: java.util.concurrent.ExecutionException:
java.lang.OutOfMemoryError: Java heap space
at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222)
at java.util.concurrent.FutureTask.get(FutureTask.java:83)
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.getDiskReport(DirectoryScanner.java:468)
... 12 more
Caused by: java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2882)
at
java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:572)
at java.lang.StringBuilder.append(StringBuilder.java:203)
at java.io.UnixFileSystem.resolve(UnixFileSystem.java:93)
at java.io.File.<init>(File.java:207)
at java.io.File.listFiles(File.java:1056)
at org.apache.hadoop.fs.FileUtil.listFiles(FileUtil.java:730)
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:518)
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:508)
at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:493)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
... 3 more





2013-06-07 22:30:05,859 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block
BP-471453121-172.16.250.16-1369298226760:blk_9041774691760522723_709080
src: /172.16.250.15:54457 dest: /172.16.250.17:50010
2013-06-07 22:30:05,885 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /
172.16.250.15:54457, dest: /172.16.250.17:50010, bytes: 224277, op:
HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1691104063_1, offset: 0, srvID:
DS-869018356-172.16.250.17-50010-1369298284382, blockid:
BP-471453121-172.16.250.16-1369298226760:blk_9041774691760522723_709080,
duration: 24379944
2013-06-07 22:30:05,886 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder:
BP-471453121-172.16.250.16-1369298226760:blk_9041774691760522723_709080,
type=LAST_IN_PIPELINE, downstreams=0:[] terminating
2013-06-07 22:30:05,983 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block
BP-471453121-172.16.250.16-1369298226760:blk_-7095462972895018344_709084
src: /172.16.250.18:49485 dest: /172.16.250.17:50010

Re: DirectoryScanner's OutOfMemoryError

Posted by Harsh J <ha...@cloudera.com>.
Please see https://issues.apache.org/jira/browse/HDFS-4461. You may
have to raise your heap for DN if you've accumulated a lot of blocks
per DN.

On Fri, Jun 7, 2013 at 8:33 PM, YouPeng Yang <yy...@gmail.com> wrote:
> Hi All
>
>    I have found that the DirectoryScanner gets error:  Error compiling
> report because of java.lang.OutOfMemoryError: Java heap space.
>   The log details are as [1]:
>
>   How does the error come out ,and how to solve this exception?
>
>
> [1]
> 2013-06-07 22:20:28,199 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: Took 2737ms to process 1
> commands from NN
> 2013-06-07 22:20:28,199 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_1870928037426403148_709040 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_1870928037426403148
> 2013-06-07 22:20:28,199 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_3693882010743127822_709044 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_3693882010743127822
> 2013-06-07 22:20:28,743 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_-5452984265504491579_709036 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-5452984265504491579
> 2013-06-07 22:20:28,743 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_-1078215880381545528_709050 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-1078215880381545528
> 2013-06-07 22:20:28,744 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_8107220088215975918_709064 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_8107220088215975918
> 2013-06-07 22:20:29,278 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_-3527717187851336238_709052 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-3527717187851336238
> 2013-06-07 22:20:29,812 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_1766998327682981895_709042 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_1766998327682981895
> 2013-06-07 22:20:29,812 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_1650592414141359061_709028 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_1650592414141359061
> 2013-06-07 22:20:29,812 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_-6527697040536951940_709038 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-6527697040536951940
> 2013-06-07 22:20:43,766 ERROR
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Error compiling
> report
> java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java
> heap space
> at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222)
> at java.util.concurrent.FutureTask.get(FutureTask.java:83)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.getDiskReport(DirectoryScanner.java:468)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.scan(DirectoryScanner.java:349)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.reconcile(DirectoryScanner.java:330)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.run(DirectoryScanner.java:286)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> at
> java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)
> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> at java.lang.Thread.run(Thread.java:662)
> Caused by: java.lang.OutOfMemoryError: Java heap space
> at java.util.Arrays.copyOf(Arrays.java:2882)
> at
> java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100)
> at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:572)
> at java.lang.StringBuilder.append(StringBuilder.java:203)
> at java.io.UnixFileSystem.resolve(UnixFileSystem.java:93)
> at java.io.File.<init>(File.java:207)
> at java.io.File.listFiles(File.java:1056)
> at org.apache.hadoop.fs.FileUtil.listFiles(FileUtil.java:730)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:518)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:508)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:493)
> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> ... 3 more
> 2013-06-07 22:20:43,994 ERROR
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Exception during
> DirectoryScanner execution - will continue next cycle
> java.lang.RuntimeException: java.util.concurrent.ExecutionException:
> java.lang.OutOfMemoryError: Java heap space
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.getDiskReport(DirectoryScanner.java:472)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.scan(DirectoryScanner.java:349)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.reconcile(DirectoryScanner.java:330)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.run(DirectoryScanner.java:286)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> at
> java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)
> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> at java.lang.Thread.run(Thread.java:662)
> Caused by: java.util.concurrent.ExecutionException:
> java.lang.OutOfMemoryError: Java heap space
> at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222)
> at java.util.concurrent.FutureTask.get(FutureTask.java:83)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.getDiskReport(DirectoryScanner.java:468)
> ... 12 more
> Caused by: java.lang.OutOfMemoryError: Java heap space
> at java.util.Arrays.copyOf(Arrays.java:2882)
> at
> java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100)
> at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:572)
> at java.lang.StringBuilder.append(StringBuilder.java:203)
> at java.io.UnixFileSystem.resolve(UnixFileSystem.java:93)
> at java.io.File.<init>(File.java:207)
> at java.io.File.listFiles(File.java:1056)
> at org.apache.hadoop.fs.FileUtil.listFiles(FileUtil.java:730)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:518)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:508)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:493)
> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> ... 3 more
>
>
>
>
>
> 2013-06-07 22:30:05,859 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block
> BP-471453121-172.16.250.16-1369298226760:blk_9041774691760522723_709080 src:
> /172.16.250.15:54457 dest: /172.16.250.17:50010
> 2013-06-07 22:30:05,885 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src:
> /172.16.250.15:54457, dest: /172.16.250.17:50010, bytes: 224277, op:
> HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1691104063_1, offset: 0, srvID:
> DS-869018356-172.16.250.17-50010-1369298284382, blockid:
> BP-471453121-172.16.250.16-1369298226760:blk_9041774691760522723_709080,
> duration: 24379944
> 2013-06-07 22:30:05,886 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder:
> BP-471453121-172.16.250.16-1369298226760:blk_9041774691760522723_709080,
> type=LAST_IN_PIPELINE, downstreams=0:[] terminating
> 2013-06-07 22:30:05,983 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block
> BP-471453121-172.16.250.16-1369298226760:blk_-7095462972895018344_709084
> src: /172.16.250.18:49485 dest: /172.16.250.17:50010
>
>
>
>



--
Harsh J

Re: DirectoryScanner's OutOfMemoryError

Posted by Harsh J <ha...@cloudera.com>.
Please see https://issues.apache.org/jira/browse/HDFS-4461. You may
have to raise your heap for DN if you've accumulated a lot of blocks
per DN.

On Fri, Jun 7, 2013 at 8:33 PM, YouPeng Yang <yy...@gmail.com> wrote:
> Hi All
>
>    I have found that the DirectoryScanner gets error:  Error compiling
> report because of java.lang.OutOfMemoryError: Java heap space.
>   The log details are as [1]:
>
>   How does the error come out ,and how to solve this exception?
>
>
> [1]
> 2013-06-07 22:20:28,199 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: Took 2737ms to process 1
> commands from NN
> 2013-06-07 22:20:28,199 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_1870928037426403148_709040 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_1870928037426403148
> 2013-06-07 22:20:28,199 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_3693882010743127822_709044 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_3693882010743127822
> 2013-06-07 22:20:28,743 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_-5452984265504491579_709036 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-5452984265504491579
> 2013-06-07 22:20:28,743 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_-1078215880381545528_709050 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-1078215880381545528
> 2013-06-07 22:20:28,744 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_8107220088215975918_709064 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_8107220088215975918
> 2013-06-07 22:20:29,278 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_-3527717187851336238_709052 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-3527717187851336238
> 2013-06-07 22:20:29,812 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_1766998327682981895_709042 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_1766998327682981895
> 2013-06-07 22:20:29,812 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_1650592414141359061_709028 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_1650592414141359061
> 2013-06-07 22:20:29,812 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_-6527697040536951940_709038 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-6527697040536951940
> 2013-06-07 22:20:43,766 ERROR
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Error compiling
> report
> java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java
> heap space
> at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222)
> at java.util.concurrent.FutureTask.get(FutureTask.java:83)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.getDiskReport(DirectoryScanner.java:468)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.scan(DirectoryScanner.java:349)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.reconcile(DirectoryScanner.java:330)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.run(DirectoryScanner.java:286)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> at
> java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)
> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> at java.lang.Thread.run(Thread.java:662)
> Caused by: java.lang.OutOfMemoryError: Java heap space
> at java.util.Arrays.copyOf(Arrays.java:2882)
> at
> java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100)
> at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:572)
> at java.lang.StringBuilder.append(StringBuilder.java:203)
> at java.io.UnixFileSystem.resolve(UnixFileSystem.java:93)
> at java.io.File.<init>(File.java:207)
> at java.io.File.listFiles(File.java:1056)
> at org.apache.hadoop.fs.FileUtil.listFiles(FileUtil.java:730)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:518)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:508)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:493)
> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> ... 3 more
> 2013-06-07 22:20:43,994 ERROR
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Exception during
> DirectoryScanner execution - will continue next cycle
> java.lang.RuntimeException: java.util.concurrent.ExecutionException:
> java.lang.OutOfMemoryError: Java heap space
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.getDiskReport(DirectoryScanner.java:472)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.scan(DirectoryScanner.java:349)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.reconcile(DirectoryScanner.java:330)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.run(DirectoryScanner.java:286)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> at
> java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)
> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> at java.lang.Thread.run(Thread.java:662)
> Caused by: java.util.concurrent.ExecutionException:
> java.lang.OutOfMemoryError: Java heap space
> at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222)
> at java.util.concurrent.FutureTask.get(FutureTask.java:83)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.getDiskReport(DirectoryScanner.java:468)
> ... 12 more
> Caused by: java.lang.OutOfMemoryError: Java heap space
> at java.util.Arrays.copyOf(Arrays.java:2882)
> at
> java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100)
> at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:572)
> at java.lang.StringBuilder.append(StringBuilder.java:203)
> at java.io.UnixFileSystem.resolve(UnixFileSystem.java:93)
> at java.io.File.<init>(File.java:207)
> at java.io.File.listFiles(File.java:1056)
> at org.apache.hadoop.fs.FileUtil.listFiles(FileUtil.java:730)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:518)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:508)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:493)
> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> ... 3 more
>
>
>
>
>
> 2013-06-07 22:30:05,859 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block
> BP-471453121-172.16.250.16-1369298226760:blk_9041774691760522723_709080 src:
> /172.16.250.15:54457 dest: /172.16.250.17:50010
> 2013-06-07 22:30:05,885 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src:
> /172.16.250.15:54457, dest: /172.16.250.17:50010, bytes: 224277, op:
> HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1691104063_1, offset: 0, srvID:
> DS-869018356-172.16.250.17-50010-1369298284382, blockid:
> BP-471453121-172.16.250.16-1369298226760:blk_9041774691760522723_709080,
> duration: 24379944
> 2013-06-07 22:30:05,886 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder:
> BP-471453121-172.16.250.16-1369298226760:blk_9041774691760522723_709080,
> type=LAST_IN_PIPELINE, downstreams=0:[] terminating
> 2013-06-07 22:30:05,983 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block
> BP-471453121-172.16.250.16-1369298226760:blk_-7095462972895018344_709084
> src: /172.16.250.18:49485 dest: /172.16.250.17:50010
>
>
>
>



--
Harsh J

Re: DirectoryScanner's OutOfMemoryError

Posted by Harsh J <ha...@cloudera.com>.
Please see https://issues.apache.org/jira/browse/HDFS-4461. You may
have to raise your heap for DN if you've accumulated a lot of blocks
per DN.

On Fri, Jun 7, 2013 at 8:33 PM, YouPeng Yang <yy...@gmail.com> wrote:
> Hi All
>
>    I have found that the DirectoryScanner gets error:  Error compiling
> report because of java.lang.OutOfMemoryError: Java heap space.
>   The log details are as [1]:
>
>   How does the error come out ,and how to solve this exception?
>
>
> [1]
> 2013-06-07 22:20:28,199 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: Took 2737ms to process 1
> commands from NN
> 2013-06-07 22:20:28,199 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_1870928037426403148_709040 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_1870928037426403148
> 2013-06-07 22:20:28,199 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_3693882010743127822_709044 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_3693882010743127822
> 2013-06-07 22:20:28,743 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_-5452984265504491579_709036 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-5452984265504491579
> 2013-06-07 22:20:28,743 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_-1078215880381545528_709050 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-1078215880381545528
> 2013-06-07 22:20:28,744 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_8107220088215975918_709064 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_8107220088215975918
> 2013-06-07 22:20:29,278 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_-3527717187851336238_709052 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-3527717187851336238
> 2013-06-07 22:20:29,812 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_1766998327682981895_709042 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_1766998327682981895
> 2013-06-07 22:20:29,812 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_1650592414141359061_709028 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_1650592414141359061
> 2013-06-07 22:20:29,812 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_-6527697040536951940_709038 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-6527697040536951940
> 2013-06-07 22:20:43,766 ERROR
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Error compiling
> report
> java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java
> heap space
> at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222)
> at java.util.concurrent.FutureTask.get(FutureTask.java:83)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.getDiskReport(DirectoryScanner.java:468)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.scan(DirectoryScanner.java:349)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.reconcile(DirectoryScanner.java:330)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.run(DirectoryScanner.java:286)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> at
> java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)
> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> at java.lang.Thread.run(Thread.java:662)
> Caused by: java.lang.OutOfMemoryError: Java heap space
> at java.util.Arrays.copyOf(Arrays.java:2882)
> at
> java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100)
> at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:572)
> at java.lang.StringBuilder.append(StringBuilder.java:203)
> at java.io.UnixFileSystem.resolve(UnixFileSystem.java:93)
> at java.io.File.<init>(File.java:207)
> at java.io.File.listFiles(File.java:1056)
> at org.apache.hadoop.fs.FileUtil.listFiles(FileUtil.java:730)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:518)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:508)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:493)
> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> ... 3 more
> 2013-06-07 22:20:43,994 ERROR
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Exception during
> DirectoryScanner execution - will continue next cycle
> java.lang.RuntimeException: java.util.concurrent.ExecutionException:
> java.lang.OutOfMemoryError: Java heap space
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.getDiskReport(DirectoryScanner.java:472)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.scan(DirectoryScanner.java:349)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.reconcile(DirectoryScanner.java:330)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.run(DirectoryScanner.java:286)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> at
> java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)
> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> at java.lang.Thread.run(Thread.java:662)
> Caused by: java.util.concurrent.ExecutionException:
> java.lang.OutOfMemoryError: Java heap space
> at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222)
> at java.util.concurrent.FutureTask.get(FutureTask.java:83)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.getDiskReport(DirectoryScanner.java:468)
> ... 12 more
> Caused by: java.lang.OutOfMemoryError: Java heap space
> at java.util.Arrays.copyOf(Arrays.java:2882)
> at
> java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100)
> at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:572)
> at java.lang.StringBuilder.append(StringBuilder.java:203)
> at java.io.UnixFileSystem.resolve(UnixFileSystem.java:93)
> at java.io.File.<init>(File.java:207)
> at java.io.File.listFiles(File.java:1056)
> at org.apache.hadoop.fs.FileUtil.listFiles(FileUtil.java:730)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:518)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:508)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:493)
> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> ... 3 more
>
>
>
>
>
> 2013-06-07 22:30:05,859 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block
> BP-471453121-172.16.250.16-1369298226760:blk_9041774691760522723_709080 src:
> /172.16.250.15:54457 dest: /172.16.250.17:50010
> 2013-06-07 22:30:05,885 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src:
> /172.16.250.15:54457, dest: /172.16.250.17:50010, bytes: 224277, op:
> HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1691104063_1, offset: 0, srvID:
> DS-869018356-172.16.250.17-50010-1369298284382, blockid:
> BP-471453121-172.16.250.16-1369298226760:blk_9041774691760522723_709080,
> duration: 24379944
> 2013-06-07 22:30:05,886 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder:
> BP-471453121-172.16.250.16-1369298226760:blk_9041774691760522723_709080,
> type=LAST_IN_PIPELINE, downstreams=0:[] terminating
> 2013-06-07 22:30:05,983 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block
> BP-471453121-172.16.250.16-1369298226760:blk_-7095462972895018344_709084
> src: /172.16.250.18:49485 dest: /172.16.250.17:50010
>
>
>
>



--
Harsh J

Re: DirectoryScanner's OutOfMemoryError

Posted by Harsh J <ha...@cloudera.com>.
Please see https://issues.apache.org/jira/browse/HDFS-4461. You may
have to raise your heap for DN if you've accumulated a lot of blocks
per DN.

On Fri, Jun 7, 2013 at 8:33 PM, YouPeng Yang <yy...@gmail.com> wrote:
> Hi All
>
>    I have found that the DirectoryScanner gets error:  Error compiling
> report because of java.lang.OutOfMemoryError: Java heap space.
>   The log details are as [1]:
>
>   How does the error come out ,and how to solve this exception?
>
>
> [1]
> 2013-06-07 22:20:28,199 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: Took 2737ms to process 1
> commands from NN
> 2013-06-07 22:20:28,199 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_1870928037426403148_709040 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_1870928037426403148
> 2013-06-07 22:20:28,199 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_3693882010743127822_709044 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_3693882010743127822
> 2013-06-07 22:20:28,743 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_-5452984265504491579_709036 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-5452984265504491579
> 2013-06-07 22:20:28,743 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_-1078215880381545528_709050 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-1078215880381545528
> 2013-06-07 22:20:28,744 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_8107220088215975918_709064 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_8107220088215975918
> 2013-06-07 22:20:29,278 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_-3527717187851336238_709052 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-3527717187851336238
> 2013-06-07 22:20:29,812 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_1766998327682981895_709042 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_1766998327682981895
> 2013-06-07 22:20:29,812 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_1650592414141359061_709028 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_1650592414141359061
> 2013-06-07 22:20:29,812 INFO
> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService:
> Deleted block BP-471453121-172.16.250.16-1369298226760
> blk_-6527697040536951940_709038 at file
> /home/hadoop/datadir/current/BP-471453121-172.16.250.16-1369298226760/current/finalized/subdir24/subdir25/blk_-6527697040536951940
> 2013-06-07 22:20:43,766 ERROR
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Error compiling
> report
> java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java
> heap space
> at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222)
> at java.util.concurrent.FutureTask.get(FutureTask.java:83)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.getDiskReport(DirectoryScanner.java:468)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.scan(DirectoryScanner.java:349)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.reconcile(DirectoryScanner.java:330)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.run(DirectoryScanner.java:286)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> at
> java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)
> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> at java.lang.Thread.run(Thread.java:662)
> Caused by: java.lang.OutOfMemoryError: Java heap space
> at java.util.Arrays.copyOf(Arrays.java:2882)
> at
> java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100)
> at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:572)
> at java.lang.StringBuilder.append(StringBuilder.java:203)
> at java.io.UnixFileSystem.resolve(UnixFileSystem.java:93)
> at java.io.File.<init>(File.java:207)
> at java.io.File.listFiles(File.java:1056)
> at org.apache.hadoop.fs.FileUtil.listFiles(FileUtil.java:730)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:518)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:508)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:493)
> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> ... 3 more
> 2013-06-07 22:20:43,994 ERROR
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Exception during
> DirectoryScanner execution - will continue next cycle
> java.lang.RuntimeException: java.util.concurrent.ExecutionException:
> java.lang.OutOfMemoryError: Java heap space
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.getDiskReport(DirectoryScanner.java:472)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.scan(DirectoryScanner.java:349)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.reconcile(DirectoryScanner.java:330)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.run(DirectoryScanner.java:286)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> at
> java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)
> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> at java.lang.Thread.run(Thread.java:662)
> Caused by: java.util.concurrent.ExecutionException:
> java.lang.OutOfMemoryError: Java heap space
> at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222)
> at java.util.concurrent.FutureTask.get(FutureTask.java:83)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.getDiskReport(DirectoryScanner.java:468)
> ... 12 more
> Caused by: java.lang.OutOfMemoryError: Java heap space
> at java.util.Arrays.copyOf(Arrays.java:2882)
> at
> java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100)
> at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:572)
> at java.lang.StringBuilder.append(StringBuilder.java:203)
> at java.io.UnixFileSystem.resolve(UnixFileSystem.java:93)
> at java.io.File.<init>(File.java:207)
> at java.io.File.listFiles(File.java:1056)
> at org.apache.hadoop.fs.FileUtil.listFiles(FileUtil.java:730)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:518)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.compileReport(DirectoryScanner.java:533)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:508)
> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:493)
> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> ... 3 more
>
>
>
>
>
> 2013-06-07 22:30:05,859 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block
> BP-471453121-172.16.250.16-1369298226760:blk_9041774691760522723_709080 src:
> /172.16.250.15:54457 dest: /172.16.250.17:50010
> 2013-06-07 22:30:05,885 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src:
> /172.16.250.15:54457, dest: /172.16.250.17:50010, bytes: 224277, op:
> HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1691104063_1, offset: 0, srvID:
> DS-869018356-172.16.250.17-50010-1369298284382, blockid:
> BP-471453121-172.16.250.16-1369298226760:blk_9041774691760522723_709080,
> duration: 24379944
> 2013-06-07 22:30:05,886 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder:
> BP-471453121-172.16.250.16-1369298226760:blk_9041774691760522723_709080,
> type=LAST_IN_PIPELINE, downstreams=0:[] terminating
> 2013-06-07 22:30:05,983 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block
> BP-471453121-172.16.250.16-1369298226760:blk_-7095462972895018344_709084
> src: /172.16.250.18:49485 dest: /172.16.250.17:50010
>
>
>
>



--
Harsh J