You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Arun C Murthy (JIRA)" <ji...@apache.org> on 2006/09/05 19:09:24 UTC
[jira] Created: (HADOOP-507) Runtime exception in
org.apache.hadoop.io.WritableFactories.newInstance when trying to startup
namenode/datanode
Runtime exception in org.apache.hadoop.io.WritableFactories.newInstance when trying to startup namenode/datanode
----------------------------------------------------------------------------------------------------------------
Key: HADOOP-507
URL: http://issues.apache.org/jira/browse/HADOOP-507
Project: Hadoop
Issue Type: Bug
Components: util
Affects Versions: 0.5.0
Reporter: Arun C Murthy
Assigned To: Owen O'Malley
Here's the logs:
arun@neo ~/dev/java/latest-hadoop/trunk $ cat /home/arun/dev/java/hadoop-0.4.0/build/libhdfs/tests/logs/hadoop-arun-namenode-neo.out
2006-09-05 22:18:39,756 INFO conf.Configuration (Configuration.java:loadResource(496)) - parsing file:/home/arun/dev/java/hadoop-0.4.0/src/c++/libhdfs/tests/conf/hadoop-default.xml
2006-09-05 22:18:39,804 INFO conf.Configuration (Configuration.java:loadResource(496)) - parsing file:/home/arun/dev/java/hadoop-0.4.0/src/c++/libhdfs/tests/conf/hadoop-site.xml
2006-09-05 22:18:39,918 INFO util.Credential (FileResource.java:<clinit>(60)) - Checking Resource aliases
2006-09-05 22:18:39,935 INFO http.HttpServer (HttpServer.java:doStart(729)) - Version Jetty/5.1.4
2006-09-05 22:18:40,366 INFO util.Container (Container.java:start(74)) - Started org.mortbay.jetty.servlet.WebApplicationHandler@1171b26
2006-09-05 22:18:40,478 INFO util.Container (Container.java:start(74)) - Started WebApplicationContext[/,/]
2006-09-05 22:18:40,478 INFO util.Container (Container.java:start(74)) - Started HttpContext[/logs,/logs]
2006-09-05 22:18:40,479 INFO util.Container (Container.java:start(74)) - Started HttpContext[/static,/static]
2006-09-05 22:18:40,485 INFO http.SocketListener (SocketListener.java:start(204)) - Started SocketListener on 0.0.0.0:50070
2006-09-05 22:18:40,487 INFO util.Container (Container.java:start(74)) - Started org.mortbay.jetty.Server@1b09468
Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalAccessException: Class org.apache.hadoop.io.WritableFactories can not access a member of class org.apache.hadoop.dfs.Block with modifiers "public"
at org.apache.hadoop.io.WritableFactories.newInstance(WritableFactories.java:49)
at org.apache.hadoop.io.ArrayWritable.readFields(ArrayWritable.java:81)
at org.apache.hadoop.dfs.FSEditLog.loadFSEdits(FSEditLog.java:134)
at org.apache.hadoop.dfs.FSImage.loadFSImage(FSImage.java:157)
at org.apache.hadoop.dfs.FSDirectory.loadFSImage(FSDirectory.java:317)
at org.apache.hadoop.dfs.FSNamesystem.<init>(FSNamesystem.java:199)
at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:132)
at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:123)
at org.apache.hadoop.dfs.NameNode.main(NameNode.java:543)
Caused by: java.lang.IllegalAccessException: Class org.apache.hadoop.io.WritableFactories can not access a member of class org.apache.hadoop.dfs.Block with modifiers "public"
at sun.reflect.Reflection.ensureMemberAccess(Reflection.java:65)
at java.lang.Class.newInstance0(Class.java:344)
at java.lang.Class.newInstance(Class.java:303)
at org.apache.hadoop.io.WritableFactories.newInstance(WritableFactories.java:45)
... 8 more
Steps to reproduce:
1. Start namenode/datanode
2. Run hdfs_test program (part of libhdfs)
3. Stop namenode/datanode
4. goto step 1
--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] Updated: (HADOOP-507) Runtime exception in
org.apache.hadoop.io.WritableFactories.newInstance when trying to startup
namenode/datanode
Posted by "Owen O'Malley (JIRA)" <ji...@apache.org>.
[ http://issues.apache.org/jira/browse/HADOOP-507?page=all ]
Owen O'Malley updated HADOOP-507:
---------------------------------
Status: Patch Available (was: Open)
Fix Version/s: 0.6.0
> Runtime exception in org.apache.hadoop.io.WritableFactories.newInstance when trying to startup namenode/datanode
> ----------------------------------------------------------------------------------------------------------------
>
> Key: HADOOP-507
> URL: http://issues.apache.org/jira/browse/HADOOP-507
> Project: Hadoop
> Issue Type: Bug
> Components: util
> Affects Versions: 0.5.0
> Reporter: Arun C Murthy
> Assigned To: Owen O'Malley
> Fix For: 0.6.0
>
> Attachments: write-factory.patch
>
>
> Here's the logs:
> arun@neo ~/dev/java/latest-hadoop/trunk $ cat /home/arun/dev/java/hadoop-0.4.0/build/libhdfs/tests/logs/hadoop-arun-namenode-neo.out
> 2006-09-05 22:18:39,756 INFO conf.Configuration (Configuration.java:loadResource(496)) - parsing file:/home/arun/dev/java/hadoop-0.4.0/src/c++/libhdfs/tests/conf/hadoop-default.xml
> 2006-09-05 22:18:39,804 INFO conf.Configuration (Configuration.java:loadResource(496)) - parsing file:/home/arun/dev/java/hadoop-0.4.0/src/c++/libhdfs/tests/conf/hadoop-site.xml
> 2006-09-05 22:18:39,918 INFO util.Credential (FileResource.java:<clinit>(60)) - Checking Resource aliases
> 2006-09-05 22:18:39,935 INFO http.HttpServer (HttpServer.java:doStart(729)) - Version Jetty/5.1.4
> 2006-09-05 22:18:40,366 INFO util.Container (Container.java:start(74)) - Started org.mortbay.jetty.servlet.WebApplicationHandler@1171b26
> 2006-09-05 22:18:40,478 INFO util.Container (Container.java:start(74)) - Started WebApplicationContext[/,/]
> 2006-09-05 22:18:40,478 INFO util.Container (Container.java:start(74)) - Started HttpContext[/logs,/logs]
> 2006-09-05 22:18:40,479 INFO util.Container (Container.java:start(74)) - Started HttpContext[/static,/static]
> 2006-09-05 22:18:40,485 INFO http.SocketListener (SocketListener.java:start(204)) - Started SocketListener on 0.0.0.0:50070
> 2006-09-05 22:18:40,487 INFO util.Container (Container.java:start(74)) - Started org.mortbay.jetty.Server@1b09468
> Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalAccessException: Class org.apache.hadoop.io.WritableFactories can not access a member of class org.apache.hadoop.dfs.Block with modifiers "public"
> at org.apache.hadoop.io.WritableFactories.newInstance(WritableFactories.java:49)
> at org.apache.hadoop.io.ArrayWritable.readFields(ArrayWritable.java:81)
> at org.apache.hadoop.dfs.FSEditLog.loadFSEdits(FSEditLog.java:134)
> at org.apache.hadoop.dfs.FSImage.loadFSImage(FSImage.java:157)
> at org.apache.hadoop.dfs.FSDirectory.loadFSImage(FSDirectory.java:317)
> at org.apache.hadoop.dfs.FSNamesystem.<init>(FSNamesystem.java:199)
> at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:132)
> at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:123)
> at org.apache.hadoop.dfs.NameNode.main(NameNode.java:543)
> Caused by: java.lang.IllegalAccessException: Class org.apache.hadoop.io.WritableFactories can not access a member of class org.apache.hadoop.dfs.Block with modifiers "public"
> at sun.reflect.Reflection.ensureMemberAccess(Reflection.java:65)
> at java.lang.Class.newInstance0(Class.java:344)
> at java.lang.Class.newInstance(Class.java:303)
> at org.apache.hadoop.io.WritableFactories.newInstance(WritableFactories.java:45)
> ... 8 more
> Steps to reproduce:
> 1. Start namenode/datanode
> 2. Run hdfs_test program (part of libhdfs)
> 3. Stop namenode/datanode
> 4. goto step 1
--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] Commented: (HADOOP-507) Runtime exception in
org.apache.hadoop.io.WritableFactories.newInstance when trying to startup
namenode/datanode
Posted by "Arun C Murthy (JIRA)" <ji...@apache.org>.
[ http://issues.apache.org/jira/browse/HADOOP-507?page=comments#action_12432635 ]
Arun C Murthy commented on HADOOP-507:
--------------------------------------
Works great for me Owen, thanks!
> Runtime exception in org.apache.hadoop.io.WritableFactories.newInstance when trying to startup namenode/datanode
> ----------------------------------------------------------------------------------------------------------------
>
> Key: HADOOP-507
> URL: http://issues.apache.org/jira/browse/HADOOP-507
> Project: Hadoop
> Issue Type: Bug
> Components: util
> Affects Versions: 0.5.0
> Reporter: Arun C Murthy
> Assigned To: Owen O'Malley
> Attachments: write-factory.patch
>
>
> Here's the logs:
> arun@neo ~/dev/java/latest-hadoop/trunk $ cat /home/arun/dev/java/hadoop-0.4.0/build/libhdfs/tests/logs/hadoop-arun-namenode-neo.out
> 2006-09-05 22:18:39,756 INFO conf.Configuration (Configuration.java:loadResource(496)) - parsing file:/home/arun/dev/java/hadoop-0.4.0/src/c++/libhdfs/tests/conf/hadoop-default.xml
> 2006-09-05 22:18:39,804 INFO conf.Configuration (Configuration.java:loadResource(496)) - parsing file:/home/arun/dev/java/hadoop-0.4.0/src/c++/libhdfs/tests/conf/hadoop-site.xml
> 2006-09-05 22:18:39,918 INFO util.Credential (FileResource.java:<clinit>(60)) - Checking Resource aliases
> 2006-09-05 22:18:39,935 INFO http.HttpServer (HttpServer.java:doStart(729)) - Version Jetty/5.1.4
> 2006-09-05 22:18:40,366 INFO util.Container (Container.java:start(74)) - Started org.mortbay.jetty.servlet.WebApplicationHandler@1171b26
> 2006-09-05 22:18:40,478 INFO util.Container (Container.java:start(74)) - Started WebApplicationContext[/,/]
> 2006-09-05 22:18:40,478 INFO util.Container (Container.java:start(74)) - Started HttpContext[/logs,/logs]
> 2006-09-05 22:18:40,479 INFO util.Container (Container.java:start(74)) - Started HttpContext[/static,/static]
> 2006-09-05 22:18:40,485 INFO http.SocketListener (SocketListener.java:start(204)) - Started SocketListener on 0.0.0.0:50070
> 2006-09-05 22:18:40,487 INFO util.Container (Container.java:start(74)) - Started org.mortbay.jetty.Server@1b09468
> Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalAccessException: Class org.apache.hadoop.io.WritableFactories can not access a member of class org.apache.hadoop.dfs.Block with modifiers "public"
> at org.apache.hadoop.io.WritableFactories.newInstance(WritableFactories.java:49)
> at org.apache.hadoop.io.ArrayWritable.readFields(ArrayWritable.java:81)
> at org.apache.hadoop.dfs.FSEditLog.loadFSEdits(FSEditLog.java:134)
> at org.apache.hadoop.dfs.FSImage.loadFSImage(FSImage.java:157)
> at org.apache.hadoop.dfs.FSDirectory.loadFSImage(FSDirectory.java:317)
> at org.apache.hadoop.dfs.FSNamesystem.<init>(FSNamesystem.java:199)
> at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:132)
> at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:123)
> at org.apache.hadoop.dfs.NameNode.main(NameNode.java:543)
> Caused by: java.lang.IllegalAccessException: Class org.apache.hadoop.io.WritableFactories can not access a member of class org.apache.hadoop.dfs.Block with modifiers "public"
> at sun.reflect.Reflection.ensureMemberAccess(Reflection.java:65)
> at java.lang.Class.newInstance0(Class.java:344)
> at java.lang.Class.newInstance(Class.java:303)
> at org.apache.hadoop.io.WritableFactories.newInstance(WritableFactories.java:45)
> ... 8 more
> Steps to reproduce:
> 1. Start namenode/datanode
> 2. Run hdfs_test program (part of libhdfs)
> 3. Stop namenode/datanode
> 4. goto step 1
--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] Updated: (HADOOP-507) Runtime exception in
org.apache.hadoop.io.WritableFactories.newInstance when trying to startup
namenode/datanode
Posted by "Owen O'Malley (JIRA)" <ji...@apache.org>.
[ http://issues.apache.org/jira/browse/HADOOP-507?page=all ]
Owen O'Malley updated HADOOP-507:
---------------------------------
Attachment: write-factory.patch
This is yet another instance of the class permission problems. This patch changes the non-factory case to call ReflectionUtils.newInstance, which has the class initialization code.
> Runtime exception in org.apache.hadoop.io.WritableFactories.newInstance when trying to startup namenode/datanode
> ----------------------------------------------------------------------------------------------------------------
>
> Key: HADOOP-507
> URL: http://issues.apache.org/jira/browse/HADOOP-507
> Project: Hadoop
> Issue Type: Bug
> Components: util
> Affects Versions: 0.5.0
> Reporter: Arun C Murthy
> Assigned To: Owen O'Malley
> Attachments: write-factory.patch
>
>
> Here's the logs:
> arun@neo ~/dev/java/latest-hadoop/trunk $ cat /home/arun/dev/java/hadoop-0.4.0/build/libhdfs/tests/logs/hadoop-arun-namenode-neo.out
> 2006-09-05 22:18:39,756 INFO conf.Configuration (Configuration.java:loadResource(496)) - parsing file:/home/arun/dev/java/hadoop-0.4.0/src/c++/libhdfs/tests/conf/hadoop-default.xml
> 2006-09-05 22:18:39,804 INFO conf.Configuration (Configuration.java:loadResource(496)) - parsing file:/home/arun/dev/java/hadoop-0.4.0/src/c++/libhdfs/tests/conf/hadoop-site.xml
> 2006-09-05 22:18:39,918 INFO util.Credential (FileResource.java:<clinit>(60)) - Checking Resource aliases
> 2006-09-05 22:18:39,935 INFO http.HttpServer (HttpServer.java:doStart(729)) - Version Jetty/5.1.4
> 2006-09-05 22:18:40,366 INFO util.Container (Container.java:start(74)) - Started org.mortbay.jetty.servlet.WebApplicationHandler@1171b26
> 2006-09-05 22:18:40,478 INFO util.Container (Container.java:start(74)) - Started WebApplicationContext[/,/]
> 2006-09-05 22:18:40,478 INFO util.Container (Container.java:start(74)) - Started HttpContext[/logs,/logs]
> 2006-09-05 22:18:40,479 INFO util.Container (Container.java:start(74)) - Started HttpContext[/static,/static]
> 2006-09-05 22:18:40,485 INFO http.SocketListener (SocketListener.java:start(204)) - Started SocketListener on 0.0.0.0:50070
> 2006-09-05 22:18:40,487 INFO util.Container (Container.java:start(74)) - Started org.mortbay.jetty.Server@1b09468
> Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalAccessException: Class org.apache.hadoop.io.WritableFactories can not access a member of class org.apache.hadoop.dfs.Block with modifiers "public"
> at org.apache.hadoop.io.WritableFactories.newInstance(WritableFactories.java:49)
> at org.apache.hadoop.io.ArrayWritable.readFields(ArrayWritable.java:81)
> at org.apache.hadoop.dfs.FSEditLog.loadFSEdits(FSEditLog.java:134)
> at org.apache.hadoop.dfs.FSImage.loadFSImage(FSImage.java:157)
> at org.apache.hadoop.dfs.FSDirectory.loadFSImage(FSDirectory.java:317)
> at org.apache.hadoop.dfs.FSNamesystem.<init>(FSNamesystem.java:199)
> at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:132)
> at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:123)
> at org.apache.hadoop.dfs.NameNode.main(NameNode.java:543)
> Caused by: java.lang.IllegalAccessException: Class org.apache.hadoop.io.WritableFactories can not access a member of class org.apache.hadoop.dfs.Block with modifiers "public"
> at sun.reflect.Reflection.ensureMemberAccess(Reflection.java:65)
> at java.lang.Class.newInstance0(Class.java:344)
> at java.lang.Class.newInstance(Class.java:303)
> at org.apache.hadoop.io.WritableFactories.newInstance(WritableFactories.java:45)
> ... 8 more
> Steps to reproduce:
> 1. Start namenode/datanode
> 2. Run hdfs_test program (part of libhdfs)
> 3. Stop namenode/datanode
> 4. goto step 1
--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] Updated: (HADOOP-507) Runtime exception in
org.apache.hadoop.io.WritableFactories.newInstance when trying to startup
namenode/datanode
Posted by "Doug Cutting (JIRA)" <ji...@apache.org>.
[ http://issues.apache.org/jira/browse/HADOOP-507?page=all ]
Doug Cutting updated HADOOP-507:
--------------------------------
Status: Resolved (was: Patch Available)
Resolution: Fixed
I just committed this. Thanks, Owen!
> Runtime exception in org.apache.hadoop.io.WritableFactories.newInstance when trying to startup namenode/datanode
> ----------------------------------------------------------------------------------------------------------------
>
> Key: HADOOP-507
> URL: http://issues.apache.org/jira/browse/HADOOP-507
> Project: Hadoop
> Issue Type: Bug
> Components: util
> Affects Versions: 0.5.0
> Reporter: Arun C Murthy
> Assigned To: Owen O'Malley
> Fix For: 0.6.0
>
> Attachments: write-factory.patch
>
>
> Here's the logs:
> arun@neo ~/dev/java/latest-hadoop/trunk $ cat /home/arun/dev/java/hadoop-0.4.0/build/libhdfs/tests/logs/hadoop-arun-namenode-neo.out
> 2006-09-05 22:18:39,756 INFO conf.Configuration (Configuration.java:loadResource(496)) - parsing file:/home/arun/dev/java/hadoop-0.4.0/src/c++/libhdfs/tests/conf/hadoop-default.xml
> 2006-09-05 22:18:39,804 INFO conf.Configuration (Configuration.java:loadResource(496)) - parsing file:/home/arun/dev/java/hadoop-0.4.0/src/c++/libhdfs/tests/conf/hadoop-site.xml
> 2006-09-05 22:18:39,918 INFO util.Credential (FileResource.java:<clinit>(60)) - Checking Resource aliases
> 2006-09-05 22:18:39,935 INFO http.HttpServer (HttpServer.java:doStart(729)) - Version Jetty/5.1.4
> 2006-09-05 22:18:40,366 INFO util.Container (Container.java:start(74)) - Started org.mortbay.jetty.servlet.WebApplicationHandler@1171b26
> 2006-09-05 22:18:40,478 INFO util.Container (Container.java:start(74)) - Started WebApplicationContext[/,/]
> 2006-09-05 22:18:40,478 INFO util.Container (Container.java:start(74)) - Started HttpContext[/logs,/logs]
> 2006-09-05 22:18:40,479 INFO util.Container (Container.java:start(74)) - Started HttpContext[/static,/static]
> 2006-09-05 22:18:40,485 INFO http.SocketListener (SocketListener.java:start(204)) - Started SocketListener on 0.0.0.0:50070
> 2006-09-05 22:18:40,487 INFO util.Container (Container.java:start(74)) - Started org.mortbay.jetty.Server@1b09468
> Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalAccessException: Class org.apache.hadoop.io.WritableFactories can not access a member of class org.apache.hadoop.dfs.Block with modifiers "public"
> at org.apache.hadoop.io.WritableFactories.newInstance(WritableFactories.java:49)
> at org.apache.hadoop.io.ArrayWritable.readFields(ArrayWritable.java:81)
> at org.apache.hadoop.dfs.FSEditLog.loadFSEdits(FSEditLog.java:134)
> at org.apache.hadoop.dfs.FSImage.loadFSImage(FSImage.java:157)
> at org.apache.hadoop.dfs.FSDirectory.loadFSImage(FSDirectory.java:317)
> at org.apache.hadoop.dfs.FSNamesystem.<init>(FSNamesystem.java:199)
> at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:132)
> at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:123)
> at org.apache.hadoop.dfs.NameNode.main(NameNode.java:543)
> Caused by: java.lang.IllegalAccessException: Class org.apache.hadoop.io.WritableFactories can not access a member of class org.apache.hadoop.dfs.Block with modifiers "public"
> at sun.reflect.Reflection.ensureMemberAccess(Reflection.java:65)
> at java.lang.Class.newInstance0(Class.java:344)
> at java.lang.Class.newInstance(Class.java:303)
> at org.apache.hadoop.io.WritableFactories.newInstance(WritableFactories.java:45)
> ... 8 more
> Steps to reproduce:
> 1. Start namenode/datanode
> 2. Run hdfs_test program (part of libhdfs)
> 3. Stop namenode/datanode
> 4. goto step 1
--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira