You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@giraph.apache.org by John Yost <so...@gmail.com> on 2014/07/01 04:23:44 UTC

Re: Couldn't instantiate

Hi Carmen,

Question--did you only define an arguments constructor?  If so, I think you
are getting this because you did not define a no-arguments constructor with
public visibility.  If this is not the case, I recommend posting your
source code and I will be happy to help.

--John


On Mon, Jun 30, 2014 at 9:38 AM, Carmen Manzulli <ca...@gmail.com>
wrote:

> Hi,
>
> I'm trying to run a selectionComputation with my own code for VertexInputFormat but giraph' job starts to work and then fails with:
>
>
>
>
> java.lang.IllegalStateException: run: Caught an unrecoverable exception newInstance: Couldn't instantiate sisinflab.SimpleRDFVertexInputFormat
> 	at org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:101)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
> 	at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:415)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
> 	at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: java.lang.IllegalStateException: newInstance: Couldn't instantiate sisinflab.SimpleRDFVertexInputFormat
> 	at org.apache.giraph.utils.ReflectionUtils.newInstance(ReflectionUtils.java:105)
> 	at org.apache.giraph.conf.ImmutableClassesGiraphConfiguration.createVertexInputFormat(ImmutableClassesGiraphConfiguration.java:235)
> 	at org.apache.giraph.conf.ImmutableClassesGiraphConfiguration.createWrappedVertexInputFormat(ImmutableClassesGiraphConfiguration.java:246)
> 	at org.apache.giraph.graph.GraphTaskManager.checkInput(GraphTaskManager.java:171)
> 	at org.apache.giraph.graph.GraphTaskManager.setup(GraphTaskManager.java:207)
> 	at org.apache.giraph.graph.GraphMapper.setup(GraphMapper.java:59)
> 	at org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:89)
> 	... 7 more
> Caused by: java.lang.InstantiationException
> 	at sun.reflect.InstantiationExceptionConstructorAccessorImpl.newInstance(InstantiationExceptionConstructorAccessorImpl.java:48)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> 	at java.lang.Class.newInstance(Class.java:374)
> 	at org.apache.giraph.utils.ReflectionUtils.newInstance(ReflectionUtils.java:103)
> 	... 13 more
>
>
> what does it mean? where is the problem?
>
> Who can help me?
>
> Carmen
>
>

Re: Couldn't instantiate

Posted by Carmen Manzulli <ca...@gmail.com>.
ok john the problem for the example of ShortestPathsComputation was
mapred.map.max.attempts property...now it runs well



2014-07-08 11:02 GMT+02:00 Carmen Manzulli <ca...@gmail.com>:

> Hi John,
> yes i've gotten the examples but the job failed.... this was my command:
>
> bin/hadoop jar
> /usr/local/giraph/giraph-core/target/giraph-1.1.0-SNAPSHOT-for-hadoop-1.2.1-jar-with-dependencies.jar
> org.apache.giraph.GiraphRunner
> org.apache.giraph.benchmark.ShortestPathsComputation -vif
> org.apache.giraph.io.formats.JsonLongDoubleFloatDoubleVertexInputFormat
> -vip /user/hduser/Documento -vof
> org.apache.giraph.io.formats.IdWithValueTextOutputFormat -op
> /user/hduser/outShortest -w 1
>
> and this was the result
>
> 14/07/08 10:46:37 INFO utils.ConfigurationUtils: No edge input format
> specified. Ensure your InputFormat does not require one.
> 14/07/08 10:46:37 INFO utils.ConfigurationUtils: No edge output format
> specified. Ensure your OutputFormat does not require one.
> Exception in thread "main" java.lang.IllegalArgumentException:
> checkClassTypes: edge value types not assignable, computation - class
> org.apache.hadoop.io.DoubleWritable, VertexInputFormat - class
> org.apache.hadoop.io.DoubleWritable
>     at
> org.apache.giraph.job.GiraphConfigurationValidator.checkAssignable(GiraphConfigurationValidator.java:381)
>     at
> org.apache.giraph.job.GiraphConfigurationValidator.verifyVertexInputFormatGenericTypes(GiraphConfigurationValidator.java:230)
>     at
> org.apache.giraph.job.GiraphConfigurationValidator.validateConfiguration(GiraphConfigurationValidator.java:141)
>     at
> org.apache.giraph.utils.ConfigurationUtils.parseArgs(ConfigurationUtils.java:214)
>     at org.apache.giraph.GiraphRunner.run(GiraphRunner.java:74)
>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>     at org.apache.giraph.GiraphRunner.main(GiraphRunner.java:124)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
>
> maybe i've some problem of configuration or there is something that i've
> not understood yes (a lot XD !); Could you give any suggestions?
>
>
> 2014-07-05 12:55 GMT+02:00 John Yost <so...@gmail.com>:
>
> Hi Carmen,
>>
>> I think the ChildError is being thrown by Hadoop but the root cause is
>> this NPE within the BspServiceMaster.   Unfortunately, I have not seen this
>> exception situation and am not sure why you are getting the NPE.  It may be
>> a data issue.  Quick question--have you gotten the examples like
>> SimpleShortestPath to run?  Please confirm, thanks.
>>
>> --John
>>
>>
>> On Wed, Jul 2, 2014 at 9:53 AM, Carmen Manzulli <carmenmanzulli@gmail.com
>> > wrote:
>>
>>> i've red in the web that "error child" could mean this:
>>> Possible reason: the memory allocated for the tasks trackers (sum of
>>> mapred.*.child.java.opt in mapred-site.xml) is more than the nodes actual
>>> memory .
>>>
>>>
>>> 2014-07-02 15:52 GMT+02:00 Carmen Manzulli <ca...@gmail.com>:
>>>
>>> ok course :) !
>>>>
>>>> java.lang.Throwable: Child Error
>>>> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
>>>> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>>>> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>>>>
>>>> and from the command line:
>>>>
>>>>
>>>> /../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-3.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar
>>>> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.library.path=/usr/local/hadoop/libexec/../lib/native/Linux-amd64-64:/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_000000_0/work
>>>> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_000000_0/work/tmp
>>>> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
>>>> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:os.name=Linux
>>>> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:os.arch=amd64
>>>> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:os.version=3.11.0-24-generic
>>>> 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client environment:user.name=hduser
>>>> 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client environment:user.home=/home/hduser
>>>> 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client environment:user.dir=/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_000000_0/work
>>>> 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Initiating client connection, connectString=carmen-HP-Pavilion-Sleekbook-15:22181 sessionTimeout=60000 watcher=org.apache.giraph.master.BspServiceMaster@465962c4
>>>> 2014-07-02 15:49:17,509 INFO org.apache.zookeeper.ClientCnxn: Opening socket connection to server carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:22181. Will not attempt to authenticate using SASL (unknown error)
>>>> 2014-07-02 15:49:17,509 INFO org.apache.zookeeper.ClientCnxn: Socket connection established to carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:22181, initiating session
>>>> 2014-07-02 15:49:17,515 INFO org.apache.zookeeper.ClientCnxn: Session establishment complete on server carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:22181, sessionid = 0x146f756106b0001, negotiated timeout = 600000
>>>> 2014-07-02 15:49:17,516 INFO org.apache.giraph.bsp.BspService: process: Asynchronous connection complete.
>>>> 2014-07-02 15:49:17,530 INFO org.apache.giraph.graph.GraphTaskManager: map: No need to do anything when not a worker
>>>> 2014-07-02 15:49:17,530 INFO org.apache.giraph.graph.GraphTaskManager: cleanup: Starting for MASTER_ZOOKEEPER_ONLY
>>>> 2014-07-02 15:49:17,561 INFO org.apache.giraph.bsp.BspService: getJobState: Job state already exists (/_hadoopBsp/job_201407021315_0003/_masterJobState)
>>>> 2014-07-02 15:49:17,568 INFO org.apache.giraph.master.BspServiceMaster: becomeMaster: First child is '/_hadoopBsp/job_201407021315_0003/_masterElectionDir/carmen-HP-Pavilion-Sleekbook-15_00000000000' and my bid is '/_hadoopBsp/job_201407021315_0003/_masterElectionDir/carmen-HP-Pavilion-Sleekbook-15_00000000000'
>>>> 2014-07-02 15:49:17,570 INFO org.apache.giraph.bsp.BspService: getApplicationAttempt: Node /_hadoopBsp/job_201407021315_0003/_applicationAttemptsDir already exists!
>>>> 2014-07-02 15:49:17,625 INFO org.apache.giraph.comm.netty.NettyServer: NettyServer: Using execution group with 8 threads for requestFrameDecoder.
>>>> 2014-07-02 15:49:17,674 INFO org.apache.giraph.comm.netty.NettyServer: start: Started server communication server: carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:30000 with up to 16 threads on bind attempt 0 with sendBufferSize = 32768 receiveBufferSize = 524288
>>>> 2014-07-02 15:49:17,679 INFO org.apache.giraph.comm.netty.NettyClient: NettyClient: Using execution handler with 8 threads after request-encoder.
>>>> 2014-07-02 15:49:17,682 INFO org.apache.giraph.master.BspServiceMaster: becomeMaster: I am now the master!
>>>> 2014-07-02 15:49:17,684 INFO org.apache.giraph.bsp.BspService: getApplicationAttempt: Node /_hadoopBsp/job_201407021315_0003/_applicationAttemptsDir already exists!
>>>> 2014-07-02 15:49:17,717 ERROR org.apache.giraph.master.MasterThread: masterThread: Master algorithm failed with NullPointerException
>>>> java.lang.NullPointerException
>>>> 	at org.apache.giraph.master.BspServiceMaster.generateInputSplits(BspServiceMaster.java:330)
>>>> 	at org.apache.giraph.master.BspServiceMaster.createInputSplits(BspServiceMaster.java:619)
>>>> 	at org.apache.giraph.master.BspServiceMaster.createVertexInputSplits(BspServiceMaster.java:686)
>>>> 	at org.apache.giraph.master.MasterThread.run(MasterThread.java:108)
>>>> 2014-07-02 15:49:17,718 FATAL org.apache.giraph.graph.GraphMapper: uncaughtException: OverrideExceptionHandler on thread org.apache.giraph.master.MasterThread, msg = java.lang.NullPointerException, exiting...
>>>> java.lang.IllegalStateException: java.lang.NullPointerException
>>>> 	at org.apache.giraph.master.MasterThread.run(MasterThread.java:193)
>>>> Caused by: java.lang.NullPointerException
>>>> 	at org.apache.giraph.master.BspServiceMaster.generateInputSplits(BspServiceMaster.java:330)
>>>> 	at org.apache.giraph.master.BspServiceMaster.createInputSplits(BspServiceMaster.java:619)
>>>> 	at org.apache.giraph.master.BspServiceMaster.createVertexInputSplits(BspServiceMaster.java:686)
>>>> 	at org.apache.giraph.master.MasterThread.run(MasterThread.java:108)
>>>> 2014-07-02 15:49:17,722 INFO org.apache.giraph.zk.ZooKeeperManager: run: Shutdown hook started.
>>>> 2014-07-02 15:49:17,727 WARN org.apache.giraph.zk.ZooKeeperManager: onlineZooKeeperServers: Forced a shutdown hook kill of the ZooKeeper process.
>>>> 2014-07-02 15:49:18,049 INFO org.apache.zookeeper.ClientCnxn: Unable to read additional data from server sessionid 0x146f756106b0001, likely server has closed socket, closing socket connection and attempting reconnect
>>>> 2014-07-02 15:49:18,050 INFO org.apache.giraph.zk.ZooKeeperManager: onlineZooKeeperServers: ZooKeeper process exited with 143 (note that 143 typically means killed).
>>>>
>>>>
>>>>
>>>>
>>>> 2014-07-02 13:52 GMT+02:00 John Yost <so...@gmail.com>:
>>>>
>>>> Hi Carmen,
>>>>>
>>>>> Please post more of the exception stack trace, not enough here for me
>>>>> to figure anything out. :)
>>>>>
>>>>> Thanks
>>>>>
>>>>> --John
>>>>>
>>>>>
>>>>> On Wed, Jul 2, 2014 at 7:33 AM, <so...@gmail.com> wrote:
>>>>>
>>>>>> Hi Carmen,
>>>>>>
>>>>>> Glad that one problem is fixed, and I can take a look at this one as
>>>>>> well.
>>>>>>
>>>>>> --John
>>>>>>
>>>>>> Sent from my iPhone
>>>>>>
>>>>>> On Jul 2, 2014, at 6:50 AM, Carmen Manzulli <ca...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>
>>>>>> ok; i've done what you have told me...but now i've got this problem..
>>>>>>
>>>>>> ava.lang.Throwable: Child Error
>>>>>> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
>>>>>> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>>>>>> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>>>>>>
>>>>>> this is my Computation code:
>>>>>> import org.apache.giraph.GiraphRunner;
>>>>>> import org.apache.giraph.graph.BasicComputation;
>>>>>> import org.apache.giraph.graph.Vertex;
>>>>>> import org.apache.giraph.edge.Edge;
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> import org.apache.hadoop.io.Text;
>>>>>> import org.apache.hadoop.io.NullWritable;
>>>>>> import org.apache.hadoop.util.ToolRunner;
>>>>>>
>>>>>>
>>>>>>
>>>>>> public class SimpleSelectionComputation extends BasicComputation<Text,NullWritable,Text,NullWritable> {
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> 	
>>>>>> @Override
>>>>>> public void compute(Vertex<Text,NullWritable,Text> vertex,Iterable<NullWritable> messages){
>>>>>> 	
>>>>>> 	
>>>>>> 	Text source = new Text("http://dbpedia.org/resource/1040s");
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> 	
>>>>>> 	if (getSuperstep()==0)
>>>>>> 	{
>>>>>> 		if(vertex.getId()==source)
>>>>>> 		{
>>>>>> 			System.out.println("il soggetto "+vertex.getId()+" ha i seguenti predicati e oggetti:");
>>>>>> 			for(Edge<Text,Text> e : vertex.getEdges())
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> 			{
>>>>>> 				System.out.println(e.getValue()+"\t"+e.getTargetVertexId());
>>>>>> 			}
>>>>>> 		}
>>>>>> 		vertex.voteToHalt();
>>>>>> 	}
>>>>>> 	
>>>>>> }
>>>>>>
>>>>>> public static void main(String[] args) throws Exception {
>>>>>>     System.exit(ToolRunner.run(new GiraphRunner(), args));
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>   }
>>>>>>
>>>>>> 	
>>>>>> }
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Couldn't instantiate

Posted by Carmen Manzulli <ca...@gmail.com>.
Hi John,
yes i've gotten the examples but the job failed.... this was my command:

bin/hadoop jar
/usr/local/giraph/giraph-core/target/giraph-1.1.0-SNAPSHOT-for-hadoop-1.2.1-jar-with-dependencies.jar
org.apache.giraph.GiraphRunner
org.apache.giraph.benchmark.ShortestPathsComputation -vif
org.apache.giraph.io.formats.JsonLongDoubleFloatDoubleVertexInputFormat
-vip /user/hduser/Documento -vof
org.apache.giraph.io.formats.IdWithValueTextOutputFormat -op
/user/hduser/outShortest -w 1

and this was the result

14/07/08 10:46:37 INFO utils.ConfigurationUtils: No edge input format
specified. Ensure your InputFormat does not require one.
14/07/08 10:46:37 INFO utils.ConfigurationUtils: No edge output format
specified. Ensure your OutputFormat does not require one.
Exception in thread "main" java.lang.IllegalArgumentException:
checkClassTypes: edge value types not assignable, computation - class
org.apache.hadoop.io.DoubleWritable, VertexInputFormat - class
org.apache.hadoop.io.DoubleWritable
    at
org.apache.giraph.job.GiraphConfigurationValidator.checkAssignable(GiraphConfigurationValidator.java:381)
    at
org.apache.giraph.job.GiraphConfigurationValidator.verifyVertexInputFormatGenericTypes(GiraphConfigurationValidator.java:230)
    at
org.apache.giraph.job.GiraphConfigurationValidator.validateConfiguration(GiraphConfigurationValidator.java:141)
    at
org.apache.giraph.utils.ConfigurationUtils.parseArgs(ConfigurationUtils.java:214)
    at org.apache.giraph.GiraphRunner.run(GiraphRunner.java:74)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
    at org.apache.giraph.GiraphRunner.main(GiraphRunner.java:124)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:160)

maybe i've some problem of configuration or there is something that i've
not understood yes (a lot XD !); Could you give any suggestions?


2014-07-05 12:55 GMT+02:00 John Yost <so...@gmail.com>:

> Hi Carmen,
>
> I think the ChildError is being thrown by Hadoop but the root cause is
> this NPE within the BspServiceMaster.   Unfortunately, I have not seen this
> exception situation and am not sure why you are getting the NPE.  It may be
> a data issue.  Quick question--have you gotten the examples like
> SimpleShortestPath to run?  Please confirm, thanks.
>
> --John
>
>
> On Wed, Jul 2, 2014 at 9:53 AM, Carmen Manzulli <ca...@gmail.com>
> wrote:
>
>> i've red in the web that "error child" could mean this:
>> Possible reason: the memory allocated for the tasks trackers (sum of
>> mapred.*.child.java.opt in mapred-site.xml) is more than the nodes actual
>> memory .
>>
>>
>> 2014-07-02 15:52 GMT+02:00 Carmen Manzulli <ca...@gmail.com>:
>>
>> ok course :) !
>>>
>>> java.lang.Throwable: Child Error
>>> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
>>> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>>> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>>>
>>> and from the command line:
>>>
>>>
>>> /../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-3.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar
>>> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.library.path=/usr/local/hadoop/libexec/../lib/native/Linux-amd64-64:/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_000000_0/work
>>> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_000000_0/work/tmp
>>> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
>>> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:os.name=Linux
>>> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:os.arch=amd64
>>> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:os.version=3.11.0-24-generic
>>> 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client environment:user.name=hduser
>>> 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client environment:user.home=/home/hduser
>>> 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client environment:user.dir=/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_000000_0/work
>>> 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Initiating client connection, connectString=carmen-HP-Pavilion-Sleekbook-15:22181 sessionTimeout=60000 watcher=org.apache.giraph.master.BspServiceMaster@465962c4
>>> 2014-07-02 15:49:17,509 INFO org.apache.zookeeper.ClientCnxn: Opening socket connection to server carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:22181. Will not attempt to authenticate using SASL (unknown error)
>>> 2014-07-02 15:49:17,509 INFO org.apache.zookeeper.ClientCnxn: Socket connection established to carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:22181, initiating session
>>> 2014-07-02 15:49:17,515 INFO org.apache.zookeeper.ClientCnxn: Session establishment complete on server carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:22181, sessionid = 0x146f756106b0001, negotiated timeout = 600000
>>> 2014-07-02 15:49:17,516 INFO org.apache.giraph.bsp.BspService: process: Asynchronous connection complete.
>>> 2014-07-02 15:49:17,530 INFO org.apache.giraph.graph.GraphTaskManager: map: No need to do anything when not a worker
>>> 2014-07-02 15:49:17,530 INFO org.apache.giraph.graph.GraphTaskManager: cleanup: Starting for MASTER_ZOOKEEPER_ONLY
>>> 2014-07-02 15:49:17,561 INFO org.apache.giraph.bsp.BspService: getJobState: Job state already exists (/_hadoopBsp/job_201407021315_0003/_masterJobState)
>>> 2014-07-02 15:49:17,568 INFO org.apache.giraph.master.BspServiceMaster: becomeMaster: First child is '/_hadoopBsp/job_201407021315_0003/_masterElectionDir/carmen-HP-Pavilion-Sleekbook-15_00000000000' and my bid is '/_hadoopBsp/job_201407021315_0003/_masterElectionDir/carmen-HP-Pavilion-Sleekbook-15_00000000000'
>>> 2014-07-02 15:49:17,570 INFO org.apache.giraph.bsp.BspService: getApplicationAttempt: Node /_hadoopBsp/job_201407021315_0003/_applicationAttemptsDir already exists!
>>> 2014-07-02 15:49:17,625 INFO org.apache.giraph.comm.netty.NettyServer: NettyServer: Using execution group with 8 threads for requestFrameDecoder.
>>> 2014-07-02 15:49:17,674 INFO org.apache.giraph.comm.netty.NettyServer: start: Started server communication server: carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:30000 with up to 16 threads on bind attempt 0 with sendBufferSize = 32768 receiveBufferSize = 524288
>>> 2014-07-02 15:49:17,679 INFO org.apache.giraph.comm.netty.NettyClient: NettyClient: Using execution handler with 8 threads after request-encoder.
>>> 2014-07-02 15:49:17,682 INFO org.apache.giraph.master.BspServiceMaster: becomeMaster: I am now the master!
>>> 2014-07-02 15:49:17,684 INFO org.apache.giraph.bsp.BspService: getApplicationAttempt: Node /_hadoopBsp/job_201407021315_0003/_applicationAttemptsDir already exists!
>>> 2014-07-02 15:49:17,717 ERROR org.apache.giraph.master.MasterThread: masterThread: Master algorithm failed with NullPointerException
>>> java.lang.NullPointerException
>>> 	at org.apache.giraph.master.BspServiceMaster.generateInputSplits(BspServiceMaster.java:330)
>>> 	at org.apache.giraph.master.BspServiceMaster.createInputSplits(BspServiceMaster.java:619)
>>> 	at org.apache.giraph.master.BspServiceMaster.createVertexInputSplits(BspServiceMaster.java:686)
>>> 	at org.apache.giraph.master.MasterThread.run(MasterThread.java:108)
>>> 2014-07-02 15:49:17,718 FATAL org.apache.giraph.graph.GraphMapper: uncaughtException: OverrideExceptionHandler on thread org.apache.giraph.master.MasterThread, msg = java.lang.NullPointerException, exiting...
>>> java.lang.IllegalStateException: java.lang.NullPointerException
>>> 	at org.apache.giraph.master.MasterThread.run(MasterThread.java:193)
>>> Caused by: java.lang.NullPointerException
>>> 	at org.apache.giraph.master.BspServiceMaster.generateInputSplits(BspServiceMaster.java:330)
>>> 	at org.apache.giraph.master.BspServiceMaster.createInputSplits(BspServiceMaster.java:619)
>>> 	at org.apache.giraph.master.BspServiceMaster.createVertexInputSplits(BspServiceMaster.java:686)
>>> 	at org.apache.giraph.master.MasterThread.run(MasterThread.java:108)
>>> 2014-07-02 15:49:17,722 INFO org.apache.giraph.zk.ZooKeeperManager: run: Shutdown hook started.
>>> 2014-07-02 15:49:17,727 WARN org.apache.giraph.zk.ZooKeeperManager: onlineZooKeeperServers: Forced a shutdown hook kill of the ZooKeeper process.
>>> 2014-07-02 15:49:18,049 INFO org.apache.zookeeper.ClientCnxn: Unable to read additional data from server sessionid 0x146f756106b0001, likely server has closed socket, closing socket connection and attempting reconnect
>>> 2014-07-02 15:49:18,050 INFO org.apache.giraph.zk.ZooKeeperManager: onlineZooKeeperServers: ZooKeeper process exited with 143 (note that 143 typically means killed).
>>>
>>>
>>>
>>>
>>> 2014-07-02 13:52 GMT+02:00 John Yost <so...@gmail.com>:
>>>
>>> Hi Carmen,
>>>>
>>>> Please post more of the exception stack trace, not enough here for me
>>>> to figure anything out. :)
>>>>
>>>> Thanks
>>>>
>>>> --John
>>>>
>>>>
>>>> On Wed, Jul 2, 2014 at 7:33 AM, <so...@gmail.com> wrote:
>>>>
>>>>> Hi Carmen,
>>>>>
>>>>> Glad that one problem is fixed, and I can take a look at this one as
>>>>> well.
>>>>>
>>>>> --John
>>>>>
>>>>> Sent from my iPhone
>>>>>
>>>>> On Jul 2, 2014, at 6:50 AM, Carmen Manzulli <ca...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>
>>>>> ok; i've done what you have told me...but now i've got this problem..
>>>>>
>>>>> ava.lang.Throwable: Child Error
>>>>> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
>>>>> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>>>>> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>>>>>
>>>>> this is my Computation code:
>>>>> import org.apache.giraph.GiraphRunner;
>>>>> import org.apache.giraph.graph.BasicComputation;
>>>>> import org.apache.giraph.graph.Vertex;
>>>>> import org.apache.giraph.edge.Edge;
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> import org.apache.hadoop.io.Text;
>>>>> import org.apache.hadoop.io.NullWritable;
>>>>> import org.apache.hadoop.util.ToolRunner;
>>>>>
>>>>>
>>>>>
>>>>> public class SimpleSelectionComputation extends BasicComputation<Text,NullWritable,Text,NullWritable> {
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> 	
>>>>> @Override
>>>>> public void compute(Vertex<Text,NullWritable,Text> vertex,Iterable<NullWritable> messages){
>>>>> 	
>>>>> 	
>>>>> 	Text source = new Text("http://dbpedia.org/resource/1040s");
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> 	
>>>>> 	if (getSuperstep()==0)
>>>>> 	{
>>>>> 		if(vertex.getId()==source)
>>>>> 		{
>>>>> 			System.out.println("il soggetto "+vertex.getId()+" ha i seguenti predicati e oggetti:");
>>>>> 			for(Edge<Text,Text> e : vertex.getEdges())
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> 			{
>>>>> 				System.out.println(e.getValue()+"\t"+e.getTargetVertexId());
>>>>> 			}
>>>>> 		}
>>>>> 		vertex.voteToHalt();
>>>>> 	}
>>>>> 	
>>>>> }
>>>>>
>>>>> public static void main(String[] args) throws Exception {
>>>>>     System.exit(ToolRunner.run(new GiraphRunner(), args));
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>   }
>>>>>
>>>>> 	
>>>>> }
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Couldn't instantiate

Posted by John Yost <so...@gmail.com>.
Hi Carmen,

I think the ChildError is being thrown by Hadoop but the root cause is this
NPE within the BspServiceMaster.   Unfortunately, I have not seen this
exception situation and am not sure why you are getting the NPE.  It may be
a data issue.  Quick question--have you gotten the examples like
SimpleShortestPath to run?  Please confirm, thanks.

--John


On Wed, Jul 2, 2014 at 9:53 AM, Carmen Manzulli <ca...@gmail.com>
wrote:

> i've red in the web that "error child" could mean this:
> Possible reason: the memory allocated for the tasks trackers (sum of
> mapred.*.child.java.opt in mapred-site.xml) is more than the nodes actual
> memory .
>
>
> 2014-07-02 15:52 GMT+02:00 Carmen Manzulli <ca...@gmail.com>:
>
> ok course :) !
>>
>> java.lang.Throwable: Child Error
>> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
>> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>>
>> and from the command line:
>>
>>
>> /../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-3.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar
>> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.library.path=/usr/local/hadoop/libexec/../lib/native/Linux-amd64-64:/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_000000_0/work
>> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_000000_0/work/tmp
>> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
>> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:os.name=Linux
>> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:os.arch=amd64
>> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:os.version=3.11.0-24-generic
>> 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client environment:user.name=hduser
>> 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client environment:user.home=/home/hduser
>> 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client environment:user.dir=/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_000000_0/work
>> 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Initiating client connection, connectString=carmen-HP-Pavilion-Sleekbook-15:22181 sessionTimeout=60000 watcher=org.apache.giraph.master.BspServiceMaster@465962c4
>> 2014-07-02 15:49:17,509 INFO org.apache.zookeeper.ClientCnxn: Opening socket connection to server carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:22181. Will not attempt to authenticate using SASL (unknown error)
>> 2014-07-02 15:49:17,509 INFO org.apache.zookeeper.ClientCnxn: Socket connection established to carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:22181, initiating session
>> 2014-07-02 15:49:17,515 INFO org.apache.zookeeper.ClientCnxn: Session establishment complete on server carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:22181, sessionid = 0x146f756106b0001, negotiated timeout = 600000
>> 2014-07-02 15:49:17,516 INFO org.apache.giraph.bsp.BspService: process: Asynchronous connection complete.
>> 2014-07-02 15:49:17,530 INFO org.apache.giraph.graph.GraphTaskManager: map: No need to do anything when not a worker
>> 2014-07-02 15:49:17,530 INFO org.apache.giraph.graph.GraphTaskManager: cleanup: Starting for MASTER_ZOOKEEPER_ONLY
>> 2014-07-02 15:49:17,561 INFO org.apache.giraph.bsp.BspService: getJobState: Job state already exists (/_hadoopBsp/job_201407021315_0003/_masterJobState)
>> 2014-07-02 15:49:17,568 INFO org.apache.giraph.master.BspServiceMaster: becomeMaster: First child is '/_hadoopBsp/job_201407021315_0003/_masterElectionDir/carmen-HP-Pavilion-Sleekbook-15_00000000000' and my bid is '/_hadoopBsp/job_201407021315_0003/_masterElectionDir/carmen-HP-Pavilion-Sleekbook-15_00000000000'
>> 2014-07-02 15:49:17,570 INFO org.apache.giraph.bsp.BspService: getApplicationAttempt: Node /_hadoopBsp/job_201407021315_0003/_applicationAttemptsDir already exists!
>> 2014-07-02 15:49:17,625 INFO org.apache.giraph.comm.netty.NettyServer: NettyServer: Using execution group with 8 threads for requestFrameDecoder.
>> 2014-07-02 15:49:17,674 INFO org.apache.giraph.comm.netty.NettyServer: start: Started server communication server: carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:30000 with up to 16 threads on bind attempt 0 with sendBufferSize = 32768 receiveBufferSize = 524288
>> 2014-07-02 15:49:17,679 INFO org.apache.giraph.comm.netty.NettyClient: NettyClient: Using execution handler with 8 threads after request-encoder.
>> 2014-07-02 15:49:17,682 INFO org.apache.giraph.master.BspServiceMaster: becomeMaster: I am now the master!
>> 2014-07-02 15:49:17,684 INFO org.apache.giraph.bsp.BspService: getApplicationAttempt: Node /_hadoopBsp/job_201407021315_0003/_applicationAttemptsDir already exists!
>> 2014-07-02 15:49:17,717 ERROR org.apache.giraph.master.MasterThread: masterThread: Master algorithm failed with NullPointerException
>> java.lang.NullPointerException
>> 	at org.apache.giraph.master.BspServiceMaster.generateInputSplits(BspServiceMaster.java:330)
>> 	at org.apache.giraph.master.BspServiceMaster.createInputSplits(BspServiceMaster.java:619)
>> 	at org.apache.giraph.master.BspServiceMaster.createVertexInputSplits(BspServiceMaster.java:686)
>> 	at org.apache.giraph.master.MasterThread.run(MasterThread.java:108)
>> 2014-07-02 15:49:17,718 FATAL org.apache.giraph.graph.GraphMapper: uncaughtException: OverrideExceptionHandler on thread org.apache.giraph.master.MasterThread, msg = java.lang.NullPointerException, exiting...
>> java.lang.IllegalStateException: java.lang.NullPointerException
>> 	at org.apache.giraph.master.MasterThread.run(MasterThread.java:193)
>> Caused by: java.lang.NullPointerException
>> 	at org.apache.giraph.master.BspServiceMaster.generateInputSplits(BspServiceMaster.java:330)
>> 	at org.apache.giraph.master.BspServiceMaster.createInputSplits(BspServiceMaster.java:619)
>> 	at org.apache.giraph.master.BspServiceMaster.createVertexInputSplits(BspServiceMaster.java:686)
>> 	at org.apache.giraph.master.MasterThread.run(MasterThread.java:108)
>> 2014-07-02 15:49:17,722 INFO org.apache.giraph.zk.ZooKeeperManager: run: Shutdown hook started.
>> 2014-07-02 15:49:17,727 WARN org.apache.giraph.zk.ZooKeeperManager: onlineZooKeeperServers: Forced a shutdown hook kill of the ZooKeeper process.
>> 2014-07-02 15:49:18,049 INFO org.apache.zookeeper.ClientCnxn: Unable to read additional data from server sessionid 0x146f756106b0001, likely server has closed socket, closing socket connection and attempting reconnect
>> 2014-07-02 15:49:18,050 INFO org.apache.giraph.zk.ZooKeeperManager: onlineZooKeeperServers: ZooKeeper process exited with 143 (note that 143 typically means killed).
>>
>>
>>
>>
>> 2014-07-02 13:52 GMT+02:00 John Yost <so...@gmail.com>:
>>
>> Hi Carmen,
>>>
>>> Please post more of the exception stack trace, not enough here for me to
>>> figure anything out. :)
>>>
>>> Thanks
>>>
>>> --John
>>>
>>>
>>> On Wed, Jul 2, 2014 at 7:33 AM, <so...@gmail.com> wrote:
>>>
>>>> Hi Carmen,
>>>>
>>>> Glad that one problem is fixed, and I can take a look at this one as
>>>> well.
>>>>
>>>> --John
>>>>
>>>> Sent from my iPhone
>>>>
>>>> On Jul 2, 2014, at 6:50 AM, Carmen Manzulli <ca...@gmail.com>
>>>> wrote:
>>>>
>>>>
>>>> ok; i've done what you have told me...but now i've got this problem..
>>>>
>>>> ava.lang.Throwable: Child Error
>>>> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
>>>> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>>>> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>>>>
>>>> this is my Computation code:
>>>> import org.apache.giraph.GiraphRunner;
>>>> import org.apache.giraph.graph.BasicComputation;
>>>> import org.apache.giraph.graph.Vertex;
>>>> import org.apache.giraph.edge.Edge;
>>>>
>>>>
>>>>
>>>>
>>>> import org.apache.hadoop.io.Text;
>>>> import org.apache.hadoop.io.NullWritable;
>>>> import org.apache.hadoop.util.ToolRunner;
>>>>
>>>>
>>>>
>>>> public class SimpleSelectionComputation extends BasicComputation<Text,NullWritable,Text,NullWritable> {
>>>>
>>>>
>>>>
>>>>
>>>> 	
>>>> @Override
>>>> public void compute(Vertex<Text,NullWritable,Text> vertex,Iterable<NullWritable> messages){
>>>> 	
>>>> 	
>>>> 	Text source = new Text("http://dbpedia.org/resource/1040s");
>>>>
>>>>
>>>>
>>>>
>>>> 	
>>>> 	if (getSuperstep()==0)
>>>> 	{
>>>> 		if(vertex.getId()==source)
>>>> 		{
>>>> 			System.out.println("il soggetto "+vertex.getId()+" ha i seguenti predicati e oggetti:");
>>>> 			for(Edge<Text,Text> e : vertex.getEdges())
>>>>
>>>>
>>>>
>>>>
>>>> 			{
>>>> 				System.out.println(e.getValue()+"\t"+e.getTargetVertexId());
>>>> 			}
>>>> 		}
>>>> 		vertex.voteToHalt();
>>>> 	}
>>>> 	
>>>> }
>>>>
>>>> public static void main(String[] args) throws Exception {
>>>>     System.exit(ToolRunner.run(new GiraphRunner(), args));
>>>>
>>>>
>>>>
>>>>
>>>>   }
>>>>
>>>> 	
>>>> }
>>>>
>>>>
>>>
>>
>

Re: Couldn't instantiate

Posted by Carmen Manzulli <ca...@gmail.com>.
i've red in the web that "error child" could mean this:
Possible reason: the memory allocated for the tasks trackers (sum of
mapred.*.child.java.opt in mapred-site.xml) is more than the nodes actual
memory .


2014-07-02 15:52 GMT+02:00 Carmen Manzulli <ca...@gmail.com>:

> ok course :) !
>
> java.lang.Throwable: Child Error
> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> and from the command line:
>
>
> /../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-3.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar
> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.library.path=/usr/local/hadoop/libexec/../lib/native/Linux-amd64-64:/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_000000_0/work
> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_000000_0/work/tmp
> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:os.name=Linux
> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:os.arch=amd64
> 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client environment:os.version=3.11.0-24-generic
> 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client environment:user.name=hduser
> 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client environment:user.home=/home/hduser
> 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client environment:user.dir=/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_000000_0/work
> 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Initiating client connection, connectString=carmen-HP-Pavilion-Sleekbook-15:22181 sessionTimeout=60000 watcher=org.apache.giraph.master.BspServiceMaster@465962c4
> 2014-07-02 15:49:17,509 INFO org.apache.zookeeper.ClientCnxn: Opening socket connection to server carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:22181. Will not attempt to authenticate using SASL (unknown error)
> 2014-07-02 15:49:17,509 INFO org.apache.zookeeper.ClientCnxn: Socket connection established to carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:22181, initiating session
> 2014-07-02 15:49:17,515 INFO org.apache.zookeeper.ClientCnxn: Session establishment complete on server carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:22181, sessionid = 0x146f756106b0001, negotiated timeout = 600000
> 2014-07-02 15:49:17,516 INFO org.apache.giraph.bsp.BspService: process: Asynchronous connection complete.
> 2014-07-02 15:49:17,530 INFO org.apache.giraph.graph.GraphTaskManager: map: No need to do anything when not a worker
> 2014-07-02 15:49:17,530 INFO org.apache.giraph.graph.GraphTaskManager: cleanup: Starting for MASTER_ZOOKEEPER_ONLY
> 2014-07-02 15:49:17,561 INFO org.apache.giraph.bsp.BspService: getJobState: Job state already exists (/_hadoopBsp/job_201407021315_0003/_masterJobState)
> 2014-07-02 15:49:17,568 INFO org.apache.giraph.master.BspServiceMaster: becomeMaster: First child is '/_hadoopBsp/job_201407021315_0003/_masterElectionDir/carmen-HP-Pavilion-Sleekbook-15_00000000000' and my bid is '/_hadoopBsp/job_201407021315_0003/_masterElectionDir/carmen-HP-Pavilion-Sleekbook-15_00000000000'
> 2014-07-02 15:49:17,570 INFO org.apache.giraph.bsp.BspService: getApplicationAttempt: Node /_hadoopBsp/job_201407021315_0003/_applicationAttemptsDir already exists!
> 2014-07-02 15:49:17,625 INFO org.apache.giraph.comm.netty.NettyServer: NettyServer: Using execution group with 8 threads for requestFrameDecoder.
> 2014-07-02 15:49:17,674 INFO org.apache.giraph.comm.netty.NettyServer: start: Started server communication server: carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:30000 with up to 16 threads on bind attempt 0 with sendBufferSize = 32768 receiveBufferSize = 524288
> 2014-07-02 15:49:17,679 INFO org.apache.giraph.comm.netty.NettyClient: NettyClient: Using execution handler with 8 threads after request-encoder.
> 2014-07-02 15:49:17,682 INFO org.apache.giraph.master.BspServiceMaster: becomeMaster: I am now the master!
> 2014-07-02 15:49:17,684 INFO org.apache.giraph.bsp.BspService: getApplicationAttempt: Node /_hadoopBsp/job_201407021315_0003/_applicationAttemptsDir already exists!
> 2014-07-02 15:49:17,717 ERROR org.apache.giraph.master.MasterThread: masterThread: Master algorithm failed with NullPointerException
> java.lang.NullPointerException
> 	at org.apache.giraph.master.BspServiceMaster.generateInputSplits(BspServiceMaster.java:330)
> 	at org.apache.giraph.master.BspServiceMaster.createInputSplits(BspServiceMaster.java:619)
> 	at org.apache.giraph.master.BspServiceMaster.createVertexInputSplits(BspServiceMaster.java:686)
> 	at org.apache.giraph.master.MasterThread.run(MasterThread.java:108)
> 2014-07-02 15:49:17,718 FATAL org.apache.giraph.graph.GraphMapper: uncaughtException: OverrideExceptionHandler on thread org.apache.giraph.master.MasterThread, msg = java.lang.NullPointerException, exiting...
> java.lang.IllegalStateException: java.lang.NullPointerException
> 	at org.apache.giraph.master.MasterThread.run(MasterThread.java:193)
> Caused by: java.lang.NullPointerException
> 	at org.apache.giraph.master.BspServiceMaster.generateInputSplits(BspServiceMaster.java:330)
> 	at org.apache.giraph.master.BspServiceMaster.createInputSplits(BspServiceMaster.java:619)
> 	at org.apache.giraph.master.BspServiceMaster.createVertexInputSplits(BspServiceMaster.java:686)
> 	at org.apache.giraph.master.MasterThread.run(MasterThread.java:108)
> 2014-07-02 15:49:17,722 INFO org.apache.giraph.zk.ZooKeeperManager: run: Shutdown hook started.
> 2014-07-02 15:49:17,727 WARN org.apache.giraph.zk.ZooKeeperManager: onlineZooKeeperServers: Forced a shutdown hook kill of the ZooKeeper process.
> 2014-07-02 15:49:18,049 INFO org.apache.zookeeper.ClientCnxn: Unable to read additional data from server sessionid 0x146f756106b0001, likely server has closed socket, closing socket connection and attempting reconnect
> 2014-07-02 15:49:18,050 INFO org.apache.giraph.zk.ZooKeeperManager: onlineZooKeeperServers: ZooKeeper process exited with 143 (note that 143 typically means killed).
>
>
>
>
> 2014-07-02 13:52 GMT+02:00 John Yost <so...@gmail.com>:
>
> Hi Carmen,
>>
>> Please post more of the exception stack trace, not enough here for me to
>> figure anything out. :)
>>
>> Thanks
>>
>> --John
>>
>>
>> On Wed, Jul 2, 2014 at 7:33 AM, <so...@gmail.com> wrote:
>>
>>> Hi Carmen,
>>>
>>> Glad that one problem is fixed, and I can take a look at this one as
>>> well.
>>>
>>> --John
>>>
>>> Sent from my iPhone
>>>
>>> On Jul 2, 2014, at 6:50 AM, Carmen Manzulli <ca...@gmail.com>
>>> wrote:
>>>
>>>
>>> ok; i've done what you have told me...but now i've got this problem..
>>>
>>> ava.lang.Throwable: Child Error
>>> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
>>> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>>> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>>>
>>> this is my Computation code:
>>> import org.apache.giraph.GiraphRunner;
>>> import org.apache.giraph.graph.BasicComputation;
>>> import org.apache.giraph.graph.Vertex;
>>> import org.apache.giraph.edge.Edge;
>>>
>>>
>>>
>>> import org.apache.hadoop.io.Text;
>>> import org.apache.hadoop.io.NullWritable;
>>> import org.apache.hadoop.util.ToolRunner;
>>>
>>>
>>>
>>> public class SimpleSelectionComputation extends BasicComputation<Text,NullWritable,Text,NullWritable> {
>>>
>>>
>>>
>>> 	
>>> @Override
>>> public void compute(Vertex<Text,NullWritable,Text> vertex,Iterable<NullWritable> messages){
>>> 	
>>> 	
>>> 	Text source = new Text("http://dbpedia.org/resource/1040s");
>>>
>>>
>>>
>>> 	
>>> 	if (getSuperstep()==0)
>>> 	{
>>> 		if(vertex.getId()==source)
>>> 		{
>>> 			System.out.println("il soggetto "+vertex.getId()+" ha i seguenti predicati e oggetti:");
>>> 			for(Edge<Text,Text> e : vertex.getEdges())
>>>
>>>
>>>
>>> 			{
>>> 				System.out.println(e.getValue()+"\t"+e.getTargetVertexId());
>>> 			}
>>> 		}
>>> 		vertex.voteToHalt();
>>> 	}
>>> 	
>>> }
>>>
>>> public static void main(String[] args) throws Exception {
>>>     System.exit(ToolRunner.run(new GiraphRunner(), args));
>>>
>>>
>>>
>>>   }
>>>
>>> 	
>>> }
>>>
>>>
>>
>

Re: Couldn't instantiate

Posted by Carmen Manzulli <ca...@gmail.com>.
ok course :) !

java.lang.Throwable: Child Error
	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

and from the command line:


/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-3.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar
2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client
environment:java.library.path=/usr/local/hadoop/libexec/../lib/native/Linux-amd64-64:/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_000000_0/work
2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client
environment:java.io.tmpdir=/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_000000_0/work/tmp
2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client
environment:java.compiler=<NA>
2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client
environment:os.name=Linux
2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client
environment:os.arch=amd64
2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client
environment:os.version=3.11.0-24-generic
2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client
environment:user.name=hduser
2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client
environment:user.home=/home/hduser
2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client
environment:user.dir=/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_000000_0/work
2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper:
Initiating client connection,
connectString=carmen-HP-Pavilion-Sleekbook-15:22181
sessionTimeout=60000
watcher=org.apache.giraph.master.BspServiceMaster@465962c4
2014-07-02 15:49:17,509 INFO org.apache.zookeeper.ClientCnxn: Opening
socket connection to server
carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:22181. Will not attempt to
authenticate using SASL (unknown error)
2014-07-02 15:49:17,509 INFO org.apache.zookeeper.ClientCnxn: Socket
connection established to
carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:22181, initiating session
2014-07-02 15:49:17,515 INFO org.apache.zookeeper.ClientCnxn: Session
establishment complete on server
carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:22181, sessionid =
0x146f756106b0001, negotiated timeout = 600000
2014-07-02 15:49:17,516 INFO org.apache.giraph.bsp.BspService:
process: Asynchronous connection complete.
2014-07-02 15:49:17,530 INFO org.apache.giraph.graph.GraphTaskManager:
map: No need to do anything when not a worker
2014-07-02 15:49:17,530 INFO org.apache.giraph.graph.GraphTaskManager:
cleanup: Starting for MASTER_ZOOKEEPER_ONLY
2014-07-02 15:49:17,561 INFO org.apache.giraph.bsp.BspService:
getJobState: Job state already exists
(/_hadoopBsp/job_201407021315_0003/_masterJobState)
2014-07-02 15:49:17,568 INFO
org.apache.giraph.master.BspServiceMaster: becomeMaster: First child
is '/_hadoopBsp/job_201407021315_0003/_masterElectionDir/carmen-HP-Pavilion-Sleekbook-15_00000000000'
and my bid is '/_hadoopBsp/job_201407021315_0003/_masterElectionDir/carmen-HP-Pavilion-Sleekbook-15_00000000000'
2014-07-02 15:49:17,570 INFO org.apache.giraph.bsp.BspService:
getApplicationAttempt: Node
/_hadoopBsp/job_201407021315_0003/_applicationAttemptsDir already
exists!
2014-07-02 15:49:17,625 INFO org.apache.giraph.comm.netty.NettyServer:
NettyServer: Using execution group with 8 threads for
requestFrameDecoder.
2014-07-02 15:49:17,674 INFO org.apache.giraph.comm.netty.NettyServer:
start: Started server communication server:
carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:30000 with up to 16 threads
on bind attempt 0 with sendBufferSize = 32768 receiveBufferSize =
524288
2014-07-02 15:49:17,679 INFO org.apache.giraph.comm.netty.NettyClient:
NettyClient: Using execution handler with 8 threads after
request-encoder.
2014-07-02 15:49:17,682 INFO
org.apache.giraph.master.BspServiceMaster: becomeMaster: I am now the
master!
2014-07-02 15:49:17,684 INFO org.apache.giraph.bsp.BspService:
getApplicationAttempt: Node
/_hadoopBsp/job_201407021315_0003/_applicationAttemptsDir already
exists!
2014-07-02 15:49:17,717 ERROR org.apache.giraph.master.MasterThread:
masterThread: Master algorithm failed with NullPointerException
java.lang.NullPointerException
	at org.apache.giraph.master.BspServiceMaster.generateInputSplits(BspServiceMaster.java:330)
	at org.apache.giraph.master.BspServiceMaster.createInputSplits(BspServiceMaster.java:619)
	at org.apache.giraph.master.BspServiceMaster.createVertexInputSplits(BspServiceMaster.java:686)
	at org.apache.giraph.master.MasterThread.run(MasterThread.java:108)
2014-07-02 15:49:17,718 FATAL org.apache.giraph.graph.GraphMapper:
uncaughtException: OverrideExceptionHandler on thread
org.apache.giraph.master.MasterThread, msg =
java.lang.NullPointerException, exiting...
java.lang.IllegalStateException: java.lang.NullPointerException
	at org.apache.giraph.master.MasterThread.run(MasterThread.java:193)
Caused by: java.lang.NullPointerException
	at org.apache.giraph.master.BspServiceMaster.generateInputSplits(BspServiceMaster.java:330)
	at org.apache.giraph.master.BspServiceMaster.createInputSplits(BspServiceMaster.java:619)
	at org.apache.giraph.master.BspServiceMaster.createVertexInputSplits(BspServiceMaster.java:686)
	at org.apache.giraph.master.MasterThread.run(MasterThread.java:108)
2014-07-02 15:49:17,722 INFO org.apache.giraph.zk.ZooKeeperManager:
run: Shutdown hook started.
2014-07-02 15:49:17,727 WARN org.apache.giraph.zk.ZooKeeperManager:
onlineZooKeeperServers: Forced a shutdown hook kill of the ZooKeeper
process.
2014-07-02 15:49:18,049 INFO org.apache.zookeeper.ClientCnxn: Unable
to read additional data from server sessionid 0x146f756106b0001,
likely server has closed socket, closing socket connection and
attempting reconnect
2014-07-02 15:49:18,050 INFO org.apache.giraph.zk.ZooKeeperManager:
onlineZooKeeperServers: ZooKeeper process exited with 143 (note that
143 typically means killed).




2014-07-02 13:52 GMT+02:00 John Yost <so...@gmail.com>:

> Hi Carmen,
>
> Please post more of the exception stack trace, not enough here for me to
> figure anything out. :)
>
> Thanks
>
> --John
>
>
> On Wed, Jul 2, 2014 at 7:33 AM, <so...@gmail.com> wrote:
>
>> Hi Carmen,
>>
>> Glad that one problem is fixed, and I can take a look at this one as well.
>>
>> --John
>>
>> Sent from my iPhone
>>
>> On Jul 2, 2014, at 6:50 AM, Carmen Manzulli <ca...@gmail.com>
>> wrote:
>>
>>
>> ok; i've done what you have told me...but now i've got this problem..
>>
>> ava.lang.Throwable: Child Error
>> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
>> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>>
>> this is my Computation code:
>> import org.apache.giraph.GiraphRunner;
>> import org.apache.giraph.graph.BasicComputation;
>> import org.apache.giraph.graph.Vertex;
>> import org.apache.giraph.edge.Edge;
>>
>>
>> import org.apache.hadoop.io.Text;
>> import org.apache.hadoop.io.NullWritable;
>> import org.apache.hadoop.util.ToolRunner;
>>
>>
>>
>> public class SimpleSelectionComputation extends BasicComputation<Text,NullWritable,Text,NullWritable> {
>>
>>
>> 	
>> @Override
>> public void compute(Vertex<Text,NullWritable,Text> vertex,Iterable<NullWritable> messages){
>> 	
>> 	
>> 	Text source = new Text("http://dbpedia.org/resource/1040s");
>>
>>
>> 	
>> 	if (getSuperstep()==0)
>> 	{
>> 		if(vertex.getId()==source)
>> 		{
>> 			System.out.println("il soggetto "+vertex.getId()+" ha i seguenti predicati e oggetti:");
>> 			for(Edge<Text,Text> e : vertex.getEdges())
>>
>>
>> 			{
>> 				System.out.println(e.getValue()+"\t"+e.getTargetVertexId());
>> 			}
>> 		}
>> 		vertex.voteToHalt();
>> 	}
>> 	
>> }
>>
>> public static void main(String[] args) throws Exception {
>>     System.exit(ToolRunner.run(new GiraphRunner(), args));
>>
>>
>>   }
>>
>> 	
>> }
>>
>>
>

Re: Couldn't instantiate

Posted by John Yost <so...@gmail.com>.
Hi Carmen,

Please post more of the exception stack trace, not enough here for me to
figure anything out. :)

Thanks

--John


On Wed, Jul 2, 2014 at 7:33 AM, <so...@gmail.com> wrote:

> Hi Carmen,
>
> Glad that one problem is fixed, and I can take a look at this one as well.
>
> --John
>
> Sent from my iPhone
>
> On Jul 2, 2014, at 6:50 AM, Carmen Manzulli <ca...@gmail.com>
> wrote:
>
>
> ok; i've done what you have told me...but now i've got this problem..
>
> ava.lang.Throwable: Child Error
> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> this is my Computation code:
> import org.apache.giraph.GiraphRunner;
> import org.apache.giraph.graph.BasicComputation;
> import org.apache.giraph.graph.Vertex;
> import org.apache.giraph.edge.Edge;
>
> import org.apache.hadoop.io.Text;
> import org.apache.hadoop.io.NullWritable;
> import org.apache.hadoop.util.ToolRunner;
>
>
>
> public class SimpleSelectionComputation extends BasicComputation<Text,NullWritable,Text,NullWritable> {
>
> 	
> @Override
> public void compute(Vertex<Text,NullWritable,Text> vertex,Iterable<NullWritable> messages){
> 	
> 	
> 	Text source = new Text("http://dbpedia.org/resource/1040s");
>
> 	
> 	if (getSuperstep()==0)
> 	{
> 		if(vertex.getId()==source)
> 		{
> 			System.out.println("il soggetto "+vertex.getId()+" ha i seguenti predicati e oggetti:");
> 			for(Edge<Text,Text> e : vertex.getEdges())
>
> 			{
> 				System.out.println(e.getValue()+"\t"+e.getTargetVertexId());
> 			}
> 		}
> 		vertex.voteToHalt();
> 	}
> 	
> }
>
> public static void main(String[] args) throws Exception {
>     System.exit(ToolRunner.run(new GiraphRunner(), args));
>
>   }
>
> 	
> }
>
>

Re: Couldn't instantiate

Posted by so...@gmail.com.
Hi Carmen,

Glad that one problem is fixed, and I can take a look at this one as well.

--John 

Sent from my iPhone

> On Jul 2, 2014, at 6:50 AM, Carmen Manzulli <ca...@gmail.com> wrote:
> 
> 
> ok; i've done what you have told me...but now i've got this problem..
> ava.lang.Throwable: Child Error
> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
> 	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
> 
> 
> this is my Computation code:
> import org.apache.giraph.GiraphRunner;
> import org.apache.giraph.graph.BasicComputation;
> import org.apache.giraph.graph.Vertex;
> import org.apache.giraph.edge.Edge;
> 
> import org.apache.hadoop.io.Text;
> import org.apache.hadoop.io.NullWritable;
> import org.apache.hadoop.util.ToolRunner;
> 
> 
> 
> public class SimpleSelectionComputation extends BasicComputation<Text,NullWritable,Text,NullWritable> {
> 
> 	
> @Override
> public void compute(Vertex<Text,NullWritable,Text> vertex,Iterable<NullWritable> messages){
> 	
> 	
> 	Text source = new Text("http://dbpedia.org/resource/1040s");
> 
> 	
> 	if (getSuperstep()==0)
> 	{
> 		if(vertex.getId()==source)
> 		{
> 			System.out.println("il soggetto "+vertex.getId()+" ha i seguenti predicati e oggetti:");
> 			for(Edge<Text,Text> e : vertex.getEdges())
> 
> 			{
> 				System.out.println(e.getValue()+"\t"+e.getTargetVertexId());
> 			}
> 		}
> 		vertex.voteToHalt();
> 	}
> 	
> }
> 
> public static void main(String[] args) throws Exception {
>     System.exit(ToolRunner.run(new GiraphRunner(), args));
> 
>   }
> 
> 	
> }

Re: Couldn't instantiate

Posted by Carmen Manzulli <ca...@gmail.com>.
ok; i've done what you have told me...but now i've got this problem..

ava.lang.Throwable: Child Error
	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
	at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

this is my Computation code:
import org.apache.giraph.GiraphRunner;
import org.apache.giraph.graph.BasicComputation;
import org.apache.giraph.graph.Vertex;
import org.apache.giraph.edge.Edge;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.util.ToolRunner;



public class SimpleSelectionComputation extends
BasicComputation<Text,NullWritable,Text,NullWritable> {
	
@Override
public void compute(Vertex<Text,NullWritable,Text>
vertex,Iterable<NullWritable> messages){
	
	
	Text source = new Text("http://dbpedia.org/resource/1040s");
	
	if (getSuperstep()==0)
	{
		if(vertex.getId()==source)
		{
			System.out.println("il soggetto "+vertex.getId()+" ha i seguenti
predicati e oggetti:");
			for(Edge<Text,Text> e : vertex.getEdges())
			{
				System.out.println(e.getValue()+"\t"+e.getTargetVertexId());
			}
		}
		vertex.voteToHalt();
	}
	
}

public static void main(String[] args) throws Exception {
    System.exit(ToolRunner.run(new GiraphRunner(), args));
  }

	
}

Re: Couldn't instantiate

Posted by Carmen Manzulli <ca...@gmail.com>.
OK!!!So in this way i've also to implement checkInputSpecs and getSplits
metods, right??
Thank you very much :)



2014-07-01 12:51 GMT+02:00 <so...@gmail.com>:

> Ah, okay, I see the problem.  Abstract classes cannot be instantiated in
> Java.  Ensure you've implemented all of the abstract methods, remove the
> abstract keyword from your class definition, and you should be ready to
> roll.
>
> -John
>
> Sent from my iPhone
>
> On Jul 1, 2014, at 5:42 AM, Carmen Manzulli <ca...@gmail.com>
> wrote:
>
> Hi John,
> yes I've tried to insert a no-arguments constructor but the problem seems
> to be another one.This is my code, an input format with a vertexreader to
> read triple as RDF.
>
> import java.io.IOException;
> import java.util.ArrayList;
> import java.lang.InterruptedException;
>
> import org.apache.giraph.graph.Vertex;
> import org.apache.giraph.edge.Edge;
> import org.apache.giraph.edge.EdgeFactory;
> import org.apache.giraph.io.VertexReader;
> import org.apache.giraph.io.VertexInputFormat;
> import org.apache.hadoop.io.LongWritable;
> import org.apache.hadoop.io.NullWritable;
> import org.apache.hadoop.io.Text;
> import org.apache.hadoop.mapreduce.InputSplit;
> import org.apache.hadoop.mapreduce.RecordReader;
> import org.apache.hadoop.mapreduce.lib.input.LineRecordReader;
> import org.apache.hadoop.mapreduce.TaskAttemptContext;
>
>
>
> public abstract class SimpleRDFVertexInputFormat extends VertexInputFormat
> <Text,NullWritable,Text> {
>
>     public SimpleRDFVertexInputFormat() {
>         // TODO Auto-generated constructor stub
>         super();
>     }
>
>     public VertexReader<Text,NullWritable,Text>
> createVertexReader(InputSplit split,TaskAttemptContext context) throws
> IOException{
>         return new SimpleRDFVertexReader();
>     }
>     public class SimpleRDFVertexReader extends
> VertexReader<Text,NullWritable,Text>{
>
>         private RecordReader<LongWritable,Text> lineRecordReader;
>         private TaskAttemptContext context;
>
>
>         @Override
>         public void initialize(InputSplit inputsplit, TaskAttemptContext
> context) throws IOException, InterruptedException{
>
>             this.setContext(context);
>             lineRecordReader= new LineRecordReader();
>             lineRecordReader.initialize(inputsplit, context);
>
>         }
>
>         @Override
>         public final boolean nextVertex() throws IOException,
> InterruptedException{
>             return lineRecordReader.nextKeyValue();
>         }
>
>
>         @Override
>         public final Vertex<Text,NullWritable,Text> getCurrentVertex()
> throws IOException, InterruptedException{
>             Text line = lineRecordReader.getCurrentValue();
>             Vertex<Text,NullWritable,Text> vertex =
> getConf().createVertex();
>             String[] elements = line.toString().split(" ");
>             Text firstele = new Text(elements[0]);
>             int len1 = firstele.getLength();
>             Text  subject = new Text(firstele.toString().substring(1,
> len1-1));
>             Text secondele = new Text(elements[1]);
>             int len2 = secondele.getLength();
>             Text predicate = new Text(secondele.toString().substring(1,
> len2-1));
>             Text object = new Text(elements[2]);
>             ArrayList<Edge<Text,Text>> edge = new
> ArrayList<Edge<Text,Text>>();
>             edge.add(EdgeFactory.create(object, predicate) );
>             vertex.initialize(subject, null, edge );
>             return vertex;
>
>         }
>
>         @Override
>         public void close() throws IOException{
>             lineRecordReader.close();
>         }
>
>         @Override
>         public  float getProgress()throws IOException,InterruptedException{
>             return lineRecordReader.getProgress();
>         }
>
>         public TaskAttemptContext getContext() {
>             return context;
>         }
>
>         public void setContext(TaskAttemptContext context) {
>             this.context = context;
>         }
>
>
>     }
>
>
> }
>
>
> 2014-07-01 4:23 GMT+02:00 John Yost <so...@gmail.com>:
>
>> Hi Carmen,
>>
>> Question--did you only define an arguments constructor?  If so, I think
>> you are getting this because you did not define a no-arguments constructor
>> with public visibility.  If this is not the case, I recommend posting your
>> source code and I will be happy to help.
>>
>> --John
>>
>>
>> On Mon, Jun 30, 2014 at 9:38 AM, Carmen Manzulli <
>> carmenmanzulli@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I'm trying to run a selectionComputation with my own code for VertexInputFormat but giraph' job starts to work and then fails with:
>>>
>>>
>>>
>>>
>>> java.lang.IllegalStateException: run: Caught an unrecoverable exception newInstance: Couldn't instantiate sisinflab.SimpleRDFVertexInputFormat
>>> 	at org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:101)
>>> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
>>> 	at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>> 	at java.security.AccessController.doPrivileged(Native Method)
>>> 	at javax.security.auth.Subject.doAs(Subject.java:415)
>>> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
>>> 	at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>> Caused by: java.lang.IllegalStateException: newInstance: Couldn't instantiate sisinflab.SimpleRDFVertexInputFormat
>>> 	at org.apache.giraph.utils.ReflectionUtils.newInstance(ReflectionUtils.java:105)
>>> 	at org.apache.giraph.conf.ImmutableClassesGiraphConfiguration.createVertexInputFormat(ImmutableClassesGiraphConfiguration.java:235)
>>> 	at org.apache.giraph.conf.ImmutableClassesGiraphConfiguration.createWrappedVertexInputFormat(ImmutableClassesGiraphConfiguration.java:246)
>>> 	at org.apache.giraph.graph.GraphTaskManager.checkInput(GraphTaskManager.java:171)
>>> 	at org.apache.giraph.graph.GraphTaskManager.setup(GraphTaskManager.java:207)
>>> 	at org.apache.giraph.graph.GraphMapper.setup(GraphMapper.java:59)
>>> 	at org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:89)
>>> 	... 7 more
>>> Caused by: java.lang.InstantiationException
>>> 	at sun.reflect.InstantiationExceptionConstructorAccessorImpl.newInstance(InstantiationExceptionConstructorAccessorImpl.java:48)
>>> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>> 	at java.lang.Class.newInstance(Class.java:374)
>>> 	at org.apache.giraph.utils.ReflectionUtils.newInstance(ReflectionUtils.java:103)
>>> 	... 13 more
>>>
>>>
>>> what does it mean? where is the problem?
>>>
>>> Who can help me?
>>>
>>> Carmen
>>>
>>>
>>
>

Re: Couldn't instantiate

Posted by so...@gmail.com.
Ah, okay, I see the problem.  Abstract classes cannot be instantiated in Java.  Ensure you've implemented all of the abstract methods, remove the abstract keyword from your class definition, and you should be ready to roll.

-John

Sent from my iPhone

> On Jul 1, 2014, at 5:42 AM, Carmen Manzulli <ca...@gmail.com> wrote:
> 
> Hi John,
> yes I've tried to insert a no-arguments constructor but the problem seems to be another one.This is my code, an input format with a vertexreader to read triple as RDF.
> 
> import java.io.IOException;
> import java.util.ArrayList;
> import java.lang.InterruptedException;
> 
> import org.apache.giraph.graph.Vertex;
> import org.apache.giraph.edge.Edge;
> import org.apache.giraph.edge.EdgeFactory;
> import org.apache.giraph.io.VertexReader;
> import org.apache.giraph.io.VertexInputFormat;
> import org.apache.hadoop.io.LongWritable;
> import org.apache.hadoop.io.NullWritable;
> import org.apache.hadoop.io.Text;
> import org.apache.hadoop.mapreduce.InputSplit;
> import org.apache.hadoop.mapreduce.RecordReader;
> import org.apache.hadoop.mapreduce.lib.input.LineRecordReader;
> import org.apache.hadoop.mapreduce.TaskAttemptContext;
> 
> 
> 
> public abstract class SimpleRDFVertexInputFormat extends VertexInputFormat <Text,NullWritable,Text> {
>     
>     public SimpleRDFVertexInputFormat() {
>         // TODO Auto-generated constructor stub
>         super();
>     }
>         
>     public VertexReader<Text,NullWritable,Text> createVertexReader(InputSplit split,TaskAttemptContext context) throws IOException{
>         return new SimpleRDFVertexReader();
>     }
>     public class SimpleRDFVertexReader extends VertexReader<Text,NullWritable,Text>{
>         
>         private RecordReader<LongWritable,Text> lineRecordReader;
>         private TaskAttemptContext context;
>         
>     
>         @Override
>         public void initialize(InputSplit inputsplit, TaskAttemptContext context) throws IOException, InterruptedException{
>             
>             this.setContext(context);
>             lineRecordReader= new LineRecordReader();
>             lineRecordReader.initialize(inputsplit, context);
>     
>         }
>         
>         @Override
>         public final boolean nextVertex() throws IOException, InterruptedException{
>             return lineRecordReader.nextKeyValue();
>         }
>         
>         
>         @Override
>         public final Vertex<Text,NullWritable,Text> getCurrentVertex() throws IOException, InterruptedException{
>             Text line = lineRecordReader.getCurrentValue();
>             Vertex<Text,NullWritable,Text> vertex = getConf().createVertex();
>             String[] elements = line.toString().split(" ");
>             Text firstele = new Text(elements[0]);
>             int len1 = firstele.getLength();
>             Text  subject = new Text(firstele.toString().substring(1, len1-1));
>             Text secondele = new Text(elements[1]);
>             int len2 = secondele.getLength();
>             Text predicate = new Text(secondele.toString().substring(1, len2-1));
>             Text object = new Text(elements[2]);
>             ArrayList<Edge<Text,Text>> edge = new ArrayList<Edge<Text,Text>>();
>             edge.add(EdgeFactory.create(object, predicate) );
>             vertex.initialize(subject, null, edge );
>             return vertex;
>             
>         }
>         
>         @Override
>         public void close() throws IOException{
>             lineRecordReader.close();
>         }
>         
>         @Override
>         public  float getProgress()throws IOException,InterruptedException{
>             return lineRecordReader.getProgress();
>         }
> 
>         public TaskAttemptContext getContext() {
>             return context;
>         }
> 
>         public void setContext(TaskAttemptContext context) {
>             this.context = context;
>         }
>          
>     
>     }
> 
>     
> }
> 
> 
> 2014-07-01 4:23 GMT+02:00 John Yost <so...@gmail.com>:
>> Hi Carmen,
>> 
>> Question--did you only define an arguments constructor?  If so, I think you are getting this because you did not define a no-arguments constructor with public visibility.  If this is not the case, I recommend posting your source code and I will be happy to help. 
>> 
>> --John
>> 
>> 
>>> On Mon, Jun 30, 2014 at 9:38 AM, Carmen Manzulli <ca...@gmail.com> wrote:
>>> Hi,
>>> I'm trying to run a selectionComputation with my own code for VertexInputFormat but giraph' job starts to work and then fails with:
>>> 
>>> 
>>> 
>>> 
>>> 
>>> java.lang.IllegalStateException: run: Caught an unrecoverable exception newInstance: Couldn't instantiate sisinflab.SimpleRDFVertexInputFormat
>>> 	at org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:101)
>>> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
>>> 	at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>> 	at java.security.AccessController.doPrivileged(Native Method)
>>> 	at javax.security.auth.Subject.doAs(Subject.java:415)
>>> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
>>> 	at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>> Caused by: java.lang.IllegalStateException: newInstance: Couldn't instantiate sisinflab.SimpleRDFVertexInputFormat
>>> 	at org.apache.giraph.utils.ReflectionUtils.newInstance(ReflectionUtils.java:105)
>>> 	at org.apache.giraph.conf.ImmutableClassesGiraphConfiguration.createVertexInputFormat(ImmutableClassesGiraphConfiguration.java:235)
>>> 	at org.apache.giraph.conf.ImmutableClassesGiraphConfiguration.createWrappedVertexInputFormat(ImmutableClassesGiraphConfiguration.java:246)
>>> 	at org.apache.giraph.graph.GraphTaskManager.checkInput(GraphTaskManager.java:171)
>>> 	at org.apache.giraph.graph.GraphTaskManager.setup(GraphTaskManager.java:207)
>>> 	at org.apache.giraph.graph.GraphMapper.setup(GraphMapper.java:59)
>>> 	at org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:89)
>>> 	... 7 more
>>> Caused by: java.lang.InstantiationException
>>> 	at sun.reflect.InstantiationExceptionConstructorAccessorImpl.newInstance(InstantiationExceptionConstructorAccessorImpl.java:48)
>>> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>> 	at java.lang.Class.newInstance(Class.java:374)
>>> 	at org.apache.giraph.utils.ReflectionUtils.newInstance(ReflectionUtils.java:103)
>>> 	... 13 more
>>> 
>>> 
>>> what does it mean? where is the problem?
>>> Who can help me?
>>> 
>>> Carmen
> 

Re: Couldn't instantiate

Posted by Carmen Manzulli <ca...@gmail.com>.
Hi John,
yes I've tried to insert a no-arguments constructor but the problem seems
to be another one.This is my code, an input format with a vertexreader to
read triple as RDF.

import java.io.IOException;
import java.util.ArrayList;
import java.lang.InterruptedException;

import org.apache.giraph.graph.Vertex;
import org.apache.giraph.edge.Edge;
import org.apache.giraph.edge.EdgeFactory;
import org.apache.giraph.io.VertexReader;
import org.apache.giraph.io.VertexInputFormat;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.InputSplit;
import org.apache.hadoop.mapreduce.RecordReader;
import org.apache.hadoop.mapreduce.lib.input.LineRecordReader;
import org.apache.hadoop.mapreduce.TaskAttemptContext;



public abstract class SimpleRDFVertexInputFormat extends VertexInputFormat
<Text,NullWritable,Text> {

    public SimpleRDFVertexInputFormat() {
        // TODO Auto-generated constructor stub
        super();
    }

    public VertexReader<Text,NullWritable,Text>
createVertexReader(InputSplit split,TaskAttemptContext context) throws
IOException{
        return new SimpleRDFVertexReader();
    }
    public class SimpleRDFVertexReader extends
VertexReader<Text,NullWritable,Text>{

        private RecordReader<LongWritable,Text> lineRecordReader;
        private TaskAttemptContext context;


        @Override
        public void initialize(InputSplit inputsplit, TaskAttemptContext
context) throws IOException, InterruptedException{

            this.setContext(context);
            lineRecordReader= new LineRecordReader();
            lineRecordReader.initialize(inputsplit, context);

        }

        @Override
        public final boolean nextVertex() throws IOException,
InterruptedException{
            return lineRecordReader.nextKeyValue();
        }


        @Override
        public final Vertex<Text,NullWritable,Text> getCurrentVertex()
throws IOException, InterruptedException{
            Text line = lineRecordReader.getCurrentValue();
            Vertex<Text,NullWritable,Text> vertex =
getConf().createVertex();
            String[] elements = line.toString().split(" ");
            Text firstele = new Text(elements[0]);
            int len1 = firstele.getLength();
            Text  subject = new Text(firstele.toString().substring(1,
len1-1));
            Text secondele = new Text(elements[1]);
            int len2 = secondele.getLength();
            Text predicate = new Text(secondele.toString().substring(1,
len2-1));
            Text object = new Text(elements[2]);
            ArrayList<Edge<Text,Text>> edge = new
ArrayList<Edge<Text,Text>>();
            edge.add(EdgeFactory.create(object, predicate) );
            vertex.initialize(subject, null, edge );
            return vertex;

        }

        @Override
        public void close() throws IOException{
            lineRecordReader.close();
        }

        @Override
        public  float getProgress()throws IOException,InterruptedException{
            return lineRecordReader.getProgress();
        }

        public TaskAttemptContext getContext() {
            return context;
        }

        public void setContext(TaskAttemptContext context) {
            this.context = context;
        }


    }


}


2014-07-01 4:23 GMT+02:00 John Yost <so...@gmail.com>:

> Hi Carmen,
>
> Question--did you only define an arguments constructor?  If so, I think
> you are getting this because you did not define a no-arguments constructor
> with public visibility.  If this is not the case, I recommend posting your
> source code and I will be happy to help.
>
> --John
>
>
> On Mon, Jun 30, 2014 at 9:38 AM, Carmen Manzulli <carmenmanzulli@gmail.com
> > wrote:
>
>> Hi,
>>
>> I'm trying to run a selectionComputation with my own code for VertexInputFormat but giraph' job starts to work and then fails with:
>>
>>
>>
>>
>> java.lang.IllegalStateException: run: Caught an unrecoverable exception newInstance: Couldn't instantiate sisinflab.SimpleRDFVertexInputFormat
>> 	at org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:101)
>> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
>> 	at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>> 	at java.security.AccessController.doPrivileged(Native Method)
>> 	at javax.security.auth.Subject.doAs(Subject.java:415)
>> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
>> 	at org.apache.hadoop.mapred.Child.main(Child.java:249)
>> Caused by: java.lang.IllegalStateException: newInstance: Couldn't instantiate sisinflab.SimpleRDFVertexInputFormat
>> 	at org.apache.giraph.utils.ReflectionUtils.newInstance(ReflectionUtils.java:105)
>> 	at org.apache.giraph.conf.ImmutableClassesGiraphConfiguration.createVertexInputFormat(ImmutableClassesGiraphConfiguration.java:235)
>> 	at org.apache.giraph.conf.ImmutableClassesGiraphConfiguration.createWrappedVertexInputFormat(ImmutableClassesGiraphConfiguration.java:246)
>> 	at org.apache.giraph.graph.GraphTaskManager.checkInput(GraphTaskManager.java:171)
>> 	at org.apache.giraph.graph.GraphTaskManager.setup(GraphTaskManager.java:207)
>> 	at org.apache.giraph.graph.GraphMapper.setup(GraphMapper.java:59)
>> 	at org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:89)
>> 	... 7 more
>> Caused by: java.lang.InstantiationException
>> 	at sun.reflect.InstantiationExceptionConstructorAccessorImpl.newInstance(InstantiationExceptionConstructorAccessorImpl.java:48)
>> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>> 	at java.lang.Class.newInstance(Class.java:374)
>> 	at org.apache.giraph.utils.ReflectionUtils.newInstance(ReflectionUtils.java:103)
>> 	... 13 more
>>
>>
>> what does it mean? where is the problem?
>>
>> Who can help me?
>>
>> Carmen
>>
>>
>