You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@nutch.apache.org by Kshitij Shukla <ks...@cisinlabs.com> on 2016/01/30 07:19:49 UTC

[CIS-CMMI-3] Re: SV: configuration nutch with hbase and elasticserach

rather then using src and compile it on system, try using the binaries. hth

On Friday 29 January 2016 07:58 PM, Dan.Wu@scb.se wrote:
> Hi Lewis,
> Many thanks for your answer. It is not easy for a beginner to download so many staffs.
> I am struggling with java and hadoop
> Now I have downloaded jdk1.8.0_71 from Oracle, it seems ok.
> But hadoop refuses to run properly! When I run hadoop
> the terminal shows:
> /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/hadoop: line 27: /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/../libexec/hadoop-config.sh: No such file or directory
>
>
> I have downloaded hadoop 2.7.2 and followed the instruction on https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-common/SingleCluster.html#Fully-Distributed_Operation
>
> It seems that I cannot find etc/hadoop as the instruction described. In the .bashrc, the paths are difined as
> export JAVA_HOME=/usr/local/jdk1.8.0_71/
> export PATH=$PATH:$JAVA_HOME/bin
>
> export HADOOP_INSTALL=/usr/local/hadoop
>
> #export PATH=$PATH:$HADOOP_INSTALL/bin #no bin file are found
> #export PATH=$PATH:$HADDOP_INSTALL/sbin #not sbin file are found
> export HADOOP_MAPRED_HOME=$HADOOP_INSTALL/hadoop-mapreduce-project/bin
>
> export HADOOP_COMMON_HOME=$HADOOP_INSTALL/hadoop-common-project/hadoop-common/src/main/bin
> export HADOOP_HDFS_HOME=$HADOOP_INSTALL/hadoop-hdfs-project/hadoop-hdfs/src/main/bin
> export YARN_HOME=$HADOOP_INSTALL/hadoop-yarn-project/hadoop-yarn/bin
> export HADOOP_COMMOM_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
> export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
>   
> The problem is that there are not bin and sbin folders under hadoop, the bin folders are distributed under different folders.
> When I go to the $HADOOP_COMMON_HOME and run hadoop, the terminal shows
>
> /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/hadoop: line 27: /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/../libexec/hadoop-config.sh: No such file or directory
> Usage: hadoop [--config confdir] [COMMAND | CLASSNAME]
>    CLASSNAME            run the class named CLASSNAME
>   or
>    where COMMAND is one of:
>    fs                   run a generic filesystem user client
>    version              print the version
>    jar <jar>            run a jar file
>                         note: please use "yarn jar" to launch
>                               YARN applications, not this command.
>    checknative [-a|-h]  check native hadoop and compression libraries availability
>    distcp <srcurl> <desturl> copy file or directories recursively
>    archive -archiveName NAME -p <parent path> <src>* <dest> create a hadoop archive
>    classpath            prints the class path needed to get the
>    credential           interact with credential providers
>                         Hadoop jar and the required libraries
>    daemonlog            get/set the log level for each daemon
>    trace                view and modify Hadoop tracing settings
>
>   Sorry to turn this into a hadoop problem! Just to give you a feedback and in case anyone has an answer... Thanks!
>
> /Dan
>
> ________________________________________
> Från: Lewis John Mcgibbney [lewis.mcgibbney@gmail.com]
> Skickat: den 28 januari 2016 00:49
> Till: user@nutch.apache.org
> Ämne: Re: configuration nutch with hbase and elasticserach
>
> Hi Dan,
>
> Which version of Nutch 2.X are you using? The document you've highlighted
> below stated Nutch 2.3 with gora-hbase 0.5.
> Both of these are old and I would strongly advise you to use Nutch 2.3.1
> (just released last week) along with one of the following backends
>
> The recommended Gora backends for this Nutch release are
>
>     - Apache Avro 1.7.6
>     - Apache Hadoop 1.2.1 and 2.5.2
>     - Apache HBase 0.98.8-hadoop2 (although also tested with 1.X)
>     - Apache Cassandra 2.0.2
>     - Apache Solr 4.10.3
>     - MongoDB 2.6.X
>     - Apache Accumlo 1.5.1
>     - Apache Spark 1.4.1
>
> Please also note that you shoudl upgrade your JDK to 1.7.
> Thanks
>
> On Wed, Jan 27, 2016 at 3:21 PM, <us...@nutch.apache.org> wrote:
>
>> Hi,
>> I am a beginner with nutch and everything. Can anyone help me with the
>> configuration?
>>
>> I follower the instruction in
>> https://gist.github.com/xrstf/b48a970098a8e76943b9
>>
>> It seems that my building process complain plugin: indexer-elastic? My
>> thanks in advance!!!
>>
>> compile:
>>       [echo] Compiling plugin: indexer-elastic
>>      [javac] Compiling 3 source files to
>> /home/dan/apache-nutch-2.3/build/indexer-elastic/classes
>>      [javac] warning: [options] bootstrap class path not set in conjunction
>> with -source 1.6
>>      [javac]
>> /home/dan/apache-nutch-2.3/src/plugin/indexer-elastic/src/java/org/apache/nutch/indexwriter/elastic/ElasticIndexWriter.java:108:
>> error: no suitable constructor found for
>> InetSocketTransportAddress(String,int)
>>      [javac]           .addTransportAddress(new
>> InetSocketTransportAddress(host, port));
>>      [javac]                                ^
>>      [javac]     constructor
>> InetSocketTransportAddress.InetSocketTransportAddress(InetSocketAddress) is
>> not applicable
>>      [javac]       (actual and formal argument lists differ in length)
>>      [javac]     constructor
>> InetSocketTransportAddress.InetSocketTransportAddress(InetAddress,int) is
>> not applicable
>>      [javac]       (actual argument String cannot be converted to
>> InetAddress by method invocation conversion)
>>      [javac]     constructor
>> InetSocketTransportAddress.InetSocketTransportAddress() is not applicable
>>      [javac]       (actual and formal argument lists differ in length)
>>      [javac]     constructor
>> InetSocketTransportAddress.InetSocketTransportAddress(StreamInput) is not
>> applicable
>>      [javac]       (actual and formal argument lists differ in length)
>>      [javac]
>> /home/dan/apache-nutch-2.3/src/plugin/indexer-elastic/src/java/org/apache/nutch/indexwriter/elastic/ElasticIndexWriter.java:107:
>> error: constructor TransportClient in class TransportClient cannot be
>> applied to given types;
>>      [javac]       client = new TransportClient(settings)
>>      [javac]                ^
>>      [javac]   required: Injector
>>      [javac]   found: Settings
>>      [javac]   reason: actual argument Settings cannot be converted to
>> Injector by method invocation conversion
>>      [javac] 2 errors
>>      [javac] 1 warning
>>
>> BUILD FAILED
>> /home/dan/apache-nutch-2.3/build.xml:113: The following error occurred
>> while executing this line:
>> /home/dan/apache-nutch-2.3/src/plugin/build.xml:35: The following error
>> occurred while executing this line:
>> /home/dan/apache-nutch-2.3/src/plugin/build-plugin.xml:117: Compile
>> failed; see the compiler error output for details.
>>
>>
>>


-- 

Please let me know if you have any questions , concerns or updates.
Have a great day ahead :)

Thanks and Regards,

Kshitij Shukla
Software developer

*Cyber Infrastructure(CIS)
**/The RightSourcing Specialists with 1250 man years of experience!/*

DISCLAIMER:  INFORMATION PRIVACY is important for us, If you are not the 
intended recipient, you should delete this message and are notified that 
any disclosure, copying or distribution of this message, or taking any 
action based on it, is strictly prohibited by Law.

Please don't print this e-mail unless you really need to.

-- 

------------------------------

*Cyber Infrastructure (P) Limited, [CIS] **(CMMI Level 3 Certified)*

Central India's largest Technology company.

*Ensuring the success of our clients and partners through our highly 
optimized Technology solutions.*

www.cisin.com | +Cisin <https://plus.google.com/+Cisin/> | Linkedin 
<https://www.linkedin.com/company/cyber-infrastructure-private-limited> | 
Offices: *Indore, India.* *Singapore. Silicon Valley, USA*.

DISCLAIMER:  INFORMATION PRIVACY is important for us, If you are not the 
intended recipient, you should delete this message and are notified that 
any disclosure, copying or distribution of this message, or taking any 
action based on it, is strictly prohibited by Law.

[CIS-CMMI-3] Re: SV: [CIS-CMMI-3] Re: SV: [CIS-CMMI-3] Re: SV: configuration nutch with hbase and elasticserach

Posted by Kshitij Shukla <ks...@cisinlabs.com>.
@Dan, Simply copy the jars from your $HBASE_HOME/lib to $NUTCH_ROOT/lib 
directory. Followed by "ant clean" and "ant runtime" respectively. This 
should fix the
Caused by: java.lang.ClassNotFoundException: 
org.apache.hadoop.hbase.HBaseConfiguration error.
hth

*K

On Wednesday 03 February 2016 09:18 PM, Dan.Wu@scb.se wrote:
> Thanks, it helped.
> Hadoop 2.5.2 and Hbase-0.98.8-hadoop2 are both installed in standalone mode. And I can run them separately.
> However, when I run nutch inject, I ran into the problem like
> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
> I am not sure how to configure them correctly. Any suggestions?
> Regards/Dan
>
>
> -----Ursprungligt meddelande-----
> Från: Kshitij Shukla [mailto:kshitij.s@cisinlabs.com]
> Skickat: den 2 februari 2016 12:29
> Till: user@nutch.apache.org
> Ämne: [CIS-CMMI-3] Re: SV: [CIS-CMMI-3] Re: SV: configuration nutch with hbase and elasticserach
>
> Actually from binaries I meant the precompiled source like if we think of hadoop you will find source and binaries.. refer this link http://hadoop.apache.org/releases.html
> So hadoop 2.6.2 binary will be a good candidate to start with. After download you can simply extract with tar command in the dir you want.
> There are several things to configure. refer http://hadoop.apache.org/docs/r2.6.2/hadoop-project-dist/hadoop-common/SingleCluster.html
>
> hth = Hope That Help ;)
>
> *Kshitij
>
> On Tuesday 02 February 2016 01:34 PM, Dan.Wu@scb.se wrote:
>> Hi Kshitij
>> Could you please give me more detailed instruction? How to use binaires.hth?
>> I downloaded the package and extract it with tar -xvpf, in the extracted package, I cannot find bin and sbin folders.
>> Thanks!
>>
>> -----Ursprungligt meddelande-----
>> Från: Kshitij Shukla [mailto:kshitij.s@cisinlabs.com]
>> Skickat: den 30 januari 2016 07:20
>> Till: user@nutch.apache.org
>> Ämne: [CIS-CMMI-3] Re: SV: configuration nutch with hbase and
>> elasticserach
>>
>> rather then using src and compile it on system, try using the
>> binaries. hth
>>
>> On Friday 29 January 2016 07:58 PM, Dan.Wu@scb.se wrote:
>>> Hi Lewis,
>>> Many thanks for your answer. It is not easy for a beginner to download so many staffs.
>>> I am struggling with java and hadoop
>>> Now I have downloaded jdk1.8.0_71 from Oracle, it seems ok.
>>> But hadoop refuses to run properly! When I run hadoop the terminal
>>> shows:
>>> /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/ha
>>> d
>>> oop: line 27:
>>> /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/..
>>> /
>>> libexec/hadoop-config.sh: No such file or directory
>>>
>>>
>>> I have downloaded hadoop 2.7.2 and followed the instruction on
>>> https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-comm
>>> o n/SingleCluster.html#Fully-Distributed_Operation
>>>
>>> It seems that I cannot find etc/hadoop as the instruction described.
>>> In the .bashrc, the paths are difined as export
>>> JAVA_HOME=/usr/local/jdk1.8.0_71/ export PATH=$PATH:$JAVA_HOME/bin
>>>
>>> export HADOOP_INSTALL=/usr/local/hadoop
>>>
>>> #export PATH=$PATH:$HADOOP_INSTALL/bin #no bin file are found #export
>>> PATH=$PATH:$HADDOP_INSTALL/sbin #not sbin file are found export
>>> HADOOP_MAPRED_HOME=$HADOOP_INSTALL/hadoop-mapreduce-project/bin
>>>
>>> export
>>> HADOOP_COMMON_HOME=$HADOOP_INSTALL/hadoop-common-project/hadoop-commo
>>> n
>>> /src/main/bin export
>>> HADOOP_HDFS_HOME=$HADOOP_INSTALL/hadoop-hdfs-project/hadoop-hdfs/src/
>>> m
>>> ain/bin export
>>> YARN_HOME=$HADOOP_INSTALL/hadoop-yarn-project/hadoop-yarn/bin
>>> export HADOOP_COMMOM_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
>>> export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
>>>     
>>> The problem is that there are not bin and sbin folders under hadoop, the bin folders are distributed under different folders.
>>> When I go to the $HADOOP_COMMON_HOME and run hadoop, the terminal
>>> shows
>>>
>>> /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/ha
>>> d
>>> oop: line 27:
>>> /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/..
>>> /
>>> libexec/hadoop-config.sh: No such file or directory
>>> Usage: hadoop [--config confdir] [COMMAND | CLASSNAME]
>>>      CLASSNAME            run the class named CLASSNAME
>>>     or
>>>      where COMMAND is one of:
>>>      fs                   run a generic filesystem user client
>>>      version              print the version
>>>      jar <jar>            run a jar file
>>>                           note: please use "yarn jar" to launch
>>>                                 YARN applications, not this command.
>>>      checknative [-a|-h]  check native hadoop and compression libraries availability
>>>      distcp <srcurl> <desturl> copy file or directories recursively
>>>      archive -archiveName NAME -p <parent path> <src>* <dest> create a hadoop archive
>>>      classpath            prints the class path needed to get the
>>>      credential           interact with credential providers
>>>                           Hadoop jar and the required libraries
>>>      daemonlog            get/set the log level for each daemon
>>>      trace                view and modify Hadoop tracing settings
>>>
>>>     Sorry to turn this into a hadoop problem! Just to give you a feedback and in case anyone has an answer... Thanks!
>>>
>>> /Dan
>>>
>>> ________________________________________
>>> Från: Lewis John Mcgibbney [lewis.mcgibbney@gmail.com]
>>> Skickat: den 28 januari 2016 00:49
>>> Till: user@nutch.apache.org
>>> Ämne: Re: configuration nutch with hbase and elasticserach
>>>
>>> Hi Dan,
>>>
>>> Which version of Nutch 2.X are you using? The document you've
>>> highlighted below stated Nutch 2.3 with gora-hbase 0.5.
>>> Both of these are old and I would strongly advise you to use Nutch
>>> 2.3.1 (just released last week) along with one of the following
>>> backends
>>>
>>> The recommended Gora backends for this Nutch release are
>>>
>>>       - Apache Avro 1.7.6
>>>       - Apache Hadoop 1.2.1 and 2.5.2
>>>       - Apache HBase 0.98.8-hadoop2 (although also tested with 1.X)
>>>       - Apache Cassandra 2.0.2
>>>       - Apache Solr 4.10.3
>>>       - MongoDB 2.6.X
>>>       - Apache Accumlo 1.5.1
>>>       - Apache Spark 1.4.1
>>>
>>> Please also note that you shoudl upgrade your JDK to 1.7.
>>> Thanks
>>>
>>> On Wed, Jan 27, 2016 at 3:21 PM, <us...@nutch.apache.org> wrote:
>>>
>>>> Hi,
>>>> I am a beginner with nutch and everything. Can anyone help me with
>>>> the configuration?
>>>>
>>>> I follower the instruction in
>>>> https://gist.github.com/xrstf/b48a970098a8e76943b9
>>>>
>>>> It seems that my building process complain plugin: indexer-elastic?
>>>> My thanks in advance!!!
>>>>
>>>> compile:
>>>>         [echo] Compiling plugin: indexer-elastic
>>>>        [javac] Compiling 3 source files to
>>>> /home/dan/apache-nutch-2.3/build/indexer-elastic/classes
>>>>        [javac] warning: [options] bootstrap class path not set in
>>>> conjunction with -source 1.6
>>>>        [javac]
>>>> /home/dan/apache-nutch-2.3/src/plugin/indexer-elastic/src/java/org/apache/nutch/indexwriter/elastic/ElasticIndexWriter.java:108:
>>>> error: no suitable constructor found for
>>>> InetSocketTransportAddress(String,int)
>>>>        [javac]           .addTransportAddress(new
>>>> InetSocketTransportAddress(host, port));
>>>>        [javac]                                ^
>>>>        [javac]     constructor
>>>> InetSocketTransportAddress.InetSocketTransportAddress(InetSocketAddr
>>>> e
>>>> ss) is not applicable
>>>>        [javac]       (actual and formal argument lists differ in length)
>>>>        [javac]     constructor
>>>> InetSocketTransportAddress.InetSocketTransportAddress(InetAddress,in
>>>> t
>>>> ) is not applicable
>>>>        [javac]       (actual argument String cannot be converted to
>>>> InetAddress by method invocation conversion)
>>>>        [javac]     constructor
>>>> InetSocketTransportAddress.InetSocketTransportAddress() is not applicable
>>>>        [javac]       (actual and formal argument lists differ in length)
>>>>        [javac]     constructor
>>>> InetSocketTransportAddress.InetSocketTransportAddress(StreamInput)
>>>> is not applicable
>>>>        [javac]       (actual and formal argument lists differ in length)
>>>>        [javac]
>>>> /home/dan/apache-nutch-2.3/src/plugin/indexer-elastic/src/java/org/apache/nutch/indexwriter/elastic/ElasticIndexWriter.java:107:
>>>> error: constructor TransportClient in class TransportClient cannot
>>>> be applied to given types;
>>>>        [javac]       client = new TransportClient(settings)
>>>>        [javac]                ^
>>>>        [javac]   required: Injector
>>>>        [javac]   found: Settings
>>>>        [javac]   reason: actual argument Settings cannot be converted to
>>>> Injector by method invocation conversion
>>>>        [javac] 2 errors
>>>>        [javac] 1 warning
>>>>
>>>> BUILD FAILED
>>>> /home/dan/apache-nutch-2.3/build.xml:113: The following error
>>>> occurred while executing this line:
>>>> /home/dan/apache-nutch-2.3/src/plugin/build.xml:35: The following
>>>> error occurred while executing this line:
>>>> /home/dan/apache-nutch-2.3/src/plugin/build-plugin.xml:117: Compile
>>>> failed; see the compiler error output for details.
>>>>
>>>>
>>>>
>


-- 

Please let me know if you have any questions , concerns or updates.
Have a great day ahead :)

Thanks and Regards,

Kshitij Shukla
Software developer

*Cyber Infrastructure(CIS)
**/The RightSourcing Specialists with 1250 man years of experience!/*

DISCLAIMER:  INFORMATION PRIVACY is important for us, If you are not the 
intended recipient, you should delete this message and are notified that 
any disclosure, copying or distribution of this message, or taking any 
action based on it, is strictly prohibited by Law.

Please don't print this e-mail unless you really need to.

-- 

------------------------------

*Cyber Infrastructure (P) Limited, [CIS] **(CMMI Level 3 Certified)*

Central India's largest Technology company.

*Ensuring the success of our clients and partners through our highly 
optimized Technology solutions.*

www.cisin.com | +Cisin <https://plus.google.com/+Cisin/> | Linkedin 
<https://www.linkedin.com/company/cyber-infrastructure-private-limited> | 
Offices: *Indore, India.* *Singapore. Silicon Valley, USA*.

DISCLAIMER:  INFORMATION PRIVACY is important for us, If you are not the 
intended recipient, you should delete this message and are notified that 
any disclosure, copying or distribution of this message, or taking any 
action based on it, is strictly prohibited by Law.

SV: [CIS-CMMI-3] Re: SV: [CIS-CMMI-3] Re: SV: configuration nutch with hbase and elasticserach

Posted by Da...@scb.se.
Thanks, it helped. 
Hadoop 2.5.2 and Hbase-0.98.8-hadoop2 are both installed in standalone mode. And I can run them separately.
However, when I run nutch inject, I ran into the problem like
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
I am not sure how to configure them correctly. Any suggestions?
Regards/Dan 


-----Ursprungligt meddelande-----
Från: Kshitij Shukla [mailto:kshitij.s@cisinlabs.com] 
Skickat: den 2 februari 2016 12:29
Till: user@nutch.apache.org
Ämne: [CIS-CMMI-3] Re: SV: [CIS-CMMI-3] Re: SV: configuration nutch with hbase and elasticserach

Actually from binaries I meant the precompiled source like if we think of hadoop you will find source and binaries.. refer this link http://hadoop.apache.org/releases.html
So hadoop 2.6.2 binary will be a good candidate to start with. After download you can simply extract with tar command in the dir you want. 
There are several things to configure. refer http://hadoop.apache.org/docs/r2.6.2/hadoop-project-dist/hadoop-common/SingleCluster.html 

hth = Hope That Help ;)

*Kshitij

On Tuesday 02 February 2016 01:34 PM, Dan.Wu@scb.se wrote:
> Hi Kshitij
> Could you please give me more detailed instruction? How to use binaires.hth?
> I downloaded the package and extract it with tar -xvpf, in the extracted package, I cannot find bin and sbin folders.
> Thanks!
>
> -----Ursprungligt meddelande-----
> Från: Kshitij Shukla [mailto:kshitij.s@cisinlabs.com]
> Skickat: den 30 januari 2016 07:20
> Till: user@nutch.apache.org
> Ämne: [CIS-CMMI-3] Re: SV: configuration nutch with hbase and 
> elasticserach
>
> rather then using src and compile it on system, try using the 
> binaries. hth
>
> On Friday 29 January 2016 07:58 PM, Dan.Wu@scb.se wrote:
>> Hi Lewis,
>> Many thanks for your answer. It is not easy for a beginner to download so many staffs.
>> I am struggling with java and hadoop
>> Now I have downloaded jdk1.8.0_71 from Oracle, it seems ok.
>> But hadoop refuses to run properly! When I run hadoop the terminal
>> shows:
>> /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/ha
>> d
>> oop: line 27:
>> /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/..
>> /
>> libexec/hadoop-config.sh: No such file or directory
>>
>>
>> I have downloaded hadoop 2.7.2 and followed the instruction on 
>> https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-comm
>> o n/SingleCluster.html#Fully-Distributed_Operation
>>
>> It seems that I cannot find etc/hadoop as the instruction described.
>> In the .bashrc, the paths are difined as export 
>> JAVA_HOME=/usr/local/jdk1.8.0_71/ export PATH=$PATH:$JAVA_HOME/bin
>>
>> export HADOOP_INSTALL=/usr/local/hadoop
>>
>> #export PATH=$PATH:$HADOOP_INSTALL/bin #no bin file are found #export 
>> PATH=$PATH:$HADDOP_INSTALL/sbin #not sbin file are found export 
>> HADOOP_MAPRED_HOME=$HADOOP_INSTALL/hadoop-mapreduce-project/bin
>>
>> export
>> HADOOP_COMMON_HOME=$HADOOP_INSTALL/hadoop-common-project/hadoop-commo
>> n
>> /src/main/bin export
>> HADOOP_HDFS_HOME=$HADOOP_INSTALL/hadoop-hdfs-project/hadoop-hdfs/src/
>> m
>> ain/bin export
>> YARN_HOME=$HADOOP_INSTALL/hadoop-yarn-project/hadoop-yarn/bin
>> export HADOOP_COMMOM_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
>> export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
>>    
>> The problem is that there are not bin and sbin folders under hadoop, the bin folders are distributed under different folders.
>> When I go to the $HADOOP_COMMON_HOME and run hadoop, the terminal 
>> shows
>>
>> /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/ha
>> d
>> oop: line 27:
>> /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/..
>> /
>> libexec/hadoop-config.sh: No such file or directory
>> Usage: hadoop [--config confdir] [COMMAND | CLASSNAME]
>>     CLASSNAME            run the class named CLASSNAME
>>    or
>>     where COMMAND is one of:
>>     fs                   run a generic filesystem user client
>>     version              print the version
>>     jar <jar>            run a jar file
>>                          note: please use "yarn jar" to launch
>>                                YARN applications, not this command.
>>     checknative [-a|-h]  check native hadoop and compression libraries availability
>>     distcp <srcurl> <desturl> copy file or directories recursively
>>     archive -archiveName NAME -p <parent path> <src>* <dest> create a hadoop archive
>>     classpath            prints the class path needed to get the
>>     credential           interact with credential providers
>>                          Hadoop jar and the required libraries
>>     daemonlog            get/set the log level for each daemon
>>     trace                view and modify Hadoop tracing settings
>>
>>    Sorry to turn this into a hadoop problem! Just to give you a feedback and in case anyone has an answer... Thanks!
>>
>> /Dan
>>
>> ________________________________________
>> Från: Lewis John Mcgibbney [lewis.mcgibbney@gmail.com]
>> Skickat: den 28 januari 2016 00:49
>> Till: user@nutch.apache.org
>> Ämne: Re: configuration nutch with hbase and elasticserach
>>
>> Hi Dan,
>>
>> Which version of Nutch 2.X are you using? The document you've 
>> highlighted below stated Nutch 2.3 with gora-hbase 0.5.
>> Both of these are old and I would strongly advise you to use Nutch
>> 2.3.1 (just released last week) along with one of the following 
>> backends
>>
>> The recommended Gora backends for this Nutch release are
>>
>>      - Apache Avro 1.7.6
>>      - Apache Hadoop 1.2.1 and 2.5.2
>>      - Apache HBase 0.98.8-hadoop2 (although also tested with 1.X)
>>      - Apache Cassandra 2.0.2
>>      - Apache Solr 4.10.3
>>      - MongoDB 2.6.X
>>      - Apache Accumlo 1.5.1
>>      - Apache Spark 1.4.1
>>
>> Please also note that you shoudl upgrade your JDK to 1.7.
>> Thanks
>>
>> On Wed, Jan 27, 2016 at 3:21 PM, <us...@nutch.apache.org> wrote:
>>
>>> Hi,
>>> I am a beginner with nutch and everything. Can anyone help me with 
>>> the configuration?
>>>
>>> I follower the instruction in
>>> https://gist.github.com/xrstf/b48a970098a8e76943b9
>>>
>>> It seems that my building process complain plugin: indexer-elastic?
>>> My thanks in advance!!!
>>>
>>> compile:
>>>        [echo] Compiling plugin: indexer-elastic
>>>       [javac] Compiling 3 source files to 
>>> /home/dan/apache-nutch-2.3/build/indexer-elastic/classes
>>>       [javac] warning: [options] bootstrap class path not set in 
>>> conjunction with -source 1.6
>>>       [javac]
>>> /home/dan/apache-nutch-2.3/src/plugin/indexer-elastic/src/java/org/apache/nutch/indexwriter/elastic/ElasticIndexWriter.java:108:
>>> error: no suitable constructor found for
>>> InetSocketTransportAddress(String,int)
>>>       [javac]           .addTransportAddress(new
>>> InetSocketTransportAddress(host, port));
>>>       [javac]                                ^
>>>       [javac]     constructor
>>> InetSocketTransportAddress.InetSocketTransportAddress(InetSocketAddr
>>> e
>>> ss) is not applicable
>>>       [javac]       (actual and formal argument lists differ in length)
>>>       [javac]     constructor
>>> InetSocketTransportAddress.InetSocketTransportAddress(InetAddress,in
>>> t
>>> ) is not applicable
>>>       [javac]       (actual argument String cannot be converted to
>>> InetAddress by method invocation conversion)
>>>       [javac]     constructor
>>> InetSocketTransportAddress.InetSocketTransportAddress() is not applicable
>>>       [javac]       (actual and formal argument lists differ in length)
>>>       [javac]     constructor
>>> InetSocketTransportAddress.InetSocketTransportAddress(StreamInput) 
>>> is not applicable
>>>       [javac]       (actual and formal argument lists differ in length)
>>>       [javac]
>>> /home/dan/apache-nutch-2.3/src/plugin/indexer-elastic/src/java/org/apache/nutch/indexwriter/elastic/ElasticIndexWriter.java:107:
>>> error: constructor TransportClient in class TransportClient cannot 
>>> be applied to given types;
>>>       [javac]       client = new TransportClient(settings)
>>>       [javac]                ^
>>>       [javac]   required: Injector
>>>       [javac]   found: Settings
>>>       [javac]   reason: actual argument Settings cannot be converted to
>>> Injector by method invocation conversion
>>>       [javac] 2 errors
>>>       [javac] 1 warning
>>>
>>> BUILD FAILED
>>> /home/dan/apache-nutch-2.3/build.xml:113: The following error 
>>> occurred while executing this line:
>>> /home/dan/apache-nutch-2.3/src/plugin/build.xml:35: The following 
>>> error occurred while executing this line:
>>> /home/dan/apache-nutch-2.3/src/plugin/build-plugin.xml:117: Compile 
>>> failed; see the compiler error output for details.
>>>
>>>
>>>
>


-- 

Please let me know if you have any questions , concerns or updates.
Have a great day ahead :)

Thanks and Regards,

Kshitij Shukla
Software developer

*Cyber Infrastructure(CIS)
**/The RightSourcing Specialists with 1250 man years of experience!/*

DISCLAIMER:  INFORMATION PRIVACY is important for us, If you are not the intended recipient, you should delete this message and are notified that any disclosure, copying or distribution of this message, or taking any action based on it, is strictly prohibited by Law.

Please don't print this e-mail unless you really need to.

-- 

------------------------------

*Cyber Infrastructure (P) Limited, [CIS] **(CMMI Level 3 Certified)*

Central India's largest Technology company.

*Ensuring the success of our clients and partners through our highly optimized Technology solutions.*

www.cisin.com | +Cisin <https://plus.google.com/+Cisin/> | Linkedin <https://www.linkedin.com/company/cyber-infrastructure-private-limited> |
Offices: *Indore, India.* *Singapore. Silicon Valley, USA*.

DISCLAIMER:  INFORMATION PRIVACY is important for us, If you are not the intended recipient, you should delete this message and are notified that any disclosure, copying or distribution of this message, or taking any action based on it, is strictly prohibited by Law.

[CIS-CMMI-3] Re: SV: [CIS-CMMI-3] Re: SV: configuration nutch with hbase and elasticserach

Posted by Kshitij Shukla <ks...@cisinlabs.com>.
Actually from binaries I meant the precompiled source like if we think 
of hadoop you will find source and binaries.. refer this link 
http://hadoop.apache.org/releases.html
So hadoop 2.6.2 binary will be a good candidate to start with. After 
download you can simply extract with tar command in the dir you want. 
There are several things to configure. refer 
http://hadoop.apache.org/docs/r2.6.2/hadoop-project-dist/hadoop-common/SingleCluster.html 

hth = Hope That Help ;)

*Kshitij

On Tuesday 02 February 2016 01:34 PM, Dan.Wu@scb.se wrote:
> Hi Kshitij
> Could you please give me more detailed instruction? How to use binaires.hth?
> I downloaded the package and extract it with tar -xvpf, in the extracted package, I cannot find bin and sbin folders.
> Thanks!
>
> -----Ursprungligt meddelande-----
> Från: Kshitij Shukla [mailto:kshitij.s@cisinlabs.com]
> Skickat: den 30 januari 2016 07:20
> Till: user@nutch.apache.org
> Ämne: [CIS-CMMI-3] Re: SV: configuration nutch with hbase and elasticserach
>
> rather then using src and compile it on system, try using the binaries. hth
>
> On Friday 29 January 2016 07:58 PM, Dan.Wu@scb.se wrote:
>> Hi Lewis,
>> Many thanks for your answer. It is not easy for a beginner to download so many staffs.
>> I am struggling with java and hadoop
>> Now I have downloaded jdk1.8.0_71 from Oracle, it seems ok.
>> But hadoop refuses to run properly! When I run hadoop the terminal
>> shows:
>> /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/had
>> oop: line 27:
>> /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/../
>> libexec/hadoop-config.sh: No such file or directory
>>
>>
>> I have downloaded hadoop 2.7.2 and followed the instruction on
>> https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-commo
>> n/SingleCluster.html#Fully-Distributed_Operation
>>
>> It seems that I cannot find etc/hadoop as the instruction described.
>> In the .bashrc, the paths are difined as export
>> JAVA_HOME=/usr/local/jdk1.8.0_71/ export PATH=$PATH:$JAVA_HOME/bin
>>
>> export HADOOP_INSTALL=/usr/local/hadoop
>>
>> #export PATH=$PATH:$HADOOP_INSTALL/bin #no bin file are found #export
>> PATH=$PATH:$HADDOP_INSTALL/sbin #not sbin file are found export
>> HADOOP_MAPRED_HOME=$HADOOP_INSTALL/hadoop-mapreduce-project/bin
>>
>> export
>> HADOOP_COMMON_HOME=$HADOOP_INSTALL/hadoop-common-project/hadoop-common
>> /src/main/bin export
>> HADOOP_HDFS_HOME=$HADOOP_INSTALL/hadoop-hdfs-project/hadoop-hdfs/src/m
>> ain/bin export
>> YARN_HOME=$HADOOP_INSTALL/hadoop-yarn-project/hadoop-yarn/bin
>> export HADOOP_COMMOM_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
>> export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
>>    
>> The problem is that there are not bin and sbin folders under hadoop, the bin folders are distributed under different folders.
>> When I go to the $HADOOP_COMMON_HOME and run hadoop, the terminal
>> shows
>>
>> /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/had
>> oop: line 27:
>> /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/../
>> libexec/hadoop-config.sh: No such file or directory
>> Usage: hadoop [--config confdir] [COMMAND | CLASSNAME]
>>     CLASSNAME            run the class named CLASSNAME
>>    or
>>     where COMMAND is one of:
>>     fs                   run a generic filesystem user client
>>     version              print the version
>>     jar <jar>            run a jar file
>>                          note: please use "yarn jar" to launch
>>                                YARN applications, not this command.
>>     checknative [-a|-h]  check native hadoop and compression libraries availability
>>     distcp <srcurl> <desturl> copy file or directories recursively
>>     archive -archiveName NAME -p <parent path> <src>* <dest> create a hadoop archive
>>     classpath            prints the class path needed to get the
>>     credential           interact with credential providers
>>                          Hadoop jar and the required libraries
>>     daemonlog            get/set the log level for each daemon
>>     trace                view and modify Hadoop tracing settings
>>
>>    Sorry to turn this into a hadoop problem! Just to give you a feedback and in case anyone has an answer... Thanks!
>>
>> /Dan
>>
>> ________________________________________
>> Från: Lewis John Mcgibbney [lewis.mcgibbney@gmail.com]
>> Skickat: den 28 januari 2016 00:49
>> Till: user@nutch.apache.org
>> Ämne: Re: configuration nutch with hbase and elasticserach
>>
>> Hi Dan,
>>
>> Which version of Nutch 2.X are you using? The document you've
>> highlighted below stated Nutch 2.3 with gora-hbase 0.5.
>> Both of these are old and I would strongly advise you to use Nutch
>> 2.3.1 (just released last week) along with one of the following
>> backends
>>
>> The recommended Gora backends for this Nutch release are
>>
>>      - Apache Avro 1.7.6
>>      - Apache Hadoop 1.2.1 and 2.5.2
>>      - Apache HBase 0.98.8-hadoop2 (although also tested with 1.X)
>>      - Apache Cassandra 2.0.2
>>      - Apache Solr 4.10.3
>>      - MongoDB 2.6.X
>>      - Apache Accumlo 1.5.1
>>      - Apache Spark 1.4.1
>>
>> Please also note that you shoudl upgrade your JDK to 1.7.
>> Thanks
>>
>> On Wed, Jan 27, 2016 at 3:21 PM, <us...@nutch.apache.org> wrote:
>>
>>> Hi,
>>> I am a beginner with nutch and everything. Can anyone help me with
>>> the configuration?
>>>
>>> I follower the instruction in
>>> https://gist.github.com/xrstf/b48a970098a8e76943b9
>>>
>>> It seems that my building process complain plugin: indexer-elastic?
>>> My thanks in advance!!!
>>>
>>> compile:
>>>        [echo] Compiling plugin: indexer-elastic
>>>       [javac] Compiling 3 source files to
>>> /home/dan/apache-nutch-2.3/build/indexer-elastic/classes
>>>       [javac] warning: [options] bootstrap class path not set in
>>> conjunction with -source 1.6
>>>       [javac]
>>> /home/dan/apache-nutch-2.3/src/plugin/indexer-elastic/src/java/org/apache/nutch/indexwriter/elastic/ElasticIndexWriter.java:108:
>>> error: no suitable constructor found for
>>> InetSocketTransportAddress(String,int)
>>>       [javac]           .addTransportAddress(new
>>> InetSocketTransportAddress(host, port));
>>>       [javac]                                ^
>>>       [javac]     constructor
>>> InetSocketTransportAddress.InetSocketTransportAddress(InetSocketAddre
>>> ss) is not applicable
>>>       [javac]       (actual and formal argument lists differ in length)
>>>       [javac]     constructor
>>> InetSocketTransportAddress.InetSocketTransportAddress(InetAddress,int
>>> ) is not applicable
>>>       [javac]       (actual argument String cannot be converted to
>>> InetAddress by method invocation conversion)
>>>       [javac]     constructor
>>> InetSocketTransportAddress.InetSocketTransportAddress() is not applicable
>>>       [javac]       (actual and formal argument lists differ in length)
>>>       [javac]     constructor
>>> InetSocketTransportAddress.InetSocketTransportAddress(StreamInput) is
>>> not applicable
>>>       [javac]       (actual and formal argument lists differ in length)
>>>       [javac]
>>> /home/dan/apache-nutch-2.3/src/plugin/indexer-elastic/src/java/org/apache/nutch/indexwriter/elastic/ElasticIndexWriter.java:107:
>>> error: constructor TransportClient in class TransportClient cannot be
>>> applied to given types;
>>>       [javac]       client = new TransportClient(settings)
>>>       [javac]                ^
>>>       [javac]   required: Injector
>>>       [javac]   found: Settings
>>>       [javac]   reason: actual argument Settings cannot be converted to
>>> Injector by method invocation conversion
>>>       [javac] 2 errors
>>>       [javac] 1 warning
>>>
>>> BUILD FAILED
>>> /home/dan/apache-nutch-2.3/build.xml:113: The following error
>>> occurred while executing this line:
>>> /home/dan/apache-nutch-2.3/src/plugin/build.xml:35: The following
>>> error occurred while executing this line:
>>> /home/dan/apache-nutch-2.3/src/plugin/build-plugin.xml:117: Compile
>>> failed; see the compiler error output for details.
>>>
>>>
>>>
>


-- 

Please let me know if you have any questions , concerns or updates.
Have a great day ahead :)

Thanks and Regards,

Kshitij Shukla
Software developer

*Cyber Infrastructure(CIS)
**/The RightSourcing Specialists with 1250 man years of experience!/*

DISCLAIMER:  INFORMATION PRIVACY is important for us, If you are not the 
intended recipient, you should delete this message and are notified that 
any disclosure, copying or distribution of this message, or taking any 
action based on it, is strictly prohibited by Law.

Please don't print this e-mail unless you really need to.

-- 

------------------------------

*Cyber Infrastructure (P) Limited, [CIS] **(CMMI Level 3 Certified)*

Central India's largest Technology company.

*Ensuring the success of our clients and partners through our highly 
optimized Technology solutions.*

www.cisin.com | +Cisin <https://plus.google.com/+Cisin/> | Linkedin 
<https://www.linkedin.com/company/cyber-infrastructure-private-limited> | 
Offices: *Indore, India.* *Singapore. Silicon Valley, USA*.

DISCLAIMER:  INFORMATION PRIVACY is important for us, If you are not the 
intended recipient, you should delete this message and are notified that 
any disclosure, copying or distribution of this message, or taking any 
action based on it, is strictly prohibited by Law.

SV: [CIS-CMMI-3] Re: SV: configuration nutch with hbase and elasticserach

Posted by Da...@scb.se.
Hi Kshitij
Could you please give me more detailed instruction? How to use binaires.hth?
I downloaded the package and extract it with tar -xvpf, in the extracted package, I cannot find bin and sbin folders.
Thanks!

-----Ursprungligt meddelande-----
Från: Kshitij Shukla [mailto:kshitij.s@cisinlabs.com] 
Skickat: den 30 januari 2016 07:20
Till: user@nutch.apache.org
Ämne: [CIS-CMMI-3] Re: SV: configuration nutch with hbase and elasticserach

rather then using src and compile it on system, try using the binaries. hth

On Friday 29 January 2016 07:58 PM, Dan.Wu@scb.se wrote:
> Hi Lewis,
> Many thanks for your answer. It is not easy for a beginner to download so many staffs.
> I am struggling with java and hadoop
> Now I have downloaded jdk1.8.0_71 from Oracle, it seems ok.
> But hadoop refuses to run properly! When I run hadoop the terminal 
> shows:
> /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/had
> oop: line 27: 
> /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/../
> libexec/hadoop-config.sh: No such file or directory
>
>
> I have downloaded hadoop 2.7.2 and followed the instruction on 
> https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-commo
> n/SingleCluster.html#Fully-Distributed_Operation
>
> It seems that I cannot find etc/hadoop as the instruction described. 
> In the .bashrc, the paths are difined as export 
> JAVA_HOME=/usr/local/jdk1.8.0_71/ export PATH=$PATH:$JAVA_HOME/bin
>
> export HADOOP_INSTALL=/usr/local/hadoop
>
> #export PATH=$PATH:$HADOOP_INSTALL/bin #no bin file are found #export 
> PATH=$PATH:$HADDOP_INSTALL/sbin #not sbin file are found export 
> HADOOP_MAPRED_HOME=$HADOOP_INSTALL/hadoop-mapreduce-project/bin
>
> export 
> HADOOP_COMMON_HOME=$HADOOP_INSTALL/hadoop-common-project/hadoop-common
> /src/main/bin export 
> HADOOP_HDFS_HOME=$HADOOP_INSTALL/hadoop-hdfs-project/hadoop-hdfs/src/m
> ain/bin export 
> YARN_HOME=$HADOOP_INSTALL/hadoop-yarn-project/hadoop-yarn/bin
> export HADOOP_COMMOM_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
> export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
>   
> The problem is that there are not bin and sbin folders under hadoop, the bin folders are distributed under different folders.
> When I go to the $HADOOP_COMMON_HOME and run hadoop, the terminal 
> shows
>
> /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/had
> oop: line 27: 
> /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/bin/../
> libexec/hadoop-config.sh: No such file or directory
> Usage: hadoop [--config confdir] [COMMAND | CLASSNAME]
>    CLASSNAME            run the class named CLASSNAME
>   or
>    where COMMAND is one of:
>    fs                   run a generic filesystem user client
>    version              print the version
>    jar <jar>            run a jar file
>                         note: please use "yarn jar" to launch
>                               YARN applications, not this command.
>    checknative [-a|-h]  check native hadoop and compression libraries availability
>    distcp <srcurl> <desturl> copy file or directories recursively
>    archive -archiveName NAME -p <parent path> <src>* <dest> create a hadoop archive
>    classpath            prints the class path needed to get the
>    credential           interact with credential providers
>                         Hadoop jar and the required libraries
>    daemonlog            get/set the log level for each daemon
>    trace                view and modify Hadoop tracing settings
>
>   Sorry to turn this into a hadoop problem! Just to give you a feedback and in case anyone has an answer... Thanks!
>
> /Dan
>
> ________________________________________
> Från: Lewis John Mcgibbney [lewis.mcgibbney@gmail.com]
> Skickat: den 28 januari 2016 00:49
> Till: user@nutch.apache.org
> Ämne: Re: configuration nutch with hbase and elasticserach
>
> Hi Dan,
>
> Which version of Nutch 2.X are you using? The document you've 
> highlighted below stated Nutch 2.3 with gora-hbase 0.5.
> Both of these are old and I would strongly advise you to use Nutch 
> 2.3.1 (just released last week) along with one of the following 
> backends
>
> The recommended Gora backends for this Nutch release are
>
>     - Apache Avro 1.7.6
>     - Apache Hadoop 1.2.1 and 2.5.2
>     - Apache HBase 0.98.8-hadoop2 (although also tested with 1.X)
>     - Apache Cassandra 2.0.2
>     - Apache Solr 4.10.3
>     - MongoDB 2.6.X
>     - Apache Accumlo 1.5.1
>     - Apache Spark 1.4.1
>
> Please also note that you shoudl upgrade your JDK to 1.7.
> Thanks
>
> On Wed, Jan 27, 2016 at 3:21 PM, <us...@nutch.apache.org> wrote:
>
>> Hi,
>> I am a beginner with nutch and everything. Can anyone help me with 
>> the configuration?
>>
>> I follower the instruction in
>> https://gist.github.com/xrstf/b48a970098a8e76943b9
>>
>> It seems that my building process complain plugin: indexer-elastic? 
>> My thanks in advance!!!
>>
>> compile:
>>       [echo] Compiling plugin: indexer-elastic
>>      [javac] Compiling 3 source files to 
>> /home/dan/apache-nutch-2.3/build/indexer-elastic/classes
>>      [javac] warning: [options] bootstrap class path not set in 
>> conjunction with -source 1.6
>>      [javac]
>> /home/dan/apache-nutch-2.3/src/plugin/indexer-elastic/src/java/org/apache/nutch/indexwriter/elastic/ElasticIndexWriter.java:108:
>> error: no suitable constructor found for
>> InetSocketTransportAddress(String,int)
>>      [javac]           .addTransportAddress(new
>> InetSocketTransportAddress(host, port));
>>      [javac]                                ^
>>      [javac]     constructor
>> InetSocketTransportAddress.InetSocketTransportAddress(InetSocketAddre
>> ss) is not applicable
>>      [javac]       (actual and formal argument lists differ in length)
>>      [javac]     constructor
>> InetSocketTransportAddress.InetSocketTransportAddress(InetAddress,int
>> ) is not applicable
>>      [javac]       (actual argument String cannot be converted to
>> InetAddress by method invocation conversion)
>>      [javac]     constructor
>> InetSocketTransportAddress.InetSocketTransportAddress() is not applicable
>>      [javac]       (actual and formal argument lists differ in length)
>>      [javac]     constructor
>> InetSocketTransportAddress.InetSocketTransportAddress(StreamInput) is 
>> not applicable
>>      [javac]       (actual and formal argument lists differ in length)
>>      [javac]
>> /home/dan/apache-nutch-2.3/src/plugin/indexer-elastic/src/java/org/apache/nutch/indexwriter/elastic/ElasticIndexWriter.java:107:
>> error: constructor TransportClient in class TransportClient cannot be 
>> applied to given types;
>>      [javac]       client = new TransportClient(settings)
>>      [javac]                ^
>>      [javac]   required: Injector
>>      [javac]   found: Settings
>>      [javac]   reason: actual argument Settings cannot be converted to
>> Injector by method invocation conversion
>>      [javac] 2 errors
>>      [javac] 1 warning
>>
>> BUILD FAILED
>> /home/dan/apache-nutch-2.3/build.xml:113: The following error 
>> occurred while executing this line:
>> /home/dan/apache-nutch-2.3/src/plugin/build.xml:35: The following 
>> error occurred while executing this line:
>> /home/dan/apache-nutch-2.3/src/plugin/build-plugin.xml:117: Compile 
>> failed; see the compiler error output for details.
>>
>>
>>


-- 

Please let me know if you have any questions , concerns or updates.
Have a great day ahead :)

Thanks and Regards,

Kshitij Shukla
Software developer

*Cyber Infrastructure(CIS)
**/The RightSourcing Specialists with 1250 man years of experience!/*

DISCLAIMER:  INFORMATION PRIVACY is important for us, If you are not the intended recipient, you should delete this message and are notified that any disclosure, copying or distribution of this message, or taking any action based on it, is strictly prohibited by Law.

Please don't print this e-mail unless you really need to.

-- 

------------------------------

*Cyber Infrastructure (P) Limited, [CIS] **(CMMI Level 3 Certified)*

Central India's largest Technology company.

*Ensuring the success of our clients and partners through our highly optimized Technology solutions.*

www.cisin.com | +Cisin <https://plus.google.com/+Cisin/> | Linkedin <https://www.linkedin.com/company/cyber-infrastructure-private-limited> |
Offices: *Indore, India.* *Singapore. Silicon Valley, USA*.

DISCLAIMER:  INFORMATION PRIVACY is important for us, If you are not the intended recipient, you should delete this message and are notified that any disclosure, copying or distribution of this message, or taking any action based on it, is strictly prohibited by Law.