You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@karaf.apache.org by Geoffry Roberts <ge...@gmail.com> on 2011/10/03 22:52:40 UTC

Camel and Hadoop in Karaf

All,

I'm having a problem getting things to work with hadoop's hdfs.  This is my
first try with this so I'm just trying to something simple.  I want to read
a file from hdfs and write it's contents to the console.

Can anyone see what I'm doing wrong?  Thanks.

Here's the error (from karaf):

org.osgi.service.blueprint.container.ComponentDefinitionException: Unable to
intialize bean camel-16

Here's my blueprint:

<blueprint xmlns="http://www.osgi.org/xmlns/blueprint/v1.0.0">
    <camelContext xmlns="http://camel.apache.org/schema/blueprint">
        <route>
            <from
uri="hdfs://qq000:54310/user/hadoop/epistate.xmi?noop=true" />
            <to uri="stream:out" />
        </route>
    </camelContext>
</blueprint>

Here's where hdfs is bound (from core-site.xml in the hadoop config):

<configuration>
...
<property>
  <name>fs.default.name</name>
  <value>hdfs://qq000:54310</value>
</property>
</configuration>

-- 
Geoffry Roberts

Re: Camel and Hadoop in Karaf

Posted by Geoffry Roberts <ge...@gmail.com>.
I did some further thinking on this.

Doen anyone know if the camel-hdfs feature includes all necessary
dependencies  e.g. hadoop-common-xx.jar? or must these be added to the Karaf
classpath to get things to work?

While we're on the subject, does anyone know how to add a library that is
not an OSGi bundle to the Karaf classpath?  I tried dropping the
aforementioned hadoop-common-xx.jar into the lib/endorsed directory.  I then
restarted Karaf.  I got a ClassNotfoundException message on the console.
Now this class is indeed included in the jar file.  What gives?

If I remove the jar then restart the error does not occur.

 log4j:ERROR Could not instantiate class
[org.apache.hadoop.metrics.jvm.EventCounter].
java.lang.ClassNotFoundException: org.apache.hadoop.metrics.jvm.EventCounter
not found by org.ops4j.pax.logging.pax-logging-service [3]
        at
org.apache.felix.framework.ModuleImpl.findClassOrResourceByDelegation(ModuleImpl.java:787)
        at
org.apache.felix.framework.ModuleImpl.access$400(ModuleImpl.java:71)
        at
org.apache.felix.framework.ModuleImpl$ModuleClassLoader.loadClass(ModuleImpl.java:1768)


On 4 October 2011 08:25, Geoffry Roberts <ge...@gmail.com> wrote:

> Jean,
>
> Yes, both camel-spring and camel-hdfs are showing as installed; both are
> also showing as active.
>
> The aforementioned error message is all that is given.  The level is set to
> debug. The blueprint shows as active but nothing happens.
>
> On 3 October 2011 22:01, Jean-Baptiste Onofré <jb...@nanthrax.net> wrote:
>
>> Hi Geoffry,
>>
>> did you install camel-spring and camel-hdfs feature in Karaf ?
>>
>> Could you send the log (log:display) ?
>>
>> Regards
>> JB
>>
>>
>> On 10/03/2011 10:52 PM, Geoffry Roberts wrote:
>>
>>> All,
>>>
>>> I'm having a problem getting things to work with hadoop's hdfs.  This is
>>> my first try with this so I'm just trying to something simple.  I want
>>> to read a file from hdfs and write it's contents to the console.
>>>
>>> Can anyone see what I'm doing wrong?  Thanks.
>>>
>>> Here's the error (from karaf):
>>>
>>> org.osgi.service.blueprint.**container.**ComponentDefinitionException:
>>> Unable to intialize bean camel-16
>>>
>>> Here's my blueprint:
>>>
>>> <blueprint xmlns="http://www.osgi.org/**xmlns/blueprint/v1.0.0<http://www.osgi.org/xmlns/blueprint/v1.0.0>
>>> ">
>>> <camelContext xmlns="http://camel.apache.**org/schema/blueprint<http://camel.apache.org/schema/blueprint>
>>> ">
>>> <route>
>>> <from uri="hdfs://qq000:54310/user/**hadoop/epistate.xmi?noop=true" />
>>> <to uri="stream:out" />
>>> </route>
>>> </camelContext>
>>> </blueprint>
>>>
>>> Here's where hdfs is bound (from core-site.xml in the hadoop config):
>>>
>>> <configuration>
>>> ...
>>> <property>
>>> <name>fs.default.name <http://fs.default.name></**name>
>>>
>>> <value>hdfs://qq000:54310</**value>
>>> </property>
>>> </configuration>
>>>
>>> --
>>> Geoffry Roberts
>>>
>>>
>> --
>> Jean-Baptiste Onofré
>> jbonofre@apache.org
>> http://blog.nanthrax.net
>> Talend - http://www.talend.com
>>
>
>
>
> --
> Geoffry Roberts
>
>


-- 
Geoffry Roberts

Re: Camel and Hadoop in Karaf

Posted by Geoffry Roberts <ge...@gmail.com>.
Jean,

Yes, both camel-spring and camel-hdfs are showing as installed; both are
also showing as active.

The aforementioned error message is all that is given.  The level is set to
debug. The blueprint shows as active but nothing happens.

On 3 October 2011 22:01, Jean-Baptiste Onofré <jb...@nanthrax.net> wrote:

> Hi Geoffry,
>
> did you install camel-spring and camel-hdfs feature in Karaf ?
>
> Could you send the log (log:display) ?
>
> Regards
> JB
>
>
> On 10/03/2011 10:52 PM, Geoffry Roberts wrote:
>
>> All,
>>
>> I'm having a problem getting things to work with hadoop's hdfs.  This is
>> my first try with this so I'm just trying to something simple.  I want
>> to read a file from hdfs and write it's contents to the console.
>>
>> Can anyone see what I'm doing wrong?  Thanks.
>>
>> Here's the error (from karaf):
>>
>> org.osgi.service.blueprint.**container.**ComponentDefinitionException:
>> Unable to intialize bean camel-16
>>
>> Here's my blueprint:
>>
>> <blueprint xmlns="http://www.osgi.org/**xmlns/blueprint/v1.0.0<http://www.osgi.org/xmlns/blueprint/v1.0.0>
>> ">
>> <camelContext xmlns="http://camel.apache.**org/schema/blueprint<http://camel.apache.org/schema/blueprint>
>> ">
>> <route>
>> <from uri="hdfs://qq000:54310/user/**hadoop/epistate.xmi?noop=true" />
>> <to uri="stream:out" />
>> </route>
>> </camelContext>
>> </blueprint>
>>
>> Here's where hdfs is bound (from core-site.xml in the hadoop config):
>>
>> <configuration>
>> ...
>> <property>
>> <name>fs.default.name <http://fs.default.name></**name>
>>
>> <value>hdfs://qq000:54310</**value>
>> </property>
>> </configuration>
>>
>> --
>> Geoffry Roberts
>>
>>
> --
> Jean-Baptiste Onofré
> jbonofre@apache.org
> http://blog.nanthrax.net
> Talend - http://www.talend.com
>



-- 
Geoffry Roberts

Re: Camel and Hadoop in Karaf

Posted by Jean-Baptiste Onofré <jb...@nanthrax.net>.
Hi Geoffry,

did you install camel-spring and camel-hdfs feature in Karaf ?

Could you send the log (log:display) ?

Regards
JB

On 10/03/2011 10:52 PM, Geoffry Roberts wrote:
> All,
>
> I'm having a problem getting things to work with hadoop's hdfs.  This is
> my first try with this so I'm just trying to something simple.  I want
> to read a file from hdfs and write it's contents to the console.
>
> Can anyone see what I'm doing wrong?  Thanks.
>
> Here's the error (from karaf):
>
> org.osgi.service.blueprint.container.ComponentDefinitionException:
> Unable to intialize bean camel-16
>
> Here's my blueprint:
>
> <blueprint xmlns="http://www.osgi.org/xmlns/blueprint/v1.0.0">
> <camelContext xmlns="http://camel.apache.org/schema/blueprint">
> <route>
> <from uri="hdfs://qq000:54310/user/hadoop/epistate.xmi?noop=true" />
> <to uri="stream:out" />
> </route>
> </camelContext>
> </blueprint>
>
> Here's where hdfs is bound (from core-site.xml in the hadoop config):
>
> <configuration>
> ...
> <property>
> <name>fs.default.name <http://fs.default.name></name>
> <value>hdfs://qq000:54310</value>
> </property>
> </configuration>
>
> --
> Geoffry Roberts
>

-- 
Jean-Baptiste Onofré
jbonofre@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com