You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@bookkeeper.apache.org by venkat <ve...@gmail.com> on 2012/03/15 06:18:22 UTC

Fwd: Integrating BookKeeper with HDFS

                     I am integrating pseudo-distributed hadoop with
bookkeeper logging system. I added the HDFS-234
contribution(bkjournal) to the hadoop source code and built it. I
added the properties (dfs.namenode.edits.dir
,dfs.namenode.edits.journal-plugin.bookkeeper) to integrate hdfs with
BK. But when I start hadoop with local bookies, NameNode shuts down
due to some errors. More over, I could not find any proper
documentation for integrating BK with hdfs.Am in need of some help in
integrating it successfully....

Re: Integrating BookKeeper with HDFS

Posted by Ivan Kelly <iv...@yahoo-inc.com>.
Hi Venkat,

Have a look at http://hadoop.apache.org/common/docs/r0.20.1/api/org/apache/hadoop/fs/FileSystem.html
There's a method #getFileBlockLocations

CCing bookkeeper-user list again. Could you reply all when you reply?

-Ivan

On 16 Mar 2012, at 19:38, venkat wrote:

> Hi Ivan,
>           How to fetch the Data block details from the namenode?
> Awaiting your reply
> 
> On Fri, Mar 16, 2012 at 1:00 AM, venkat <ve...@gmail.com> wrote:
>>             I am trying to efficiently manage the data across
>> clusters(inter-cluster management). suppose I have 5 clusters with a
>> namenode each, my data is replicated across these five clusters. So
>> when a job comes I need to fetch the details about the data present in
>> datanodes which is available in NN.
>> 
>> 
>> 
>> 
>> 
>> On Fri, Mar 16, 2012 at 12:47 AM, Ivan Kelly <iv...@yahoo-inc.com> wrote:
>>> On 15 Mar 2012, at 20:09, venkat wrote:
>>>>            The guide helped me a lot.My hadoop work got stagnant with
>>>> this integration problem. Now my bookkeeper is working fine with HDFS.
>>>> Is it possible to fetch the namenode details( data blocks information)
>>>> from the bookkeeper ??
>>> No, you have to go to the namenode for that. What is it exactly you'd like to do?
>>> 
>>> -Ivan


Re: Integrating BookKeeper with HDFS

Posted by Ivan Kelly <iv...@yahoo-inc.com>.
On 15 Mar 2012, at 20:09, venkat wrote:
>            The guide helped me a lot.My hadoop work got stagnant with
> this integration problem. Now my bookkeeper is working fine with HDFS.
> Is it possible to fetch the namenode details( data blocks information)
> from the bookkeeper ??
No, you have to go to the namenode for that. What is it exactly you'd like to do?

-Ivan

Re: Integrating BookKeeper with HDFS

Posted by Ivan Kelly <iv...@yahoo-inc.com>.
Hi Venkat,

I've put a guide to getting the namenode running with HDFS on
https://cwiki.apache.org/confluence/display/BOOKKEEPER/JournalManager

There are a few patchs pending on HDFS, which make usage much easier, but it's taking time to get them committed.

Let me know if you have any questions.

Regards
Ivan

On 15 Mar 2012, at 06:18, venkat wrote:

>                     I am integrating pseudo-distributed hadoop with
> bookkeeper logging system. I added the HDFS-234
> contribution(bkjournal) to the hadoop source code and built it. I
> added the properties (dfs.namenode.edits.dir
> ,dfs.namenode.edits.journal-plugin.bookkeeper) to integrate hdfs with
> BK. But when I start hadoop with local bookies, NameNode shuts down
> due to some errors. More over, I could not find any proper
> documentation for integrating BK with hdfs.Am in need of some help in
> integrating it successfully....