You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@oodt.apache.org by "Cinquini, Luca (3880)" <Lu...@jpl.nasa.gov> on 2012/02/28 16:01:29 UTC

question on push pull

Hi all,
	I have a quick question concerning the pushpull framework : is there any way to transfer full directory trees, as opposed to single files ? And which of the currently implemented transfer protocols would allow that ? I haven't see any examples on that, though I might have missed it.

thanks a lot,
Luca

P.S.: Cameron, thanks for writing the push-pull user guide - it's great.


Re: question on push pull

Posted by "Mattmann, Chris A (388J)" <ch...@jpl.nasa.gov>.
Hi Luca and Cam,

Yep, no problem. And Luca, yep, the plan is to use this on SKA...

Cheers,
Chris

On Mar 1, 2012, at 5:00 AM, Cinquini, Luca (3880) wrote:

> Thanks Chris and Cameron, I'll give it a try - off course, that is we decide to use push-pull for the SKA project, Chris.
> I appreciate both of you guys' help,
> Luca
> 
> On Feb 29, 2012, at 9:03 AM, Mattmann, Chris A (388J) wrote:
> 
>> Hey Guys,
>> 
>> Just wanted to loop in on this. Yes, push pull does support download
>> of full directory trees. Probably the best guide to check out is Brian's
>> documentation on the DirFileStructXML here:
>> 
>> http://s.apache.org/yz
>> 
>> I also wrote a static user guide here:
>> 
>> http://s.apache.org/10Z
>> 
>> One thing to note too is that some of the plugins for Push Pull that
>> have been develop use non ALv2 compatible code, so if you want
>> those plugins for now (until someone writes nice shiny new ALv2
>> compatible versions which I would LOVE), you can find them here
>> at Apache Extras:
>> 
>> http://code.google.com/a/apache-extras.org/p/oodt-pushpull-plugins/
>> 
>> That you can cull information from. If you have any further specific
>> questions I can help.
>> 
>> Thanks!
>> 
>> Cheers,
>> Chris
>> 
>> On Feb 28, 2012, at 7:40 AM, Cameron Goodale wrote:
>> 
>>> Luca,
>>> 
>>> I haven't tried this exact use case within Crawler, but Crawler does
>>> support scp and I have used 'scp -r' to recursively download a folder and
>>> all content housed within.  I can only imagine ftp has a similar recursive
>>> option as well.
>>> 
>>> Maybe another more Crawler Savy dev can shine some light on the recursion
>>> use case when using Crawler.
>>> 
>>> -Cameron
>>> 
>>> P.S. When we get a final answer let's add this to the Crawler User Guide
>>> Wiki too as an example use case.  Glad you found the Crawler Wiki page
>>> useful.
>>> 
>>> On Tue, Feb 28, 2012 at 7:01 AM, Cinquini, Luca (3880) <
>>> Luca.Cinquini@jpl.nasa.gov> wrote:
>>> 
>>>> Hi all,
>>>>     I have a quick question concerning the pushpull framework : is
>>>> there any way to transfer full directory trees, as opposed to single files
>>>> ? And which of the currently implemented transfer protocols would allow
>>>> that ? I haven't see any examples on that, though I might have missed it.
>>>> 
>>>> thanks a lot,
>>>> Luca
>>>> 
>>>> P.S.: Cameron, thanks for writing the push-pull user guide - it's great.
>>>> 
>>>> 
>>> 
>>> 
>>> -- 
>>> 
>>> Sent from a Tin Can attached to a String
>> 
>> 
>> ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
>> Chris Mattmann, Ph.D.
>> Senior Computer Scientist
>> NASA Jet Propulsion Laboratory Pasadena, CA 91109 USA
>> Office: 171-266B, Mailstop: 171-246
>> Email: chris.a.mattmann@nasa.gov
>> WWW:   http://sunset.usc.edu/~mattmann/
>> ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
>> Adjunct Assistant Professor, Computer Science Department
>> University of Southern California, Los Angeles, CA 90089 USA
>> ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
>> 
> 


++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Chris Mattmann, Ph.D.
Senior Computer Scientist
NASA Jet Propulsion Laboratory Pasadena, CA 91109 USA
Office: 171-266B, Mailstop: 171-246
Email: chris.a.mattmann@nasa.gov
WWW:   http://sunset.usc.edu/~mattmann/
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Adjunct Assistant Professor, Computer Science Department
University of Southern California, Los Angeles, CA 90089 USA
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++


Re: question on push pull

Posted by "Cinquini, Luca (3880)" <Lu...@jpl.nasa.gov>.
Thanks Chris and Cameron, I'll give it a try - off course, that is we decide to use push-pull for the SKA project, Chris.
I appreciate both of you guys' help,
Luca

On Feb 29, 2012, at 9:03 AM, Mattmann, Chris A (388J) wrote:

> Hey Guys,
> 
> Just wanted to loop in on this. Yes, push pull does support download
> of full directory trees. Probably the best guide to check out is Brian's
> documentation on the DirFileStructXML here:
> 
> http://s.apache.org/yz
> 
> I also wrote a static user guide here:
> 
> http://s.apache.org/10Z
> 
> One thing to note too is that some of the plugins for Push Pull that
> have been develop use non ALv2 compatible code, so if you want
> those plugins for now (until someone writes nice shiny new ALv2
> compatible versions which I would LOVE), you can find them here
> at Apache Extras:
> 
> http://code.google.com/a/apache-extras.org/p/oodt-pushpull-plugins/
> 
> That you can cull information from. If you have any further specific
> questions I can help.
> 
> Thanks!
> 
> Cheers,
> Chris
> 
> On Feb 28, 2012, at 7:40 AM, Cameron Goodale wrote:
> 
>> Luca,
>> 
>> I haven't tried this exact use case within Crawler, but Crawler does
>> support scp and I have used 'scp -r' to recursively download a folder and
>> all content housed within.  I can only imagine ftp has a similar recursive
>> option as well.
>> 
>> Maybe another more Crawler Savy dev can shine some light on the recursion
>> use case when using Crawler.
>> 
>> -Cameron
>> 
>> P.S. When we get a final answer let's add this to the Crawler User Guide
>> Wiki too as an example use case.  Glad you found the Crawler Wiki page
>> useful.
>> 
>> On Tue, Feb 28, 2012 at 7:01 AM, Cinquini, Luca (3880) <
>> Luca.Cinquini@jpl.nasa.gov> wrote:
>> 
>>> Hi all,
>>>      I have a quick question concerning the pushpull framework : is
>>> there any way to transfer full directory trees, as opposed to single files
>>> ? And which of the currently implemented transfer protocols would allow
>>> that ? I haven't see any examples on that, though I might have missed it.
>>> 
>>> thanks a lot,
>>> Luca
>>> 
>>> P.S.: Cameron, thanks for writing the push-pull user guide - it's great.
>>> 
>>> 
>> 
>> 
>> -- 
>> 
>> Sent from a Tin Can attached to a String
> 
> 
> ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
> Chris Mattmann, Ph.D.
> Senior Computer Scientist
> NASA Jet Propulsion Laboratory Pasadena, CA 91109 USA
> Office: 171-266B, Mailstop: 171-246
> Email: chris.a.mattmann@nasa.gov
> WWW:   http://sunset.usc.edu/~mattmann/
> ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
> Adjunct Assistant Professor, Computer Science Department
> University of Southern California, Los Angeles, CA 90089 USA
> ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
> 


Re: question on push pull

Posted by "Mattmann, Chris A (388J)" <ch...@jpl.nasa.gov>.
Hey Guys,

Just wanted to loop in on this. Yes, push pull does support download
of full directory trees. Probably the best guide to check out is Brian's
documentation on the DirFileStructXML here:

http://s.apache.org/yz

I also wrote a static user guide here:

http://s.apache.org/10Z

One thing to note too is that some of the plugins for Push Pull that
have been develop use non ALv2 compatible code, so if you want
those plugins for now (until someone writes nice shiny new ALv2
compatible versions which I would LOVE), you can find them here
at Apache Extras:

http://code.google.com/a/apache-extras.org/p/oodt-pushpull-plugins/

That you can cull information from. If you have any further specific
questions I can help.

Thanks!

Cheers,
Chris

On Feb 28, 2012, at 7:40 AM, Cameron Goodale wrote:

> Luca,
> 
> I haven't tried this exact use case within Crawler, but Crawler does
> support scp and I have used 'scp -r' to recursively download a folder and
> all content housed within.  I can only imagine ftp has a similar recursive
> option as well.
> 
> Maybe another more Crawler Savy dev can shine some light on the recursion
> use case when using Crawler.
> 
> -Cameron
> 
> P.S. When we get a final answer let's add this to the Crawler User Guide
> Wiki too as an example use case.  Glad you found the Crawler Wiki page
> useful.
> 
> On Tue, Feb 28, 2012 at 7:01 AM, Cinquini, Luca (3880) <
> Luca.Cinquini@jpl.nasa.gov> wrote:
> 
>> Hi all,
>>       I have a quick question concerning the pushpull framework : is
>> there any way to transfer full directory trees, as opposed to single files
>> ? And which of the currently implemented transfer protocols would allow
>> that ? I haven't see any examples on that, though I might have missed it.
>> 
>> thanks a lot,
>> Luca
>> 
>> P.S.: Cameron, thanks for writing the push-pull user guide - it's great.
>> 
>> 
> 
> 
> -- 
> 
> Sent from a Tin Can attached to a String


++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Chris Mattmann, Ph.D.
Senior Computer Scientist
NASA Jet Propulsion Laboratory Pasadena, CA 91109 USA
Office: 171-266B, Mailstop: 171-246
Email: chris.a.mattmann@nasa.gov
WWW:   http://sunset.usc.edu/~mattmann/
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Adjunct Assistant Professor, Computer Science Department
University of Southern California, Los Angeles, CA 90089 USA
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++


Re: question on push pull

Posted by Brian Foster <ho...@me.com>.
hey Luca,

Ya pushpull can download full directories... do you have a distruct XML file for downloading individual files?... paste it into a message on this thread and let me know which directory you would like to download and i can do the mods or help you through making the mods to the file... in short if you don't specify a <file name=""> element in a <dir name=""> element then all files in that directory will be downloaded

-Brian

On Feb 29, 2012, at 10:12 PM, Cameron Goodale <si...@gmail.com> wrote:

> Luca,
> 
> Good catch.  I was typing faster than my brain was working.  I meant to say
> Push Pull instead of Crawler.
> 
> Sorry for the confusion.  Chris laid it all out really well.  Guess I was
> just a little to excited to answer you and goofed in my haste.
> 
> To answer your question, I am not sure where the code is in PUSHPULL (got
> it right this time ;) since I haven't had reason to use it yet.
> 
> I did check the etc/examples directory in PushPull and I only found file,
> ftp, and sftp examples.  If you do sort out the scp and scp -r versions it
> would be great to add them to the etc/examples area.
> 
> It could be part of the Apache Extras that Chris mentioned:
> http://code.google.com/a/apache-extras.org/p/oodt-pushpull-plugins/
> 
> Good Luck.
> 
> 
> 
> -Cameron
> 
> On Wed, Feb 29, 2012 at 10:23 AM, Cinquini, Luca (3880) <
> Luca.Cinquini@jpl.nasa.gov> wrote:
> 
>> Hi Cameron,
>>       Maybe I am confused, but I was actually asking about the push-pull
>> capabilities - does the crawler plug into the push-pull framework ? (sorry
>> about my ignorance here). If push-pull supports scp, would you know the
>> name of the protocol transfer factory to use - I haven't found one.
>> thanks a lot,
>> Luca
>> 
>> On Feb 28, 2012, at 8:40 AM, Cameron Goodale wrote:
>> 
>>> Luca,
>>> 
>>> I haven't tried this exact use case within Crawler, but Crawler does
>>> support scp and I have used 'scp -r' to recursively download a folder and
>>> all content housed within.  I can only imagine ftp has a similar
>> recursive
>>> option as well.
>>> 
>>> Maybe another more Crawler Savy dev can shine some light on the recursion
>>> use case when using Crawler.
>>> 
>>> -Cameron
>>> 
>>> P.S. When we get a final answer let's add this to the Crawler User Guide
>>> Wiki too as an example use case.  Glad you found the Crawler Wiki page
>>> useful.
>>> 
>>> On Tue, Feb 28, 2012 at 7:01 AM, Cinquini, Luca (3880) <
>>> Luca.Cinquini@jpl.nasa.gov> wrote:
>>> 
>>>> Hi all,
>>>>      I have a quick question concerning the pushpull framework : is
>>>> there any way to transfer full directory trees, as opposed to single
>> files
>>>> ? And which of the currently implemented transfer protocols would allow
>>>> that ? I haven't see any examples on that, though I might have missed
>> it.
>>>> 
>>>> thanks a lot,
>>>> Luca
>>>> 
>>>> P.S.: Cameron, thanks for writing the push-pull user guide - it's great.
>>>> 
>>>> 
>>> 
>>> 
>>> --
>>> 
>>> Sent from a Tin Can attached to a String
>> 
>> 
> 
> 
> -- 
> 
> Sent from a Tin Can attached to a String

Re: question on push pull

Posted by Cameron Goodale <si...@gmail.com>.
Luca,

Good catch.  I was typing faster than my brain was working.  I meant to say
Push Pull instead of Crawler.

Sorry for the confusion.  Chris laid it all out really well.  Guess I was
just a little to excited to answer you and goofed in my haste.

To answer your question, I am not sure where the code is in PUSHPULL (got
it right this time ;) since I haven't had reason to use it yet.

I did check the etc/examples directory in PushPull and I only found file,
ftp, and sftp examples.  If you do sort out the scp and scp -r versions it
would be great to add them to the etc/examples area.

It could be part of the Apache Extras that Chris mentioned:
http://code.google.com/a/apache-extras.org/p/oodt-pushpull-plugins/

Good Luck.



-Cameron

On Wed, Feb 29, 2012 at 10:23 AM, Cinquini, Luca (3880) <
Luca.Cinquini@jpl.nasa.gov> wrote:

> Hi Cameron,
>        Maybe I am confused, but I was actually asking about the push-pull
> capabilities - does the crawler plug into the push-pull framework ? (sorry
> about my ignorance here). If push-pull supports scp, would you know the
> name of the protocol transfer factory to use - I haven't found one.
> thanks a lot,
> Luca
>
> On Feb 28, 2012, at 8:40 AM, Cameron Goodale wrote:
>
> > Luca,
> >
> > I haven't tried this exact use case within Crawler, but Crawler does
> > support scp and I have used 'scp -r' to recursively download a folder and
> > all content housed within.  I can only imagine ftp has a similar
> recursive
> > option as well.
> >
> > Maybe another more Crawler Savy dev can shine some light on the recursion
> > use case when using Crawler.
> >
> > -Cameron
> >
> > P.S. When we get a final answer let's add this to the Crawler User Guide
> > Wiki too as an example use case.  Glad you found the Crawler Wiki page
> > useful.
> >
> > On Tue, Feb 28, 2012 at 7:01 AM, Cinquini, Luca (3880) <
> > Luca.Cinquini@jpl.nasa.gov> wrote:
> >
> >> Hi all,
> >>       I have a quick question concerning the pushpull framework : is
> >> there any way to transfer full directory trees, as opposed to single
> files
> >> ? And which of the currently implemented transfer protocols would allow
> >> that ? I haven't see any examples on that, though I might have missed
> it.
> >>
> >> thanks a lot,
> >> Luca
> >>
> >> P.S.: Cameron, thanks for writing the push-pull user guide - it's great.
> >>
> >>
> >
> >
> > --
> >
> > Sent from a Tin Can attached to a String
>
>


-- 

Sent from a Tin Can attached to a String

Re: question on push pull

Posted by "Cinquini, Luca (3880)" <Lu...@jpl.nasa.gov>.
Hi Cameron,
	Maybe I am confused, but I was actually asking about the push-pull capabilities - does the crawler plug into the push-pull framework ? (sorry about my ignorance here). If push-pull supports scp, would you know the name of the protocol transfer factory to use - I haven't found one.
thanks a lot,
Luca

On Feb 28, 2012, at 8:40 AM, Cameron Goodale wrote:

> Luca,
> 
> I haven't tried this exact use case within Crawler, but Crawler does
> support scp and I have used 'scp -r' to recursively download a folder and
> all content housed within.  I can only imagine ftp has a similar recursive
> option as well.
> 
> Maybe another more Crawler Savy dev can shine some light on the recursion
> use case when using Crawler.
> 
> -Cameron
> 
> P.S. When we get a final answer let's add this to the Crawler User Guide
> Wiki too as an example use case.  Glad you found the Crawler Wiki page
> useful.
> 
> On Tue, Feb 28, 2012 at 7:01 AM, Cinquini, Luca (3880) <
> Luca.Cinquini@jpl.nasa.gov> wrote:
> 
>> Hi all,
>>       I have a quick question concerning the pushpull framework : is
>> there any way to transfer full directory trees, as opposed to single files
>> ? And which of the currently implemented transfer protocols would allow
>> that ? I haven't see any examples on that, though I might have missed it.
>> 
>> thanks a lot,
>> Luca
>> 
>> P.S.: Cameron, thanks for writing the push-pull user guide - it's great.
>> 
>> 
> 
> 
> -- 
> 
> Sent from a Tin Can attached to a String


Re: question on push pull

Posted by Cameron Goodale <si...@gmail.com>.
Luca,

I haven't tried this exact use case within Crawler, but Crawler does
support scp and I have used 'scp -r' to recursively download a folder and
all content housed within.  I can only imagine ftp has a similar recursive
option as well.

Maybe another more Crawler Savy dev can shine some light on the recursion
use case when using Crawler.

-Cameron

P.S. When we get a final answer let's add this to the Crawler User Guide
Wiki too as an example use case.  Glad you found the Crawler Wiki page
useful.

On Tue, Feb 28, 2012 at 7:01 AM, Cinquini, Luca (3880) <
Luca.Cinquini@jpl.nasa.gov> wrote:

> Hi all,
>        I have a quick question concerning the pushpull framework : is
> there any way to transfer full directory trees, as opposed to single files
> ? And which of the currently implemented transfer protocols would allow
> that ? I haven't see any examples on that, though I might have missed it.
>
> thanks a lot,
> Luca
>
> P.S.: Cameron, thanks for writing the push-pull user guide - it's great.
>
>


-- 

Sent from a Tin Can attached to a String