You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@camel.apache.org by S Ahmed <sa...@gmail.com> on 2016/09/01 15:30:28 UTC

RE: downloading large files in chunks

Hello,

Is there an example of how to download a large file in chunks and save the
file as the file downloads.

The goal is not to hold the entire file in memory and then save it to disk.


Thanks.

Re: downloading large files in chunks

Posted by Brad Johnson <br...@mediadriver.com>.
By the way, I don't know if you said or not but do you control both sides
of this or just the consumer side?

On Fri, Sep 2, 2016 at 9:51 AM, Brad Johnson <br...@mediadriver.com>
wrote:

> Hmmm. That could be a problem if it doesn't actually chunk.  I thought it
> read the entire chunk into memory before letting you read it.  So if the
> chunk size is 10mb it would download that whole 10mb and then let you read,
> then fetch the next 10mb and let you read.  But that may not be the case. I
> haven't worked with it much so can't say.  I do know it's exceptionally
> fast.
>
> The chunking almost seems pointless if it doesn't work that way.
>
> On Fri, Sep 2, 2016 at 9:27 AM, S Ahmed <sa...@gmail.com> wrote:
>
>> Brad, that page says this: "Notice Netty4 HTTP reads the entire stream
>> into
>> memory using io.netty.handler.codec.http.HttpObjectAggregator to build
>> the
>> entire full http message. But the resulting message is still a stream
>> based
>> message which is readable once."
>>
>> On Fri, Sep 2, 2016 at 10:26 AM, S Ahmed <sa...@gmail.com> wrote:
>>
>> > Thanks.
>> >
>> > Just to be clear, I don't run the server where I am downloading the
>> file.
>> > I want to download files that are very large, but stream them so they
>> are
>> > not held in memory and then written to disk.  I want to stream the
>> download
>> > straight to a file and not hold the entire file in memory.
>> >
>> > Is Netty for the server portion or the client?
>> >
>> > On Fri, Sep 2, 2016 at 12:34 AM, Brad Johnson <
>> > brad.johnson@mediadriver.com> wrote:
>> >
>> >> http://camel.apache.org/netty4-http.html
>> >>
>> >> Look at netty and see if that works.  It can control chunk size but it
>> is
>> >> also streaming in any case so you may not even need to be concerned
>> about
>> >> it.
>> >>
>> >> Brad
>> >>
>> >> On Thu, Sep 1, 2016 at 8:53 PM, S Ahmed <sa...@gmail.com> wrote:
>> >>
>> >> > Does it have to be ftp, I just need http?
>> >> >
>> >> > On Thu, Sep 1, 2016 at 5:31 PM, Quinn Stevenson <
>> >> > quinn@pronoia-solutions.com
>> >> > > wrote:
>> >> >
>> >> > > Check out the section on the ftp component page about “Using a
>> Local
>> >> Work
>> >> > > Directory” (http://people.apache.org/~dkulp/camel/ftp2.html <
>> >> > > http://people.apache.org/~dkulp/camel/ftp2.html>) - I think that
>> may
>> >> be
>> >> > > what you’re after.
>> >> > >
>> >> > >
>> >> > > > On Sep 1, 2016, at 9:30 AM, S Ahmed <sa...@gmail.com>
>> wrote:
>> >> > > >
>> >> > > > Hello,
>> >> > > >
>> >> > > > Is there an example of how to download a large file in chunks and
>> >> save
>> >> > > the
>> >> > > > file as the file downloads.
>> >> > > >
>> >> > > > The goal is not to hold the entire file in memory and then save
>> it
>> >> to
>> >> > > disk.
>> >> > > >
>> >> > > >
>> >> > > > Thanks.
>> >> > >
>> >> > >
>> >> >
>> >>
>> >
>> >
>>
>
>

Re: downloading large files in chunks

Posted by Brad Johnson <br...@mediadriver.com>.
https://netty.io/4.0/api/io/netty/handler/codec/http/HttpChunkedInput.html

That's why I thought the Camel Netty with chunked would only read the
entire stream of the specified chunk size in.

On Fri, Sep 2, 2016 at 10:11 AM, S Ahmed <sa...@gmail.com> wrote:

> I'm just the consumer (downloading), the file can be anywhere like s3 or
> centos.org!
>
>
>
> On Fri, Sep 2, 2016 at 11:09 AM, Brad Johnson <
> brad.johnson@mediadriver.com>
> wrote:
>
> > By the way S. Ahmed, do you have control of both ends of this I mean
> > client/server or are you just on the client/consumer side?
> >
> > On Fri, Sep 2, 2016 at 10:01 AM, Brad Johnson <
> > brad.johnson@mediadriver.com>
> > wrote:
> >
> > > Absolutely.  Love to set up a VM for my server.  I just had a "duh"
> > moment
> > > when I did it.  No harm, no foul.
> > >
> > > On Fri, Sep 2, 2016 at 10:00 AM, Quinn Stevenson <
> > > quinn@pronoia-solutions.com> wrote:
> > >
> > >> Sorry - I wanted to put in and example that worked, and download
> > >> something big to make sure it was streaming.  Hopefully you needed a
> new
> > >> CentOS image :-)
> > >>
> > >>
> > >>
> > >> > On Sep 2, 2016, at 8:58 AM, Brad Johnson <
> > brad.johnson@mediadriver.com>
> > >> wrote:
> > >> >
> > >> > Neat.  I accidentally clicked on the link and Chrome downloaded the
> > ISO
> > >> for
> > >> > me.  Are you propagating Trojan horses here?  Heh.
> > >> >
> > >> > On Fri, Sep 2, 2016 at 9:56 AM, Quinn Stevenson <
> > >> quinn@pronoia-solutions.com
> > >> >> wrote:
> > >> >
> > >> >> I think something like this might work for you
> > >> >>
> > >> >> <route>
> > >> >>    <from uri="direct://trigger-download" />
> > >> >>    <log message="Download Triggered" />
> > >> >>    <to uri="http4://buildlogs.centos.org/rolling/7/isos/x86_64/
> > >> >> CentOS-7-x86_64-DVD.iso?disableStreamCache=true" />
> > >> >>    <log message="Writing File" />
> > >> >>    <to uri="file://target/download" />
> > >> >> </route>
> > >> >>
> > >> >>> On Sep 2, 2016, at 8:51 AM, Brad Johnson <
> > >> brad.johnson@mediadriver.com>
> > >> >> wrote:
> > >> >>>
> > >> >>> Hmmm. That could be a problem if it doesn't actually chunk.  I
> > >> thought it
> > >> >>> read the entire chunk into memory before letting you read it.  So
> if
> > >> the
> > >> >>> chunk size is 10mb it would download that whole 10mb and then let
> > you
> > >> >> read,
> > >> >>> then fetch the next 10mb and let you read.  But that may not be
> the
> > >> >> case. I
> > >> >>> haven't worked with it much so can't say.  I do know it's
> > >> exceptionally
> > >> >>> fast.
> > >> >>>
> > >> >>> The chunking almost seems pointless if it doesn't work that way.
> > >> >>>
> > >> >>> On Fri, Sep 2, 2016 at 9:27 AM, S Ahmed <sa...@gmail.com>
> > wrote:
> > >> >>>
> > >> >>>> Brad, that page says this: "Notice Netty4 HTTP reads the entire
> > >> stream
> > >> >> into
> > >> >>>> memory using io.netty.handler.codec.http.HttpObjectAggregator to
> > >> build
> > >> >> the
> > >> >>>> entire full http message. But the resulting message is still a
> > stream
> > >> >> based
> > >> >>>> message which is readable once."
> > >> >>>>
> > >> >>>> On Fri, Sep 2, 2016 at 10:26 AM, S Ahmed <sa...@gmail.com>
> > >> wrote:
> > >> >>>>
> > >> >>>>> Thanks.
> > >> >>>>>
> > >> >>>>> Just to be clear, I don't run the server where I am downloading
> > the
> > >> >> file.
> > >> >>>>> I want to download files that are very large, but stream them so
> > >> they
> > >> >> are
> > >> >>>>> not held in memory and then written to disk.  I want to stream
> the
> > >> >>>> download
> > >> >>>>> straight to a file and not hold the entire file in memory.
> > >> >>>>>
> > >> >>>>> Is Netty for the server portion or the client?
> > >> >>>>>
> > >> >>>>> On Fri, Sep 2, 2016 at 12:34 AM, Brad Johnson <
> > >> >>>>> brad.johnson@mediadriver.com> wrote:
> > >> >>>>>
> > >> >>>>>> http://camel.apache.org/netty4-http.html
> > >> >>>>>>
> > >> >>>>>> Look at netty and see if that works.  It can control chunk size
> > >> but it
> > >> >>>> is
> > >> >>>>>> also streaming in any case so you may not even need to be
> > concerned
> > >> >>>> about
> > >> >>>>>> it.
> > >> >>>>>>
> > >> >>>>>> Brad
> > >> >>>>>>
> > >> >>>>>> On Thu, Sep 1, 2016 at 8:53 PM, S Ahmed <sa...@gmail.com>
> > >> wrote:
> > >> >>>>>>
> > >> >>>>>>> Does it have to be ftp, I just need http?
> > >> >>>>>>>
> > >> >>>>>>> On Thu, Sep 1, 2016 at 5:31 PM, Quinn Stevenson <
> > >> >>>>>>> quinn@pronoia-solutions.com
> > >> >>>>>>>> wrote:
> > >> >>>>>>>
> > >> >>>>>>>> Check out the section on the ftp component page about “Using
> a
> > >> Local
> > >> >>>>>> Work
> > >> >>>>>>>> Directory” (http://people.apache.org/~dkulp/camel/ftp2.html
> <
> > >> >>>>>>>> http://people.apache.org/~dkulp/camel/ftp2.html>) - I think
> > that
> > >> >>>> may
> > >> >>>>>> be
> > >> >>>>>>>> what you’re after.
> > >> >>>>>>>>
> > >> >>>>>>>>
> > >> >>>>>>>>> On Sep 1, 2016, at 9:30 AM, S Ahmed <sa...@gmail.com>
> > >> wrote:
> > >> >>>>>>>>>
> > >> >>>>>>>>> Hello,
> > >> >>>>>>>>>
> > >> >>>>>>>>> Is there an example of how to download a large file in
> chunks
> > >> and
> > >> >>>>>> save
> > >> >>>>>>>> the
> > >> >>>>>>>>> file as the file downloads.
> > >> >>>>>>>>>
> > >> >>>>>>>>> The goal is not to hold the entire file in memory and then
> > save
> > >> it
> > >> >>>>>> to
> > >> >>>>>>>> disk.
> > >> >>>>>>>>>
> > >> >>>>>>>>>
> > >> >>>>>>>>> Thanks.
> > >> >>>>>>>>
> > >> >>>>>>>>
> > >> >>>>>>>
> > >> >>>>>>
> > >> >>>>>
> > >> >>>>>
> > >> >>>>
> > >> >>
> > >> >>
> > >>
> > >>
> > >
> >
>

Re: downloading large files in chunks

Posted by Quinn Stevenson <qu...@pronoia-solutions.com>.
The way I know it’s streaming is running the route. 

You’ll see the log entries (“Download Triggered” and “Writing File”) fairly close together.  Then if you watch the filesystem, you’ll see the file size on disk growing.  Also, I’m using default JVM parameters, so the heap isn’t big enough for the entire file to fit into memory and I’m not getting an OOM Exception.  I only downloaded about 500 MB, but that should’ve been enough to blow the JVM with my settings.

If you want to see it behave without streaming, change “disableStreamCache=true” to “disableStreamCache=false” (or just remove it from the URI - false is the default).

I’d have to thing about how to write an integration test for this

> On Sep 2, 2016, at 9:26 AM, S Ahmed <sa...@gmail.com> wrote:
> 
> Also, is there a way for me to test if the endpoint supports streaming?
> I'm on OSX so any open source tools to test this?
> 
> On Fri, Sep 2, 2016 at 11:11 AM, S Ahmed <sa...@gmail.com> wrote:
> 
>> I'm just the consumer (downloading), the file can be anywhere like s3 or
>> centos.org!
>> 
>> 
>> 
>> On Fri, Sep 2, 2016 at 11:09 AM, Brad Johnson <
>> brad.johnson@mediadriver.com> wrote:
>> 
>>> By the way S. Ahmed, do you have control of both ends of this I mean
>>> client/server or are you just on the client/consumer side?
>>> 
>>> On Fri, Sep 2, 2016 at 10:01 AM, Brad Johnson <
>>> brad.johnson@mediadriver.com>
>>> wrote:
>>> 
>>>> Absolutely.  Love to set up a VM for my server.  I just had a "duh"
>>> moment
>>>> when I did it.  No harm, no foul.
>>>> 
>>>> On Fri, Sep 2, 2016 at 10:00 AM, Quinn Stevenson <
>>>> quinn@pronoia-solutions.com> wrote:
>>>> 
>>>>> Sorry - I wanted to put in and example that worked, and download
>>>>> something big to make sure it was streaming.  Hopefully you needed a
>>> new
>>>>> CentOS image :-)
>>>>> 
>>>>> 
>>>>> 
>>>>>> On Sep 2, 2016, at 8:58 AM, Brad Johnson <
>>> brad.johnson@mediadriver.com>
>>>>> wrote:
>>>>>> 
>>>>>> Neat.  I accidentally clicked on the link and Chrome downloaded the
>>> ISO
>>>>> for
>>>>>> me.  Are you propagating Trojan horses here?  Heh.
>>>>>> 
>>>>>> On Fri, Sep 2, 2016 at 9:56 AM, Quinn Stevenson <
>>>>> quinn@pronoia-solutions.com
>>>>>>> wrote:
>>>>>> 
>>>>>>> I think something like this might work for you
>>>>>>> 
>>>>>>> <route>
>>>>>>>   <from uri="direct://trigger-download" />
>>>>>>>   <log message="Download Triggered" />
>>>>>>>   <to uri="http4://buildlogs.centos.org/rolling/7/isos/x86_64/
>>>>>>> CentOS-7-x86_64-DVD.iso?disableStreamCache=true" />
>>>>>>>   <log message="Writing File" />
>>>>>>>   <to uri="file://target/download" />
>>>>>>> </route>
>>>>>>> 
>>>>>>>> On Sep 2, 2016, at 8:51 AM, Brad Johnson <
>>>>> brad.johnson@mediadriver.com>
>>>>>>> wrote:
>>>>>>>> 
>>>>>>>> Hmmm. That could be a problem if it doesn't actually chunk.  I
>>>>> thought it
>>>>>>>> read the entire chunk into memory before letting you read it.  So
>>> if
>>>>> the
>>>>>>>> chunk size is 10mb it would download that whole 10mb and then let
>>> you
>>>>>>> read,
>>>>>>>> then fetch the next 10mb and let you read.  But that may not be the
>>>>>>> case. I
>>>>>>>> haven't worked with it much so can't say.  I do know it's
>>>>> exceptionally
>>>>>>>> fast.
>>>>>>>> 
>>>>>>>> The chunking almost seems pointless if it doesn't work that way.
>>>>>>>> 
>>>>>>>> On Fri, Sep 2, 2016 at 9:27 AM, S Ahmed <sa...@gmail.com>
>>> wrote:
>>>>>>>> 
>>>>>>>>> Brad, that page says this: "Notice Netty4 HTTP reads the entire
>>>>> stream
>>>>>>> into
>>>>>>>>> memory using io.netty.handler.codec.http.HttpObjectAggregator to
>>>>> build
>>>>>>> the
>>>>>>>>> entire full http message. But the resulting message is still a
>>> stream
>>>>>>> based
>>>>>>>>> message which is readable once."
>>>>>>>>> 
>>>>>>>>> On Fri, Sep 2, 2016 at 10:26 AM, S Ahmed <sa...@gmail.com>
>>>>> wrote:
>>>>>>>>> 
>>>>>>>>>> Thanks.
>>>>>>>>>> 
>>>>>>>>>> Just to be clear, I don't run the server where I am downloading
>>> the
>>>>>>> file.
>>>>>>>>>> I want to download files that are very large, but stream them so
>>>>> they
>>>>>>> are
>>>>>>>>>> not held in memory and then written to disk.  I want to stream
>>> the
>>>>>>>>> download
>>>>>>>>>> straight to a file and not hold the entire file in memory.
>>>>>>>>>> 
>>>>>>>>>> Is Netty for the server portion or the client?
>>>>>>>>>> 
>>>>>>>>>> On Fri, Sep 2, 2016 at 12:34 AM, Brad Johnson <
>>>>>>>>>> brad.johnson@mediadriver.com> wrote:
>>>>>>>>>> 
>>>>>>>>>>> http://camel.apache.org/netty4-http.html
>>>>>>>>>>> 
>>>>>>>>>>> Look at netty and see if that works.  It can control chunk size
>>>>> but it
>>>>>>>>> is
>>>>>>>>>>> also streaming in any case so you may not even need to be
>>> concerned
>>>>>>>>> about
>>>>>>>>>>> it.
>>>>>>>>>>> 
>>>>>>>>>>> Brad
>>>>>>>>>>> 
>>>>>>>>>>> On Thu, Sep 1, 2016 at 8:53 PM, S Ahmed <sa...@gmail.com>
>>>>> wrote:
>>>>>>>>>>> 
>>>>>>>>>>>> Does it have to be ftp, I just need http?
>>>>>>>>>>>> 
>>>>>>>>>>>> On Thu, Sep 1, 2016 at 5:31 PM, Quinn Stevenson <
>>>>>>>>>>>> quinn@pronoia-solutions.com
>>>>>>>>>>>>> wrote:
>>>>>>>>>>>> 
>>>>>>>>>>>>> Check out the section on the ftp component page about “Using a
>>>>> Local
>>>>>>>>>>> Work
>>>>>>>>>>>>> Directory” (http://people.apache.org/~dkulp/camel/ftp2.html <
>>>>>>>>>>>>> http://people.apache.org/~dkulp/camel/ftp2.html>) - I think
>>> that
>>>>>>>>> may
>>>>>>>>>>> be
>>>>>>>>>>>>> what you’re after.
>>>>>>>>>>>>> 
>>>>>>>>>>>>> 
>>>>>>>>>>>>>> On Sep 1, 2016, at 9:30 AM, S Ahmed <sa...@gmail.com>
>>>>> wrote:
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> Hello,
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> Is there an example of how to download a large file in chunks
>>>>> and
>>>>>>>>>>> save
>>>>>>>>>>>>> the
>>>>>>>>>>>>>> file as the file downloads.
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> The goal is not to hold the entire file in memory and then
>>> save
>>>>> it
>>>>>>>>>>> to
>>>>>>>>>>>>> disk.
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> Thanks.
>>>>>>>>>>>>> 
>>>>>>>>>>>>> 
>>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>>> 
>>>>>>>>>> 
>>>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>> 
>>>>> 
>>>> 
>>> 
>> 
>> 


Re: downloading large files in chunks

Posted by S Ahmed <sa...@gmail.com>.
Also, is there a way for me to test if the endpoint supports streaming?
I'm on OSX so any open source tools to test this?

On Fri, Sep 2, 2016 at 11:11 AM, S Ahmed <sa...@gmail.com> wrote:

> I'm just the consumer (downloading), the file can be anywhere like s3 or
> centos.org!
>
>
>
> On Fri, Sep 2, 2016 at 11:09 AM, Brad Johnson <
> brad.johnson@mediadriver.com> wrote:
>
>> By the way S. Ahmed, do you have control of both ends of this I mean
>> client/server or are you just on the client/consumer side?
>>
>> On Fri, Sep 2, 2016 at 10:01 AM, Brad Johnson <
>> brad.johnson@mediadriver.com>
>> wrote:
>>
>> > Absolutely.  Love to set up a VM for my server.  I just had a "duh"
>> moment
>> > when I did it.  No harm, no foul.
>> >
>> > On Fri, Sep 2, 2016 at 10:00 AM, Quinn Stevenson <
>> > quinn@pronoia-solutions.com> wrote:
>> >
>> >> Sorry - I wanted to put in and example that worked, and download
>> >> something big to make sure it was streaming.  Hopefully you needed a
>> new
>> >> CentOS image :-)
>> >>
>> >>
>> >>
>> >> > On Sep 2, 2016, at 8:58 AM, Brad Johnson <
>> brad.johnson@mediadriver.com>
>> >> wrote:
>> >> >
>> >> > Neat.  I accidentally clicked on the link and Chrome downloaded the
>> ISO
>> >> for
>> >> > me.  Are you propagating Trojan horses here?  Heh.
>> >> >
>> >> > On Fri, Sep 2, 2016 at 9:56 AM, Quinn Stevenson <
>> >> quinn@pronoia-solutions.com
>> >> >> wrote:
>> >> >
>> >> >> I think something like this might work for you
>> >> >>
>> >> >> <route>
>> >> >>    <from uri="direct://trigger-download" />
>> >> >>    <log message="Download Triggered" />
>> >> >>    <to uri="http4://buildlogs.centos.org/rolling/7/isos/x86_64/
>> >> >> CentOS-7-x86_64-DVD.iso?disableStreamCache=true" />
>> >> >>    <log message="Writing File" />
>> >> >>    <to uri="file://target/download" />
>> >> >> </route>
>> >> >>
>> >> >>> On Sep 2, 2016, at 8:51 AM, Brad Johnson <
>> >> brad.johnson@mediadriver.com>
>> >> >> wrote:
>> >> >>>
>> >> >>> Hmmm. That could be a problem if it doesn't actually chunk.  I
>> >> thought it
>> >> >>> read the entire chunk into memory before letting you read it.  So
>> if
>> >> the
>> >> >>> chunk size is 10mb it would download that whole 10mb and then let
>> you
>> >> >> read,
>> >> >>> then fetch the next 10mb and let you read.  But that may not be the
>> >> >> case. I
>> >> >>> haven't worked with it much so can't say.  I do know it's
>> >> exceptionally
>> >> >>> fast.
>> >> >>>
>> >> >>> The chunking almost seems pointless if it doesn't work that way.
>> >> >>>
>> >> >>> On Fri, Sep 2, 2016 at 9:27 AM, S Ahmed <sa...@gmail.com>
>> wrote:
>> >> >>>
>> >> >>>> Brad, that page says this: "Notice Netty4 HTTP reads the entire
>> >> stream
>> >> >> into
>> >> >>>> memory using io.netty.handler.codec.http.HttpObjectAggregator to
>> >> build
>> >> >> the
>> >> >>>> entire full http message. But the resulting message is still a
>> stream
>> >> >> based
>> >> >>>> message which is readable once."
>> >> >>>>
>> >> >>>> On Fri, Sep 2, 2016 at 10:26 AM, S Ahmed <sa...@gmail.com>
>> >> wrote:
>> >> >>>>
>> >> >>>>> Thanks.
>> >> >>>>>
>> >> >>>>> Just to be clear, I don't run the server where I am downloading
>> the
>> >> >> file.
>> >> >>>>> I want to download files that are very large, but stream them so
>> >> they
>> >> >> are
>> >> >>>>> not held in memory and then written to disk.  I want to stream
>> the
>> >> >>>> download
>> >> >>>>> straight to a file and not hold the entire file in memory.
>> >> >>>>>
>> >> >>>>> Is Netty for the server portion or the client?
>> >> >>>>>
>> >> >>>>> On Fri, Sep 2, 2016 at 12:34 AM, Brad Johnson <
>> >> >>>>> brad.johnson@mediadriver.com> wrote:
>> >> >>>>>
>> >> >>>>>> http://camel.apache.org/netty4-http.html
>> >> >>>>>>
>> >> >>>>>> Look at netty and see if that works.  It can control chunk size
>> >> but it
>> >> >>>> is
>> >> >>>>>> also streaming in any case so you may not even need to be
>> concerned
>> >> >>>> about
>> >> >>>>>> it.
>> >> >>>>>>
>> >> >>>>>> Brad
>> >> >>>>>>
>> >> >>>>>> On Thu, Sep 1, 2016 at 8:53 PM, S Ahmed <sa...@gmail.com>
>> >> wrote:
>> >> >>>>>>
>> >> >>>>>>> Does it have to be ftp, I just need http?
>> >> >>>>>>>
>> >> >>>>>>> On Thu, Sep 1, 2016 at 5:31 PM, Quinn Stevenson <
>> >> >>>>>>> quinn@pronoia-solutions.com
>> >> >>>>>>>> wrote:
>> >> >>>>>>>
>> >> >>>>>>>> Check out the section on the ftp component page about “Using a
>> >> Local
>> >> >>>>>> Work
>> >> >>>>>>>> Directory” (http://people.apache.org/~dkulp/camel/ftp2.html <
>> >> >>>>>>>> http://people.apache.org/~dkulp/camel/ftp2.html>) - I think
>> that
>> >> >>>> may
>> >> >>>>>> be
>> >> >>>>>>>> what you’re after.
>> >> >>>>>>>>
>> >> >>>>>>>>
>> >> >>>>>>>>> On Sep 1, 2016, at 9:30 AM, S Ahmed <sa...@gmail.com>
>> >> wrote:
>> >> >>>>>>>>>
>> >> >>>>>>>>> Hello,
>> >> >>>>>>>>>
>> >> >>>>>>>>> Is there an example of how to download a large file in chunks
>> >> and
>> >> >>>>>> save
>> >> >>>>>>>> the
>> >> >>>>>>>>> file as the file downloads.
>> >> >>>>>>>>>
>> >> >>>>>>>>> The goal is not to hold the entire file in memory and then
>> save
>> >> it
>> >> >>>>>> to
>> >> >>>>>>>> disk.
>> >> >>>>>>>>>
>> >> >>>>>>>>>
>> >> >>>>>>>>> Thanks.
>> >> >>>>>>>>
>> >> >>>>>>>>
>> >> >>>>>>>
>> >> >>>>>>
>> >> >>>>>
>> >> >>>>>
>> >> >>>>
>> >> >>
>> >> >>
>> >>
>> >>
>> >
>>
>
>

Re: downloading large files in chunks

Posted by S Ahmed <sa...@gmail.com>.
I'm just the consumer (downloading), the file can be anywhere like s3 or
centos.org!



On Fri, Sep 2, 2016 at 11:09 AM, Brad Johnson <br...@mediadriver.com>
wrote:

> By the way S. Ahmed, do you have control of both ends of this I mean
> client/server or are you just on the client/consumer side?
>
> On Fri, Sep 2, 2016 at 10:01 AM, Brad Johnson <
> brad.johnson@mediadriver.com>
> wrote:
>
> > Absolutely.  Love to set up a VM for my server.  I just had a "duh"
> moment
> > when I did it.  No harm, no foul.
> >
> > On Fri, Sep 2, 2016 at 10:00 AM, Quinn Stevenson <
> > quinn@pronoia-solutions.com> wrote:
> >
> >> Sorry - I wanted to put in and example that worked, and download
> >> something big to make sure it was streaming.  Hopefully you needed a new
> >> CentOS image :-)
> >>
> >>
> >>
> >> > On Sep 2, 2016, at 8:58 AM, Brad Johnson <
> brad.johnson@mediadriver.com>
> >> wrote:
> >> >
> >> > Neat.  I accidentally clicked on the link and Chrome downloaded the
> ISO
> >> for
> >> > me.  Are you propagating Trojan horses here?  Heh.
> >> >
> >> > On Fri, Sep 2, 2016 at 9:56 AM, Quinn Stevenson <
> >> quinn@pronoia-solutions.com
> >> >> wrote:
> >> >
> >> >> I think something like this might work for you
> >> >>
> >> >> <route>
> >> >>    <from uri="direct://trigger-download" />
> >> >>    <log message="Download Triggered" />
> >> >>    <to uri="http4://buildlogs.centos.org/rolling/7/isos/x86_64/
> >> >> CentOS-7-x86_64-DVD.iso?disableStreamCache=true" />
> >> >>    <log message="Writing File" />
> >> >>    <to uri="file://target/download" />
> >> >> </route>
> >> >>
> >> >>> On Sep 2, 2016, at 8:51 AM, Brad Johnson <
> >> brad.johnson@mediadriver.com>
> >> >> wrote:
> >> >>>
> >> >>> Hmmm. That could be a problem if it doesn't actually chunk.  I
> >> thought it
> >> >>> read the entire chunk into memory before letting you read it.  So if
> >> the
> >> >>> chunk size is 10mb it would download that whole 10mb and then let
> you
> >> >> read,
> >> >>> then fetch the next 10mb and let you read.  But that may not be the
> >> >> case. I
> >> >>> haven't worked with it much so can't say.  I do know it's
> >> exceptionally
> >> >>> fast.
> >> >>>
> >> >>> The chunking almost seems pointless if it doesn't work that way.
> >> >>>
> >> >>> On Fri, Sep 2, 2016 at 9:27 AM, S Ahmed <sa...@gmail.com>
> wrote:
> >> >>>
> >> >>>> Brad, that page says this: "Notice Netty4 HTTP reads the entire
> >> stream
> >> >> into
> >> >>>> memory using io.netty.handler.codec.http.HttpObjectAggregator to
> >> build
> >> >> the
> >> >>>> entire full http message. But the resulting message is still a
> stream
> >> >> based
> >> >>>> message which is readable once."
> >> >>>>
> >> >>>> On Fri, Sep 2, 2016 at 10:26 AM, S Ahmed <sa...@gmail.com>
> >> wrote:
> >> >>>>
> >> >>>>> Thanks.
> >> >>>>>
> >> >>>>> Just to be clear, I don't run the server where I am downloading
> the
> >> >> file.
> >> >>>>> I want to download files that are very large, but stream them so
> >> they
> >> >> are
> >> >>>>> not held in memory and then written to disk.  I want to stream the
> >> >>>> download
> >> >>>>> straight to a file and not hold the entire file in memory.
> >> >>>>>
> >> >>>>> Is Netty for the server portion or the client?
> >> >>>>>
> >> >>>>> On Fri, Sep 2, 2016 at 12:34 AM, Brad Johnson <
> >> >>>>> brad.johnson@mediadriver.com> wrote:
> >> >>>>>
> >> >>>>>> http://camel.apache.org/netty4-http.html
> >> >>>>>>
> >> >>>>>> Look at netty and see if that works.  It can control chunk size
> >> but it
> >> >>>> is
> >> >>>>>> also streaming in any case so you may not even need to be
> concerned
> >> >>>> about
> >> >>>>>> it.
> >> >>>>>>
> >> >>>>>> Brad
> >> >>>>>>
> >> >>>>>> On Thu, Sep 1, 2016 at 8:53 PM, S Ahmed <sa...@gmail.com>
> >> wrote:
> >> >>>>>>
> >> >>>>>>> Does it have to be ftp, I just need http?
> >> >>>>>>>
> >> >>>>>>> On Thu, Sep 1, 2016 at 5:31 PM, Quinn Stevenson <
> >> >>>>>>> quinn@pronoia-solutions.com
> >> >>>>>>>> wrote:
> >> >>>>>>>
> >> >>>>>>>> Check out the section on the ftp component page about “Using a
> >> Local
> >> >>>>>> Work
> >> >>>>>>>> Directory” (http://people.apache.org/~dkulp/camel/ftp2.html <
> >> >>>>>>>> http://people.apache.org/~dkulp/camel/ftp2.html>) - I think
> that
> >> >>>> may
> >> >>>>>> be
> >> >>>>>>>> what you’re after.
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>>>>> On Sep 1, 2016, at 9:30 AM, S Ahmed <sa...@gmail.com>
> >> wrote:
> >> >>>>>>>>>
> >> >>>>>>>>> Hello,
> >> >>>>>>>>>
> >> >>>>>>>>> Is there an example of how to download a large file in chunks
> >> and
> >> >>>>>> save
> >> >>>>>>>> the
> >> >>>>>>>>> file as the file downloads.
> >> >>>>>>>>>
> >> >>>>>>>>> The goal is not to hold the entire file in memory and then
> save
> >> it
> >> >>>>>> to
> >> >>>>>>>> disk.
> >> >>>>>>>>>
> >> >>>>>>>>>
> >> >>>>>>>>> Thanks.
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>>>
> >> >>>>>>
> >> >>>>>
> >> >>>>>
> >> >>>>
> >> >>
> >> >>
> >>
> >>
> >
>

Re: downloading large files in chunks

Posted by Brad Johnson <br...@mediadriver.com>.
By the way S. Ahmed, do you have control of both ends of this I mean
client/server or are you just on the client/consumer side?

On Fri, Sep 2, 2016 at 10:01 AM, Brad Johnson <br...@mediadriver.com>
wrote:

> Absolutely.  Love to set up a VM for my server.  I just had a "duh" moment
> when I did it.  No harm, no foul.
>
> On Fri, Sep 2, 2016 at 10:00 AM, Quinn Stevenson <
> quinn@pronoia-solutions.com> wrote:
>
>> Sorry - I wanted to put in and example that worked, and download
>> something big to make sure it was streaming.  Hopefully you needed a new
>> CentOS image :-)
>>
>>
>>
>> > On Sep 2, 2016, at 8:58 AM, Brad Johnson <br...@mediadriver.com>
>> wrote:
>> >
>> > Neat.  I accidentally clicked on the link and Chrome downloaded the ISO
>> for
>> > me.  Are you propagating Trojan horses here?  Heh.
>> >
>> > On Fri, Sep 2, 2016 at 9:56 AM, Quinn Stevenson <
>> quinn@pronoia-solutions.com
>> >> wrote:
>> >
>> >> I think something like this might work for you
>> >>
>> >> <route>
>> >>    <from uri="direct://trigger-download" />
>> >>    <log message="Download Triggered" />
>> >>    <to uri="http4://buildlogs.centos.org/rolling/7/isos/x86_64/
>> >> CentOS-7-x86_64-DVD.iso?disableStreamCache=true" />
>> >>    <log message="Writing File" />
>> >>    <to uri="file://target/download" />
>> >> </route>
>> >>
>> >>> On Sep 2, 2016, at 8:51 AM, Brad Johnson <
>> brad.johnson@mediadriver.com>
>> >> wrote:
>> >>>
>> >>> Hmmm. That could be a problem if it doesn't actually chunk.  I
>> thought it
>> >>> read the entire chunk into memory before letting you read it.  So if
>> the
>> >>> chunk size is 10mb it would download that whole 10mb and then let you
>> >> read,
>> >>> then fetch the next 10mb and let you read.  But that may not be the
>> >> case. I
>> >>> haven't worked with it much so can't say.  I do know it's
>> exceptionally
>> >>> fast.
>> >>>
>> >>> The chunking almost seems pointless if it doesn't work that way.
>> >>>
>> >>> On Fri, Sep 2, 2016 at 9:27 AM, S Ahmed <sa...@gmail.com> wrote:
>> >>>
>> >>>> Brad, that page says this: "Notice Netty4 HTTP reads the entire
>> stream
>> >> into
>> >>>> memory using io.netty.handler.codec.http.HttpObjectAggregator to
>> build
>> >> the
>> >>>> entire full http message. But the resulting message is still a stream
>> >> based
>> >>>> message which is readable once."
>> >>>>
>> >>>> On Fri, Sep 2, 2016 at 10:26 AM, S Ahmed <sa...@gmail.com>
>> wrote:
>> >>>>
>> >>>>> Thanks.
>> >>>>>
>> >>>>> Just to be clear, I don't run the server where I am downloading the
>> >> file.
>> >>>>> I want to download files that are very large, but stream them so
>> they
>> >> are
>> >>>>> not held in memory and then written to disk.  I want to stream the
>> >>>> download
>> >>>>> straight to a file and not hold the entire file in memory.
>> >>>>>
>> >>>>> Is Netty for the server portion or the client?
>> >>>>>
>> >>>>> On Fri, Sep 2, 2016 at 12:34 AM, Brad Johnson <
>> >>>>> brad.johnson@mediadriver.com> wrote:
>> >>>>>
>> >>>>>> http://camel.apache.org/netty4-http.html
>> >>>>>>
>> >>>>>> Look at netty and see if that works.  It can control chunk size
>> but it
>> >>>> is
>> >>>>>> also streaming in any case so you may not even need to be concerned
>> >>>> about
>> >>>>>> it.
>> >>>>>>
>> >>>>>> Brad
>> >>>>>>
>> >>>>>> On Thu, Sep 1, 2016 at 8:53 PM, S Ahmed <sa...@gmail.com>
>> wrote:
>> >>>>>>
>> >>>>>>> Does it have to be ftp, I just need http?
>> >>>>>>>
>> >>>>>>> On Thu, Sep 1, 2016 at 5:31 PM, Quinn Stevenson <
>> >>>>>>> quinn@pronoia-solutions.com
>> >>>>>>>> wrote:
>> >>>>>>>
>> >>>>>>>> Check out the section on the ftp component page about “Using a
>> Local
>> >>>>>> Work
>> >>>>>>>> Directory” (http://people.apache.org/~dkulp/camel/ftp2.html <
>> >>>>>>>> http://people.apache.org/~dkulp/camel/ftp2.html>) - I think that
>> >>>> may
>> >>>>>> be
>> >>>>>>>> what you’re after.
>> >>>>>>>>
>> >>>>>>>>
>> >>>>>>>>> On Sep 1, 2016, at 9:30 AM, S Ahmed <sa...@gmail.com>
>> wrote:
>> >>>>>>>>>
>> >>>>>>>>> Hello,
>> >>>>>>>>>
>> >>>>>>>>> Is there an example of how to download a large file in chunks
>> and
>> >>>>>> save
>> >>>>>>>> the
>> >>>>>>>>> file as the file downloads.
>> >>>>>>>>>
>> >>>>>>>>> The goal is not to hold the entire file in memory and then save
>> it
>> >>>>>> to
>> >>>>>>>> disk.
>> >>>>>>>>>
>> >>>>>>>>>
>> >>>>>>>>> Thanks.
>> >>>>>>>>
>> >>>>>>>>
>> >>>>>>>
>> >>>>>>
>> >>>>>
>> >>>>>
>> >>>>
>> >>
>> >>
>>
>>
>

Re: downloading large files in chunks

Posted by Brad Johnson <br...@mediadriver.com>.
Absolutely.  Love to set up a VM for my server.  I just had a "duh" moment
when I did it.  No harm, no foul.

On Fri, Sep 2, 2016 at 10:00 AM, Quinn Stevenson <
quinn@pronoia-solutions.com> wrote:

> Sorry - I wanted to put in and example that worked, and download something
> big to make sure it was streaming.  Hopefully you needed a new CentOS image
> :-)
>
>
>
> > On Sep 2, 2016, at 8:58 AM, Brad Johnson <br...@mediadriver.com>
> wrote:
> >
> > Neat.  I accidentally clicked on the link and Chrome downloaded the ISO
> for
> > me.  Are you propagating Trojan horses here?  Heh.
> >
> > On Fri, Sep 2, 2016 at 9:56 AM, Quinn Stevenson <
> quinn@pronoia-solutions.com
> >> wrote:
> >
> >> I think something like this might work for you
> >>
> >> <route>
> >>    <from uri="direct://trigger-download" />
> >>    <log message="Download Triggered" />
> >>    <to uri="http4://buildlogs.centos.org/rolling/7/isos/x86_64/
> >> CentOS-7-x86_64-DVD.iso?disableStreamCache=true" />
> >>    <log message="Writing File" />
> >>    <to uri="file://target/download" />
> >> </route>
> >>
> >>> On Sep 2, 2016, at 8:51 AM, Brad Johnson <brad.johnson@mediadriver.com
> >
> >> wrote:
> >>>
> >>> Hmmm. That could be a problem if it doesn't actually chunk.  I thought
> it
> >>> read the entire chunk into memory before letting you read it.  So if
> the
> >>> chunk size is 10mb it would download that whole 10mb and then let you
> >> read,
> >>> then fetch the next 10mb and let you read.  But that may not be the
> >> case. I
> >>> haven't worked with it much so can't say.  I do know it's exceptionally
> >>> fast.
> >>>
> >>> The chunking almost seems pointless if it doesn't work that way.
> >>>
> >>> On Fri, Sep 2, 2016 at 9:27 AM, S Ahmed <sa...@gmail.com> wrote:
> >>>
> >>>> Brad, that page says this: "Notice Netty4 HTTP reads the entire stream
> >> into
> >>>> memory using io.netty.handler.codec.http.HttpObjectAggregator to
> build
> >> the
> >>>> entire full http message. But the resulting message is still a stream
> >> based
> >>>> message which is readable once."
> >>>>
> >>>> On Fri, Sep 2, 2016 at 10:26 AM, S Ahmed <sa...@gmail.com>
> wrote:
> >>>>
> >>>>> Thanks.
> >>>>>
> >>>>> Just to be clear, I don't run the server where I am downloading the
> >> file.
> >>>>> I want to download files that are very large, but stream them so they
> >> are
> >>>>> not held in memory and then written to disk.  I want to stream the
> >>>> download
> >>>>> straight to a file and not hold the entire file in memory.
> >>>>>
> >>>>> Is Netty for the server portion or the client?
> >>>>>
> >>>>> On Fri, Sep 2, 2016 at 12:34 AM, Brad Johnson <
> >>>>> brad.johnson@mediadriver.com> wrote:
> >>>>>
> >>>>>> http://camel.apache.org/netty4-http.html
> >>>>>>
> >>>>>> Look at netty and see if that works.  It can control chunk size but
> it
> >>>> is
> >>>>>> also streaming in any case so you may not even need to be concerned
> >>>> about
> >>>>>> it.
> >>>>>>
> >>>>>> Brad
> >>>>>>
> >>>>>> On Thu, Sep 1, 2016 at 8:53 PM, S Ahmed <sa...@gmail.com>
> wrote:
> >>>>>>
> >>>>>>> Does it have to be ftp, I just need http?
> >>>>>>>
> >>>>>>> On Thu, Sep 1, 2016 at 5:31 PM, Quinn Stevenson <
> >>>>>>> quinn@pronoia-solutions.com
> >>>>>>>> wrote:
> >>>>>>>
> >>>>>>>> Check out the section on the ftp component page about “Using a
> Local
> >>>>>> Work
> >>>>>>>> Directory” (http://people.apache.org/~dkulp/camel/ftp2.html <
> >>>>>>>> http://people.apache.org/~dkulp/camel/ftp2.html>) - I think that
> >>>> may
> >>>>>> be
> >>>>>>>> what you’re after.
> >>>>>>>>
> >>>>>>>>
> >>>>>>>>> On Sep 1, 2016, at 9:30 AM, S Ahmed <sa...@gmail.com>
> wrote:
> >>>>>>>>>
> >>>>>>>>> Hello,
> >>>>>>>>>
> >>>>>>>>> Is there an example of how to download a large file in chunks and
> >>>>>> save
> >>>>>>>> the
> >>>>>>>>> file as the file downloads.
> >>>>>>>>>
> >>>>>>>>> The goal is not to hold the entire file in memory and then save
> it
> >>>>>> to
> >>>>>>>> disk.
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>> Thanks.
> >>>>>>>>
> >>>>>>>>
> >>>>>>>
> >>>>>>
> >>>>>
> >>>>>
> >>>>
> >>
> >>
>
>

Re: downloading large files in chunks

Posted by Quinn Stevenson <qu...@pronoia-solutions.com>.
Sorry - I wanted to put in and example that worked, and download something big to make sure it was streaming.  Hopefully you needed a new CentOS image :-)



> On Sep 2, 2016, at 8:58 AM, Brad Johnson <br...@mediadriver.com> wrote:
> 
> Neat.  I accidentally clicked on the link and Chrome downloaded the ISO for
> me.  Are you propagating Trojan horses here?  Heh.
> 
> On Fri, Sep 2, 2016 at 9:56 AM, Quinn Stevenson <quinn@pronoia-solutions.com
>> wrote:
> 
>> I think something like this might work for you
>> 
>> <route>
>>    <from uri="direct://trigger-download" />
>>    <log message="Download Triggered" />
>>    <to uri="http4://buildlogs.centos.org/rolling/7/isos/x86_64/
>> CentOS-7-x86_64-DVD.iso?disableStreamCache=true" />
>>    <log message="Writing File" />
>>    <to uri="file://target/download" />
>> </route>
>> 
>>> On Sep 2, 2016, at 8:51 AM, Brad Johnson <br...@mediadriver.com>
>> wrote:
>>> 
>>> Hmmm. That could be a problem if it doesn't actually chunk.  I thought it
>>> read the entire chunk into memory before letting you read it.  So if the
>>> chunk size is 10mb it would download that whole 10mb and then let you
>> read,
>>> then fetch the next 10mb and let you read.  But that may not be the
>> case. I
>>> haven't worked with it much so can't say.  I do know it's exceptionally
>>> fast.
>>> 
>>> The chunking almost seems pointless if it doesn't work that way.
>>> 
>>> On Fri, Sep 2, 2016 at 9:27 AM, S Ahmed <sa...@gmail.com> wrote:
>>> 
>>>> Brad, that page says this: "Notice Netty4 HTTP reads the entire stream
>> into
>>>> memory using io.netty.handler.codec.http.HttpObjectAggregator to build
>> the
>>>> entire full http message. But the resulting message is still a stream
>> based
>>>> message which is readable once."
>>>> 
>>>> On Fri, Sep 2, 2016 at 10:26 AM, S Ahmed <sa...@gmail.com> wrote:
>>>> 
>>>>> Thanks.
>>>>> 
>>>>> Just to be clear, I don't run the server where I am downloading the
>> file.
>>>>> I want to download files that are very large, but stream them so they
>> are
>>>>> not held in memory and then written to disk.  I want to stream the
>>>> download
>>>>> straight to a file and not hold the entire file in memory.
>>>>> 
>>>>> Is Netty for the server portion or the client?
>>>>> 
>>>>> On Fri, Sep 2, 2016 at 12:34 AM, Brad Johnson <
>>>>> brad.johnson@mediadriver.com> wrote:
>>>>> 
>>>>>> http://camel.apache.org/netty4-http.html
>>>>>> 
>>>>>> Look at netty and see if that works.  It can control chunk size but it
>>>> is
>>>>>> also streaming in any case so you may not even need to be concerned
>>>> about
>>>>>> it.
>>>>>> 
>>>>>> Brad
>>>>>> 
>>>>>> On Thu, Sep 1, 2016 at 8:53 PM, S Ahmed <sa...@gmail.com> wrote:
>>>>>> 
>>>>>>> Does it have to be ftp, I just need http?
>>>>>>> 
>>>>>>> On Thu, Sep 1, 2016 at 5:31 PM, Quinn Stevenson <
>>>>>>> quinn@pronoia-solutions.com
>>>>>>>> wrote:
>>>>>>> 
>>>>>>>> Check out the section on the ftp component page about “Using a Local
>>>>>> Work
>>>>>>>> Directory” (http://people.apache.org/~dkulp/camel/ftp2.html <
>>>>>>>> http://people.apache.org/~dkulp/camel/ftp2.html>) - I think that
>>>> may
>>>>>> be
>>>>>>>> what you’re after.
>>>>>>>> 
>>>>>>>> 
>>>>>>>>> On Sep 1, 2016, at 9:30 AM, S Ahmed <sa...@gmail.com> wrote:
>>>>>>>>> 
>>>>>>>>> Hello,
>>>>>>>>> 
>>>>>>>>> Is there an example of how to download a large file in chunks and
>>>>>> save
>>>>>>>> the
>>>>>>>>> file as the file downloads.
>>>>>>>>> 
>>>>>>>>> The goal is not to hold the entire file in memory and then save it
>>>>>> to
>>>>>>>> disk.
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> Thanks.
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>> 
>>>>> 
>>>> 
>> 
>> 


Re: downloading large files in chunks

Posted by Brad Johnson <br...@mediadriver.com>.
Neat.  I accidentally clicked on the link and Chrome downloaded the ISO for
me.  Are you propagating Trojan horses here?  Heh.

On Fri, Sep 2, 2016 at 9:56 AM, Quinn Stevenson <quinn@pronoia-solutions.com
> wrote:

> I think something like this might work for you
>
> <route>
>     <from uri="direct://trigger-download" />
>     <log message="Download Triggered" />
>     <to uri="http4://buildlogs.centos.org/rolling/7/isos/x86_64/
> CentOS-7-x86_64-DVD.iso?disableStreamCache=true" />
>     <log message="Writing File" />
>     <to uri="file://target/download" />
> </route>
>
> > On Sep 2, 2016, at 8:51 AM, Brad Johnson <br...@mediadriver.com>
> wrote:
> >
> > Hmmm. That could be a problem if it doesn't actually chunk.  I thought it
> > read the entire chunk into memory before letting you read it.  So if the
> > chunk size is 10mb it would download that whole 10mb and then let you
> read,
> > then fetch the next 10mb and let you read.  But that may not be the
> case. I
> > haven't worked with it much so can't say.  I do know it's exceptionally
> > fast.
> >
> > The chunking almost seems pointless if it doesn't work that way.
> >
> > On Fri, Sep 2, 2016 at 9:27 AM, S Ahmed <sa...@gmail.com> wrote:
> >
> >> Brad, that page says this: "Notice Netty4 HTTP reads the entire stream
> into
> >> memory using io.netty.handler.codec.http.HttpObjectAggregator to build
> the
> >> entire full http message. But the resulting message is still a stream
> based
> >> message which is readable once."
> >>
> >> On Fri, Sep 2, 2016 at 10:26 AM, S Ahmed <sa...@gmail.com> wrote:
> >>
> >>> Thanks.
> >>>
> >>> Just to be clear, I don't run the server where I am downloading the
> file.
> >>> I want to download files that are very large, but stream them so they
> are
> >>> not held in memory and then written to disk.  I want to stream the
> >> download
> >>> straight to a file and not hold the entire file in memory.
> >>>
> >>> Is Netty for the server portion or the client?
> >>>
> >>> On Fri, Sep 2, 2016 at 12:34 AM, Brad Johnson <
> >>> brad.johnson@mediadriver.com> wrote:
> >>>
> >>>> http://camel.apache.org/netty4-http.html
> >>>>
> >>>> Look at netty and see if that works.  It can control chunk size but it
> >> is
> >>>> also streaming in any case so you may not even need to be concerned
> >> about
> >>>> it.
> >>>>
> >>>> Brad
> >>>>
> >>>> On Thu, Sep 1, 2016 at 8:53 PM, S Ahmed <sa...@gmail.com> wrote:
> >>>>
> >>>>> Does it have to be ftp, I just need http?
> >>>>>
> >>>>> On Thu, Sep 1, 2016 at 5:31 PM, Quinn Stevenson <
> >>>>> quinn@pronoia-solutions.com
> >>>>>> wrote:
> >>>>>
> >>>>>> Check out the section on the ftp component page about “Using a Local
> >>>> Work
> >>>>>> Directory” (http://people.apache.org/~dkulp/camel/ftp2.html <
> >>>>>> http://people.apache.org/~dkulp/camel/ftp2.html>) - I think that
> >> may
> >>>> be
> >>>>>> what you’re after.
> >>>>>>
> >>>>>>
> >>>>>>> On Sep 1, 2016, at 9:30 AM, S Ahmed <sa...@gmail.com> wrote:
> >>>>>>>
> >>>>>>> Hello,
> >>>>>>>
> >>>>>>> Is there an example of how to download a large file in chunks and
> >>>> save
> >>>>>> the
> >>>>>>> file as the file downloads.
> >>>>>>>
> >>>>>>> The goal is not to hold the entire file in memory and then save it
> >>>> to
> >>>>>> disk.
> >>>>>>>
> >>>>>>>
> >>>>>>> Thanks.
> >>>>>>
> >>>>>>
> >>>>>
> >>>>
> >>>
> >>>
> >>
>
>

Re: downloading large files in chunks

Posted by Quinn Stevenson <qu...@pronoia-solutions.com>.
I think something like this might work for you

<route>
    <from uri="direct://trigger-download" />
    <log message="Download Triggered" />
    <to uri="http4://buildlogs.centos.org/rolling/7/isos/x86_64/CentOS-7-x86_64-DVD.iso?disableStreamCache=true" />
    <log message="Writing File" />
    <to uri="file://target/download" />
</route>

> On Sep 2, 2016, at 8:51 AM, Brad Johnson <br...@mediadriver.com> wrote:
> 
> Hmmm. That could be a problem if it doesn't actually chunk.  I thought it
> read the entire chunk into memory before letting you read it.  So if the
> chunk size is 10mb it would download that whole 10mb and then let you read,
> then fetch the next 10mb and let you read.  But that may not be the case. I
> haven't worked with it much so can't say.  I do know it's exceptionally
> fast.
> 
> The chunking almost seems pointless if it doesn't work that way.
> 
> On Fri, Sep 2, 2016 at 9:27 AM, S Ahmed <sa...@gmail.com> wrote:
> 
>> Brad, that page says this: "Notice Netty4 HTTP reads the entire stream into
>> memory using io.netty.handler.codec.http.HttpObjectAggregator to build the
>> entire full http message. But the resulting message is still a stream based
>> message which is readable once."
>> 
>> On Fri, Sep 2, 2016 at 10:26 AM, S Ahmed <sa...@gmail.com> wrote:
>> 
>>> Thanks.
>>> 
>>> Just to be clear, I don't run the server where I am downloading the file.
>>> I want to download files that are very large, but stream them so they are
>>> not held in memory and then written to disk.  I want to stream the
>> download
>>> straight to a file and not hold the entire file in memory.
>>> 
>>> Is Netty for the server portion or the client?
>>> 
>>> On Fri, Sep 2, 2016 at 12:34 AM, Brad Johnson <
>>> brad.johnson@mediadriver.com> wrote:
>>> 
>>>> http://camel.apache.org/netty4-http.html
>>>> 
>>>> Look at netty and see if that works.  It can control chunk size but it
>> is
>>>> also streaming in any case so you may not even need to be concerned
>> about
>>>> it.
>>>> 
>>>> Brad
>>>> 
>>>> On Thu, Sep 1, 2016 at 8:53 PM, S Ahmed <sa...@gmail.com> wrote:
>>>> 
>>>>> Does it have to be ftp, I just need http?
>>>>> 
>>>>> On Thu, Sep 1, 2016 at 5:31 PM, Quinn Stevenson <
>>>>> quinn@pronoia-solutions.com
>>>>>> wrote:
>>>>> 
>>>>>> Check out the section on the ftp component page about “Using a Local
>>>> Work
>>>>>> Directory” (http://people.apache.org/~dkulp/camel/ftp2.html <
>>>>>> http://people.apache.org/~dkulp/camel/ftp2.html>) - I think that
>> may
>>>> be
>>>>>> what you’re after.
>>>>>> 
>>>>>> 
>>>>>>> On Sep 1, 2016, at 9:30 AM, S Ahmed <sa...@gmail.com> wrote:
>>>>>>> 
>>>>>>> Hello,
>>>>>>> 
>>>>>>> Is there an example of how to download a large file in chunks and
>>>> save
>>>>>> the
>>>>>>> file as the file downloads.
>>>>>>> 
>>>>>>> The goal is not to hold the entire file in memory and then save it
>>>> to
>>>>>> disk.
>>>>>>> 
>>>>>>> 
>>>>>>> Thanks.
>>>>>> 
>>>>>> 
>>>>> 
>>>> 
>>> 
>>> 
>> 


Re: downloading large files in chunks

Posted by Brad Johnson <br...@mediadriver.com>.
Hmmm. That could be a problem if it doesn't actually chunk.  I thought it
read the entire chunk into memory before letting you read it.  So if the
chunk size is 10mb it would download that whole 10mb and then let you read,
then fetch the next 10mb and let you read.  But that may not be the case. I
haven't worked with it much so can't say.  I do know it's exceptionally
fast.

The chunking almost seems pointless if it doesn't work that way.

On Fri, Sep 2, 2016 at 9:27 AM, S Ahmed <sa...@gmail.com> wrote:

> Brad, that page says this: "Notice Netty4 HTTP reads the entire stream into
> memory using io.netty.handler.codec.http.HttpObjectAggregator to build the
> entire full http message. But the resulting message is still a stream based
> message which is readable once."
>
> On Fri, Sep 2, 2016 at 10:26 AM, S Ahmed <sa...@gmail.com> wrote:
>
> > Thanks.
> >
> > Just to be clear, I don't run the server where I am downloading the file.
> > I want to download files that are very large, but stream them so they are
> > not held in memory and then written to disk.  I want to stream the
> download
> > straight to a file and not hold the entire file in memory.
> >
> > Is Netty for the server portion or the client?
> >
> > On Fri, Sep 2, 2016 at 12:34 AM, Brad Johnson <
> > brad.johnson@mediadriver.com> wrote:
> >
> >> http://camel.apache.org/netty4-http.html
> >>
> >> Look at netty and see if that works.  It can control chunk size but it
> is
> >> also streaming in any case so you may not even need to be concerned
> about
> >> it.
> >>
> >> Brad
> >>
> >> On Thu, Sep 1, 2016 at 8:53 PM, S Ahmed <sa...@gmail.com> wrote:
> >>
> >> > Does it have to be ftp, I just need http?
> >> >
> >> > On Thu, Sep 1, 2016 at 5:31 PM, Quinn Stevenson <
> >> > quinn@pronoia-solutions.com
> >> > > wrote:
> >> >
> >> > > Check out the section on the ftp component page about “Using a Local
> >> Work
> >> > > Directory” (http://people.apache.org/~dkulp/camel/ftp2.html <
> >> > > http://people.apache.org/~dkulp/camel/ftp2.html>) - I think that
> may
> >> be
> >> > > what you’re after.
> >> > >
> >> > >
> >> > > > On Sep 1, 2016, at 9:30 AM, S Ahmed <sa...@gmail.com> wrote:
> >> > > >
> >> > > > Hello,
> >> > > >
> >> > > > Is there an example of how to download a large file in chunks and
> >> save
> >> > > the
> >> > > > file as the file downloads.
> >> > > >
> >> > > > The goal is not to hold the entire file in memory and then save it
> >> to
> >> > > disk.
> >> > > >
> >> > > >
> >> > > > Thanks.
> >> > >
> >> > >
> >> >
> >>
> >
> >
>

Re: downloading large files in chunks

Posted by S Ahmed <sa...@gmail.com>.
Brad, that page says this: "Notice Netty4 HTTP reads the entire stream into
memory using io.netty.handler.codec.http.HttpObjectAggregator to build the
entire full http message. But the resulting message is still a stream based
message which is readable once."

On Fri, Sep 2, 2016 at 10:26 AM, S Ahmed <sa...@gmail.com> wrote:

> Thanks.
>
> Just to be clear, I don't run the server where I am downloading the file.
> I want to download files that are very large, but stream them so they are
> not held in memory and then written to disk.  I want to stream the download
> straight to a file and not hold the entire file in memory.
>
> Is Netty for the server portion or the client?
>
> On Fri, Sep 2, 2016 at 12:34 AM, Brad Johnson <
> brad.johnson@mediadriver.com> wrote:
>
>> http://camel.apache.org/netty4-http.html
>>
>> Look at netty and see if that works.  It can control chunk size but it is
>> also streaming in any case so you may not even need to be concerned about
>> it.
>>
>> Brad
>>
>> On Thu, Sep 1, 2016 at 8:53 PM, S Ahmed <sa...@gmail.com> wrote:
>>
>> > Does it have to be ftp, I just need http?
>> >
>> > On Thu, Sep 1, 2016 at 5:31 PM, Quinn Stevenson <
>> > quinn@pronoia-solutions.com
>> > > wrote:
>> >
>> > > Check out the section on the ftp component page about “Using a Local
>> Work
>> > > Directory” (http://people.apache.org/~dkulp/camel/ftp2.html <
>> > > http://people.apache.org/~dkulp/camel/ftp2.html>) - I think that may
>> be
>> > > what you’re after.
>> > >
>> > >
>> > > > On Sep 1, 2016, at 9:30 AM, S Ahmed <sa...@gmail.com> wrote:
>> > > >
>> > > > Hello,
>> > > >
>> > > > Is there an example of how to download a large file in chunks and
>> save
>> > > the
>> > > > file as the file downloads.
>> > > >
>> > > > The goal is not to hold the entire file in memory and then save it
>> to
>> > > disk.
>> > > >
>> > > >
>> > > > Thanks.
>> > >
>> > >
>> >
>>
>
>

Re: downloading large files in chunks

Posted by S Ahmed <sa...@gmail.com>.
Thanks.

Just to be clear, I don't run the server where I am downloading the file. I
want to download files that are very large, but stream them so they are not
held in memory and then written to disk.  I want to stream the download
straight to a file and not hold the entire file in memory.

Is Netty for the server portion or the client?

On Fri, Sep 2, 2016 at 12:34 AM, Brad Johnson <br...@mediadriver.com>
wrote:

> http://camel.apache.org/netty4-http.html
>
> Look at netty and see if that works.  It can control chunk size but it is
> also streaming in any case so you may not even need to be concerned about
> it.
>
> Brad
>
> On Thu, Sep 1, 2016 at 8:53 PM, S Ahmed <sa...@gmail.com> wrote:
>
> > Does it have to be ftp, I just need http?
> >
> > On Thu, Sep 1, 2016 at 5:31 PM, Quinn Stevenson <
> > quinn@pronoia-solutions.com
> > > wrote:
> >
> > > Check out the section on the ftp component page about “Using a Local
> Work
> > > Directory” (http://people.apache.org/~dkulp/camel/ftp2.html <
> > > http://people.apache.org/~dkulp/camel/ftp2.html>) - I think that may
> be
> > > what you’re after.
> > >
> > >
> > > > On Sep 1, 2016, at 9:30 AM, S Ahmed <sa...@gmail.com> wrote:
> > > >
> > > > Hello,
> > > >
> > > > Is there an example of how to download a large file in chunks and
> save
> > > the
> > > > file as the file downloads.
> > > >
> > > > The goal is not to hold the entire file in memory and then save it to
> > > disk.
> > > >
> > > >
> > > > Thanks.
> > >
> > >
> >
>

Re: downloading large files in chunks

Posted by Brad Johnson <br...@mediadriver.com>.
http://camel.apache.org/netty4-http.html

Look at netty and see if that works.  It can control chunk size but it is
also streaming in any case so you may not even need to be concerned about
it.

Brad

On Thu, Sep 1, 2016 at 8:53 PM, S Ahmed <sa...@gmail.com> wrote:

> Does it have to be ftp, I just need http?
>
> On Thu, Sep 1, 2016 at 5:31 PM, Quinn Stevenson <
> quinn@pronoia-solutions.com
> > wrote:
>
> > Check out the section on the ftp component page about “Using a Local Work
> > Directory” (http://people.apache.org/~dkulp/camel/ftp2.html <
> > http://people.apache.org/~dkulp/camel/ftp2.html>) - I think that may be
> > what you’re after.
> >
> >
> > > On Sep 1, 2016, at 9:30 AM, S Ahmed <sa...@gmail.com> wrote:
> > >
> > > Hello,
> > >
> > > Is there an example of how to download a large file in chunks and save
> > the
> > > file as the file downloads.
> > >
> > > The goal is not to hold the entire file in memory and then save it to
> > disk.
> > >
> > >
> > > Thanks.
> >
> >
>

Re: downloading large files in chunks

Posted by S Ahmed <sa...@gmail.com>.
Does it have to be ftp, I just need http?

On Thu, Sep 1, 2016 at 5:31 PM, Quinn Stevenson <quinn@pronoia-solutions.com
> wrote:

> Check out the section on the ftp component page about “Using a Local Work
> Directory” (http://people.apache.org/~dkulp/camel/ftp2.html <
> http://people.apache.org/~dkulp/camel/ftp2.html>) - I think that may be
> what you’re after.
>
>
> > On Sep 1, 2016, at 9:30 AM, S Ahmed <sa...@gmail.com> wrote:
> >
> > Hello,
> >
> > Is there an example of how to download a large file in chunks and save
> the
> > file as the file downloads.
> >
> > The goal is not to hold the entire file in memory and then save it to
> disk.
> >
> >
> > Thanks.
>
>

Re: downloading large files in chunks

Posted by Quinn Stevenson <qu...@pronoia-solutions.com>.
Check out the section on the ftp component page about “Using a Local Work Directory” (http://people.apache.org/~dkulp/camel/ftp2.html <http://people.apache.org/~dkulp/camel/ftp2.html>) - I think that may be what you’re after.


> On Sep 1, 2016, at 9:30 AM, S Ahmed <sa...@gmail.com> wrote:
> 
> Hello,
> 
> Is there an example of how to download a large file in chunks and save the
> file as the file downloads.
> 
> The goal is not to hold the entire file in memory and then save it to disk.
> 
> 
> Thanks.