You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@camel.apache.org by Mike D <md...@isi.edu> on 2011/09/01 09:39:28 UTC

Errors with SFTP producer and large files

Hi, I am using Camel 2.7.0 and trying to upload large files using the SFTP
producer.  However, past a certain size the transfers fail without any real
informative errors in the log.  Interestingly enough, it seems I start to
get failures as soon as the file exceeds my java -Xms heap setting.  For
example, my base heap setting is 256m, and I can send a file below that size
without errors, but over that size it fails.
Sounds suspiciously like the whole file is being read into java heap memory,
no?  I thought Camel was supposed to avoid this wherever possible, but for
some reason it does not seem to be working for me in this case.  

My route is a simple polling file consumer to sftp producer:


Looking at the SFTP component code I can see that the argument to the JSCH
library put() command is an InputStream, and indeed Camel tries to convert
the File body of the exchange to an InputStream to pass to the JSCH lib, but
the conversion fails when the file is too big. Below is some trace ouput
that shows the successful type conversion and the failed one:

#1 File is small enough:

[java] 2011-08-31 23:43:05,863 TRACE
org.apache.camel.component.file.remote.SftpOperations [Camel (camelContext)
thread #0 - file://data/etl-out/, doStoreFile:654]
doStoreFile(SQLEXPRWT_x64_ENU.exe.tmp)    
[java] 2011-08-31 22:38:51,957 TRACE
org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
thread #0 - file://data/etl-out/, doConvertTo:152] Converting
org.apache.camel.component.file.GenericFile -> java.io.InputStream with
value: GenericFile[SQLEXPRWT_x64_ENU.exe]
[java] 2011-08-31 22:38:51,957 TRACE
org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
thread #0 - file://data/etl-out/, doConvertTo:180] Using converter:
StaticMethodTypeConverter: public static java.io.InputStream
org.apache.camel.component.file.GenericFileConverter.genericFileToInputStream(org.apache.camel.component.file.GenericFile,org.apache.camel.Exchange)
throws java.io.IOException to convert [class
org.apache.camel.component.file.GenericFile=>class java.io.InputStream]
[java] 2011-08-31 22:38:51,959 TRACE
org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
thread #0 - file://data/etl-out/, doConvertTo:152] Converting java.io.File
-> byte[] with value: data\etl-out\SQLEXPRWT_x64_ENU.exe
[java] 2011-08-31 22:38:51,959 TRACE
org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
thread #0 - file://data/etl-out/, doConvertTo:180] Using converter:
StaticMethodTypeConverter: public static byte[]
org.apache.camel.converter.IOConverter.toByteArray(java.io.File) throws
java.io.IOException to convert [class java.io.File=>class [B]
[java] 2011-08-31 22:38:53,520 TRACE
org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
thread #0 - file://data/etl-out/, doConvertTo:152] Converting byte[] ->
java.io.InputStream with value: [B@1ebf4ff
[java] 2011-08-31 22:38:53,520 TRACE
org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
thread #0 - file://data/etl-out/, doConvertTo:180] Using converter:
StaticMethodTypeConverter: public static java.io.InputStream
org.apache.camel.converter.IOConverter.toInputStream(byte[]) to convert
[class [B=>class java.io.InputStream]

#2 File is too big - subsequent error:

[java] 2011-08-31 23:43:05,863 TRACE
org.apache.camel.component.file.remote.SftpOperations [Camel (camelContext)
thread #0 - file://data/etl-out/ ,doStoreFile:654]
doStoreFile(SQLServer2008SP1-KB968369-x64-ENU.exe.tmp)
[java] 2011-08-31 23:43:05,866 TRACE
org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
thread #0 - file://data/etl-out/, doConvertTo:152] Converting
org.apache.camel.component.file.GenericFile -> java.io.InputStream with
value: GenericFile[SQLServer2008SP1-KB968369-x64-ENU.exe]
[java] 2011-08-31 23:43:05,866 TRACE
org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
thread #0 - file://data/etl-out/, doConvertTo:180] Using converter:
StaticMethodTypeConverter: public static java.io.InputStream
org.apache.camel.component.file.GenericFileConverter.genericFileToInputStream(org.apache.camel.component.file.GenericFile,org.apache.camel.Exchange)
throws java.io.IOException to convert [class
org.apache.camel.component.file.GenericFile=>class java.io.InputStream]
[java] 2011-08-31 23:43:05,867 TRACE
org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
thread #0 - file://data/etl-out/, doConvertTo:152] Converting java.io.File
-> byte[] with value: data\etl-out\SQLServer2008SP1-KB968369-x64-ENU.exe
[java] 2011-08-31 23:43:05,867 TRACE
org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
thread #0 - file://data/etl-out/, doConvertTo:180] Using converter:
StaticMethodTypeConverter: public static byte[]
org.apache.camel.converter.IOConverter.toByteArray(java.io.File) throws
java.io.IOException to convert [class java.io.File=>class [B]

...and sometime later...

     [java] 2011-08-31 23:43:07,433 DEBUG
org.apache.camel.component.file.GenericFileOnCompletion [Camel
(camelContext) thread #0 - file://data/etl-out/,log:227] Caused by:
[org.apache.camel.component.file.GenericFileOperationFailedException -
Cannot store file:
/home/test/data/SQLServer2008SP1-KB968369-x64-ENU.exe.tmp]
     [java]
org.apache.camel.component.file.GenericFileOperationFailedException: Cannot
store file: /home/test/data/SQLServer2008SP1-KB968369-x64-ENU.exe.tmp
     [java]     at
org.apache.camel.component.file.remote.SftpOperations.doStoreFile(SftpOperations.java:684)[camel-ftp-2.7.0.jar:2.7.0]
     [java]     at
org.apache.camel.component.file.remote.SftpOperations.storeFile(SftpOperations.java:641)[camel-ftp-2.7.0.jar:2.7.0]
     [java]     at
org.apache.camel.component.file.GenericFileProducer.writeFile(GenericFileProducer.java:269)[camel-core-2.7.0.jar:2.7.0]
     [java]     at
org.apache.camel.component.file.GenericFileProducer.processExchange(GenericFileProducer.java:163)[camel-core-2.7.0.jar:2.7.0]
     [java]     at
org.apache.camel.component.file.remote.RemoteFileProducer.process(RemoteFileProducer.java:50)[camel-ftp-2.7.0.jar:2.7.0]
     [java]     at
org.apache.camel.impl.converter.AsyncProcessorTypeConverter$ProcessorToAsyncProcessorBridge.process(AsyncProcessorTypeConverter.java:50)[camel-core-2.7.0.jar:2.7.0] 


My question is: why does the file get converted first to a byte array before
then being converted to an InputStream?  This causes IOConverter to invoke
toBytes() which duplicates the entire stream. Shouldn't you be able to pass
a BufferedInputStream wrapper of the FileInputStream directly to JSCH put()
function?  Is this the intended behavior or have I misconfigured something? 
Ultimately I have a need to send a 5GB plus file to an SFTP destination --
is it feasible that I should be able to use the SFTP producer for this?

Thanks for reading,
Mike

--
View this message in context: http://camel.465427.n5.nabble.com/Errors-with-SFTP-producer-and-large-files-tp4757404p4757404.html
Sent from the Camel - Users mailing list archive at Nabble.com.

Re: Errors with SFTP producer and large files

Posted by ylambert <yl...@bouyguestelecom.fr>.
Hi, I had the same problem.

In order to solve it, i've made my own TypeConverter.

public class CdnFileConverter implements TypeConverter {

    /**
     * Convertit un fichier local en InputStream SANS charger le fichier en
memoire, contrairement au convertisseur Camel.
     * 
     * @param file the local file
     * @return the input stream
     * @throws IOException Signals that an I/O exception has occurred.
     */
    public static InputStream genericFileToInputStream(GenericFile<File>
file) throws IOException {
        return new
BufferedInputStream(FileUtils.openInputStream(file.getFile()));
    }

    /*
     * (non-Javadoc)
     * 
     * @see org.apache.camel.TypeConverter#convertTo(java.lang.Class,
java.lang.Object)
     */
    @Override
    public <T> T convertTo(Class<T> type, Object value) {
        return convertTo(type, null, value);
    }

    /*
     * (non-Javadoc)
     * 
     * @see org.apache.camel.TypeConverter#convertTo(java.lang.Class,
org.apache.camel.Exchange, java.lang.Object)
     */
    @SuppressWarnings("unchecked")
    @Override
    public <T> T convertTo(Class<T> type, Exchange exchange, Object value) {
        // this method with the Exchange parameter will be preferred by
Camel to invoke
        // this allows you to fetch information from the exchange during
conversions
        // such as an encoding parameter or the likes
        T converted = null;

        // Si c'est un fichier distant, on laisse le convertisseur Camel
faire son travail
        if (InputStream.class.isAssignableFrom(type) &&
RemoteFile.class.isInstance(value)) {
            RemoteFile< ? > file = (RemoteFile< ? >) value;
            try {
                converted = (T)
GenericFileConverter.genericFileToInputStream(file, exchange);
            }
            //CHECKSTYLE:OFF
            catch (IOException e) {
                // converted reste null
            }
            //CHECKSTYLE:ON
        }
        // Si c'est un fichier local, on lance la conversion sans charger le
fichier en mémoire
        else if (InputStream.class.isAssignableFrom(type) &&
GenericFile.class.isInstance(value)) {
            GenericFile<File> file = (GenericFile<File>) value;
            try {
                converted = (T) genericFileToInputStream(file);
            }
            //CHECKSTYLE:OFF
            catch (IOException e) {
                // converted reste null
            }
            //CHECKSTYLE:ON
        }
        // Sinon converted reste null : Camel va tenter une conversion avec
un FallbackConverter

        return converted;
    }

    /*
     * (non-Javadoc)
     * 
     * @see
org.apache.camel.TypeConverter#mandatoryConvertTo(java.lang.Class,
java.lang.Object)
     */
    @Override
    public <T> T mandatoryConvertTo(Class<T> type, Object value) {
        return convertTo(type, null, value);
    }

    /*
     * (non-Javadoc)
     * 
     * @see
org.apache.camel.TypeConverter#mandatoryConvertTo(java.lang.Class,
org.apache.camel.Exchange, java.lang.Object)
     */
    @Override
    public <T> T mandatoryConvertTo(Class<T> type, Exchange exchange, Object
value) {
        return convertTo(type, exchange, value);
    }

--
View this message in context: http://camel.465427.n5.nabble.com/Errors-with-SFTP-producer-and-large-files-tp4757404p4757568.html
Sent from the Camel - Users mailing list archive at Nabble.com.

Re: Errors with SFTP producer and large files

Posted by mamta <ma...@gmail.com>.
I am using camel 2.11.0 version, still facing similar issue.
We are sending files via sftp, so why it is giving us Storage exception.
Strange thing is files are getting delivered to users successfully, but our
process is getting this exception, which is not letting us throw the correct
alert. And we are forced to check with end users whether they received files
are not.
There should be a way to know if the files have been successfully delivered.
Please help me as this is something we identified in our running prod code
on a regular basis.



--
View this message in context: http://camel.465427.n5.nabble.com/Errors-with-SFTP-producer-and-large-files-tp4757404p5781465.html
Sent from the Camel - Users mailing list archive at Nabble.com.

Re: Errors with SFTP producer and large files

Posted by Claus Ibsen <cl...@gmail.com>.
Hi

I had a chance to look at this, and its improved by CAMEL-3962.
https://issues.apache.org/jira/browse/CAMEL-3962

This is fixed from Camel 2.8.0 onwards.

I will backport this to the 2.7 branch as well.

On Fri, Sep 2, 2011 at 7:33 PM, Claus Ibsen <cl...@gmail.com> wrote:
> Thanks Mike for the details report. I have marked this conversation to
> get back to it.
> Its a bit odd as writing a file ought to be stream based and not take
> up memory space.
>
>
>
>
> On Thu, Sep 1, 2011 at 11:57 PM, Mike D <md...@isi.edu> wrote:
>> Update:
>>
>> I was able to workaround this problem using ylambert's approach (thank you
>> sir!), however it makes me a little uneasy that the problem must be dealt
>> with in such a way.  It seems to me like this is a bug in the way Camel
>> handles transferring a local file to a remote system with SFTP/FTP(S).
>>
>> I cannot envision a case when it is appropriate to try to read the entire
>> contents of a (large) file into a byte buffer in order to stream data over
>> the network, and I think most people would agree this is common sense.  I
>> can only conclude that this is the product of some nuance of the
>> TypeConverter system that I do not understand, and that it is not really
>> meant to work this way - at least not in this case.
>>
>> With regards to localWorkDirectory and its usefulness in the upload
>> scenario, I can see that this might be helpful if I was trying to send data
>> that was already memory bound and so writing out data to a local temporary
>> file would help things stay within memory limits.  However, in my scenario
>> the file already exists on the local disk, so I don't see how writing that
>> file out to *another* file is going to help anything.
>>
>> I do appreciate the quick response Claus, I'm just guessing you thought I
>> was dealing with the download use-case instead of the upload case.  Camel
>> still rocks; I know it is hard to please everybody and there are a LOT of
>> components to support.  Hopefully this thread helps anybody else who runs
>> into this issue.
>>
>> Cheers,
>> Mike
>>
>> --
>> View this message in context: http://camel.465427.n5.nabble.com/Errors-with-SFTP-producer-and-large-files-tp4757404p4760166.html
>> Sent from the Camel - Users mailing list archive at Nabble.com.
>>
>
>
>
> --
> Claus Ibsen
> -----------------
> FuseSource
> Email: cibsen@fusesource.com
> Web: http://fusesource.com
> Twitter: davsclaus, fusenews
> Blog: http://davsclaus.blogspot.com/
> Author of Camel in Action: http://www.manning.com/ibsen/
>



-- 
Claus Ibsen
-----------------
FuseSource
Email: cibsen@fusesource.com
Web: http://fusesource.com
Twitter: davsclaus, fusenews
Blog: http://davsclaus.blogspot.com/
Author of Camel in Action: http://www.manning.com/ibsen/

Re: Errors with SFTP producer and large files

Posted by Claus Ibsen <cl...@gmail.com>.
Thanks Mike for the details report. I have marked this conversation to
get back to it.
Its a bit odd as writing a file ought to be stream based and not take
up memory space.




On Thu, Sep 1, 2011 at 11:57 PM, Mike D <md...@isi.edu> wrote:
> Update:
>
> I was able to workaround this problem using ylambert's approach (thank you
> sir!), however it makes me a little uneasy that the problem must be dealt
> with in such a way.  It seems to me like this is a bug in the way Camel
> handles transferring a local file to a remote system with SFTP/FTP(S).
>
> I cannot envision a case when it is appropriate to try to read the entire
> contents of a (large) file into a byte buffer in order to stream data over
> the network, and I think most people would agree this is common sense.  I
> can only conclude that this is the product of some nuance of the
> TypeConverter system that I do not understand, and that it is not really
> meant to work this way - at least not in this case.
>
> With regards to localWorkDirectory and its usefulness in the upload
> scenario, I can see that this might be helpful if I was trying to send data
> that was already memory bound and so writing out data to a local temporary
> file would help things stay within memory limits.  However, in my scenario
> the file already exists on the local disk, so I don't see how writing that
> file out to *another* file is going to help anything.
>
> I do appreciate the quick response Claus, I'm just guessing you thought I
> was dealing with the download use-case instead of the upload case.  Camel
> still rocks; I know it is hard to please everybody and there are a LOT of
> components to support.  Hopefully this thread helps anybody else who runs
> into this issue.
>
> Cheers,
> Mike
>
> --
> View this message in context: http://camel.465427.n5.nabble.com/Errors-with-SFTP-producer-and-large-files-tp4757404p4760166.html
> Sent from the Camel - Users mailing list archive at Nabble.com.
>



-- 
Claus Ibsen
-----------------
FuseSource
Email: cibsen@fusesource.com
Web: http://fusesource.com
Twitter: davsclaus, fusenews
Blog: http://davsclaus.blogspot.com/
Author of Camel in Action: http://www.manning.com/ibsen/

Re: Errors with SFTP producer and large files

Posted by Mike D <md...@isi.edu>.
Update:

I was able to workaround this problem using ylambert's approach (thank you
sir!), however it makes me a little uneasy that the problem must be dealt
with in such a way.  It seems to me like this is a bug in the way Camel
handles transferring a local file to a remote system with SFTP/FTP(S). 

I cannot envision a case when it is appropriate to try to read the entire
contents of a (large) file into a byte buffer in order to stream data over
the network, and I think most people would agree this is common sense.  I
can only conclude that this is the product of some nuance of the
TypeConverter system that I do not understand, and that it is not really
meant to work this way - at least not in this case.

With regards to localWorkDirectory and its usefulness in the upload
scenario, I can see that this might be helpful if I was trying to send data
that was already memory bound and so writing out data to a local temporary
file would help things stay within memory limits.  However, in my scenario
the file already exists on the local disk, so I don't see how writing that
file out to *another* file is going to help anything. 

I do appreciate the quick response Claus, I'm just guessing you thought I
was dealing with the download use-case instead of the upload case.  Camel
still rocks; I know it is hard to please everybody and there are a LOT of
components to support.  Hopefully this thread helps anybody else who runs
into this issue.

Cheers,
Mike 

--
View this message in context: http://camel.465427.n5.nabble.com/Errors-with-SFTP-producer-and-large-files-tp4757404p4760166.html
Sent from the Camel - Users mailing list archive at Nabble.com.

Re: Errors with SFTP producer and large files

Posted by Mike D <md...@isi.edu>.
Hi Claus,

Thank you for your answer. I had investigated the localWorkDirectory option
already without any success.  It seems like this option is only available
when retrieving files from a remote system and not when sending files a
system.  In fact, the documentation at http://camel.apache.org/ftp2
basically states this - though it does not state explicitly that
localWorkDirectory does *not* apply to uploading scenarios.

So, I am trying to upload large files via SFTP component - should
localWorkDirectory be a viable option in that use case?

Regards,
Mike

P.S.  The suggestion from ylambert to create a custom TypeConverter makes
sense to me given what I have seen, but I still do not understand why the
File->byte[]->InputStream conversion is happening.  Is this the expected
conversion logic, or is it a (possible) bug with type conversion ordering?

--
View this message in context: http://camel.465427.n5.nabble.com/Errors-with-SFTP-producer-and-large-files-tp4757404p4759520.html
Sent from the Camel - Users mailing list archive at Nabble.com.

Re: Errors with SFTP producer and large files

Posted by Claus Ibsen <cl...@gmail.com>.
Hi

The ftp component has a localWorkDirectory option which you can use to
configure a temp directory, which allows the FTP component
to stream directly to a java.io.File instead of in memory.

The option is documented in these 2 pages
http://camel.apache.org/ftp2
http://camel.apache.org/file2



On Thu, Sep 1, 2011 at 9:39 AM, Mike D <md...@isi.edu> wrote:
> Hi, I am using Camel 2.7.0 and trying to upload large files using the SFTP
> producer.  However, past a certain size the transfers fail without any real
> informative errors in the log.  Interestingly enough, it seems I start to
> get failures as soon as the file exceeds my java -Xms heap setting.  For
> example, my base heap setting is 256m, and I can send a file below that size
> without errors, but over that size it fails.
> Sounds suspiciously like the whole file is being read into java heap memory,
> no?  I thought Camel was supposed to avoid this wherever possible, but for
> some reason it does not seem to be working for me in this case.
>
> My route is a simple polling file consumer to sftp producer:
>
>
> Looking at the SFTP component code I can see that the argument to the JSCH
> library put() command is an InputStream, and indeed Camel tries to convert
> the File body of the exchange to an InputStream to pass to the JSCH lib, but
> the conversion fails when the file is too big. Below is some trace ouput
> that shows the successful type conversion and the failed one:
>
> #1 File is small enough:
>
> [java] 2011-08-31 23:43:05,863 TRACE
> org.apache.camel.component.file.remote.SftpOperations [Camel (camelContext)
> thread #0 - file://data/etl-out/, doStoreFile:654]
> doStoreFile(SQLEXPRWT_x64_ENU.exe.tmp)
> [java] 2011-08-31 22:38:51,957 TRACE
> org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
> thread #0 - file://data/etl-out/, doConvertTo:152] Converting
> org.apache.camel.component.file.GenericFile -> java.io.InputStream with
> value: GenericFile[SQLEXPRWT_x64_ENU.exe]
> [java] 2011-08-31 22:38:51,957 TRACE
> org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
> thread #0 - file://data/etl-out/, doConvertTo:180] Using converter:
> StaticMethodTypeConverter: public static java.io.InputStream
> org.apache.camel.component.file.GenericFileConverter.genericFileToInputStream(org.apache.camel.component.file.GenericFile,org.apache.camel.Exchange)
> throws java.io.IOException to convert [class
> org.apache.camel.component.file.GenericFile=>class java.io.InputStream]
> [java] 2011-08-31 22:38:51,959 TRACE
> org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
> thread #0 - file://data/etl-out/, doConvertTo:152] Converting java.io.File
> -> byte[] with value: data\etl-out\SQLEXPRWT_x64_ENU.exe
> [java] 2011-08-31 22:38:51,959 TRACE
> org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
> thread #0 - file://data/etl-out/, doConvertTo:180] Using converter:
> StaticMethodTypeConverter: public static byte[]
> org.apache.camel.converter.IOConverter.toByteArray(java.io.File) throws
> java.io.IOException to convert [class java.io.File=>class [B]
> [java] 2011-08-31 22:38:53,520 TRACE
> org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
> thread #0 - file://data/etl-out/, doConvertTo:152] Converting byte[] ->
> java.io.InputStream with value: [B@1ebf4ff
> [java] 2011-08-31 22:38:53,520 TRACE
> org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
> thread #0 - file://data/etl-out/, doConvertTo:180] Using converter:
> StaticMethodTypeConverter: public static java.io.InputStream
> org.apache.camel.converter.IOConverter.toInputStream(byte[]) to convert
> [class [B=>class java.io.InputStream]
>
> #2 File is too big - subsequent error:
>
> [java] 2011-08-31 23:43:05,863 TRACE
> org.apache.camel.component.file.remote.SftpOperations [Camel (camelContext)
> thread #0 - file://data/etl-out/ ,doStoreFile:654]
> doStoreFile(SQLServer2008SP1-KB968369-x64-ENU.exe.tmp)
> [java] 2011-08-31 23:43:05,866 TRACE
> org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
> thread #0 - file://data/etl-out/, doConvertTo:152] Converting
> org.apache.camel.component.file.GenericFile -> java.io.InputStream with
> value: GenericFile[SQLServer2008SP1-KB968369-x64-ENU.exe]
> [java] 2011-08-31 23:43:05,866 TRACE
> org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
> thread #0 - file://data/etl-out/, doConvertTo:180] Using converter:
> StaticMethodTypeConverter: public static java.io.InputStream
> org.apache.camel.component.file.GenericFileConverter.genericFileToInputStream(org.apache.camel.component.file.GenericFile,org.apache.camel.Exchange)
> throws java.io.IOException to convert [class
> org.apache.camel.component.file.GenericFile=>class java.io.InputStream]
> [java] 2011-08-31 23:43:05,867 TRACE
> org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
> thread #0 - file://data/etl-out/, doConvertTo:152] Converting java.io.File
> -> byte[] with value: data\etl-out\SQLServer2008SP1-KB968369-x64-ENU.exe
> [java] 2011-08-31 23:43:05,867 TRACE
> org.apache.camel.impl.converter.DefaultTypeConverter [Camel (camelContext)
> thread #0 - file://data/etl-out/, doConvertTo:180] Using converter:
> StaticMethodTypeConverter: public static byte[]
> org.apache.camel.converter.IOConverter.toByteArray(java.io.File) throws
> java.io.IOException to convert [class java.io.File=>class [B]
>
> ...and sometime later...
>
>     [java] 2011-08-31 23:43:07,433 DEBUG
> org.apache.camel.component.file.GenericFileOnCompletion [Camel
> (camelContext) thread #0 - file://data/etl-out/,log:227] Caused by:
> [org.apache.camel.component.file.GenericFileOperationFailedException -
> Cannot store file:
> /home/test/data/SQLServer2008SP1-KB968369-x64-ENU.exe.tmp]
>     [java]
> org.apache.camel.component.file.GenericFileOperationFailedException: Cannot
> store file: /home/test/data/SQLServer2008SP1-KB968369-x64-ENU.exe.tmp
>     [java]     at
> org.apache.camel.component.file.remote.SftpOperations.doStoreFile(SftpOperations.java:684)[camel-ftp-2.7.0.jar:2.7.0]
>     [java]     at
> org.apache.camel.component.file.remote.SftpOperations.storeFile(SftpOperations.java:641)[camel-ftp-2.7.0.jar:2.7.0]
>     [java]     at
> org.apache.camel.component.file.GenericFileProducer.writeFile(GenericFileProducer.java:269)[camel-core-2.7.0.jar:2.7.0]
>     [java]     at
> org.apache.camel.component.file.GenericFileProducer.processExchange(GenericFileProducer.java:163)[camel-core-2.7.0.jar:2.7.0]
>     [java]     at
> org.apache.camel.component.file.remote.RemoteFileProducer.process(RemoteFileProducer.java:50)[camel-ftp-2.7.0.jar:2.7.0]
>     [java]     at
> org.apache.camel.impl.converter.AsyncProcessorTypeConverter$ProcessorToAsyncProcessorBridge.process(AsyncProcessorTypeConverter.java:50)[camel-core-2.7.0.jar:2.7.0]
>
>
> My question is: why does the file get converted first to a byte array before
> then being converted to an InputStream?  This causes IOConverter to invoke
> toBytes() which duplicates the entire stream. Shouldn't you be able to pass
> a BufferedInputStream wrapper of the FileInputStream directly to JSCH put()
> function?  Is this the intended behavior or have I misconfigured something?
> Ultimately I have a need to send a 5GB plus file to an SFTP destination --
> is it feasible that I should be able to use the SFTP producer for this?
>
> Thanks for reading,
> Mike
>
> --
> View this message in context: http://camel.465427.n5.nabble.com/Errors-with-SFTP-producer-and-large-files-tp4757404p4757404.html
> Sent from the Camel - Users mailing list archive at Nabble.com.
>



-- 
Claus Ibsen
-----------------
FuseSource
Email: cibsen@fusesource.com
Web: http://fusesource.com
Twitter: davsclaus, fusenews
Blog: http://davsclaus.blogspot.com/
Author of Camel in Action: http://www.manning.com/ibsen/