You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@sqoop.apache.org by jia jimin <ji...@gmail.com> on 2012/11/27 09:29:15 UTC

Does Sqoop support importing non-relational data to Hadoop ?

Hi there,

I am investigating Sqoop on windows for importing data to HDFS and have
some questions :

1. Does Sqoop support importing non-relational data such as event log in
windows to HDFS  ?

2. If our client machine changed frequently ( recycle old machine and add
new machine) ,   Can sqoop dynamically updated configuration information
 so that we can upload data seamlessly ?

Thanks for looking at these questions !

Regards
Benjamin

Re: Does Sqoop support importing non-relational data to Hadoop ?

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Jia,
thank you very much for your questions. Sqoop is designed as a batch tool to transfer data from database and warehouse systems to Hadoop ecosystem and vice-versa. Right now, Sqoop is supporting only JDBC compliant databases. This requirement will however fade away with Sqoop 2.

Based on your questions, it seems to me that you're more looking for an online ingest system rather than batch one. In case case, I would recommend checking Apache Flume project [1] that aim to address online data ingesting.

Links:
1: http://flume.apache.org/

On Tue, Nov 27, 2012 at 04:29:15PM +0800, jia jimin wrote:
> Hi there,
> 
> I am investigating Sqoop on windows for importing data to HDFS and have
> some questions :
> 
> 1. Does Sqoop support importing non-relational data such as event log in
> windows to HDFS  ?
> 
> 2. If our client machine changed frequently ( recycle old machine and add
> new machine) ,   Can sqoop dynamically updated configuration information
>  so that we can upload data seamlessly ?
> 
> Thanks for looking at these questions !
> 
> Regards
> Benjamin