You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Bhagaban Khatai <em...@gmail.com> on 2016/08/01 11:24:26 UTC

Teradata into hadoop Migration

Hi Guys-

I need a quick help if anybody done any migration project in TD into hadoop.
We have very tight deadline and I am trying to find any tool (online or
paid) for quick development.

Please help us here and guide me if any other way is available to do the
development fast.

Bhagaban

Re: Teradata into hadoop Migration

Posted by Sandeep Khurana <sk...@gmail.com>.
I will suggest to also look at TPT from teradata which is much faster than
sqoop

On 01-Aug-2016 6:17 pm, "Gmail" <as...@gmail.com> wrote:

> Hi Bhagaban
>
> I have seen more efficient way of data transfer by fast exporting the data
> and put them directly in HDFS. And then create structure as per the data in
> HIVE or PIG. We have developed DDL transformation script using shell script
> which can convert the TERADATA DDL to Hive DDL. This automation effort will
> not take much time . We have faced some challenges while converting the
> timestamp and date data type in parquet format. But later those resolved
> with higher version of hive. I will also suggest to check TERADATA
> connector for Hadoop the tool developed by TERADATA which is self efficient
> to choose different data transfer strategies.
>
> Thanks
> Asim
>
> Sent from my iPhone
>
> On Aug 1, 2016, at 7:37 AM, Rakesh Radhakrishnan <ra...@apache.org>
> wrote:
>
> Hi Bhagaban,
>
> Perhaps, you can try "Apache Sqoop" to transfer data to Hadoop from
> Teradata. Apache Sqoop provides an efficient approach for transferring
> large data between Hadoop related systems and structured data stores. It
> allows support for a data store to be added as a so-called connector and
> can connect to various databases including Oracle etc.
>
> I hope the below links will be helpful to you,
> http://sqoop.apache.org/
>
> http://blog.cloudera.com/blog/2012/01/cloudera-connector-for-teradata-1-0-0/
> http://hortonworks.com/blog/round-trip-data-enrichment-teradata-hadoop/
>
> http://dataconomy.com/wp-content/uploads/2014/06/Syncsort-A-123ApproachtoTeradataOffloadwithHadoop.pdf
>
> Below are few data ingestion tools, probably you can dig more into it,
> https://www.datatorrent.com/product/datatorrent-ingestion/
>
> https://www.datatorrent.com/dtingest-unified-streaming-batch-data-ingestion-hadoop/
>
> Thanks,
> Rakesh
>
> On Mon, Aug 1, 2016 at 4:54 PM, Bhagaban Khatai <em...@gmail.com>
> wrote:
>
>> Hi Guys-
>>
>> I need a quick help if anybody done any migration project in TD into
>> hadoop.
>> We have very tight deadline and I am trying to find any tool (online or
>> paid) for quick development.
>>
>> Please help us here and guide me if any other way is available to do the
>> development fast.
>>
>> Bhagaban
>>
>
>

Re: Teradata into hadoop Migration

Posted by Gmail <as...@gmail.com>.
Hi Bhagaban

I have seen more efficient way of data transfer by fast exporting the data and put them directly in HDFS. And then create structure as per the data in HIVE or PIG. We have developed DDL transformation script using shell script which can convert the TERADATA DDL to Hive DDL. This automation effort will not take much time . We have faced some challenges while converting the timestamp and date data type in parquet format. But later those resolved with higher version of hive. I will also suggest to check TERADATA connector for Hadoop the tool developed by TERADATA which is self efficient to choose different data transfer strategies.

Thanks
Asim

Sent from my iPhone

> On Aug 1, 2016, at 7:37 AM, Rakesh Radhakrishnan <ra...@apache.org> wrote:
> 
> Hi Bhagaban,
> 
> Perhaps, you can try "Apache Sqoop" to transfer data to Hadoop from Teradata. Apache Sqoop provides an efficient approach for transferring large data between Hadoop related systems and structured data stores. It allows support for a data store to be added as a so-called connector and can connect to various databases including Oracle etc.
> 
> I hope the below links will be helpful to you,
> http://sqoop.apache.org/
> http://blog.cloudera.com/blog/2012/01/cloudera-connector-for-teradata-1-0-0/
> http://hortonworks.com/blog/round-trip-data-enrichment-teradata-hadoop/
> http://dataconomy.com/wp-content/uploads/2014/06/Syncsort-A-123ApproachtoTeradataOffloadwithHadoop.pdf
> 
> Below are few data ingestion tools, probably you can dig more into it,
> https://www.datatorrent.com/product/datatorrent-ingestion/
> https://www.datatorrent.com/dtingest-unified-streaming-batch-data-ingestion-hadoop/
> 
> Thanks,
> Rakesh
> 
>> On Mon, Aug 1, 2016 at 4:54 PM, Bhagaban Khatai <em...@gmail.com> wrote:
>> Hi Guys-
>> 
>> I need a quick help if anybody done any migration project in TD into hadoop.
>> We have very tight deadline and I am trying to find any tool (online or paid) for quick development.
>> 
>> Please help us here and guide me if any other way is available to do the development fast.
>> 
>> Bhagaban
> 

Re: Teradata into hadoop Migration

Posted by Arun Natva <ar...@gmail.com>.
Bhagaban,
First step is to ingest data into Hadoop using sqoop.
Teradata has powerful connectors to Hadoop where the connectors are to be installed on all data nodes and then run imports using fast export etc., 

Challenge would be to create the same workflows in Hadoop that you had in teradata.

Teradata is rich in features compared to Hive & Impala.

Mostly data in teradata is encrypted so pls make sure you have HDFS encryption at rest enabled.

You can use oozie to create a chain of SQLs to mimic your ETL jobs written in Datastage or TD itself or Informatica.

Please note that TD may perform better than Hadoop since it has proprietary hardware and software which is efficient, Hadoop can save you money.


Sent from my iPhone

> On Aug 5, 2016, at 12:02 PM, praveenesh kumar <pr...@gmail.com> wrote:
> 
> From TD perspective have a look at this - https://youtu.be/NTTQdAfZMJA They are planning to opensource it. Perhaps you can get in touch with the team. Let me know if you are interested. If you are TD contacts, ask about this, they should be able to point to the right people.
> 
> Again, this is not sales pitch. This tool looks like what you are looking for and will be open source soon. Let me know if you want to get in touch with the folks you are working on this. 
> 
> Regards
> Prav
> 
>> On Fri, Aug 5, 2016 at 4:29 PM, Wei-Chiu Chuang <we...@cloudera.com> wrote:
>> Hi,
>> 
>> I think Cloudera Navigator Optimizer is the tool you are looking for. It allows you to transform SQL queries (TD) into Impala and Hive.
>> http://blog.cloudera.com/blog/2015/11/introducing-cloudera-navigator-optimizer-for-optimal-sql-workload-efficiency-on-apache-hadoop/
>> Hope this doesn’t sound like a sales pitch. If you’re a Cloudera paid customer you should reach out to the account/support team for more information.
>> 
>> *disclaimer: I work for Cloudera
>> 
>> Wei-Chiu Chuang
>> A very happy Clouderan
>> 
>>> On Aug 4, 2016, at 10:50 PM, Rakesh Radhakrishnan <ra...@apache.org> wrote:
>>> 
>>> Sorry, I don't have much insight about this apart from basic Sqoop. AFAIK, it is more of vendor specific, you may need to dig more into that line.
>>> 
>>> Thanks,
>>> Rakesh
>>> 
>>>> On Mon, Aug 1, 2016 at 11:38 PM, Bhagaban Khatai <em...@gmail.com> wrote:
>>>> Thanks Rakesh for the useful information. But we are using sqoop for data transfer but all TD logic we are implementing thru Hive.
>>>> But it's taking time by using mapping provided by TD team and the same logic we are implementing.
>>>> 
>>>> What I want some tool or ready-made framework so that development effort would be less.
>>>> 
>>>> Thanks in advance for your help.
>>>> 
>>>> Bhagaban 
>>>> 
>>>>> On Mon, Aug 1, 2016 at 6:07 PM, Rakesh Radhakrishnan <ra...@apache.org> wrote:
>>>>> Hi Bhagaban,
>>>>> 
>>>>> Perhaps, you can try "Apache Sqoop" to transfer data to Hadoop from Teradata. Apache Sqoop provides an efficient approach for transferring large data between Hadoop related systems and structured data stores. It allows support for a data store to be added as a so-called connector and can connect to various databases including Oracle etc.
>>>>> 
>>>>> I hope the below links will be helpful to you,
>>>>> http://sqoop.apache.org/
>>>>> http://blog.cloudera.com/blog/2012/01/cloudera-connector-for-teradata-1-0-0/
>>>>> http://hortonworks.com/blog/round-trip-data-enrichment-teradata-hadoop/
>>>>> http://dataconomy.com/wp-content/uploads/2014/06/Syncsort-A-123ApproachtoTeradataOffloadwithHadoop.pdf
>>>>> 
>>>>> Below are few data ingestion tools, probably you can dig more into it,
>>>>> https://www.datatorrent.com/product/datatorrent-ingestion/
>>>>> https://www.datatorrent.com/dtingest-unified-streaming-batch-data-ingestion-hadoop/
>>>>> 
>>>>> Thanks,
>>>>> Rakesh
>>>>> 
>>>>>> On Mon, Aug 1, 2016 at 4:54 PM, Bhagaban Khatai <em...@gmail.com> wrote:
>>>>>> Hi Guys-
>>>>>> 
>>>>>> I need a quick help if anybody done any migration project in TD into hadoop.
>>>>>> We have very tight deadline and I am trying to find any tool (online or paid) for quick development.
>>>>>> 
>>>>>> Please help us here and guide me if any other way is available to do the development fast.
>>>>>> 
>>>>>> Bhagaban
> 

Re: Teradata into hadoop Migration

Posted by praveenesh kumar <pr...@gmail.com>.
From TD perspective have a look at this - https://youtu.be/NTTQdAfZMJA They
are planning to opensource it. Perhaps you can get in touch with the team.
Let me know if you are interested. If you are TD contacts, ask about this,
they should be able to point to the right people.

Again, this is not sales pitch. This tool looks like what you are looking
for and will be open source soon. Let me know if you want to get in touch
with the folks you are working on this.

Regards
Prav

On Fri, Aug 5, 2016 at 4:29 PM, Wei-Chiu Chuang <we...@cloudera.com>
wrote:

> Hi,
>
> I think Cloudera Navigator Optimizer is the tool you are looking for. It
> allows you to transform SQL queries (TD) into Impala and Hive.
> http://blog.cloudera.com/blog/2015/11/introducing-cloudera-
> navigator-optimizer-for-optimal-sql-workload-efficiency-on-apache-hadoop/
> Hope this doesn’t sound like a sales pitch. If you’re a Cloudera paid
> customer you should reach out to the account/support team for more
> information.
>
> *disclaimer: I work for Cloudera
>
> Wei-Chiu Chuang
> A very happy Clouderan
>
> On Aug 4, 2016, at 10:50 PM, Rakesh Radhakrishnan <ra...@apache.org>
> wrote:
>
> Sorry, I don't have much insight about this apart from basic Sqoop. AFAIK,
> it is more of vendor specific, you may need to dig more into that line.
>
> Thanks,
> Rakesh
>
> On Mon, Aug 1, 2016 at 11:38 PM, Bhagaban Khatai <email.bhagaban@gmail.com
> > wrote:
>
>> Thanks Rakesh for the useful information. But we are using sqoop for data
>> transfer but all TD logic we are implementing thru Hive.
>> But it's taking time by using mapping provided by TD team and the same
>> logic we are implementing.
>>
>> What I want some tool or ready-made framework so that development effort
>> would be less.
>>
>> Thanks in advance for your help.
>>
>> Bhagaban
>>
>> On Mon, Aug 1, 2016 at 6:07 PM, Rakesh Radhakrishnan <ra...@apache.org>
>> wrote:
>>
>>> Hi Bhagaban,
>>>
>>> Perhaps, you can try "Apache Sqoop" to transfer data to Hadoop from
>>> Teradata. Apache Sqoop provides an efficient approach for transferring
>>> large data between Hadoop related systems and structured data stores. It
>>> allows support for a data store to be added as a so-called connector and
>>> can connect to various databases including Oracle etc.
>>>
>>> I hope the below links will be helpful to you,
>>> http://sqoop.apache.org/
>>> http://blog.cloudera.com/blog/2012/01/cloudera-connector-for
>>> -teradata-1-0-0/
>>> http://hortonworks.com/blog/round-trip-data-enrichment-teradata-hadoop/
>>> http://dataconomy.com/wp-content/uploads/2014/06/Syncsort-A-
>>> 123ApproachtoTeradataOffloadwithHadoop.pdf
>>>
>>> Below are few data ingestion tools, probably you can dig more into it,
>>> https://www.datatorrent.com/product/datatorrent-ingestion/
>>> https://www.datatorrent.com/dtingest-unified-streaming-batch
>>> -data-ingestion-hadoop/
>>>
>>> Thanks,
>>> Rakesh
>>>
>>> On Mon, Aug 1, 2016 at 4:54 PM, Bhagaban Khatai <
>>> email.bhagaban@gmail.com> wrote:
>>>
>>>> Hi Guys-
>>>>
>>>> I need a quick help if anybody done any migration project in TD into
>>>> hadoop.
>>>> We have very tight deadline and I am trying to find any tool (online or
>>>> paid) for quick development.
>>>>
>>>> Please help us here and guide me if any other way is available to do
>>>> the development fast.
>>>>
>>>> Bhagaban
>>>>
>>>
>>>
>>
>
>

Re: Teradata into hadoop Migration

Posted by Wei-Chiu Chuang <we...@cloudera.com>.
Hi,

I think Cloudera Navigator Optimizer is the tool you are looking for. It allows you to transform SQL queries (TD) into Impala and Hive.
http://blog.cloudera.com/blog/2015/11/introducing-cloudera-navigator-optimizer-for-optimal-sql-workload-efficiency-on-apache-hadoop/ <http://blog.cloudera.com/blog/2015/11/introducing-cloudera-navigator-optimizer-for-optimal-sql-workload-efficiency-on-apache-hadoop/>
Hope this doesn’t sound like a sales pitch. If you’re a Cloudera paid customer you should reach out to the account/support team for more information.

*disclaimer: I work for Cloudera

Wei-Chiu Chuang
A very happy Clouderan

> On Aug 4, 2016, at 10:50 PM, Rakesh Radhakrishnan <ra...@apache.org> wrote:
> 
> Sorry, I don't have much insight about this apart from basic Sqoop. AFAIK, it is more of vendor specific, you may need to dig more into that line.
> 
> Thanks,
> Rakesh
> 
> On Mon, Aug 1, 2016 at 11:38 PM, Bhagaban Khatai <email.bhagaban@gmail.com <ma...@gmail.com>> wrote:
> Thanks Rakesh for the useful information. But we are using sqoop for data transfer but all TD logic we are implementing thru Hive.
> But it's taking time by using mapping provided by TD team and the same logic we are implementing.
> 
> What I want some tool or ready-made framework so that development effort would be less.
> 
> Thanks in advance for your help.
> 
> Bhagaban 
> 
> On Mon, Aug 1, 2016 at 6:07 PM, Rakesh Radhakrishnan <rakeshr@apache.org <ma...@apache.org>> wrote:
> Hi Bhagaban,
> 
> Perhaps, you can try "Apache Sqoop" to transfer data to Hadoop from Teradata. Apache Sqoop provides an efficient approach for transferring large data between Hadoop related systems and structured data stores. It allows support for a data store to be added as a so-called connector and can connect to various databases including Oracle etc.
> 
> I hope the below links will be helpful to you,
> http://sqoop.apache.org/ <http://sqoop.apache.org/>
> http://blog.cloudera.com/blog/2012/01/cloudera-connector-for-teradata-1-0-0/ <http://blog.cloudera.com/blog/2012/01/cloudera-connector-for-teradata-1-0-0/>
> http://hortonworks.com/blog/round-trip-data-enrichment-teradata-hadoop/ <http://hortonworks.com/blog/round-trip-data-enrichment-teradata-hadoop/>
> http://dataconomy.com/wp-content/uploads/2014/06/Syncsort-A-123ApproachtoTeradataOffloadwithHadoop.pdf <http://dataconomy.com/wp-content/uploads/2014/06/Syncsort-A-123ApproachtoTeradataOffloadwithHadoop.pdf>
> 
> Below are few data ingestion tools, probably you can dig more into it,
> https://www.datatorrent.com/product/datatorrent-ingestion/ <https://www.datatorrent.com/product/datatorrent-ingestion/>
> https://www.datatorrent.com/dtingest-unified-streaming-batch-data-ingestion-hadoop/ <https://www.datatorrent.com/dtingest-unified-streaming-batch-data-ingestion-hadoop/>
> 
> Thanks,
> Rakesh
> 
> On Mon, Aug 1, 2016 at 4:54 PM, Bhagaban Khatai <email.bhagaban@gmail.com <ma...@gmail.com>> wrote:
> Hi Guys-
> 
> I need a quick help if anybody done any migration project in TD into hadoop.
> We have very tight deadline and I am trying to find any tool (online or paid) for quick development.
> 
> Please help us here and guide me if any other way is available to do the development fast.
> 
> Bhagaban
> 
> 
> 


Re: Teradata into hadoop Migration

Posted by Rakesh Radhakrishnan <ra...@apache.org>.
Sorry, I don't have much insight about this apart from basic Sqoop. AFAIK,
it is more of vendor specific, you may need to dig more into that line.

Thanks,
Rakesh

On Mon, Aug 1, 2016 at 11:38 PM, Bhagaban Khatai <em...@gmail.com>
wrote:

> Thanks Rakesh for the useful information. But we are using sqoop for data
> transfer but all TD logic we are implementing thru Hive.
> But it's taking time by using mapping provided by TD team and the same
> logic we are implementing.
>
> What I want some tool or ready-made framework so that development effort
> would be less.
>
> Thanks in advance for your help.
>
> Bhagaban
>
> On Mon, Aug 1, 2016 at 6:07 PM, Rakesh Radhakrishnan <ra...@apache.org>
> wrote:
>
>> Hi Bhagaban,
>>
>> Perhaps, you can try "Apache Sqoop" to transfer data to Hadoop from
>> Teradata. Apache Sqoop provides an efficient approach for transferring
>> large data between Hadoop related systems and structured data stores. It
>> allows support for a data store to be added as a so-called connector and
>> can connect to various databases including Oracle etc.
>>
>> I hope the below links will be helpful to you,
>> http://sqoop.apache.org/
>> http://blog.cloudera.com/blog/2012/01/cloudera-connector-
>> for-teradata-1-0-0/
>> http://hortonworks.com/blog/round-trip-data-enrichment-teradata-hadoop/
>> http://dataconomy.com/wp-content/uploads/2014/06/Syncsort-A-
>> 123ApproachtoTeradataOffloadwithHadoop.pdf
>>
>> Below are few data ingestion tools, probably you can dig more into it,
>> https://www.datatorrent.com/product/datatorrent-ingestion/
>> https://www.datatorrent.com/dtingest-unified-streaming-
>> batch-data-ingestion-hadoop/
>>
>> Thanks,
>> Rakesh
>>
>> On Mon, Aug 1, 2016 at 4:54 PM, Bhagaban Khatai <email.bhagaban@gmail.com
>> > wrote:
>>
>>> Hi Guys-
>>>
>>> I need a quick help if anybody done any migration project in TD into
>>> hadoop.
>>> We have very tight deadline and I am trying to find any tool (online or
>>> paid) for quick development.
>>>
>>> Please help us here and guide me if any other way is available to do the
>>> development fast.
>>>
>>> Bhagaban
>>>
>>
>>
>

Re: Teradata into hadoop Migration

Posted by Bhagaban Khatai <em...@gmail.com>.
Thanks Rakesh for the useful information. But we are using sqoop for data
transfer but all TD logic we are implementing thru Hive.
But it's taking time by using mapping provided by TD team and the same
logic we are implementing.

What I want some tool or ready-made framework so that development effort
would be less.

Thanks in advance for your help.

Bhagaban

On Mon, Aug 1, 2016 at 6:07 PM, Rakesh Radhakrishnan <ra...@apache.org>
wrote:

> Hi Bhagaban,
>
> Perhaps, you can try "Apache Sqoop" to transfer data to Hadoop from
> Teradata. Apache Sqoop provides an efficient approach for transferring
> large data between Hadoop related systems and structured data stores. It
> allows support for a data store to be added as a so-called connector and
> can connect to various databases including Oracle etc.
>
> I hope the below links will be helpful to you,
> http://sqoop.apache.org/
>
> http://blog.cloudera.com/blog/2012/01/cloudera-connector-for-teradata-1-0-0/
> http://hortonworks.com/blog/round-trip-data-enrichment-teradata-hadoop/
>
> http://dataconomy.com/wp-content/uploads/2014/06/Syncsort-A-123ApproachtoTeradataOffloadwithHadoop.pdf
>
> Below are few data ingestion tools, probably you can dig more into it,
> https://www.datatorrent.com/product/datatorrent-ingestion/
>
> https://www.datatorrent.com/dtingest-unified-streaming-batch-data-ingestion-hadoop/
>
> Thanks,
> Rakesh
>
> On Mon, Aug 1, 2016 at 4:54 PM, Bhagaban Khatai <em...@gmail.com>
> wrote:
>
>> Hi Guys-
>>
>> I need a quick help if anybody done any migration project in TD into
>> hadoop.
>> We have very tight deadline and I am trying to find any tool (online or
>> paid) for quick development.
>>
>> Please help us here and guide me if any other way is available to do the
>> development fast.
>>
>> Bhagaban
>>
>
>

Re: Teradata into hadoop Migration

Posted by "Sudhir.Kumar" <Su...@target.com>.
Hi Bhagaban,

Data migration can be achieved by Scoop. However if you are also looking for ETL/ELT application migration then you would have to look into converting the ELT SQLs into map-reduce framework based codes. You can build an conversion tool.

Thanks,

Sudhir Kumar

“Your present circumstances don’t determine where you can go; they merely determine where you start”. — Nido Qubein


From: Rakesh Radhakrishnan <ra...@apache.org>
Date: Monday, August 1, 2016 at 6:07 PM
To: Bhagaban Khatai <em...@gmail.com>
Cc: "user.hadoop" <us...@hadoop.apache.org>
Subject: Re: Teradata into hadoop Migration

Hi Bhagaban,

Perhaps, you can try "Apache Sqoop" to transfer data to Hadoop from Teradata. Apache Sqoop provides an efficient approach for transferring large data between Hadoop related systems and structured data stores. It allows support for a data store to be added as a so-called connector and can connect to various databases including Oracle etc.

I hope the below links will be helpful to you,
http://sqoop.apache.org/
http://blog.cloudera.com/blog/2012/01/cloudera-connector-for-teradata-1-0-0/
http://hortonworks.com/blog/round-trip-data-enrichment-teradata-hadoop/
http://dataconomy.com/wp-content/uploads/2014/06/Syncsort-A-123ApproachtoTeradataOffloadwithHadoop.pdf

Below are few data ingestion tools, probably you can dig more into it,
https://www.datatorrent.com/product/datatorrent-ingestion/
https://www.datatorrent.com/dtingest-unified-streaming-batch-data-ingestion-hadoop/

Thanks,
Rakesh

On Mon, Aug 1, 2016 at 4:54 PM, Bhagaban Khatai <em...@gmail.com>> wrote:
Hi Guys-

I need a quick help if anybody done any migration project in TD into hadoop.
We have very tight deadline and I am trying to find any tool (online or paid) for quick development.

Please help us here and guide me if any other way is available to do the development fast.

Bhagaban


Re: Teradata into hadoop Migration

Posted by Rakesh Radhakrishnan <ra...@apache.org>.
Hi Bhagaban,

Perhaps, you can try "Apache Sqoop" to transfer data to Hadoop from
Teradata. Apache Sqoop provides an efficient approach for transferring
large data between Hadoop related systems and structured data stores. It
allows support for a data store to be added as a so-called connector and
can connect to various databases including Oracle etc.

I hope the below links will be helpful to you,
http://sqoop.apache.org/
http://blog.cloudera.com/blog/2012/01/cloudera-connector-for-teradata-1-0-0/
http://hortonworks.com/blog/round-trip-data-enrichment-teradata-hadoop/
http://dataconomy.com/wp-content/uploads/2014/06/Syncsort-A-123ApproachtoTeradataOffloadwithHadoop.pdf

Below are few data ingestion tools, probably you can dig more into it,
https://www.datatorrent.com/product/datatorrent-ingestion/
https://www.datatorrent.com/dtingest-unified-streaming-batch-data-ingestion-hadoop/

Thanks,
Rakesh

On Mon, Aug 1, 2016 at 4:54 PM, Bhagaban Khatai <em...@gmail.com>
wrote:

> Hi Guys-
>
> I need a quick help if anybody done any migration project in TD into
> hadoop.
> We have very tight deadline and I am trying to find any tool (online or
> paid) for quick development.
>
> Please help us here and guide me if any other way is available to do the
> development fast.
>
> Bhagaban
>