You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Ravi Prasad <ra...@gmail.com> on 2014/10/24 13:05:13 UTC
How to automate the Sqoop script in Production environment
Hi all,
1) Can anyone please suggest me , how to automate the Sqoop scripts in the
production environment.
I need to import data from Oracle tables to Hadoop Hive tables using the
below scripts.
sqoop import --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB> --username
username --password *password *--table <tablename> --columns column1
,column2,column3--hive-import --hive-overwrite --hive-table
default.oracreport --lines-terminated-by '\n' --fields-terminated-by ','
--target-dir /user/hdfs/
2) Is there any way to hide the password.
----------------------------------------------
Regards,
RAVI PRASAD. T
Re: How to automate the Sqoop script in Production environment
Posted by Ravi Prasad <ra...@gmail.com>.
Thanks a lot Karthik, Girish and Laurent
On Fri, Oct 24, 2014 at 9:30 PM, Laurent H <la...@gmail.com> wrote:
> That's right, it's better to use oozie schedulor for your production
> environment ! (can check easily treament status & logs) Check the link
> below : http://oozie.apache.org/docs/4.0.0/DG_SqoopActionExtension.html
>
>
>
> --
> Laurent HATIER - Consultant Big Data & Business Intelligence chez CapGemini
> fr.linkedin.com/pub/laurent-hatier/25/36b/a86/
> <http://fr.linkedin.com/pub/laurent-h/25/36b/a86/>
>
> 2014-10-24 17:38 GMT+02:00 Girish Lingappa <gl...@pivotal.io>:
>
>> Ravi
>> If you are using oozie in your production environment one option is to
>> plugin your sqoop job into the oozie workflow xml using oozie sqoop action.
>>
>> Thanks
>> Girish
>>
>> Sent from my iPhone
>>
>> > On Oct 24, 2014, at 4:17 AM, Dhandapani, Karthik
>> <Ka...@CVSCaremark.com> wrote:
>> >
>> > Hi,
>> >
>> > There is an option.
>> >
>> > Use --password-file Set path for file containing
>> authentication password
>> >
>> > http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html
>> >
>> > All the dynamic parameter values can be passed in as unix variables to
>> automate the sqoop script for different tables. Copy the below script to
>> .sh file and run the script from any scheduler.
>> >
>> > Thanks,
>> > Karthik
>> > ________________________________________
>> > From: Ravi Prasad [raviprasad29@gmail.com]
>> > Sent: Friday, October 24, 2014 7:05 AM
>> > To: user@hadoop.apache.org
>> > Subject: How to automate the Sqoop script in Production environment
>> >
>> > Hi all,
>> >
>> > 1) Can anyone please suggest me , how to automate the Sqoop scripts in
>> the production environment.
>> >
>> > I need to import data from Oracle tables to Hadoop Hive tables using
>> the below scripts.
>> >
>> > sqoop import --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB>
>> --username username --password password --table <tablename> --columns
>> column1 ,column2,column3--hive-import --hive-overwrite --hive-table
>> default.oracreport --lines-terminated-by '\n' --fields-terminated-by ','
>> --target-dir /user/hdfs/
>> >
>> >
>> > 2) Is there any way to hide the password.
>> >
>> > ----------------------------------------------
>> > Regards,
>> > RAVI PRASAD. T
>>
>
>
--
----------------------------------------------
Regards,
RAVI PRASAD. T
Re: How to automate the Sqoop script in Production environment
Posted by Ravi Prasad <ra...@gmail.com>.
Thanks a lot Karthik, Girish and Laurent
On Fri, Oct 24, 2014 at 9:30 PM, Laurent H <la...@gmail.com> wrote:
> That's right, it's better to use oozie schedulor for your production
> environment ! (can check easily treament status & logs) Check the link
> below : http://oozie.apache.org/docs/4.0.0/DG_SqoopActionExtension.html
>
>
>
> --
> Laurent HATIER - Consultant Big Data & Business Intelligence chez CapGemini
> fr.linkedin.com/pub/laurent-hatier/25/36b/a86/
> <http://fr.linkedin.com/pub/laurent-h/25/36b/a86/>
>
> 2014-10-24 17:38 GMT+02:00 Girish Lingappa <gl...@pivotal.io>:
>
>> Ravi
>> If you are using oozie in your production environment one option is to
>> plugin your sqoop job into the oozie workflow xml using oozie sqoop action.
>>
>> Thanks
>> Girish
>>
>> Sent from my iPhone
>>
>> > On Oct 24, 2014, at 4:17 AM, Dhandapani, Karthik
>> <Ka...@CVSCaremark.com> wrote:
>> >
>> > Hi,
>> >
>> > There is an option.
>> >
>> > Use --password-file Set path for file containing
>> authentication password
>> >
>> > http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html
>> >
>> > All the dynamic parameter values can be passed in as unix variables to
>> automate the sqoop script for different tables. Copy the below script to
>> .sh file and run the script from any scheduler.
>> >
>> > Thanks,
>> > Karthik
>> > ________________________________________
>> > From: Ravi Prasad [raviprasad29@gmail.com]
>> > Sent: Friday, October 24, 2014 7:05 AM
>> > To: user@hadoop.apache.org
>> > Subject: How to automate the Sqoop script in Production environment
>> >
>> > Hi all,
>> >
>> > 1) Can anyone please suggest me , how to automate the Sqoop scripts in
>> the production environment.
>> >
>> > I need to import data from Oracle tables to Hadoop Hive tables using
>> the below scripts.
>> >
>> > sqoop import --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB>
>> --username username --password password --table <tablename> --columns
>> column1 ,column2,column3--hive-import --hive-overwrite --hive-table
>> default.oracreport --lines-terminated-by '\n' --fields-terminated-by ','
>> --target-dir /user/hdfs/
>> >
>> >
>> > 2) Is there any way to hide the password.
>> >
>> > ----------------------------------------------
>> > Regards,
>> > RAVI PRASAD. T
>>
>
>
--
----------------------------------------------
Regards,
RAVI PRASAD. T
Re: How to automate the Sqoop script in Production environment
Posted by Ravi Prasad <ra...@gmail.com>.
Thanks a lot Karthik, Girish and Laurent
On Fri, Oct 24, 2014 at 9:30 PM, Laurent H <la...@gmail.com> wrote:
> That's right, it's better to use oozie schedulor for your production
> environment ! (can check easily treament status & logs) Check the link
> below : http://oozie.apache.org/docs/4.0.0/DG_SqoopActionExtension.html
>
>
>
> --
> Laurent HATIER - Consultant Big Data & Business Intelligence chez CapGemini
> fr.linkedin.com/pub/laurent-hatier/25/36b/a86/
> <http://fr.linkedin.com/pub/laurent-h/25/36b/a86/>
>
> 2014-10-24 17:38 GMT+02:00 Girish Lingappa <gl...@pivotal.io>:
>
>> Ravi
>> If you are using oozie in your production environment one option is to
>> plugin your sqoop job into the oozie workflow xml using oozie sqoop action.
>>
>> Thanks
>> Girish
>>
>> Sent from my iPhone
>>
>> > On Oct 24, 2014, at 4:17 AM, Dhandapani, Karthik
>> <Ka...@CVSCaremark.com> wrote:
>> >
>> > Hi,
>> >
>> > There is an option.
>> >
>> > Use --password-file Set path for file containing
>> authentication password
>> >
>> > http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html
>> >
>> > All the dynamic parameter values can be passed in as unix variables to
>> automate the sqoop script for different tables. Copy the below script to
>> .sh file and run the script from any scheduler.
>> >
>> > Thanks,
>> > Karthik
>> > ________________________________________
>> > From: Ravi Prasad [raviprasad29@gmail.com]
>> > Sent: Friday, October 24, 2014 7:05 AM
>> > To: user@hadoop.apache.org
>> > Subject: How to automate the Sqoop script in Production environment
>> >
>> > Hi all,
>> >
>> > 1) Can anyone please suggest me , how to automate the Sqoop scripts in
>> the production environment.
>> >
>> > I need to import data from Oracle tables to Hadoop Hive tables using
>> the below scripts.
>> >
>> > sqoop import --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB>
>> --username username --password password --table <tablename> --columns
>> column1 ,column2,column3--hive-import --hive-overwrite --hive-table
>> default.oracreport --lines-terminated-by '\n' --fields-terminated-by ','
>> --target-dir /user/hdfs/
>> >
>> >
>> > 2) Is there any way to hide the password.
>> >
>> > ----------------------------------------------
>> > Regards,
>> > RAVI PRASAD. T
>>
>
>
--
----------------------------------------------
Regards,
RAVI PRASAD. T
Re: How to automate the Sqoop script in Production environment
Posted by Ravi Prasad <ra...@gmail.com>.
Thanks a lot Karthik, Girish and Laurent
On Fri, Oct 24, 2014 at 9:30 PM, Laurent H <la...@gmail.com> wrote:
> That's right, it's better to use oozie schedulor for your production
> environment ! (can check easily treament status & logs) Check the link
> below : http://oozie.apache.org/docs/4.0.0/DG_SqoopActionExtension.html
>
>
>
> --
> Laurent HATIER - Consultant Big Data & Business Intelligence chez CapGemini
> fr.linkedin.com/pub/laurent-hatier/25/36b/a86/
> <http://fr.linkedin.com/pub/laurent-h/25/36b/a86/>
>
> 2014-10-24 17:38 GMT+02:00 Girish Lingappa <gl...@pivotal.io>:
>
>> Ravi
>> If you are using oozie in your production environment one option is to
>> plugin your sqoop job into the oozie workflow xml using oozie sqoop action.
>>
>> Thanks
>> Girish
>>
>> Sent from my iPhone
>>
>> > On Oct 24, 2014, at 4:17 AM, Dhandapani, Karthik
>> <Ka...@CVSCaremark.com> wrote:
>> >
>> > Hi,
>> >
>> > There is an option.
>> >
>> > Use --password-file Set path for file containing
>> authentication password
>> >
>> > http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html
>> >
>> > All the dynamic parameter values can be passed in as unix variables to
>> automate the sqoop script for different tables. Copy the below script to
>> .sh file and run the script from any scheduler.
>> >
>> > Thanks,
>> > Karthik
>> > ________________________________________
>> > From: Ravi Prasad [raviprasad29@gmail.com]
>> > Sent: Friday, October 24, 2014 7:05 AM
>> > To: user@hadoop.apache.org
>> > Subject: How to automate the Sqoop script in Production environment
>> >
>> > Hi all,
>> >
>> > 1) Can anyone please suggest me , how to automate the Sqoop scripts in
>> the production environment.
>> >
>> > I need to import data from Oracle tables to Hadoop Hive tables using
>> the below scripts.
>> >
>> > sqoop import --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB>
>> --username username --password password --table <tablename> --columns
>> column1 ,column2,column3--hive-import --hive-overwrite --hive-table
>> default.oracreport --lines-terminated-by '\n' --fields-terminated-by ','
>> --target-dir /user/hdfs/
>> >
>> >
>> > 2) Is there any way to hide the password.
>> >
>> > ----------------------------------------------
>> > Regards,
>> > RAVI PRASAD. T
>>
>
>
--
----------------------------------------------
Regards,
RAVI PRASAD. T
Re: How to automate the Sqoop script in Production environment
Posted by Laurent H <la...@gmail.com>.
That's right, it's better to use oozie schedulor for your production
environment ! (can check easily treament status & logs) Check the link
below : http://oozie.apache.org/docs/4.0.0/DG_SqoopActionExtension.html
--
Laurent HATIER - Consultant Big Data & Business Intelligence chez CapGemini
fr.linkedin.com/pub/laurent-hatier/25/36b/a86/
<http://fr.linkedin.com/pub/laurent-h/25/36b/a86/>
2014-10-24 17:38 GMT+02:00 Girish Lingappa <gl...@pivotal.io>:
> Ravi
> If you are using oozie in your production environment one option is to
> plugin your sqoop job into the oozie workflow xml using oozie sqoop action.
>
> Thanks
> Girish
>
> Sent from my iPhone
>
> > On Oct 24, 2014, at 4:17 AM, Dhandapani, Karthik
> <Ka...@CVSCaremark.com> wrote:
> >
> > Hi,
> >
> > There is an option.
> >
> > Use --password-file Set path for file containing
> authentication password
> >
> > http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html
> >
> > All the dynamic parameter values can be passed in as unix variables to
> automate the sqoop script for different tables. Copy the below script to
> .sh file and run the script from any scheduler.
> >
> > Thanks,
> > Karthik
> > ________________________________________
> > From: Ravi Prasad [raviprasad29@gmail.com]
> > Sent: Friday, October 24, 2014 7:05 AM
> > To: user@hadoop.apache.org
> > Subject: How to automate the Sqoop script in Production environment
> >
> > Hi all,
> >
> > 1) Can anyone please suggest me , how to automate the Sqoop scripts in
> the production environment.
> >
> > I need to import data from Oracle tables to Hadoop Hive tables using
> the below scripts.
> >
> > sqoop import --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB>
> --username username --password password --table <tablename> --columns
> column1 ,column2,column3--hive-import --hive-overwrite --hive-table
> default.oracreport --lines-terminated-by '\n' --fields-terminated-by ','
> --target-dir /user/hdfs/
> >
> >
> > 2) Is there any way to hide the password.
> >
> > ----------------------------------------------
> > Regards,
> > RAVI PRASAD. T
>
Re: How to automate the Sqoop script in Production environment
Posted by Laurent H <la...@gmail.com>.
That's right, it's better to use oozie schedulor for your production
environment ! (can check easily treament status & logs) Check the link
below : http://oozie.apache.org/docs/4.0.0/DG_SqoopActionExtension.html
--
Laurent HATIER - Consultant Big Data & Business Intelligence chez CapGemini
fr.linkedin.com/pub/laurent-hatier/25/36b/a86/
<http://fr.linkedin.com/pub/laurent-h/25/36b/a86/>
2014-10-24 17:38 GMT+02:00 Girish Lingappa <gl...@pivotal.io>:
> Ravi
> If you are using oozie in your production environment one option is to
> plugin your sqoop job into the oozie workflow xml using oozie sqoop action.
>
> Thanks
> Girish
>
> Sent from my iPhone
>
> > On Oct 24, 2014, at 4:17 AM, Dhandapani, Karthik
> <Ka...@CVSCaremark.com> wrote:
> >
> > Hi,
> >
> > There is an option.
> >
> > Use --password-file Set path for file containing
> authentication password
> >
> > http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html
> >
> > All the dynamic parameter values can be passed in as unix variables to
> automate the sqoop script for different tables. Copy the below script to
> .sh file and run the script from any scheduler.
> >
> > Thanks,
> > Karthik
> > ________________________________________
> > From: Ravi Prasad [raviprasad29@gmail.com]
> > Sent: Friday, October 24, 2014 7:05 AM
> > To: user@hadoop.apache.org
> > Subject: How to automate the Sqoop script in Production environment
> >
> > Hi all,
> >
> > 1) Can anyone please suggest me , how to automate the Sqoop scripts in
> the production environment.
> >
> > I need to import data from Oracle tables to Hadoop Hive tables using
> the below scripts.
> >
> > sqoop import --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB>
> --username username --password password --table <tablename> --columns
> column1 ,column2,column3--hive-import --hive-overwrite --hive-table
> default.oracreport --lines-terminated-by '\n' --fields-terminated-by ','
> --target-dir /user/hdfs/
> >
> >
> > 2) Is there any way to hide the password.
> >
> > ----------------------------------------------
> > Regards,
> > RAVI PRASAD. T
>
Re: How to automate the Sqoop script in Production environment
Posted by Laurent H <la...@gmail.com>.
That's right, it's better to use oozie schedulor for your production
environment ! (can check easily treament status & logs) Check the link
below : http://oozie.apache.org/docs/4.0.0/DG_SqoopActionExtension.html
--
Laurent HATIER - Consultant Big Data & Business Intelligence chez CapGemini
fr.linkedin.com/pub/laurent-hatier/25/36b/a86/
<http://fr.linkedin.com/pub/laurent-h/25/36b/a86/>
2014-10-24 17:38 GMT+02:00 Girish Lingappa <gl...@pivotal.io>:
> Ravi
> If you are using oozie in your production environment one option is to
> plugin your sqoop job into the oozie workflow xml using oozie sqoop action.
>
> Thanks
> Girish
>
> Sent from my iPhone
>
> > On Oct 24, 2014, at 4:17 AM, Dhandapani, Karthik
> <Ka...@CVSCaremark.com> wrote:
> >
> > Hi,
> >
> > There is an option.
> >
> > Use --password-file Set path for file containing
> authentication password
> >
> > http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html
> >
> > All the dynamic parameter values can be passed in as unix variables to
> automate the sqoop script for different tables. Copy the below script to
> .sh file and run the script from any scheduler.
> >
> > Thanks,
> > Karthik
> > ________________________________________
> > From: Ravi Prasad [raviprasad29@gmail.com]
> > Sent: Friday, October 24, 2014 7:05 AM
> > To: user@hadoop.apache.org
> > Subject: How to automate the Sqoop script in Production environment
> >
> > Hi all,
> >
> > 1) Can anyone please suggest me , how to automate the Sqoop scripts in
> the production environment.
> >
> > I need to import data from Oracle tables to Hadoop Hive tables using
> the below scripts.
> >
> > sqoop import --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB>
> --username username --password password --table <tablename> --columns
> column1 ,column2,column3--hive-import --hive-overwrite --hive-table
> default.oracreport --lines-terminated-by '\n' --fields-terminated-by ','
> --target-dir /user/hdfs/
> >
> >
> > 2) Is there any way to hide the password.
> >
> > ----------------------------------------------
> > Regards,
> > RAVI PRASAD. T
>
Re: How to automate the Sqoop script in Production environment
Posted by Laurent H <la...@gmail.com>.
That's right, it's better to use oozie schedulor for your production
environment ! (can check easily treament status & logs) Check the link
below : http://oozie.apache.org/docs/4.0.0/DG_SqoopActionExtension.html
--
Laurent HATIER - Consultant Big Data & Business Intelligence chez CapGemini
fr.linkedin.com/pub/laurent-hatier/25/36b/a86/
<http://fr.linkedin.com/pub/laurent-h/25/36b/a86/>
2014-10-24 17:38 GMT+02:00 Girish Lingappa <gl...@pivotal.io>:
> Ravi
> If you are using oozie in your production environment one option is to
> plugin your sqoop job into the oozie workflow xml using oozie sqoop action.
>
> Thanks
> Girish
>
> Sent from my iPhone
>
> > On Oct 24, 2014, at 4:17 AM, Dhandapani, Karthik
> <Ka...@CVSCaremark.com> wrote:
> >
> > Hi,
> >
> > There is an option.
> >
> > Use --password-file Set path for file containing
> authentication password
> >
> > http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html
> >
> > All the dynamic parameter values can be passed in as unix variables to
> automate the sqoop script for different tables. Copy the below script to
> .sh file and run the script from any scheduler.
> >
> > Thanks,
> > Karthik
> > ________________________________________
> > From: Ravi Prasad [raviprasad29@gmail.com]
> > Sent: Friday, October 24, 2014 7:05 AM
> > To: user@hadoop.apache.org
> > Subject: How to automate the Sqoop script in Production environment
> >
> > Hi all,
> >
> > 1) Can anyone please suggest me , how to automate the Sqoop scripts in
> the production environment.
> >
> > I need to import data from Oracle tables to Hadoop Hive tables using
> the below scripts.
> >
> > sqoop import --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB>
> --username username --password password --table <tablename> --columns
> column1 ,column2,column3--hive-import --hive-overwrite --hive-table
> default.oracreport --lines-terminated-by '\n' --fields-terminated-by ','
> --target-dir /user/hdfs/
> >
> >
> > 2) Is there any way to hide the password.
> >
> > ----------------------------------------------
> > Regards,
> > RAVI PRASAD. T
>
Re: How to automate the Sqoop script in Production environment
Posted by Girish Lingappa <gl...@pivotal.io>.
Ravi
If you are using oozie in your production environment one option is to plugin your sqoop job into the oozie workflow xml using oozie sqoop action.
Thanks
Girish
Sent from my iPhone
> On Oct 24, 2014, at 4:17 AM, Dhandapani, Karthik <Ka...@CVSCaremark.com> wrote:
>
> Hi,
>
> There is an option.
>
> Use --password-file Set path for file containing authentication password
>
> http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html
>
> All the dynamic parameter values can be passed in as unix variables to automate the sqoop script for different tables. Copy the below script to .sh file and run the script from any scheduler.
>
> Thanks,
> Karthik
> ________________________________________
> From: Ravi Prasad [raviprasad29@gmail.com]
> Sent: Friday, October 24, 2014 7:05 AM
> To: user@hadoop.apache.org
> Subject: How to automate the Sqoop script in Production environment
>
> Hi all,
>
> 1) Can anyone please suggest me , how to automate the Sqoop scripts in the production environment.
>
> I need to import data from Oracle tables to Hadoop Hive tables using the below scripts.
>
> sqoop import --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB> --username username --password password --table <tablename> --columns column1 ,column2,column3--hive-import --hive-overwrite --hive-table default.oracreport --lines-terminated-by '\n' --fields-terminated-by ',' --target-dir /user/hdfs/
>
>
> 2) Is there any way to hide the password.
>
> ----------------------------------------------
> Regards,
> RAVI PRASAD. T
Re: How to automate the Sqoop script in Production environment
Posted by Girish Lingappa <gl...@pivotal.io>.
Ravi
If you are using oozie in your production environment one option is to plugin your sqoop job into the oozie workflow xml using oozie sqoop action.
Thanks
Girish
Sent from my iPhone
> On Oct 24, 2014, at 4:17 AM, Dhandapani, Karthik <Ka...@CVSCaremark.com> wrote:
>
> Hi,
>
> There is an option.
>
> Use --password-file Set path for file containing authentication password
>
> http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html
>
> All the dynamic parameter values can be passed in as unix variables to automate the sqoop script for different tables. Copy the below script to .sh file and run the script from any scheduler.
>
> Thanks,
> Karthik
> ________________________________________
> From: Ravi Prasad [raviprasad29@gmail.com]
> Sent: Friday, October 24, 2014 7:05 AM
> To: user@hadoop.apache.org
> Subject: How to automate the Sqoop script in Production environment
>
> Hi all,
>
> 1) Can anyone please suggest me , how to automate the Sqoop scripts in the production environment.
>
> I need to import data from Oracle tables to Hadoop Hive tables using the below scripts.
>
> sqoop import --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB> --username username --password password --table <tablename> --columns column1 ,column2,column3--hive-import --hive-overwrite --hive-table default.oracreport --lines-terminated-by '\n' --fields-terminated-by ',' --target-dir /user/hdfs/
>
>
> 2) Is there any way to hide the password.
>
> ----------------------------------------------
> Regards,
> RAVI PRASAD. T
Re: How to automate the Sqoop script in Production environment
Posted by Girish Lingappa <gl...@pivotal.io>.
Ravi
If you are using oozie in your production environment one option is to plugin your sqoop job into the oozie workflow xml using oozie sqoop action.
Thanks
Girish
Sent from my iPhone
> On Oct 24, 2014, at 4:17 AM, Dhandapani, Karthik <Ka...@CVSCaremark.com> wrote:
>
> Hi,
>
> There is an option.
>
> Use --password-file Set path for file containing authentication password
>
> http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html
>
> All the dynamic parameter values can be passed in as unix variables to automate the sqoop script for different tables. Copy the below script to .sh file and run the script from any scheduler.
>
> Thanks,
> Karthik
> ________________________________________
> From: Ravi Prasad [raviprasad29@gmail.com]
> Sent: Friday, October 24, 2014 7:05 AM
> To: user@hadoop.apache.org
> Subject: How to automate the Sqoop script in Production environment
>
> Hi all,
>
> 1) Can anyone please suggest me , how to automate the Sqoop scripts in the production environment.
>
> I need to import data from Oracle tables to Hadoop Hive tables using the below scripts.
>
> sqoop import --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB> --username username --password password --table <tablename> --columns column1 ,column2,column3--hive-import --hive-overwrite --hive-table default.oracreport --lines-terminated-by '\n' --fields-terminated-by ',' --target-dir /user/hdfs/
>
>
> 2) Is there any way to hide the password.
>
> ----------------------------------------------
> Regards,
> RAVI PRASAD. T
Re: How to automate the Sqoop script in Production environment
Posted by Girish Lingappa <gl...@pivotal.io>.
Ravi
If you are using oozie in your production environment one option is to plugin your sqoop job into the oozie workflow xml using oozie sqoop action.
Thanks
Girish
Sent from my iPhone
> On Oct 24, 2014, at 4:17 AM, Dhandapani, Karthik <Ka...@CVSCaremark.com> wrote:
>
> Hi,
>
> There is an option.
>
> Use --password-file Set path for file containing authentication password
>
> http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html
>
> All the dynamic parameter values can be passed in as unix variables to automate the sqoop script for different tables. Copy the below script to .sh file and run the script from any scheduler.
>
> Thanks,
> Karthik
> ________________________________________
> From: Ravi Prasad [raviprasad29@gmail.com]
> Sent: Friday, October 24, 2014 7:05 AM
> To: user@hadoop.apache.org
> Subject: How to automate the Sqoop script in Production environment
>
> Hi all,
>
> 1) Can anyone please suggest me , how to automate the Sqoop scripts in the production environment.
>
> I need to import data from Oracle tables to Hadoop Hive tables using the below scripts.
>
> sqoop import --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB> --username username --password password --table <tablename> --columns column1 ,column2,column3--hive-import --hive-overwrite --hive-table default.oracreport --lines-terminated-by '\n' --fields-terminated-by ',' --target-dir /user/hdfs/
>
>
> 2) Is there any way to hide the password.
>
> ----------------------------------------------
> Regards,
> RAVI PRASAD. T
RE: How to automate the Sqoop script in Production environment
Posted by "Dhandapani, Karthik" <Ka...@CVSCaremark.com>.
Hi,
There is an option.
Use --password-file Set path for file containing authentication password
http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html
All the dynamic parameter values can be passed in as unix variables to automate the sqoop script for different tables. Copy the below script to .sh file and run the script from any scheduler.
Thanks,
Karthik
________________________________________
From: Ravi Prasad [raviprasad29@gmail.com]
Sent: Friday, October 24, 2014 7:05 AM
To: user@hadoop.apache.org
Subject: How to automate the Sqoop script in Production environment
Hi all,
1) Can anyone please suggest me , how to automate the Sqoop scripts in the production environment.
I need to import data from Oracle tables to Hadoop Hive tables using the below scripts.
sqoop import --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB> --username username --password password --table <tablename> --columns column1 ,column2,column3--hive-import --hive-overwrite --hive-table default.oracreport --lines-terminated-by '\n' --fields-terminated-by ',' --target-dir /user/hdfs/
2) Is there any way to hide the password.
----------------------------------------------
Regards,
RAVI PRASAD. T
RE: How to automate the Sqoop script in Production environment
Posted by "Dhandapani, Karthik" <Ka...@CVSCaremark.com>.
Hi,
There is an option.
Use --password-file Set path for file containing authentication password
http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html
All the dynamic parameter values can be passed in as unix variables to automate the sqoop script for different tables. Copy the below script to .sh file and run the script from any scheduler.
Thanks,
Karthik
________________________________________
From: Ravi Prasad [raviprasad29@gmail.com]
Sent: Friday, October 24, 2014 7:05 AM
To: user@hadoop.apache.org
Subject: How to automate the Sqoop script in Production environment
Hi all,
1) Can anyone please suggest me , how to automate the Sqoop scripts in the production environment.
I need to import data from Oracle tables to Hadoop Hive tables using the below scripts.
sqoop import --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB> --username username --password password --table <tablename> --columns column1 ,column2,column3--hive-import --hive-overwrite --hive-table default.oracreport --lines-terminated-by '\n' --fields-terminated-by ',' --target-dir /user/hdfs/
2) Is there any way to hide the password.
----------------------------------------------
Regards,
RAVI PRASAD. T
RE: How to automate the Sqoop script in Production environment
Posted by "Dhandapani, Karthik" <Ka...@CVSCaremark.com>.
Hi,
There is an option.
Use --password-file Set path for file containing authentication password
http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html
All the dynamic parameter values can be passed in as unix variables to automate the sqoop script for different tables. Copy the below script to .sh file and run the script from any scheduler.
Thanks,
Karthik
________________________________________
From: Ravi Prasad [raviprasad29@gmail.com]
Sent: Friday, October 24, 2014 7:05 AM
To: user@hadoop.apache.org
Subject: How to automate the Sqoop script in Production environment
Hi all,
1) Can anyone please suggest me , how to automate the Sqoop scripts in the production environment.
I need to import data from Oracle tables to Hadoop Hive tables using the below scripts.
sqoop import --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB> --username username --password password --table <tablename> --columns column1 ,column2,column3--hive-import --hive-overwrite --hive-table default.oracreport --lines-terminated-by '\n' --fields-terminated-by ',' --target-dir /user/hdfs/
2) Is there any way to hide the password.
----------------------------------------------
Regards,
RAVI PRASAD. T
RE: How to automate the Sqoop script in Production environment
Posted by "Dhandapani, Karthik" <Ka...@CVSCaremark.com>.
Hi,
There is an option.
Use --password-file Set path for file containing authentication password
http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html
All the dynamic parameter values can be passed in as unix variables to automate the sqoop script for different tables. Copy the below script to .sh file and run the script from any scheduler.
Thanks,
Karthik
________________________________________
From: Ravi Prasad [raviprasad29@gmail.com]
Sent: Friday, October 24, 2014 7:05 AM
To: user@hadoop.apache.org
Subject: How to automate the Sqoop script in Production environment
Hi all,
1) Can anyone please suggest me , how to automate the Sqoop scripts in the production environment.
I need to import data from Oracle tables to Hadoop Hive tables using the below scripts.
sqoop import --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB> --username username --password password --table <tablename> --columns column1 ,column2,column3--hive-import --hive-overwrite --hive-table default.oracreport --lines-terminated-by '\n' --fields-terminated-by ',' --target-dir /user/hdfs/
2) Is there any way to hide the password.
----------------------------------------------
Regards,
RAVI PRASAD. T