You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@oozie.apache.org by Panshul Whisper <ou...@gmail.com> on 2013/04/17 13:01:05 UTC

executing workflow having pig script stored on s3

Hello,

I am trying to create a workflow, which is supposed to execute a pig
script, which is stored in a bucket in S3.

What configuration should I make for the above to work.
where to give the AWS access key and AWS secret key

I can successfully run the pig script from the shell using:
pig s3n://bucketname/foldername/scriptname.pig

Thanking  You,

-- 
Regards,
Ouch Whisper
010101010101

Re: executing workflow having pig script stored on s3

Posted by Mohammad Islam <mi...@yahoo.com>.
Hi Panshul,
I never worked in s3. 
I hope someone will help you.

Few questions:
Did you try it and what problem did you get?
Is it possible to try w/o security first?

Did you able to run anything from s3 through Oozie? Or this is your first try.


Regards,
Mohammad


________________________________
 From: Panshul Whisper <ou...@gmail.com>
To: user@oozie.apache.org 
Sent: Wednesday, April 17, 2013 4:01 AM
Subject: executing workflow having pig script stored on s3
 
Hello,

I am trying to create a workflow, which is supposed to execute a pig
script, which is stored in a bucket in S3.

What configuration should I make for the above to work.
where to give the AWS access key and AWS secret key

I can successfully run the pig script from the shell using:
pig s3n://bucketname/foldername/scriptname.pig

Thanking  You,

-- 
Regards,
Ouch Whisper
010101010101