You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@pig.apache.org by Russell Jurney <ru...@gmail.com> on 2012/09/10 23:10:24 UTC

Pig Committer Trading Cards

http://hortonworks.com/blog/twitter-analytics-presents-hadoop-and-pig-at-uc-berkeley/


I think these lectures were posted before, but I thought Pig users might
find this as amusing as me :) You can now print and trade Pig committers
Alan Gates, Jonathan Coveney and Bill Graham. Collect them all!

-- 
Russell Jurney twitter.com/rjurney russell.jurney@gmail.com datasyndrome.com

Re: loading data to mysql using pig

Posted by Ruslan Al-Fakikh <me...@gmail.com>.
Hi,

Probably DBStorage is more convenient (I haven't tried it), but you
can also you Sqoop if you are ok with storing data to HDFS first and
then using Sqoop to insert data to MySql

Ruslan

On Tue, Sep 11, 2012 at 2:26 AM, Ranjith <ra...@gmail.com> wrote:
> Question for you pig experts.....Trying to determine the best way of inserting data into mysql using pig. Is DBStorage the best way to perform this? I have also read about denial of service attacks that one needs to be careful of......Can we minimize this risk by setting hadoop parameters limiting the number of mappers?

loading data to mysql using pig

Posted by Ranjith <ra...@gmail.com>.
Question for you pig experts.....Trying to determine the best way of inserting data into mysql using pig. Is DBStorage the best way to perform this? I have also read about denial of service attacks that one needs to be careful of......Can we minimize this risk by setting hadoop parameters limiting the number of mappers?