You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hawq.apache.org by Gregory Chase <gc...@pivotal.io> on 2016/12/09 19:53:25 UTC

[INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Dear HAWQs,

I thought it would be fun to get to know some of the other people in the
community.

My name is Greg Chase and I run community development for Pivotal for big
data open source communities that Pivotal contributes to.

Some of you may have seen my frequent emails about virtual events I help
organize for user and contributor education.

Not so long ago, I was in charge of product marketing for an in-memory data
warehouse named after a Hawaiian town from a three-letter acronymed German
Company. We treated Hadoop as an external table, and returning results from
these queries was both slow and brittle due to the network transfer rates.

So I have a special appreciation of the innovation that has gone into
creating Hadoop-native HAWQ out of PostgreSQL and Greenplum.

These days I'm much more of a marketer than a coder, but I still love
hearing about the kinds of projects that HAWQ users are involved in.

I know we'd all love to hear more about everyone else's projects, and how
you became a HAWQ user.  So please introduce yourselves!

-- 
Greg Chase

Global Head, Big Data Communities
http://www.pivotal.io/big-data

Pivotal Software
http://www.pivotal.io/

650-215-0477
@GregChase
Blog: http://geekmarketing.biz/

Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Posted by Ruilong Huo <rh...@pivotal.io>.
Hi David,

Do you feel it is not related to your interest or something else to
unsubscribe the email group? Hope we can make it better without your
comment. Thanks.

To unsubscribe, please sent email to
user-unsubscribe@hawq.incubator.apache.org and then confirm that in the
email your receive.

Best regards,
Ruilong Huo

On Tue, Dec 20, 2016 at 11:08 PM, David Wierz <dj...@theocillc.com> wrote:

> Good Morning –
>
>
>
> Please unsubscribe me from this listserv
>
>
>
> Thank you
>
>
>
> *David*
>
>
>
> David J Wierz
>
> Senior Principal | Market Strategy & Finance
>
> [image: OCI_black_rgb]
>
> djwierz@theocillc.com | +1.484.432.8075 <(484)%20432-8075>
>
> 387 Hidden Farm Drive | Exton, PA  19341-1185
>
>
>
> The information contained in this e-mail is confidential and may be
> privileged. It may be read, copied and used only by the intended recipient.
> If you have received it in error, please contact the sender immediately and
> delete the material from any computer
>
>
>
> *From:* Hong [mailto:xunzhang@apache.org]
> *Sent:* Tuesday, 20 December, 2016 10:06
> *To:* user@hawq.incubator.apache.org
> *Subject:* Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache
> HAWQ community
>
>
>
> Dear all,
>
>
>
> My name is HongWu(xunzhang is my ID on the Internet). I began to
> contribute to HAWQ this year and now become the HAWQ committer. I wrote
> near 100 commits
> <https://github.com/apache/incubator-hawq/commits?author=xunzhang> to
> make HAWQ better! I am a believer of open source since I benefit a lot from
> open source projects and I know the true meaning of it. I think open source
> is more than just publishing the code. It is delivering comprehensive
> documents/tutorials, collecting users/developers/feedbacks, establish
> discussion/comparison, solving confusions/issues, reproducing good or bad
> results, making clear roadmap, keeping long-term development, treating the
> project as our own child. Moreover, Apache Software Foundation is an elite
> community in the open source world, there are lots of high-quality
> projects, developers and users.
>
>
>
> I started my career 2012 at Douban which is a social network website
> connecting people with interest such as book, movie, music, photo, blog and
> so on. I was working as an algorithm engineer there, focused on developing
> large-scale machine learning platforms and optimizing recommendation
> engine. I am the author of Paracel <http://paracel.io/> open source
> project, which is a distributed computational framework designed for
> machine learning, graph algorithms and scientific computing. Paracel wants
> to simplify the distributed and communication details for model developers
> and data scientists, lets them be concentrated on model developing. During
> this period, I also created the Plato: a real-time recommendation system.
> This work got very outstanding results on douban.fm product. Plato
> changed the architecture of Douban's offline processing way to do machine
> learning.
>
>
>
> It's a very exciting jouney with HAWQ! In my point of view, SQL engine is
> a super critical infrastructure in the big data ecosystem. I have tried to
> design a parallel programming language named Girl(for some reason it's
> still at very early stage), but no matter how simple it will be, it
> supposes developers following some paradigms or programming
> idioms/patterns. But SQL on the other side, is the only existing language
> that erases the distributed coding logic: we just write SQLs.
>
>
>
> I definitely believe HAWQ could be one of the best SQL engines on Hadoop
> ecosystem with all of our collaborative effort including HAWQ users, HAWQ
> developers, HDB custormers, Pivotal FTEs, HAWQ competitors, Apache mentors
> and so on. Look forward to hear more people with diversed background
> joining HAWQ family.
>
>
>
> Beers
>
>
>
> 2016-12-20 18:03 GMT+08:00 Wen Lin <wl...@pivotal.io>:
>
> Hi, All,
>
>
>
> My name is LinWen. I joined Pivotal HAWQ Beijing team since November,
> 2014. Before that, I am an engineer of VMware.
>
> I have implemented some features for HAWQ, like new fault tolerance
> service, libyarn, etc.
>
> I believe HAWQ can be the best SQL on Hadoop engine with our joint
> effort.
>
>
>
> On Thu, Dec 15, 2016 at 11:03 PM, Lili Ma <li...@apache.org> wrote:
>
> Hello everyone,
>
>
>
> Glad to know everybody here:)
>
>
>
> I'm Lili Ma, from Pivotal HAWQ R&D team in Beijing. I have been focusing
> on HAWQ development and product management since 2012 when I joined
> Pivotal. I experienced and contributed in HAWQ's all growth path, from
> birth, Alpha, 1.X, 2.0...
>
>
>
> My main covering fields about HAWQ include three parts: 1) Storage such as
> internal table storage, HAWQ Input/OutputFormat, hawq extract/register,etc
> 2) Dispatcher and interconnect 3) Security including Ranger integration,
> Kerberos and LDAP.
>
>
>
> Before Pivotal, I worked at IBM for more than 2 years and focused on
> providing data service inside of our public cloud provision. The data
> service includes RDS(relational data service) which can provision a
> distributed relational database based on DB2 Federation, and NOSQL service
> which is based on HBase.
>
>
>
> I believe HAWQ can become more successful with our joint effort!  Welcome
> to reach me or this mail list for any HAWQ or other kinds of issues :)
>
>
>
> Thanks
>
> Lili
>
>
>
>
>
> 2016-12-15 4:45 GMT+08:00 Dan Baskette <db...@gmail.com>:
>
> I will add to the email flow…
>
>
>
> I am Dan Baskette, I am the Director of Tech Marketing for Pivotal and
> cover Pivotal HDB/Apache HAWQ, Pivotal Greenplum Database, and Apache
> MADlib.   I started my career at Sun Microsystems, and have been working
> for EMC/Greenplum and now Pivotal since 2000….a LONG time in quite a number
> of roles.   I was part of the team that launched Greenplum’s first Hadoop
> distribution and was around for the birth of HAWQ or as we called it when
> it was in it’s infancy…. GOH or Greenplum on Hadoop.   I have been actively
> running some webcasts on various HAWQ how-to topics for Hortonworks, so you
> can check those out on their site.
>
>
>
> Hoping this community really takes off in a big way!
>
>
>
> Dan
>
>
>
>
>
> On December 14, 2016 at 10:09:34 AM, Ruilong Huo (rhuo@pivotal.io) wrote:
>
> Hi All,
>
>
>
> Great for Gregory to start the thread that people can know each other much
> better, at least in Apache HAWQ community!
>
>
>
> I am Ruilong Huo and I am from HDB/HAWQ engineering team in Pivotal. I am
> from Teradata and joined Pivotal after that. It's my honor to be part of
> HAWQ project at its early stage. I am a fan of RDBMS (especially MPP
> database), big data, and cloud technology that changes the IT
> infrastructure of the enterprises and helps to do information
> transformation in a very large extent.
>
>
>
> I hope that with joint effort from hawq community, it will become even
> greater product in big data area, especially in SQL-on-Hadoop category.
>
>
> Best regards,
>
> Ruilong Huo
>
>
>
> On Wed, Dec 14, 2016 at 2:27 PM, Bob Glithero <rg...@pivotal.io>
> wrote:
>
> Hello all,
>
>
>
> I'm Bob, and I'm doing product marketing for HDB/HAWQ at Pivotal.  I'm
> new-ish here, and not so much from a coding background as from networking.
> I'm from Cisco Systems, where I focused on analytics use cases in
> telecommunications, particularly for mobile network operators, for service
> assurance, customer care, and customer profiling.  (also, as you're
> introducing yourselves, we'd love to hear what use cases you're involved
> with, too).
>
>
>
> About a year before I left my group at Cisco acquired an MPP database of
> its own -- ParStream -- for its IoT and fog computing use cases, so it's
> interesting to come here and learn about the architecture and applications
> of HAWQ.
>
>
>
> I hope to help make your experience with HAWQ a good one.  If I can help
> in any way, please reach out to me directly or on the list.
>
>
>
> Cheers,
>
> Bob
>
>
>
>
>
>
> Bob Glithero | Product Marketing
>
> Pivotal, Inc.
>
> rglithero@pivotal.io | m: 415.341.5592
>
>
>
>
>
> On Sun, Dec 11, 2016 at 6:56 PM, Roman Shaposhnik <ro...@shaposhnik.org>
> wrote:
>
> Greg, thanks for kicking off the roll call. Getting to know each other is
> super
> useful (and can be fun! ;-)). I'll go next:
>
> I am Roman (your friendly neighborhood mentor). I hang around a lot of ASF
> big data projects (as a committer and a PMC member), but lately I've been
> gravitating towards IoT as well (Apache Mynewt). I started my career at Sun
> microsystems back at a time when Linux  wasn't even 1.x and I've been doing
> enterprise software ever since. I was lucky enough to get to work on
> the original
> Hadoop team at Yahoo! and fall in love with not one but two elephants
> (Hadoop
> and Postgres). Recently I've assumed a position of VP of Technology at ODPi
> and I'm still hoping to MHGA! My secret weapon is Apache Bigtop (which
> co-founded)
> and I'm not afraid to use it!
>
> I'm here to help as much as I can to make sure that this community evolves
> into
> a vibrant, self-governed, exciting place worthy of being a top level
> project (TLP)
> at ASF. If you have any questions or ideas that you may want to bounce off
> of
> me -- please don't hesitate to reach out directly or on the mailing list.
>
> Thanks,
> Roman.
>
> On Fri, Dec 9, 2016 at 11:53 AM, Gregory Chase <gc...@pivotal.io> wrote:
> >
> > Dear HAWQs,
> >
> > I thought it would be fun to get to know some of the other people in the
> community.
> >
> > My name is Greg Chase and I run community development for Pivotal for
> big data open source communities that Pivotal contributes to.
> >
> > Some of you may have seen my frequent emails about virtual events I help
> organize for user and contributor education.
> >
> > Not so long ago, I was in charge of product marketing for an in-memory
> data warehouse named after a Hawaiian town from a three-letter acronymed
> German Company. We treated Hadoop as an external table, and returning
> results from these queries was both slow and brittle due to the network
> transfer rates.
> >
> > So I have a special appreciation of the innovation that has gone into
> creating Hadoop-native HAWQ out of PostgreSQL and Greenplum.
> >
> > These days I'm much more of a marketer than a coder, but I still love
> hearing about the kinds of projects that HAWQ users are involved in.
> >
> > I know we'd all love to hear more about everyone else's projects, and
> how you became a HAWQ user.  So please introduce yourselves!
> >
>
> > --
> > Greg Chase
> >
> > Global Head, Big Data Communities
> > http://www.pivotal.io/big-data
> >
> > Pivotal Software
> > http://www.pivotal.io/
> >
> > 650-215-0477
> > @GregChase
> > Blog: http://geekmarketing.biz/
> >
>
>
>
>
>
>
>
>
>
>
>

Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Posted by Radar Da lei <rl...@pivotal.io>.
Hi All,

I'm Radar Lei from Pivotal Beijng, I worked on HAWQ management tools and CI
works in HAWQ. Before Pivotal, I worked in Teradata for several years.

Great to know you guys by this opportunity. You are welcome to reach me
with any HAWQ discussion, I'm sure we can make HAWQ better and better
together.

Thanks.

Regards,
Radar

On Thu, Dec 22, 2016 at 3:07 PM, Ma Hongxu <in...@outlook.com> wrote:

> Hi All,
> I'm Hongxu Ma and just joined Pivotal China two months ago. Before
> Pivotal, I worked in Baidu many years and focused on search and
> infrastructure.
> I am a open source fan, and worked on libhdfs3, ranger integrations in
> HAWQ team now.
>
> Welcome to discuss with me and let's make HAWQ better!
>
> Regards,
> hongxu
>
>
> 在 22/12/2016 14:21, Yi Jin 写道:
>
> Hi All,
>
> My name is Yi Jin, I am currently working for Pivotal in Sydney Australia
> as a committer of Apache HAWQ. I designed and developed new resource
> manager including resource queue ddl interface, fault tolerant cluster
> resource management, YARN integration and on-demand query resource
> consumption management.
>
> Before working on HAWQ in Pivotal, I used to work for IBM as a developer
> of DB2 LUW (Linux, Unix, Windows) mainly in parser, runtime components for
> its data warehouse features, and then I did and verified some technical
> research work for automatically optimizing critical database replication in
> the core banking HA and DR solution (high availability and disaster
> recovery) which has been delivered by IBM upon DB2 z/OS in ICBC the largest
> bank in China known as its Active-Active banking HADR solution.
>
> Based on my work in HAWQ for distributed resource management, I am now
> pushing myself to every components of Apache HAWQ to pursue more
> contributions to the success of Apache HAWQ. Any really happy to have this
> opportunity to get known with you guys. Thank you!
>
> Best,
> Yi
>
> On Wed, Dec 21, 2016 at 8:55 PM, Paul Guo <pa...@gmail.com> wrote:
>
>> Hi guys,
>>
>> I started to work on Apache HAWQ ~8 months ago in Pivotal. I'm currently
>> a committer of HAWQ. Before this career, I mainly worked on Unix/Linux
>> kernel along with various infrastructure softwares in SUN and later a SV
>> startup. Later I worked for 1+ year on an Internet company until I found it
>> would be interesting to work on infrastructure software which could service
>> more users and service them better. I'd expect to discuss with anyone who
>> is interested in any system software related topics (besides HAWQ of
>> course:-))
>>
>> Thanks.
>>
>>
>> 2016-12-21 17:11 GMT+08:00 Hubert Zhang <hz...@pivotal.io>:
>>
>>> Hi, All,
>>>
>>> My name is Hubert Zhang. I worked on HAWQ resource negotiator and data
>>> locality feature and now working on HAWQ integrated with Apache Ranger.
>>>
>>> For any details related to the upper features and questions about HAWQ,
>>> please feel free to contact me.
>>>
>>>
>>> On Wed, Dec 21, 2016 at 1:31 PM, Michael Schubert <ms...@pivotal.io>
>>> wrote:
>>>
>>>> Hi David,
>>>>
>>>> instructions to unsubscribe are at the bottom of every email. It's self
>>>> service; we can't unsubscribe for you.
>>>>
>>>> Cheers.
>>>>
>>>>
>>>>
>>>> On Tue, Dec 20, 2016 at 5:09 AM -1000, "David Wierz" <
>>>> djwierz@theocillc.com> wrote:
>>>>
>>>> Good Morning –
>>>>>
>>>>>
>>>>>
>>>>> Please unsubscribe me from this listserv
>>>>>
>>>>>
>>>>>
>>>>> Thank you
>>>>>
>>>>>
>>>>>
>>>>> *David*
>>>>>
>>>>>
>>>>>
>>>>> David J Wierz
>>>>>
>>>>> Senior Principal | Market Strategy & Finance
>>>>>
>>>>> [image: OCI_black_rgb]
>>>>>
>>>>> djwierz@theocillc.com | +1.484.432.8075 <%28484%29%20432-8075>
>>>>>
>>>>> 387 Hidden Farm Drive | Exton, PA  19341-1185
>>>>>
>>>>>
>>>>>
>>>>> The information contained in this e-mail is confidential and may be
>>>>> privileged. It may be read, copied and used only by the intended recipient.
>>>>> If you have received it in error, please contact the sender immediately and
>>>>> delete the material from any computer
>>>>>
>>>>>
>>>>>
>>>>> *From:* Hong [mailto:xunzhang@apache.org]
>>>>> *Sent:* Tuesday, 20 December, 2016 10:06
>>>>> *To:* user@hawq.incubator.apache.org
>>>>> *Subject:* Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache
>>>>> HAWQ community
>>>>>
>>>>>
>>>>>
>>>>> Dear all,
>>>>>
>>>>>
>>>>>
>>>>> My name is HongWu(xunzhang is my ID on the Internet). I began to
>>>>> contribute to HAWQ this year and now become the HAWQ committer. I wrote
>>>>> near 100 commits
>>>>> <https://github.com/apache/incubator-hawq/commits?author=xunzhang> to
>>>>> make HAWQ better! I am a believer of open source since I benefit a lot from
>>>>> open source projects and I know the true meaning of it. I think open source
>>>>> is more than just publishing the code. It is delivering comprehensive
>>>>> documents/tutorials, collecting users/developers/feedbacks, establish
>>>>> discussion/comparison, solving confusions/issues, reproducing good or bad
>>>>> results, making clear roadmap, keeping long-term development, treating the
>>>>> project as our own child. Moreover, Apache Software Foundation is an elite
>>>>> community in the open source world, there are lots of high-quality
>>>>> projects, developers and users.
>>>>>
>>>>>
>>>>>
>>>>> I started my career 2012 at Douban which is a social network website
>>>>> connecting people with interest such as book, movie, music, photo, blog and
>>>>> so on. I was working as an algorithm engineer there, focused on developing
>>>>> large-scale machine learning platforms and optimizing recommendation
>>>>> engine. I am the author of Paracel <http://paracel.io/> open source
>>>>> project, which is a distributed computational framework designed for
>>>>> machine learning, graph algorithms and scientific computing. Paracel wants
>>>>> to simplify the distributed and communication details for model developers
>>>>> and data scientists, lets them be concentrated on model developing. During
>>>>> this period, I also created the Plato: a real-time recommendation system.
>>>>> This work got very outstanding results on douban.fm product. Plato
>>>>> changed the architecture of Douban's offline processing way to do machine
>>>>> learning.
>>>>>
>>>>>
>>>>>
>>>>> It's a very exciting jouney with HAWQ! In my point of view, SQL engine
>>>>> is a super critical infrastructure in the big data ecosystem. I have tried
>>>>> to design a parallel programming language named Girl(for some reason it's
>>>>> still at very early stage), but no matter how simple it will be, it
>>>>> supposes developers following some paradigms or programming
>>>>> idioms/patterns. But SQL on the other side, is the only existing language
>>>>> that erases the distributed coding logic: we just write SQLs.
>>>>>
>>>>>
>>>>>
>>>>> I definitely believe HAWQ could be one of the best SQL engines on
>>>>> Hadoop ecosystem with all of our collaborative effort including HAWQ users,
>>>>> HAWQ developers, HDB custormers, Pivotal FTEs, HAWQ competitors, Apache
>>>>> mentors and so on. Look forward to hear more people with diversed
>>>>> background joining HAWQ family.
>>>>>
>>>>>
>>>>>
>>>>> Beers
>>>>>
>>>>>
>>>>>
>>>>> 2016-12-20 18:03 GMT+08:00 Wen Lin <wl...@pivotal.io>:
>>>>>
>>>>> Hi, All,
>>>>>
>>>>>
>>>>>
>>>>> My name is LinWen. I joined Pivotal HAWQ Beijing team since November,
>>>>> 2014. Before that, I am an engineer of VMware.
>>>>>
>>>>> I have implemented some features for HAWQ, like new fault tolerance
>>>>> service, libyarn, etc.
>>>>>
>>>>> I believe HAWQ can be the best SQL on Hadoop engine with our joint
>>>>> effort.
>>>>>
>>>>>
>>>>>
>>>>> On Thu, Dec 15, 2016 at 11:03 PM, Lili Ma <li...@apache.org> wrote:
>>>>>
>>>>> Hello everyone,
>>>>>
>>>>>
>>>>>
>>>>> Glad to know everybody here:)
>>>>>
>>>>>
>>>>>
>>>>> I'm Lili Ma, from Pivotal HAWQ R&D team in Beijing. I have been
>>>>> focusing on HAWQ development and product management since 2012 when I
>>>>> joined Pivotal. I experienced and contributed in HAWQ's all growth path,
>>>>> from birth, Alpha, 1.X, 2.0...
>>>>>
>>>>>
>>>>>
>>>>> My main covering fields about HAWQ include three parts: 1) Storage
>>>>> such as internal table storage, HAWQ Input/OutputFormat, hawq
>>>>> extract/register,etc 2) Dispatcher and interconnect 3) Security including
>>>>> Ranger integration, Kerberos and LDAP.
>>>>>
>>>>>
>>>>>
>>>>> Before Pivotal, I worked at IBM for more than 2 years and focused on
>>>>> providing data service inside of our public cloud provision. The data
>>>>> service includes RDS(relational data service) which can provision a
>>>>> distributed relational database based on DB2 Federation, and NOSQL service
>>>>> which is based on HBase.
>>>>>
>>>>>
>>>>>
>>>>> I believe HAWQ can become more successful with our joint effort!
>>>>> Welcome to reach me or this mail list for any HAWQ or other kinds of issues
>>>>> :)
>>>>>
>>>>>
>>>>>
>>>>> Thanks
>>>>>
>>>>> Lili
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> 2016-12-15 4:45 GMT+08:00 Dan Baskette <db...@gmail.com>:
>>>>>
>>>>> I will add to the email flow…
>>>>>
>>>>>
>>>>>
>>>>> I am Dan Baskette, I am the Director of Tech Marketing for Pivotal and
>>>>> cover Pivotal HDB/Apache HAWQ, Pivotal Greenplum Database, and Apache
>>>>> MADlib.   I started my career at Sun Microsystems, and have been working
>>>>> for EMC/Greenplum and now Pivotal since 2000….a LONG time in quite a number
>>>>> of roles.   I was part of the team that launched Greenplum’s first Hadoop
>>>>> distribution and was around for the birth of HAWQ or as we called it when
>>>>> it was in it’s infancy…. GOH or Greenplum on Hadoop.   I have been actively
>>>>> running some webcasts on various HAWQ how-to topics for Hortonworks, so you
>>>>> can check those out on their site.
>>>>>
>>>>>
>>>>>
>>>>> Hoping this community really takes off in a big way!
>>>>>
>>>>>
>>>>>
>>>>> Dan
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On December 14, 2016 at 10:09:34 AM, Ruilong Huo (rhuo@pivotal.io)
>>>>> wrote:
>>>>>
>>>>> Hi All,
>>>>>
>>>>>
>>>>>
>>>>> Great for Gregory to start the thread that people can know each other
>>>>> much better, at least in Apache HAWQ community!
>>>>>
>>>>>
>>>>>
>>>>> I am Ruilong Huo and I am from HDB/HAWQ engineering team in Pivotal. I
>>>>> am from Teradata and joined Pivotal after that. It's my honor to be part of
>>>>> HAWQ project at its early stage. I am a fan of RDBMS (especially MPP
>>>>> database), big data, and cloud technology that changes the IT
>>>>> infrastructure of the enterprises and helps to do information
>>>>> transformation in a very large extent.
>>>>>
>>>>>
>>>>>
>>>>> I hope that with joint effort from hawq community, it will become even
>>>>> greater product in big data area, especially in SQL-on-Hadoop category.
>>>>>
>>>>>
>>>>> Best regards,
>>>>>
>>>>> Ruilong Huo
>>>>>
>>>>>
>>>>>
>>>>> On Wed, Dec 14, 2016 at 2:27 PM, Bob Glithero <rg...@pivotal.io>
>>>>> wrote:
>>>>>
>>>>> Hello all,
>>>>>
>>>>>
>>>>>
>>>>> I'm Bob, and I'm doing product marketing for HDB/HAWQ at Pivotal.  I'm
>>>>> new-ish here, and not so much from a coding background as from networking.
>>>>> I'm from Cisco Systems, where I focused on analytics use cases in
>>>>> telecommunications, particularly for mobile network operators, for service
>>>>> assurance, customer care, and customer profiling.  (also, as you're
>>>>> introducing yourselves, we'd love to hear what use cases you're involved
>>>>> with, too).
>>>>>
>>>>>
>>>>>
>>>>> About a year before I left my group at Cisco acquired an MPP database
>>>>> of its own -- ParStream -- for its IoT and fog computing use cases, so it's
>>>>> interesting to come here and learn about the architecture and applications
>>>>> of HAWQ.
>>>>>
>>>>>
>>>>>
>>>>> I hope to help make your experience with HAWQ a good one.  If I can
>>>>> help in any way, please reach out to me directly or on the list.
>>>>>
>>>>>
>>>>>
>>>>> Cheers,
>>>>>
>>>>> Bob
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> Bob Glithero | Product Marketing
>>>>>
>>>>> Pivotal, Inc.
>>>>>
>>>>> rglithero@pivotal.io | m: 415.341.5592
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Sun, Dec 11, 2016 at 6:56 PM, Roman Shaposhnik <
>>>>> roman@shaposhnik.org> wrote:
>>>>>
>>>>> Greg, thanks for kicking off the roll call. Getting to know each other
>>>>> is super
>>>>> useful (and can be fun! ;-)). I'll go next:
>>>>>
>>>>> I am Roman (your friendly neighborhood mentor). I hang around a lot of
>>>>> ASF
>>>>> big data projects (as a committer and a PMC member), but lately I've
>>>>> been
>>>>> gravitating towards IoT as well (Apache Mynewt). I started my career
>>>>> at Sun
>>>>> microsystems back at a time when Linux  wasn't even 1.x and I've been
>>>>> doing
>>>>> enterprise software ever since. I was lucky enough to get to work on
>>>>> the original
>>>>> Hadoop team at Yahoo! and fall in love with not one but two elephants
>>>>> (Hadoop
>>>>> and Postgres). Recently I've assumed a position of VP of Technology at
>>>>> ODPi
>>>>> and I'm still hoping to MHGA! My secret weapon is Apache Bigtop (which
>>>>> co-founded)
>>>>> and I'm not afraid to use it!
>>>>>
>>>>> I'm here to help as much as I can to make sure that this community
>>>>> evolves into
>>>>> a vibrant, self-governed, exciting place worthy of being a top level
>>>>> project (TLP)
>>>>> at ASF. If you have any questions or ideas that you may want to bounce
>>>>> off of
>>>>> me -- please don't hesitate to reach out directly or on the mailing
>>>>> list.
>>>>>
>>>>> Thanks,
>>>>> Roman.
>>>>>
>>>>> On Fri, Dec 9, 2016 at 11:53 AM, Gregory Chase <gc...@pivotal.io>
>>>>> wrote:
>>>>> >
>>>>> > Dear HAWQs,
>>>>> >
>>>>> > I thought it would be fun to get to know some of the other people in
>>>>> the community.
>>>>> >
>>>>> > My name is Greg Chase and I run community development for Pivotal
>>>>> for big data open source communities that Pivotal contributes to.
>>>>> >
>>>>> > Some of you may have seen my frequent emails about virtual events I
>>>>> help organize for user and contributor education.
>>>>> >
>>>>> > Not so long ago, I was in charge of product marketing for an
>>>>> in-memory data warehouse named after a Hawaiian town from a three-letter
>>>>> acronymed German Company. We treated Hadoop as an external table, and
>>>>> returning results from these queries was both slow and brittle due to the
>>>>> network transfer rates.
>>>>> >
>>>>> > So I have a special appreciation of the innovation that has gone
>>>>> into creating Hadoop-native HAWQ out of PostgreSQL and Greenplum.
>>>>> >
>>>>> > These days I'm much more of a marketer than a coder, but I still
>>>>> love hearing about the kinds of projects that HAWQ users are involved in.
>>>>> >
>>>>> > I know we'd all love to hear more about everyone else's projects,
>>>>> and how you became a HAWQ user.  So please introduce yourselves!
>>>>> >
>>>>>
>>>>> > --
>>>>> > Greg Chase
>>>>> >
>>>>> > Global Head, Big Data Communities
>>>>> > http://www.pivotal.io/big-data
>>>>> >
>>>>> > Pivotal Software
>>>>> > http://www.pivotal.io/
>>>>> >
>>>>> > 650-215-0477
>>>>> > @GregChase
>>>>> > Blog: http://geekmarketing.biz/
>>>>> >
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>> Thanks
>>>
>>> Hubert Zhang
>>>
>>
>>
>
>

Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Posted by Ma Hongxu <in...@outlook.com>.
Hi All,

I'm Hongxu Ma and just joined Pivotal China two months ago. Before Pivotal, I worked in Baidu many years and focused on search and infrastructure.
I am a open source fan, and worked on libhdfs3, ranger integrations in HAWQ team now.

Welcome to discuss with me and let's make HAWQ better!

Regards,
hongxu

在 22/12/2016 14:21, Yi Jin 写道:
Hi All,

My name is Yi Jin, I am currently working for Pivotal in Sydney Australia as a committer of Apache HAWQ. I designed and developed new resource manager including resource queue ddl interface, fault tolerant cluster resource management, YARN integration and on-demand query resource consumption management.

Before working on HAWQ in Pivotal, I used to work for IBM as a developer of DB2 LUW (Linux, Unix, Windows) mainly in parser, runtime components for its data warehouse features, and then I did and verified some technical research work for automatically optimizing critical database replication in the core banking HA and DR solution (high availability and disaster recovery) which has been delivered by IBM upon DB2 z/OS in ICBC the largest bank in China known as its Active-Active banking HADR solution.

Based on my work in HAWQ for distributed resource management, I am now pushing myself to every components of Apache HAWQ to pursue more contributions to the success of Apache HAWQ. Any really happy to have this opportunity to get known with you guys. Thank you!

Best,
Yi

On Wed, Dec 21, 2016 at 8:55 PM, Paul Guo <pa...@gmail.com>> wrote:
Hi guys,

I started to work on Apache HAWQ ~8 months ago in Pivotal. I'm currently a committer of HAWQ. Before this career, I mainly worked on Unix/Linux kernel along with various infrastructure softwares in SUN and later a SV startup. Later I worked for 1+ year on an Internet company until I found it would be interesting to work on infrastructure software which could service more users and service them better. I'd expect to discuss with anyone who is interested in any system software related topics (besides HAWQ of course:-))

Thanks.


2016-12-21 17:11 GMT+08:00 Hubert Zhang <hz...@pivotal.io>>:
Hi, All,

My name is Hubert Zhang. I worked on HAWQ resource negotiator and data locality feature and now working on HAWQ integrated with Apache Ranger.

For any details related to the upper features and questions about HAWQ, please feel free to contact me.


On Wed, Dec 21, 2016 at 1:31 PM, Michael Schubert <ms...@pivotal.io>> wrote:
Hi David,

instructions to unsubscribe are at the bottom of every email. It's self service; we can't unsubscribe for you.

Cheers.



On Tue, Dec 20, 2016 at 5:09 AM -1000, "David Wierz" <dj...@theocillc.com>> wrote:

Good Morning –

Please unsubscribe me from this listserv

Thank you

David

David J Wierz
Senior Principal | Market Strategy & Finance
[OCI_black_rgb]
djwierz@theocillc.com<ma...@theocillc.com> | +1.484.432.8075<tel:%28484%29%20432-8075>
387 Hidden Farm Drive | Exton, PA  19341-1185

The information contained in this e-mail is confidential and may be privileged. It may be read, copied and used only by the intended recipient. If you have received it in error, please contact the sender immediately and delete the material from any computer

From: Hong [mailto:xunzhang@apache.org<ma...@apache.org>]
Sent: Tuesday, 20 December, 2016 10:06
To: user@hawq.incubator.apache.org<ma...@hawq.incubator.apache.org>
Subject: Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Dear all,

My name is HongWu(xunzhang is my ID on the Internet). I began to contribute to HAWQ this year and now become the HAWQ committer. I wrote near 100 commits<https://github.com/apache/incubator-hawq/commits?author=xunzhang> to make HAWQ better! I am a believer of open source since I benefit a lot from open source projects and I know the true meaning of it. I think open source is more than just publishing the code. It is delivering comprehensive documents/tutorials, collecting users/developers/feedbacks, establish discussion/comparison, solving confusions/issues, reproducing good or bad results, making clear roadmap, keeping long-term development, treating the project as our own child. Moreover, Apache Software Foundation is an elite community in the open source world, there are lots of high-quality projects, developers and users.

I started my career 2012 at Douban which is a social network website connecting people with interest such as book, movie, music, photo, blog and so on. I was working as an algorithm engineer there, focused on developing large-scale machine learning platforms and optimizing recommendation engine. I am the author of Paracel<http://paracel.io/> open source project, which is a distributed computational framework designed for machine learning, graph algorithms and scientific computing. Paracel wants to simplify the distributed and communication details for model developers and data scientists, lets them be concentrated on model developing. During this period, I also created the Plato: a real-time recommendation system. This work got very outstanding results on douban.fm<http://douban.fm> product. Plato changed the architecture of Douban's offline processing way to do machine learning.

It's a very exciting jouney with HAWQ! In my point of view, SQL engine is a super critical infrastructure in the big data ecosystem. I have tried to design a parallel programming language named Girl(for some reason it's still at very early stage), but no matter how simple it will be, it supposes developers following some paradigms or programming idioms/patterns. But SQL on the other side, is the only existing language that erases the distributed coding logic: we just write SQLs.

I definitely believe HAWQ could be one of the best SQL engines on Hadoop ecosystem with all of our collaborative effort including HAWQ users, HAWQ developers, HDB custormers, Pivotal FTEs, HAWQ competitors, Apache mentors and so on. Look forward to hear more people with diversed background joining HAWQ family.

Beers

2016-12-20 18:03 GMT+08:00 Wen Lin <wl...@pivotal.io>>:
Hi, All,

My name is LinWen. I joined Pivotal HAWQ Beijing team since November, 2014. Before that, I am an engineer of VMware.
I have implemented some features for HAWQ, like new fault tolerance service, libyarn, etc.
I believe HAWQ can be the best SQL on Hadoop engine with our joint effort.

On Thu, Dec 15, 2016 at 11:03 PM, Lili Ma <li...@apache.org>> wrote:
Hello everyone,

Glad to know everybody here:)

I'm Lili Ma, from Pivotal HAWQ R&D team in Beijing. I have been focusing on HAWQ development and product management since 2012 when I joined Pivotal. I experienced and contributed in HAWQ's all growth path, from birth, Alpha, 1.X, 2.0...

My main covering fields about HAWQ include three parts: 1) Storage such as internal table storage, HAWQ Input/OutputFormat, hawq extract/register,etc 2) Dispatcher and interconnect 3) Security including Ranger integration, Kerberos and LDAP.

Before Pivotal, I worked at IBM for more than 2 years and focused on providing data service inside of our public cloud provision. The data service includes RDS(relational data service) which can provision a distributed relational database based on DB2 Federation, and NOSQL service which is based on HBase.

I believe HAWQ can become more successful with our joint effort!  Welcome to reach me or this mail list for any HAWQ or other kinds of issues :)

Thanks
Lili


2016-12-15 4:45 GMT+08:00 Dan Baskette <db...@gmail.com>>:
I will add to the email flow…

I am Dan Baskette, I am the Director of Tech Marketing for Pivotal and cover Pivotal HDB/Apache HAWQ, Pivotal Greenplum Database, and Apache MADlib.   I started my career at Sun Microsystems, and have been working for EMC/Greenplum and now Pivotal since 2000….a LONG time in quite a number of roles.   I was part of the team that launched Greenplum’s first Hadoop distribution and was around for the birth of HAWQ or as we called it when it was in it’s infancy…. GOH or Greenplum on Hadoop.   I have been actively running some webcasts on various HAWQ how-to topics for Hortonworks, so you can check those out on their site.

Hoping this community really takes off in a big way!

Dan



On December 14, 2016 at 10:09:34 AM, Ruilong Huo (rhuo@pivotal.io<ma...@pivotal.io>) wrote:
Hi All,

Great for Gregory to start the thread that people can know each other much better, at least in Apache HAWQ community!

I am Ruilong Huo and I am from HDB/HAWQ engineering team in Pivotal. I am from Teradata and joined Pivotal after that. It's my honor to be part of HAWQ project at its early stage. I am a fan of RDBMS (especially MPP database), big data, and cloud technology that changes the IT infrastructure of the enterprises and helps to do information transformation in a very large extent.

I hope that with joint effort from hawq community, it will become even greater product in big data area, especially in SQL-on-Hadoop category.

Best regards,
Ruilong Huo

On Wed, Dec 14, 2016 at 2:27 PM, Bob Glithero <rg...@pivotal.io>> wrote:
Hello all,

I'm Bob, and I'm doing product marketing for HDB/HAWQ at Pivotal.  I'm new-ish here, and not so much from a coding background as from networking.  I'm from Cisco Systems, where I focused on analytics use cases in telecommunications, particularly for mobile network operators, for service assurance, customer care, and customer profiling.  (also, as you're introducing yourselves, we'd love to hear what use cases you're involved with, too).

About a year before I left my group at Cisco acquired an MPP database of its own -- ParStream -- for its IoT and fog computing use cases, so it's interesting to come here and learn about the architecture and applications of HAWQ.

I hope to help make your experience with HAWQ a good one.  If I can help in any way, please reach out to me directly or on the list.

Cheers,
Bob



Bob Glithero | Product Marketing
Pivotal, Inc.
rglithero@pivotal.io<ma...@pivotal.io> | m: 415.341.5592<tel:415.341.5592>


On Sun, Dec 11, 2016 at 6:56 PM, Roman Shaposhnik <ro...@shaposhnik.org>> wrote:
Greg, thanks for kicking off the roll call. Getting to know each other is super
useful (and can be fun! ;-)). I'll go next:

I am Roman (your friendly neighborhood mentor). I hang around a lot of ASF
big data projects (as a committer and a PMC member), but lately I've been
gravitating towards IoT as well (Apache Mynewt). I started my career at Sun
microsystems back at a time when Linux  wasn't even 1.x and I've been doing
enterprise software ever since. I was lucky enough to get to work on
the original
Hadoop team at Yahoo! and fall in love with not one but two elephants (Hadoop
and Postgres). Recently I've assumed a position of VP of Technology at ODPi
and I'm still hoping to MHGA! My secret weapon is Apache Bigtop (which
co-founded)
and I'm not afraid to use it!

I'm here to help as much as I can to make sure that this community evolves into
a vibrant, self-governed, exciting place worthy of being a top level
project (TLP)
at ASF. If you have any questions or ideas that you may want to bounce off of
me -- please don't hesitate to reach out directly or on the mailing list.

Thanks,
Roman.

On Fri, Dec 9, 2016 at 11:53 AM, Gregory Chase <gc...@pivotal.io>> wrote:
>
> Dear HAWQs,
>
> I thought it would be fun to get to know some of the other people in the community.
>
> My name is Greg Chase and I run community development for Pivotal for big data open source communities that Pivotal contributes to.
>
> Some of you may have seen my frequent emails about virtual events I help organize for user and contributor education.
>
> Not so long ago, I was in charge of product marketing for an in-memory data warehouse named after a Hawaiian town from a three-letter acronymed German Company. We treated Hadoop as an external table, and returning results from these queries was both slow and brittle due to the network transfer rates.
>
> So I have a special appreciation of the innovation that has gone into creating Hadoop-native HAWQ out of PostgreSQL and Greenplum.
>
> These days I'm much more of a marketer than a coder, but I still love hearing about the kinds of projects that HAWQ users are involved in.
>
> I know we'd all love to hear more about everyone else's projects, and how you became a HAWQ user.  So please introduce yourselves!
>
> --
> Greg Chase
>
> Global Head, Big Data Communities
> http://www.pivotal.io/big-data
>
> Pivotal Software
> http://www.pivotal.io/
>
> 650-215-0477<tel:650-215-0477>
> @GregChase
> Blog: http://geekmarketing.biz/
>








--
Thanks

Hubert Zhang




Hi, I'm hongxu and I'm part of the Apache HAWQ community

Posted by Ma Hongxu <in...@outlook.com>.
Hi All,

I'm Hongxu Ma and just joined Pivotal China two months ago. Before 
Pivotal, I worked in Baidu many years and focused on search and 
infrastructure.
I am a open source fan, and worked on libhdfs3, ranger integrations in 
HAWQ team now.

Welcome to discuss with me and let's make HAWQ better!

Regards,
hongxu


Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Posted by Yi Jin <yj...@pivotal.io>.
Hi All,

My name is Yi Jin, I am currently working for Pivotal in Sydney Australia
as a committer of Apache HAWQ. I designed and developed new resource
manager including resource queue ddl interface, fault tolerant cluster
resource management, YARN integration and on-demand query resource
consumption management.

Before working on HAWQ in Pivotal, I used to work for IBM as a developer of
DB2 LUW (Linux, Unix, Windows) mainly in parser, runtime components for its
data warehouse features, and then I did and verified some technical
research work for automatically optimizing critical database replication in
the core banking HA and DR solution (high availability and disaster
recovery) which has been delivered by IBM upon DB2 z/OS in ICBC the largest
bank in China known as its Active-Active banking HADR solution.

Based on my work in HAWQ for distributed resource management, I am now
pushing myself to every components of Apache HAWQ to pursue more
contributions to the success of Apache HAWQ. Any really happy to have this
opportunity to get known with you guys. Thank you!

Best,
Yi

On Wed, Dec 21, 2016 at 8:55 PM, Paul Guo <pa...@gmail.com> wrote:

> Hi guys,
>
> I started to work on Apache HAWQ ~8 months ago in Pivotal. I'm currently a
> committer of HAWQ. Before this career, I mainly worked on Unix/Linux kernel
> along with various infrastructure softwares in SUN and later a SV startup.
> Later I worked for 1+ year on an Internet company until I found it would be
> interesting to work on infrastructure software which could service more
> users and service them better. I'd expect to discuss with anyone who is
> interested in any system software related topics (besides HAWQ of course:-))
>
> Thanks.
>
>
> 2016-12-21 17:11 GMT+08:00 Hubert Zhang <hz...@pivotal.io>:
>
>> Hi, All,
>>
>> My name is Hubert Zhang. I worked on HAWQ resource negotiator and data
>> locality feature and now working on HAWQ integrated with Apache Ranger.
>>
>> For any details related to the upper features and questions about HAWQ,
>> please feel free to contact me.
>>
>>
>> On Wed, Dec 21, 2016 at 1:31 PM, Michael Schubert <ms...@pivotal.io>
>> wrote:
>>
>>> Hi David,
>>>
>>> instructions to unsubscribe are at the bottom of every email. It's self
>>> service; we can't unsubscribe for you.
>>>
>>> Cheers.
>>>
>>>
>>>
>>> On Tue, Dec 20, 2016 at 5:09 AM -1000, "David Wierz" <
>>> djwierz@theocillc.com> wrote:
>>>
>>> Good Morning –
>>>>
>>>>
>>>>
>>>> Please unsubscribe me from this listserv
>>>>
>>>>
>>>>
>>>> Thank you
>>>>
>>>>
>>>>
>>>> *David*
>>>>
>>>>
>>>>
>>>> David J Wierz
>>>>
>>>> Senior Principal | Market Strategy & Finance
>>>>
>>>> [image: OCI_black_rgb]
>>>>
>>>> djwierz@theocillc.com | +1.484.432.8075 <(484)%20432-8075>
>>>>
>>>> 387 Hidden Farm Drive | Exton, PA  19341-1185
>>>>
>>>>
>>>>
>>>> The information contained in this e-mail is confidential and may be
>>>> privileged. It may be read, copied and used only by the intended recipient.
>>>> If you have received it in error, please contact the sender immediately and
>>>> delete the material from any computer
>>>>
>>>>
>>>>
>>>> *From:* Hong [mailto:xunzhang@apache.org]
>>>> *Sent:* Tuesday, 20 December, 2016 10:06
>>>> *To:* user@hawq.incubator.apache.org
>>>> *Subject:* Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache
>>>> HAWQ community
>>>>
>>>>
>>>>
>>>> Dear all,
>>>>
>>>>
>>>>
>>>> My name is HongWu(xunzhang is my ID on the Internet). I began to
>>>> contribute to HAWQ this year and now become the HAWQ committer. I wrote
>>>> near 100 commits
>>>> <https://github.com/apache/incubator-hawq/commits?author=xunzhang> to
>>>> make HAWQ better! I am a believer of open source since I benefit a lot from
>>>> open source projects and I know the true meaning of it. I think open source
>>>> is more than just publishing the code. It is delivering comprehensive
>>>> documents/tutorials, collecting users/developers/feedbacks, establish
>>>> discussion/comparison, solving confusions/issues, reproducing good or bad
>>>> results, making clear roadmap, keeping long-term development, treating the
>>>> project as our own child. Moreover, Apache Software Foundation is an elite
>>>> community in the open source world, there are lots of high-quality
>>>> projects, developers and users.
>>>>
>>>>
>>>>
>>>> I started my career 2012 at Douban which is a social network website
>>>> connecting people with interest such as book, movie, music, photo, blog and
>>>> so on. I was working as an algorithm engineer there, focused on developing
>>>> large-scale machine learning platforms and optimizing recommendation
>>>> engine. I am the author of Paracel <http://paracel.io/> open source
>>>> project, which is a distributed computational framework designed for
>>>> machine learning, graph algorithms and scientific computing. Paracel wants
>>>> to simplify the distributed and communication details for model developers
>>>> and data scientists, lets them be concentrated on model developing. During
>>>> this period, I also created the Plato: a real-time recommendation system.
>>>> This work got very outstanding results on douban.fm product. Plato
>>>> changed the architecture of Douban's offline processing way to do machine
>>>> learning.
>>>>
>>>>
>>>>
>>>> It's a very exciting jouney with HAWQ! In my point of view, SQL engine
>>>> is a super critical infrastructure in the big data ecosystem. I have tried
>>>> to design a parallel programming language named Girl(for some reason it's
>>>> still at very early stage), but no matter how simple it will be, it
>>>> supposes developers following some paradigms or programming
>>>> idioms/patterns. But SQL on the other side, is the only existing language
>>>> that erases the distributed coding logic: we just write SQLs.
>>>>
>>>>
>>>>
>>>> I definitely believe HAWQ could be one of the best SQL engines on
>>>> Hadoop ecosystem with all of our collaborative effort including HAWQ users,
>>>> HAWQ developers, HDB custormers, Pivotal FTEs, HAWQ competitors, Apache
>>>> mentors and so on. Look forward to hear more people with diversed
>>>> background joining HAWQ family.
>>>>
>>>>
>>>>
>>>> Beers
>>>>
>>>>
>>>>
>>>> 2016-12-20 18:03 GMT+08:00 Wen Lin <wl...@pivotal.io>:
>>>>
>>>> Hi, All,
>>>>
>>>>
>>>>
>>>> My name is LinWen. I joined Pivotal HAWQ Beijing team since November,
>>>> 2014. Before that, I am an engineer of VMware.
>>>>
>>>> I have implemented some features for HAWQ, like new fault tolerance
>>>> service, libyarn, etc.
>>>>
>>>> I believe HAWQ can be the best SQL on Hadoop engine with our joint
>>>> effort.
>>>>
>>>>
>>>>
>>>> On Thu, Dec 15, 2016 at 11:03 PM, Lili Ma <li...@apache.org> wrote:
>>>>
>>>> Hello everyone,
>>>>
>>>>
>>>>
>>>> Glad to know everybody here:)
>>>>
>>>>
>>>>
>>>> I'm Lili Ma, from Pivotal HAWQ R&D team in Beijing. I have been
>>>> focusing on HAWQ development and product management since 2012 when I
>>>> joined Pivotal. I experienced and contributed in HAWQ's all growth path,
>>>> from birth, Alpha, 1.X, 2.0...
>>>>
>>>>
>>>>
>>>> My main covering fields about HAWQ include three parts: 1) Storage such
>>>> as internal table storage, HAWQ Input/OutputFormat, hawq
>>>> extract/register,etc 2) Dispatcher and interconnect 3) Security including
>>>> Ranger integration, Kerberos and LDAP.
>>>>
>>>>
>>>>
>>>> Before Pivotal, I worked at IBM for more than 2 years and focused on
>>>> providing data service inside of our public cloud provision. The data
>>>> service includes RDS(relational data service) which can provision a
>>>> distributed relational database based on DB2 Federation, and NOSQL service
>>>> which is based on HBase.
>>>>
>>>>
>>>>
>>>> I believe HAWQ can become more successful with our joint effort!
>>>> Welcome to reach me or this mail list for any HAWQ or other kinds of issues
>>>> :)
>>>>
>>>>
>>>>
>>>> Thanks
>>>>
>>>> Lili
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> 2016-12-15 4:45 GMT+08:00 Dan Baskette <db...@gmail.com>:
>>>>
>>>> I will add to the email flow…
>>>>
>>>>
>>>>
>>>> I am Dan Baskette, I am the Director of Tech Marketing for Pivotal and
>>>> cover Pivotal HDB/Apache HAWQ, Pivotal Greenplum Database, and Apache
>>>> MADlib.   I started my career at Sun Microsystems, and have been working
>>>> for EMC/Greenplum and now Pivotal since 2000….a LONG time in quite a number
>>>> of roles.   I was part of the team that launched Greenplum’s first Hadoop
>>>> distribution and was around for the birth of HAWQ or as we called it when
>>>> it was in it’s infancy…. GOH or Greenplum on Hadoop.   I have been actively
>>>> running some webcasts on various HAWQ how-to topics for Hortonworks, so you
>>>> can check those out on their site.
>>>>
>>>>
>>>>
>>>> Hoping this community really takes off in a big way!
>>>>
>>>>
>>>>
>>>> Dan
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On December 14, 2016 at 10:09:34 AM, Ruilong Huo (rhuo@pivotal.io)
>>>> wrote:
>>>>
>>>> Hi All,
>>>>
>>>>
>>>>
>>>> Great for Gregory to start the thread that people can know each other
>>>> much better, at least in Apache HAWQ community!
>>>>
>>>>
>>>>
>>>> I am Ruilong Huo and I am from HDB/HAWQ engineering team in Pivotal. I
>>>> am from Teradata and joined Pivotal after that. It's my honor to be part of
>>>> HAWQ project at its early stage. I am a fan of RDBMS (especially MPP
>>>> database), big data, and cloud technology that changes the IT
>>>> infrastructure of the enterprises and helps to do information
>>>> transformation in a very large extent.
>>>>
>>>>
>>>>
>>>> I hope that with joint effort from hawq community, it will become even
>>>> greater product in big data area, especially in SQL-on-Hadoop category.
>>>>
>>>>
>>>> Best regards,
>>>>
>>>> Ruilong Huo
>>>>
>>>>
>>>>
>>>> On Wed, Dec 14, 2016 at 2:27 PM, Bob Glithero <rg...@pivotal.io>
>>>> wrote:
>>>>
>>>> Hello all,
>>>>
>>>>
>>>>
>>>> I'm Bob, and I'm doing product marketing for HDB/HAWQ at Pivotal.  I'm
>>>> new-ish here, and not so much from a coding background as from networking.
>>>> I'm from Cisco Systems, where I focused on analytics use cases in
>>>> telecommunications, particularly for mobile network operators, for service
>>>> assurance, customer care, and customer profiling.  (also, as you're
>>>> introducing yourselves, we'd love to hear what use cases you're involved
>>>> with, too).
>>>>
>>>>
>>>>
>>>> About a year before I left my group at Cisco acquired an MPP database
>>>> of its own -- ParStream -- for its IoT and fog computing use cases, so it's
>>>> interesting to come here and learn about the architecture and applications
>>>> of HAWQ.
>>>>
>>>>
>>>>
>>>> I hope to help make your experience with HAWQ a good one.  If I can
>>>> help in any way, please reach out to me directly or on the list.
>>>>
>>>>
>>>>
>>>> Cheers,
>>>>
>>>> Bob
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> Bob Glithero | Product Marketing
>>>>
>>>> Pivotal, Inc.
>>>>
>>>> rglithero@pivotal.io | m: 415.341.5592
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Sun, Dec 11, 2016 at 6:56 PM, Roman Shaposhnik <ro...@shaposhnik.org>
>>>> wrote:
>>>>
>>>> Greg, thanks for kicking off the roll call. Getting to know each other
>>>> is super
>>>> useful (and can be fun! ;-)). I'll go next:
>>>>
>>>> I am Roman (your friendly neighborhood mentor). I hang around a lot of
>>>> ASF
>>>> big data projects (as a committer and a PMC member), but lately I've
>>>> been
>>>> gravitating towards IoT as well (Apache Mynewt). I started my career at
>>>> Sun
>>>> microsystems back at a time when Linux  wasn't even 1.x and I've been
>>>> doing
>>>> enterprise software ever since. I was lucky enough to get to work on
>>>> the original
>>>> Hadoop team at Yahoo! and fall in love with not one but two elephants
>>>> (Hadoop
>>>> and Postgres). Recently I've assumed a position of VP of Technology at
>>>> ODPi
>>>> and I'm still hoping to MHGA! My secret weapon is Apache Bigtop (which
>>>> co-founded)
>>>> and I'm not afraid to use it!
>>>>
>>>> I'm here to help as much as I can to make sure that this community
>>>> evolves into
>>>> a vibrant, self-governed, exciting place worthy of being a top level
>>>> project (TLP)
>>>> at ASF. If you have any questions or ideas that you may want to bounce
>>>> off of
>>>> me -- please don't hesitate to reach out directly or on the mailing
>>>> list.
>>>>
>>>> Thanks,
>>>> Roman.
>>>>
>>>> On Fri, Dec 9, 2016 at 11:53 AM, Gregory Chase <gc...@pivotal.io>
>>>> wrote:
>>>> >
>>>> > Dear HAWQs,
>>>> >
>>>> > I thought it would be fun to get to know some of the other people in
>>>> the community.
>>>> >
>>>> > My name is Greg Chase and I run community development for Pivotal for
>>>> big data open source communities that Pivotal contributes to.
>>>> >
>>>> > Some of you may have seen my frequent emails about virtual events I
>>>> help organize for user and contributor education.
>>>> >
>>>> > Not so long ago, I was in charge of product marketing for an
>>>> in-memory data warehouse named after a Hawaiian town from a three-letter
>>>> acronymed German Company. We treated Hadoop as an external table, and
>>>> returning results from these queries was both slow and brittle due to the
>>>> network transfer rates.
>>>> >
>>>> > So I have a special appreciation of the innovation that has gone into
>>>> creating Hadoop-native HAWQ out of PostgreSQL and Greenplum.
>>>> >
>>>> > These days I'm much more of a marketer than a coder, but I still love
>>>> hearing about the kinds of projects that HAWQ users are involved in.
>>>> >
>>>> > I know we'd all love to hear more about everyone else's projects, and
>>>> how you became a HAWQ user.  So please introduce yourselves!
>>>> >
>>>>
>>>> > --
>>>> > Greg Chase
>>>> >
>>>> > Global Head, Big Data Communities
>>>> > http://www.pivotal.io/big-data
>>>> >
>>>> > Pivotal Software
>>>> > http://www.pivotal.io/
>>>> >
>>>> > 650-215-0477
>>>> > @GregChase
>>>> > Blog: http://geekmarketing.biz/
>>>> >
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>>
>>
>> --
>> Thanks
>>
>> Hubert Zhang
>>
>
>

Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Posted by Paul Guo <pa...@gmail.com>.
Hi guys,

I started to work on Apache HAWQ ~8 months ago in Pivotal. I'm currently a
committer of HAWQ. Before this career, I mainly worked on Unix/Linux kernel
along with various infrastructure softwares in SUN and later a SV startup.
Later I worked for 1+ year on an Internet company until I found it would be
interesting to work on infrastructure software which could service more
users and service them better. I'd expect to discuss with anyone who is
interested in any system software related topics (besides HAWQ of course:-))

Thanks.


2016-12-21 17:11 GMT+08:00 Hubert Zhang <hz...@pivotal.io>:

> Hi, All,
>
> My name is Hubert Zhang. I worked on HAWQ resource negotiator and data
> locality feature and now working on HAWQ integrated with Apache Ranger.
>
> For any details related to the upper features and questions about HAWQ,
> please feel free to contact me.
>
>
> On Wed, Dec 21, 2016 at 1:31 PM, Michael Schubert <ms...@pivotal.io>
> wrote:
>
>> Hi David,
>>
>> instructions to unsubscribe are at the bottom of every email. It's self
>> service; we can't unsubscribe for you.
>>
>> Cheers.
>>
>>
>>
>> On Tue, Dec 20, 2016 at 5:09 AM -1000, "David Wierz" <
>> djwierz@theocillc.com> wrote:
>>
>> Good Morning –
>>>
>>>
>>>
>>> Please unsubscribe me from this listserv
>>>
>>>
>>>
>>> Thank you
>>>
>>>
>>>
>>> *David*
>>>
>>>
>>>
>>> David J Wierz
>>>
>>> Senior Principal | Market Strategy & Finance
>>>
>>> [image: OCI_black_rgb]
>>>
>>> djwierz@theocillc.com | +1.484.432.8075 <(484)%20432-8075>
>>>
>>> 387 Hidden Farm Drive | Exton, PA  19341-1185
>>>
>>>
>>>
>>> The information contained in this e-mail is confidential and may be
>>> privileged. It may be read, copied and used only by the intended recipient.
>>> If you have received it in error, please contact the sender immediately and
>>> delete the material from any computer
>>>
>>>
>>>
>>> *From:* Hong [mailto:xunzhang@apache.org]
>>> *Sent:* Tuesday, 20 December, 2016 10:06
>>> *To:* user@hawq.incubator.apache.org
>>> *Subject:* Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache
>>> HAWQ community
>>>
>>>
>>>
>>> Dear all,
>>>
>>>
>>>
>>> My name is HongWu(xunzhang is my ID on the Internet). I began to
>>> contribute to HAWQ this year and now become the HAWQ committer. I wrote
>>> near 100 commits
>>> <https://github.com/apache/incubator-hawq/commits?author=xunzhang> to
>>> make HAWQ better! I am a believer of open source since I benefit a lot from
>>> open source projects and I know the true meaning of it. I think open source
>>> is more than just publishing the code. It is delivering comprehensive
>>> documents/tutorials, collecting users/developers/feedbacks, establish
>>> discussion/comparison, solving confusions/issues, reproducing good or bad
>>> results, making clear roadmap, keeping long-term development, treating the
>>> project as our own child. Moreover, Apache Software Foundation is an elite
>>> community in the open source world, there are lots of high-quality
>>> projects, developers and users.
>>>
>>>
>>>
>>> I started my career 2012 at Douban which is a social network website
>>> connecting people with interest such as book, movie, music, photo, blog and
>>> so on. I was working as an algorithm engineer there, focused on developing
>>> large-scale machine learning platforms and optimizing recommendation
>>> engine. I am the author of Paracel <http://paracel.io/> open source
>>> project, which is a distributed computational framework designed for
>>> machine learning, graph algorithms and scientific computing. Paracel wants
>>> to simplify the distributed and communication details for model developers
>>> and data scientists, lets them be concentrated on model developing. During
>>> this period, I also created the Plato: a real-time recommendation system.
>>> This work got very outstanding results on douban.fm product. Plato
>>> changed the architecture of Douban's offline processing way to do machine
>>> learning.
>>>
>>>
>>>
>>> It's a very exciting jouney with HAWQ! In my point of view, SQL engine
>>> is a super critical infrastructure in the big data ecosystem. I have tried
>>> to design a parallel programming language named Girl(for some reason it's
>>> still at very early stage), but no matter how simple it will be, it
>>> supposes developers following some paradigms or programming
>>> idioms/patterns. But SQL on the other side, is the only existing language
>>> that erases the distributed coding logic: we just write SQLs.
>>>
>>>
>>>
>>> I definitely believe HAWQ could be one of the best SQL engines on Hadoop
>>> ecosystem with all of our collaborative effort including HAWQ users, HAWQ
>>> developers, HDB custormers, Pivotal FTEs, HAWQ competitors, Apache mentors
>>> and so on. Look forward to hear more people with diversed background
>>> joining HAWQ family.
>>>
>>>
>>>
>>> Beers
>>>
>>>
>>>
>>> 2016-12-20 18:03 GMT+08:00 Wen Lin <wl...@pivotal.io>:
>>>
>>> Hi, All,
>>>
>>>
>>>
>>> My name is LinWen. I joined Pivotal HAWQ Beijing team since November,
>>> 2014. Before that, I am an engineer of VMware.
>>>
>>> I have implemented some features for HAWQ, like new fault tolerance
>>> service, libyarn, etc.
>>>
>>> I believe HAWQ can be the best SQL on Hadoop engine with our joint
>>> effort.
>>>
>>>
>>>
>>> On Thu, Dec 15, 2016 at 11:03 PM, Lili Ma <li...@apache.org> wrote:
>>>
>>> Hello everyone,
>>>
>>>
>>>
>>> Glad to know everybody here:)
>>>
>>>
>>>
>>> I'm Lili Ma, from Pivotal HAWQ R&D team in Beijing. I have been focusing
>>> on HAWQ development and product management since 2012 when I joined
>>> Pivotal. I experienced and contributed in HAWQ's all growth path, from
>>> birth, Alpha, 1.X, 2.0...
>>>
>>>
>>>
>>> My main covering fields about HAWQ include three parts: 1) Storage such
>>> as internal table storage, HAWQ Input/OutputFormat, hawq
>>> extract/register,etc 2) Dispatcher and interconnect 3) Security including
>>> Ranger integration, Kerberos and LDAP.
>>>
>>>
>>>
>>> Before Pivotal, I worked at IBM for more than 2 years and focused on
>>> providing data service inside of our public cloud provision. The data
>>> service includes RDS(relational data service) which can provision a
>>> distributed relational database based on DB2 Federation, and NOSQL service
>>> which is based on HBase.
>>>
>>>
>>>
>>> I believe HAWQ can become more successful with our joint effort!
>>> Welcome to reach me or this mail list for any HAWQ or other kinds of issues
>>> :)
>>>
>>>
>>>
>>> Thanks
>>>
>>> Lili
>>>
>>>
>>>
>>>
>>>
>>> 2016-12-15 4:45 GMT+08:00 Dan Baskette <db...@gmail.com>:
>>>
>>> I will add to the email flow…
>>>
>>>
>>>
>>> I am Dan Baskette, I am the Director of Tech Marketing for Pivotal and
>>> cover Pivotal HDB/Apache HAWQ, Pivotal Greenplum Database, and Apache
>>> MADlib.   I started my career at Sun Microsystems, and have been working
>>> for EMC/Greenplum and now Pivotal since 2000….a LONG time in quite a number
>>> of roles.   I was part of the team that launched Greenplum’s first Hadoop
>>> distribution and was around for the birth of HAWQ or as we called it when
>>> it was in it’s infancy…. GOH or Greenplum on Hadoop.   I have been actively
>>> running some webcasts on various HAWQ how-to topics for Hortonworks, so you
>>> can check those out on their site.
>>>
>>>
>>>
>>> Hoping this community really takes off in a big way!
>>>
>>>
>>>
>>> Dan
>>>
>>>
>>>
>>>
>>>
>>> On December 14, 2016 at 10:09:34 AM, Ruilong Huo (rhuo@pivotal.io)
>>> wrote:
>>>
>>> Hi All,
>>>
>>>
>>>
>>> Great for Gregory to start the thread that people can know each other
>>> much better, at least in Apache HAWQ community!
>>>
>>>
>>>
>>> I am Ruilong Huo and I am from HDB/HAWQ engineering team in Pivotal. I
>>> am from Teradata and joined Pivotal after that. It's my honor to be part of
>>> HAWQ project at its early stage. I am a fan of RDBMS (especially MPP
>>> database), big data, and cloud technology that changes the IT
>>> infrastructure of the enterprises and helps to do information
>>> transformation in a very large extent.
>>>
>>>
>>>
>>> I hope that with joint effort from hawq community, it will become even
>>> greater product in big data area, especially in SQL-on-Hadoop category.
>>>
>>>
>>> Best regards,
>>>
>>> Ruilong Huo
>>>
>>>
>>>
>>> On Wed, Dec 14, 2016 at 2:27 PM, Bob Glithero <rg...@pivotal.io>
>>> wrote:
>>>
>>> Hello all,
>>>
>>>
>>>
>>> I'm Bob, and I'm doing product marketing for HDB/HAWQ at Pivotal.  I'm
>>> new-ish here, and not so much from a coding background as from networking.
>>> I'm from Cisco Systems, where I focused on analytics use cases in
>>> telecommunications, particularly for mobile network operators, for service
>>> assurance, customer care, and customer profiling.  (also, as you're
>>> introducing yourselves, we'd love to hear what use cases you're involved
>>> with, too).
>>>
>>>
>>>
>>> About a year before I left my group at Cisco acquired an MPP database of
>>> its own -- ParStream -- for its IoT and fog computing use cases, so it's
>>> interesting to come here and learn about the architecture and applications
>>> of HAWQ.
>>>
>>>
>>>
>>> I hope to help make your experience with HAWQ a good one.  If I can help
>>> in any way, please reach out to me directly or on the list.
>>>
>>>
>>>
>>> Cheers,
>>>
>>> Bob
>>>
>>>
>>>
>>>
>>>
>>>
>>> Bob Glithero | Product Marketing
>>>
>>> Pivotal, Inc.
>>>
>>> rglithero@pivotal.io | m: 415.341.5592
>>>
>>>
>>>
>>>
>>>
>>> On Sun, Dec 11, 2016 at 6:56 PM, Roman Shaposhnik <ro...@shaposhnik.org>
>>> wrote:
>>>
>>> Greg, thanks for kicking off the roll call. Getting to know each other
>>> is super
>>> useful (and can be fun! ;-)). I'll go next:
>>>
>>> I am Roman (your friendly neighborhood mentor). I hang around a lot of
>>> ASF
>>> big data projects (as a committer and a PMC member), but lately I've been
>>> gravitating towards IoT as well (Apache Mynewt). I started my career at
>>> Sun
>>> microsystems back at a time when Linux  wasn't even 1.x and I've been
>>> doing
>>> enterprise software ever since. I was lucky enough to get to work on
>>> the original
>>> Hadoop team at Yahoo! and fall in love with not one but two elephants
>>> (Hadoop
>>> and Postgres). Recently I've assumed a position of VP of Technology at
>>> ODPi
>>> and I'm still hoping to MHGA! My secret weapon is Apache Bigtop (which
>>> co-founded)
>>> and I'm not afraid to use it!
>>>
>>> I'm here to help as much as I can to make sure that this community
>>> evolves into
>>> a vibrant, self-governed, exciting place worthy of being a top level
>>> project (TLP)
>>> at ASF. If you have any questions or ideas that you may want to bounce
>>> off of
>>> me -- please don't hesitate to reach out directly or on the mailing list.
>>>
>>> Thanks,
>>> Roman.
>>>
>>> On Fri, Dec 9, 2016 at 11:53 AM, Gregory Chase <gc...@pivotal.io>
>>> wrote:
>>> >
>>> > Dear HAWQs,
>>> >
>>> > I thought it would be fun to get to know some of the other people in
>>> the community.
>>> >
>>> > My name is Greg Chase and I run community development for Pivotal for
>>> big data open source communities that Pivotal contributes to.
>>> >
>>> > Some of you may have seen my frequent emails about virtual events I
>>> help organize for user and contributor education.
>>> >
>>> > Not so long ago, I was in charge of product marketing for an in-memory
>>> data warehouse named after a Hawaiian town from a three-letter acronymed
>>> German Company. We treated Hadoop as an external table, and returning
>>> results from these queries was both slow and brittle due to the network
>>> transfer rates.
>>> >
>>> > So I have a special appreciation of the innovation that has gone into
>>> creating Hadoop-native HAWQ out of PostgreSQL and Greenplum.
>>> >
>>> > These days I'm much more of a marketer than a coder, but I still love
>>> hearing about the kinds of projects that HAWQ users are involved in.
>>> >
>>> > I know we'd all love to hear more about everyone else's projects, and
>>> how you became a HAWQ user.  So please introduce yourselves!
>>> >
>>>
>>> > --
>>> > Greg Chase
>>> >
>>> > Global Head, Big Data Communities
>>> > http://www.pivotal.io/big-data
>>> >
>>> > Pivotal Software
>>> > http://www.pivotal.io/
>>> >
>>> > 650-215-0477
>>> > @GregChase
>>> > Blog: http://geekmarketing.biz/
>>> >
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>
>
> --
> Thanks
>
> Hubert Zhang
>

Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Posted by Xiang Sheng <xs...@pivotal.io>.
Hi All,

I'm Xiang from Pivotal China. I got my master in June this year and  joined
HAWQ Beijing team in July. Before full-time in Pivotal, I have been a
Pivotal intern in HAWQ Beijing team for a year. Focused on testing
Datalocality, Segment Policy, DFS metadata cache and a long time
performance test.
Currently I'm working on HAWQ integrated with Apache Ranger. And I'm
familiar with HAWQ ResourceManager and HAWQ register.
Looking forward to contribute more to Apache HAWQ community.

Regards,
Xiang

On Wed, Dec 21, 2016 at 5:19 PM, Ivan Weng <iw...@pivotal.io> wrote:

> Hi guys,
>
> I'm Ivan from Pivotal China. I worked on Interconnect and DFS metadata
> cache related component in HAWQ. Before Pivotal, I worked in Baidu for
> three years. Looking forward to have more discussion about SQL on Hadoop in
> the community and make HAWQ better.
>
>
> Regards,
> Ivan
>
> On Wed, Dec 21, 2016 at 5:11 PM, Hubert Zhang <hz...@pivotal.io> wrote:
>
>> Hi, All,
>>
>> My name is Hubert Zhang. I worked on HAWQ resource negotiator and data
>> locality feature and now working on HAWQ integrated with Apache Ranger.
>>
>> For any details related to the upper features and questions about HAWQ,
>> please feel free to contact me.
>>
>>
>> On Wed, Dec 21, 2016 at 1:31 PM, Michael Schubert <ms...@pivotal.io>
>> wrote:
>>
>>> Hi David,
>>>
>>> instructions to unsubscribe are at the bottom of every email. It's self
>>> service; we can't unsubscribe for you.
>>>
>>> Cheers.
>>>
>>>
>>>
>>> On Tue, Dec 20, 2016 at 5:09 AM -1000, "David Wierz" <
>>> djwierz@theocillc.com> wrote:
>>>
>>> Good Morning –
>>>>
>>>>
>>>>
>>>> Please unsubscribe me from this listserv
>>>>
>>>>
>>>>
>>>> Thank you
>>>>
>>>>
>>>>
>>>> *David*
>>>>
>>>>
>>>>
>>>> David J Wierz
>>>>
>>>> Senior Principal | Market Strategy & Finance
>>>>
>>>> [image: OCI_black_rgb]
>>>>
>>>> djwierz@theocillc.com | +1.484.432.8075 <(484)%20432-8075>
>>>>
>>>> 387 Hidden Farm Drive | Exton, PA  19341-1185
>>>>
>>>>
>>>>
>>>> The information contained in this e-mail is confidential and may be
>>>> privileged. It may be read, copied and used only by the intended recipient.
>>>> If you have received it in error, please contact the sender immediately and
>>>> delete the material from any computer
>>>>
>>>>
>>>>
>>>> *From:* Hong [mailto:xunzhang@apache.org]
>>>> *Sent:* Tuesday, 20 December, 2016 10:06
>>>> *To:* user@hawq.incubator.apache.org
>>>> *Subject:* Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache
>>>> HAWQ community
>>>>
>>>>
>>>>
>>>> Dear all,
>>>>
>>>>
>>>>
>>>> My name is HongWu(xunzhang is my ID on the Internet). I began to
>>>> contribute to HAWQ this year and now become the HAWQ committer. I wrote
>>>> near 100 commits
>>>> <https://github.com/apache/incubator-hawq/commits?author=xunzhang> to
>>>> make HAWQ better! I am a believer of open source since I benefit a lot from
>>>> open source projects and I know the true meaning of it. I think open source
>>>> is more than just publishing the code. It is delivering comprehensive
>>>> documents/tutorials, collecting users/developers/feedbacks, establish
>>>> discussion/comparison, solving confusions/issues, reproducing good or bad
>>>> results, making clear roadmap, keeping long-term development, treating the
>>>> project as our own child. Moreover, Apache Software Foundation is an elite
>>>> community in the open source world, there are lots of high-quality
>>>> projects, developers and users.
>>>>
>>>>
>>>>
>>>> I started my career 2012 at Douban which is a social network website
>>>> connecting people with interest such as book, movie, music, photo, blog and
>>>> so on. I was working as an algorithm engineer there, focused on developing
>>>> large-scale machine learning platforms and optimizing recommendation
>>>> engine. I am the author of Paracel <http://paracel.io/> open source
>>>> project, which is a distributed computational framework designed for
>>>> machine learning, graph algorithms and scientific computing. Paracel wants
>>>> to simplify the distributed and communication details for model developers
>>>> and data scientists, lets them be concentrated on model developing. During
>>>> this period, I also created the Plato: a real-time recommendation system.
>>>> This work got very outstanding results on douban.fm product. Plato
>>>> changed the architecture of Douban's offline processing way to do machine
>>>> learning.
>>>>
>>>>
>>>>
>>>> It's a very exciting jouney with HAWQ! In my point of view, SQL engine
>>>> is a super critical infrastructure in the big data ecosystem. I have tried
>>>> to design a parallel programming language named Girl(for some reason it's
>>>> still at very early stage), but no matter how simple it will be, it
>>>> supposes developers following some paradigms or programming
>>>> idioms/patterns. But SQL on the other side, is the only existing language
>>>> that erases the distributed coding logic: we just write SQLs.
>>>>
>>>>
>>>>
>>>> I definitely believe HAWQ could be one of the best SQL engines on
>>>> Hadoop ecosystem with all of our collaborative effort including HAWQ users,
>>>> HAWQ developers, HDB custormers, Pivotal FTEs, HAWQ competitors, Apache
>>>> mentors and so on. Look forward to hear more people with diversed
>>>> background joining HAWQ family.
>>>>
>>>>
>>>>
>>>> Beers
>>>>
>>>>
>>>>
>>>> 2016-12-20 18:03 GMT+08:00 Wen Lin <wl...@pivotal.io>:
>>>>
>>>> Hi, All,
>>>>
>>>>
>>>>
>>>> My name is LinWen. I joined Pivotal HAWQ Beijing team since November,
>>>> 2014. Before that, I am an engineer of VMware.
>>>>
>>>> I have implemented some features for HAWQ, like new fault tolerance
>>>> service, libyarn, etc.
>>>>
>>>> I believe HAWQ can be the best SQL on Hadoop engine with our joint
>>>> effort.
>>>>
>>>>
>>>>
>>>> On Thu, Dec 15, 2016 at 11:03 PM, Lili Ma <li...@apache.org> wrote:
>>>>
>>>> Hello everyone,
>>>>
>>>>
>>>>
>>>> Glad to know everybody here:)
>>>>
>>>>
>>>>
>>>> I'm Lili Ma, from Pivotal HAWQ R&D team in Beijing. I have been
>>>> focusing on HAWQ development and product management since 2012 when I
>>>> joined Pivotal. I experienced and contributed in HAWQ's all growth path,
>>>> from birth, Alpha, 1.X, 2.0...
>>>>
>>>>
>>>>
>>>> My main covering fields about HAWQ include three parts: 1) Storage such
>>>> as internal table storage, HAWQ Input/OutputFormat, hawq
>>>> extract/register,etc 2) Dispatcher and interconnect 3) Security including
>>>> Ranger integration, Kerberos and LDAP.
>>>>
>>>>
>>>>
>>>> Before Pivotal, I worked at IBM for more than 2 years and focused on
>>>> providing data service inside of our public cloud provision. The data
>>>> service includes RDS(relational data service) which can provision a
>>>> distributed relational database based on DB2 Federation, and NOSQL service
>>>> which is based on HBase.
>>>>
>>>>
>>>>
>>>> I believe HAWQ can become more successful with our joint effort!
>>>> Welcome to reach me or this mail list for any HAWQ or other kinds of issues
>>>> :)
>>>>
>>>>
>>>>
>>>> Thanks
>>>>
>>>> Lili
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> 2016-12-15 4:45 GMT+08:00 Dan Baskette <db...@gmail.com>:
>>>>
>>>> I will add to the email flow…
>>>>
>>>>
>>>>
>>>> I am Dan Baskette, I am the Director of Tech Marketing for Pivotal and
>>>> cover Pivotal HDB/Apache HAWQ, Pivotal Greenplum Database, and Apache
>>>> MADlib.   I started my career at Sun Microsystems, and have been working
>>>> for EMC/Greenplum and now Pivotal since 2000….a LONG time in quite a number
>>>> of roles.   I was part of the team that launched Greenplum’s first Hadoop
>>>> distribution and was around for the birth of HAWQ or as we called it when
>>>> it was in it’s infancy…. GOH or Greenplum on Hadoop.   I have been actively
>>>> running some webcasts on various HAWQ how-to topics for Hortonworks, so you
>>>> can check those out on their site.
>>>>
>>>>
>>>>
>>>> Hoping this community really takes off in a big way!
>>>>
>>>>
>>>>
>>>> Dan
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On December 14, 2016 at 10:09:34 AM, Ruilong Huo (rhuo@pivotal.io)
>>>> wrote:
>>>>
>>>> Hi All,
>>>>
>>>>
>>>>
>>>> Great for Gregory to start the thread that people can know each other
>>>> much better, at least in Apache HAWQ community!
>>>>
>>>>
>>>>
>>>> I am Ruilong Huo and I am from HDB/HAWQ engineering team in Pivotal. I
>>>> am from Teradata and joined Pivotal after that. It's my honor to be part of
>>>> HAWQ project at its early stage. I am a fan of RDBMS (especially MPP
>>>> database), big data, and cloud technology that changes the IT
>>>> infrastructure of the enterprises and helps to do information
>>>> transformation in a very large extent.
>>>>
>>>>
>>>>
>>>> I hope that with joint effort from hawq community, it will become even
>>>> greater product in big data area, especially in SQL-on-Hadoop category.
>>>>
>>>>
>>>> Best regards,
>>>>
>>>> Ruilong Huo
>>>>
>>>>
>>>>
>>>> On Wed, Dec 14, 2016 at 2:27 PM, Bob Glithero <rg...@pivotal.io>
>>>> wrote:
>>>>
>>>> Hello all,
>>>>
>>>>
>>>>
>>>> I'm Bob, and I'm doing product marketing for HDB/HAWQ at Pivotal.  I'm
>>>> new-ish here, and not so much from a coding background as from networking.
>>>> I'm from Cisco Systems, where I focused on analytics use cases in
>>>> telecommunications, particularly for mobile network operators, for service
>>>> assurance, customer care, and customer profiling.  (also, as you're
>>>> introducing yourselves, we'd love to hear what use cases you're involved
>>>> with, too).
>>>>
>>>>
>>>>
>>>> About a year before I left my group at Cisco acquired an MPP database
>>>> of its own -- ParStream -- for its IoT and fog computing use cases, so it's
>>>> interesting to come here and learn about the architecture and applications
>>>> of HAWQ.
>>>>
>>>>
>>>>
>>>> I hope to help make your experience with HAWQ a good one.  If I can
>>>> help in any way, please reach out to me directly or on the list.
>>>>
>>>>
>>>>
>>>> Cheers,
>>>>
>>>> Bob
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> Bob Glithero | Product Marketing
>>>>
>>>> Pivotal, Inc.
>>>>
>>>> rglithero@pivotal.io | m: 415.341.5592
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Sun, Dec 11, 2016 at 6:56 PM, Roman Shaposhnik <ro...@shaposhnik.org>
>>>> wrote:
>>>>
>>>> Greg, thanks for kicking off the roll call. Getting to know each other
>>>> is super
>>>> useful (and can be fun! ;-)). I'll go next:
>>>>
>>>> I am Roman (your friendly neighborhood mentor). I hang around a lot of
>>>> ASF
>>>> big data projects (as a committer and a PMC member), but lately I've
>>>> been
>>>> gravitating towards IoT as well (Apache Mynewt). I started my career at
>>>> Sun
>>>> microsystems back at a time when Linux  wasn't even 1.x and I've been
>>>> doing
>>>> enterprise software ever since. I was lucky enough to get to work on
>>>> the original
>>>> Hadoop team at Yahoo! and fall in love with not one but two elephants
>>>> (Hadoop
>>>> and Postgres). Recently I've assumed a position of VP of Technology at
>>>> ODPi
>>>> and I'm still hoping to MHGA! My secret weapon is Apache Bigtop (which
>>>> co-founded)
>>>> and I'm not afraid to use it!
>>>>
>>>> I'm here to help as much as I can to make sure that this community
>>>> evolves into
>>>> a vibrant, self-governed, exciting place worthy of being a top level
>>>> project (TLP)
>>>> at ASF. If you have any questions or ideas that you may want to bounce
>>>> off of
>>>> me -- please don't hesitate to reach out directly or on the mailing
>>>> list.
>>>>
>>>> Thanks,
>>>> Roman.
>>>>
>>>> On Fri, Dec 9, 2016 at 11:53 AM, Gregory Chase <gc...@pivotal.io>
>>>> wrote:
>>>> >
>>>> > Dear HAWQs,
>>>> >
>>>> > I thought it would be fun to get to know some of the other people in
>>>> the community.
>>>> >
>>>> > My name is Greg Chase and I run community development for Pivotal for
>>>> big data open source communities that Pivotal contributes to.
>>>> >
>>>> > Some of you may have seen my frequent emails about virtual events I
>>>> help organize for user and contributor education.
>>>> >
>>>> > Not so long ago, I was in charge of product marketing for an
>>>> in-memory data warehouse named after a Hawaiian town from a three-letter
>>>> acronymed German Company. We treated Hadoop as an external table, and
>>>> returning results from these queries was both slow and brittle due to the
>>>> network transfer rates.
>>>> >
>>>> > So I have a special appreciation of the innovation that has gone into
>>>> creating Hadoop-native HAWQ out of PostgreSQL and Greenplum.
>>>> >
>>>> > These days I'm much more of a marketer than a coder, but I still love
>>>> hearing about the kinds of projects that HAWQ users are involved in.
>>>> >
>>>> > I know we'd all love to hear more about everyone else's projects, and
>>>> how you became a HAWQ user.  So please introduce yourselves!
>>>> >
>>>>
>>>> > --
>>>> > Greg Chase
>>>> >
>>>> > Global Head, Big Data Communities
>>>> > http://www.pivotal.io/big-data
>>>> >
>>>> > Pivotal Software
>>>> > http://www.pivotal.io/
>>>> >
>>>> > 650-215-0477
>>>> > @GregChase
>>>> > Blog: http://geekmarketing.biz/
>>>> >
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>>
>>
>> --
>> Thanks
>>
>> Hubert Zhang
>>
>
>


-- 
Best Regards,
Xiang Sheng

Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Posted by Ivan Weng <iw...@pivotal.io>.
Hi guys,

I'm Ivan from Pivotal China. I worked on Interconnect and DFS metadata
cache related component in HAWQ. Before Pivotal, I worked in Baidu for
three years. Looking forward to have more discussion about SQL on Hadoop in
the community and make HAWQ better.


Regards,
Ivan

On Wed, Dec 21, 2016 at 5:11 PM, Hubert Zhang <hz...@pivotal.io> wrote:

> Hi, All,
>
> My name is Hubert Zhang. I worked on HAWQ resource negotiator and data
> locality feature and now working on HAWQ integrated with Apache Ranger.
>
> For any details related to the upper features and questions about HAWQ,
> please feel free to contact me.
>
>
> On Wed, Dec 21, 2016 at 1:31 PM, Michael Schubert <ms...@pivotal.io>
> wrote:
>
>> Hi David,
>>
>> instructions to unsubscribe are at the bottom of every email. It's self
>> service; we can't unsubscribe for you.
>>
>> Cheers.
>>
>>
>>
>> On Tue, Dec 20, 2016 at 5:09 AM -1000, "David Wierz" <
>> djwierz@theocillc.com> wrote:
>>
>> Good Morning –
>>>
>>>
>>>
>>> Please unsubscribe me from this listserv
>>>
>>>
>>>
>>> Thank you
>>>
>>>
>>>
>>> *David*
>>>
>>>
>>>
>>> David J Wierz
>>>
>>> Senior Principal | Market Strategy & Finance
>>>
>>> [image: OCI_black_rgb]
>>>
>>> djwierz@theocillc.com | +1.484.432.8075 <(484)%20432-8075>
>>>
>>> 387 Hidden Farm Drive | Exton, PA  19341-1185
>>>
>>>
>>>
>>> The information contained in this e-mail is confidential and may be
>>> privileged. It may be read, copied and used only by the intended recipient.
>>> If you have received it in error, please contact the sender immediately and
>>> delete the material from any computer
>>>
>>>
>>>
>>> *From:* Hong [mailto:xunzhang@apache.org]
>>> *Sent:* Tuesday, 20 December, 2016 10:06
>>> *To:* user@hawq.incubator.apache.org
>>> *Subject:* Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache
>>> HAWQ community
>>>
>>>
>>>
>>> Dear all,
>>>
>>>
>>>
>>> My name is HongWu(xunzhang is my ID on the Internet). I began to
>>> contribute to HAWQ this year and now become the HAWQ committer. I wrote
>>> near 100 commits
>>> <https://github.com/apache/incubator-hawq/commits?author=xunzhang> to
>>> make HAWQ better! I am a believer of open source since I benefit a lot from
>>> open source projects and I know the true meaning of it. I think open source
>>> is more than just publishing the code. It is delivering comprehensive
>>> documents/tutorials, collecting users/developers/feedbacks, establish
>>> discussion/comparison, solving confusions/issues, reproducing good or bad
>>> results, making clear roadmap, keeping long-term development, treating the
>>> project as our own child. Moreover, Apache Software Foundation is an elite
>>> community in the open source world, there are lots of high-quality
>>> projects, developers and users.
>>>
>>>
>>>
>>> I started my career 2012 at Douban which is a social network website
>>> connecting people with interest such as book, movie, music, photo, blog and
>>> so on. I was working as an algorithm engineer there, focused on developing
>>> large-scale machine learning platforms and optimizing recommendation
>>> engine. I am the author of Paracel <http://paracel.io/> open source
>>> project, which is a distributed computational framework designed for
>>> machine learning, graph algorithms and scientific computing. Paracel wants
>>> to simplify the distributed and communication details for model developers
>>> and data scientists, lets them be concentrated on model developing. During
>>> this period, I also created the Plato: a real-time recommendation system.
>>> This work got very outstanding results on douban.fm product. Plato
>>> changed the architecture of Douban's offline processing way to do machine
>>> learning.
>>>
>>>
>>>
>>> It's a very exciting jouney with HAWQ! In my point of view, SQL engine
>>> is a super critical infrastructure in the big data ecosystem. I have tried
>>> to design a parallel programming language named Girl(for some reason it's
>>> still at very early stage), but no matter how simple it will be, it
>>> supposes developers following some paradigms or programming
>>> idioms/patterns. But SQL on the other side, is the only existing language
>>> that erases the distributed coding logic: we just write SQLs.
>>>
>>>
>>>
>>> I definitely believe HAWQ could be one of the best SQL engines on Hadoop
>>> ecosystem with all of our collaborative effort including HAWQ users, HAWQ
>>> developers, HDB custormers, Pivotal FTEs, HAWQ competitors, Apache mentors
>>> and so on. Look forward to hear more people with diversed background
>>> joining HAWQ family.
>>>
>>>
>>>
>>> Beers
>>>
>>>
>>>
>>> 2016-12-20 18:03 GMT+08:00 Wen Lin <wl...@pivotal.io>:
>>>
>>> Hi, All,
>>>
>>>
>>>
>>> My name is LinWen. I joined Pivotal HAWQ Beijing team since November,
>>> 2014. Before that, I am an engineer of VMware.
>>>
>>> I have implemented some features for HAWQ, like new fault tolerance
>>> service, libyarn, etc.
>>>
>>> I believe HAWQ can be the best SQL on Hadoop engine with our joint
>>> effort.
>>>
>>>
>>>
>>> On Thu, Dec 15, 2016 at 11:03 PM, Lili Ma <li...@apache.org> wrote:
>>>
>>> Hello everyone,
>>>
>>>
>>>
>>> Glad to know everybody here:)
>>>
>>>
>>>
>>> I'm Lili Ma, from Pivotal HAWQ R&D team in Beijing. I have been focusing
>>> on HAWQ development and product management since 2012 when I joined
>>> Pivotal. I experienced and contributed in HAWQ's all growth path, from
>>> birth, Alpha, 1.X, 2.0...
>>>
>>>
>>>
>>> My main covering fields about HAWQ include three parts: 1) Storage such
>>> as internal table storage, HAWQ Input/OutputFormat, hawq
>>> extract/register,etc 2) Dispatcher and interconnect 3) Security including
>>> Ranger integration, Kerberos and LDAP.
>>>
>>>
>>>
>>> Before Pivotal, I worked at IBM for more than 2 years and focused on
>>> providing data service inside of our public cloud provision. The data
>>> service includes RDS(relational data service) which can provision a
>>> distributed relational database based on DB2 Federation, and NOSQL service
>>> which is based on HBase.
>>>
>>>
>>>
>>> I believe HAWQ can become more successful with our joint effort!
>>> Welcome to reach me or this mail list for any HAWQ or other kinds of issues
>>> :)
>>>
>>>
>>>
>>> Thanks
>>>
>>> Lili
>>>
>>>
>>>
>>>
>>>
>>> 2016-12-15 4:45 GMT+08:00 Dan Baskette <db...@gmail.com>:
>>>
>>> I will add to the email flow…
>>>
>>>
>>>
>>> I am Dan Baskette, I am the Director of Tech Marketing for Pivotal and
>>> cover Pivotal HDB/Apache HAWQ, Pivotal Greenplum Database, and Apache
>>> MADlib.   I started my career at Sun Microsystems, and have been working
>>> for EMC/Greenplum and now Pivotal since 2000….a LONG time in quite a number
>>> of roles.   I was part of the team that launched Greenplum’s first Hadoop
>>> distribution and was around for the birth of HAWQ or as we called it when
>>> it was in it’s infancy…. GOH or Greenplum on Hadoop.   I have been actively
>>> running some webcasts on various HAWQ how-to topics for Hortonworks, so you
>>> can check those out on their site.
>>>
>>>
>>>
>>> Hoping this community really takes off in a big way!
>>>
>>>
>>>
>>> Dan
>>>
>>>
>>>
>>>
>>>
>>> On December 14, 2016 at 10:09:34 AM, Ruilong Huo (rhuo@pivotal.io)
>>> wrote:
>>>
>>> Hi All,
>>>
>>>
>>>
>>> Great for Gregory to start the thread that people can know each other
>>> much better, at least in Apache HAWQ community!
>>>
>>>
>>>
>>> I am Ruilong Huo and I am from HDB/HAWQ engineering team in Pivotal. I
>>> am from Teradata and joined Pivotal after that. It's my honor to be part of
>>> HAWQ project at its early stage. I am a fan of RDBMS (especially MPP
>>> database), big data, and cloud technology that changes the IT
>>> infrastructure of the enterprises and helps to do information
>>> transformation in a very large extent.
>>>
>>>
>>>
>>> I hope that with joint effort from hawq community, it will become even
>>> greater product in big data area, especially in SQL-on-Hadoop category.
>>>
>>>
>>> Best regards,
>>>
>>> Ruilong Huo
>>>
>>>
>>>
>>> On Wed, Dec 14, 2016 at 2:27 PM, Bob Glithero <rg...@pivotal.io>
>>> wrote:
>>>
>>> Hello all,
>>>
>>>
>>>
>>> I'm Bob, and I'm doing product marketing for HDB/HAWQ at Pivotal.  I'm
>>> new-ish here, and not so much from a coding background as from networking.
>>> I'm from Cisco Systems, where I focused on analytics use cases in
>>> telecommunications, particularly for mobile network operators, for service
>>> assurance, customer care, and customer profiling.  (also, as you're
>>> introducing yourselves, we'd love to hear what use cases you're involved
>>> with, too).
>>>
>>>
>>>
>>> About a year before I left my group at Cisco acquired an MPP database of
>>> its own -- ParStream -- for its IoT and fog computing use cases, so it's
>>> interesting to come here and learn about the architecture and applications
>>> of HAWQ.
>>>
>>>
>>>
>>> I hope to help make your experience with HAWQ a good one.  If I can help
>>> in any way, please reach out to me directly or on the list.
>>>
>>>
>>>
>>> Cheers,
>>>
>>> Bob
>>>
>>>
>>>
>>>
>>>
>>>
>>> Bob Glithero | Product Marketing
>>>
>>> Pivotal, Inc.
>>>
>>> rglithero@pivotal.io | m: 415.341.5592
>>>
>>>
>>>
>>>
>>>
>>> On Sun, Dec 11, 2016 at 6:56 PM, Roman Shaposhnik <ro...@shaposhnik.org>
>>> wrote:
>>>
>>> Greg, thanks for kicking off the roll call. Getting to know each other
>>> is super
>>> useful (and can be fun! ;-)). I'll go next:
>>>
>>> I am Roman (your friendly neighborhood mentor). I hang around a lot of
>>> ASF
>>> big data projects (as a committer and a PMC member), but lately I've been
>>> gravitating towards IoT as well (Apache Mynewt). I started my career at
>>> Sun
>>> microsystems back at a time when Linux  wasn't even 1.x and I've been
>>> doing
>>> enterprise software ever since. I was lucky enough to get to work on
>>> the original
>>> Hadoop team at Yahoo! and fall in love with not one but two elephants
>>> (Hadoop
>>> and Postgres). Recently I've assumed a position of VP of Technology at
>>> ODPi
>>> and I'm still hoping to MHGA! My secret weapon is Apache Bigtop (which
>>> co-founded)
>>> and I'm not afraid to use it!
>>>
>>> I'm here to help as much as I can to make sure that this community
>>> evolves into
>>> a vibrant, self-governed, exciting place worthy of being a top level
>>> project (TLP)
>>> at ASF. If you have any questions or ideas that you may want to bounce
>>> off of
>>> me -- please don't hesitate to reach out directly or on the mailing list.
>>>
>>> Thanks,
>>> Roman.
>>>
>>> On Fri, Dec 9, 2016 at 11:53 AM, Gregory Chase <gc...@pivotal.io>
>>> wrote:
>>> >
>>> > Dear HAWQs,
>>> >
>>> > I thought it would be fun to get to know some of the other people in
>>> the community.
>>> >
>>> > My name is Greg Chase and I run community development for Pivotal for
>>> big data open source communities that Pivotal contributes to.
>>> >
>>> > Some of you may have seen my frequent emails about virtual events I
>>> help organize for user and contributor education.
>>> >
>>> > Not so long ago, I was in charge of product marketing for an in-memory
>>> data warehouse named after a Hawaiian town from a three-letter acronymed
>>> German Company. We treated Hadoop as an external table, and returning
>>> results from these queries was both slow and brittle due to the network
>>> transfer rates.
>>> >
>>> > So I have a special appreciation of the innovation that has gone into
>>> creating Hadoop-native HAWQ out of PostgreSQL and Greenplum.
>>> >
>>> > These days I'm much more of a marketer than a coder, but I still love
>>> hearing about the kinds of projects that HAWQ users are involved in.
>>> >
>>> > I know we'd all love to hear more about everyone else's projects, and
>>> how you became a HAWQ user.  So please introduce yourselves!
>>> >
>>>
>>> > --
>>> > Greg Chase
>>> >
>>> > Global Head, Big Data Communities
>>> > http://www.pivotal.io/big-data
>>> >
>>> > Pivotal Software
>>> > http://www.pivotal.io/
>>> >
>>> > 650-215-0477
>>> > @GregChase
>>> > Blog: http://geekmarketing.biz/
>>> >
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>
>
> --
> Thanks
>
> Hubert Zhang
>

Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Posted by Hubert Zhang <hz...@pivotal.io>.
Hi, All,

My name is Hubert Zhang. I worked on HAWQ resource negotiator and data
locality feature and now working on HAWQ integrated with Apache Ranger.

For any details related to the upper features and questions about HAWQ,
please feel free to contact me.


On Wed, Dec 21, 2016 at 1:31 PM, Michael Schubert <ms...@pivotal.io>
wrote:

> Hi David,
>
> instructions to unsubscribe are at the bottom of every email. It's self
> service; we can't unsubscribe for you.
>
> Cheers.
>
>
>
> On Tue, Dec 20, 2016 at 5:09 AM -1000, "David Wierz" <
> djwierz@theocillc.com> wrote:
>
> Good Morning –
>>
>>
>>
>> Please unsubscribe me from this listserv
>>
>>
>>
>> Thank you
>>
>>
>>
>> *David*
>>
>>
>>
>> David J Wierz
>>
>> Senior Principal | Market Strategy & Finance
>>
>> [image: OCI_black_rgb]
>>
>> djwierz@theocillc.com | +1.484.432.8075 <(484)%20432-8075>
>>
>> 387 Hidden Farm Drive | Exton, PA  19341-1185
>>
>>
>>
>> The information contained in this e-mail is confidential and may be
>> privileged. It may be read, copied and used only by the intended recipient.
>> If you have received it in error, please contact the sender immediately and
>> delete the material from any computer
>>
>>
>>
>> *From:* Hong [mailto:xunzhang@apache.org]
>> *Sent:* Tuesday, 20 December, 2016 10:06
>> *To:* user@hawq.incubator.apache.org
>> *Subject:* Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache
>> HAWQ community
>>
>>
>>
>> Dear all,
>>
>>
>>
>> My name is HongWu(xunzhang is my ID on the Internet). I began to
>> contribute to HAWQ this year and now become the HAWQ committer. I wrote
>> near 100 commits
>> <https://github.com/apache/incubator-hawq/commits?author=xunzhang> to
>> make HAWQ better! I am a believer of open source since I benefit a lot from
>> open source projects and I know the true meaning of it. I think open source
>> is more than just publishing the code. It is delivering comprehensive
>> documents/tutorials, collecting users/developers/feedbacks, establish
>> discussion/comparison, solving confusions/issues, reproducing good or bad
>> results, making clear roadmap, keeping long-term development, treating the
>> project as our own child. Moreover, Apache Software Foundation is an elite
>> community in the open source world, there are lots of high-quality
>> projects, developers and users.
>>
>>
>>
>> I started my career 2012 at Douban which is a social network website
>> connecting people with interest such as book, movie, music, photo, blog and
>> so on. I was working as an algorithm engineer there, focused on developing
>> large-scale machine learning platforms and optimizing recommendation
>> engine. I am the author of Paracel <http://paracel.io/> open source
>> project, which is a distributed computational framework designed for
>> machine learning, graph algorithms and scientific computing. Paracel wants
>> to simplify the distributed and communication details for model developers
>> and data scientists, lets them be concentrated on model developing. During
>> this period, I also created the Plato: a real-time recommendation system.
>> This work got very outstanding results on douban.fm product. Plato
>> changed the architecture of Douban's offline processing way to do machine
>> learning.
>>
>>
>>
>> It's a very exciting jouney with HAWQ! In my point of view, SQL engine is
>> a super critical infrastructure in the big data ecosystem. I have tried to
>> design a parallel programming language named Girl(for some reason it's
>> still at very early stage), but no matter how simple it will be, it
>> supposes developers following some paradigms or programming
>> idioms/patterns. But SQL on the other side, is the only existing language
>> that erases the distributed coding logic: we just write SQLs.
>>
>>
>>
>> I definitely believe HAWQ could be one of the best SQL engines on Hadoop
>> ecosystem with all of our collaborative effort including HAWQ users, HAWQ
>> developers, HDB custormers, Pivotal FTEs, HAWQ competitors, Apache mentors
>> and so on. Look forward to hear more people with diversed background
>> joining HAWQ family.
>>
>>
>>
>> Beers
>>
>>
>>
>> 2016-12-20 18:03 GMT+08:00 Wen Lin <wl...@pivotal.io>:
>>
>> Hi, All,
>>
>>
>>
>> My name is LinWen. I joined Pivotal HAWQ Beijing team since November,
>> 2014. Before that, I am an engineer of VMware.
>>
>> I have implemented some features for HAWQ, like new fault tolerance
>> service, libyarn, etc.
>>
>> I believe HAWQ can be the best SQL on Hadoop engine with our joint
>> effort.
>>
>>
>>
>> On Thu, Dec 15, 2016 at 11:03 PM, Lili Ma <li...@apache.org> wrote:
>>
>> Hello everyone,
>>
>>
>>
>> Glad to know everybody here:)
>>
>>
>>
>> I'm Lili Ma, from Pivotal HAWQ R&D team in Beijing. I have been focusing
>> on HAWQ development and product management since 2012 when I joined
>> Pivotal. I experienced and contributed in HAWQ's all growth path, from
>> birth, Alpha, 1.X, 2.0...
>>
>>
>>
>> My main covering fields about HAWQ include three parts: 1) Storage such
>> as internal table storage, HAWQ Input/OutputFormat, hawq
>> extract/register,etc 2) Dispatcher and interconnect 3) Security including
>> Ranger integration, Kerberos and LDAP.
>>
>>
>>
>> Before Pivotal, I worked at IBM for more than 2 years and focused on
>> providing data service inside of our public cloud provision. The data
>> service includes RDS(relational data service) which can provision a
>> distributed relational database based on DB2 Federation, and NOSQL service
>> which is based on HBase.
>>
>>
>>
>> I believe HAWQ can become more successful with our joint effort!  Welcome
>> to reach me or this mail list for any HAWQ or other kinds of issues :)
>>
>>
>>
>> Thanks
>>
>> Lili
>>
>>
>>
>>
>>
>> 2016-12-15 4:45 GMT+08:00 Dan Baskette <db...@gmail.com>:
>>
>> I will add to the email flow…
>>
>>
>>
>> I am Dan Baskette, I am the Director of Tech Marketing for Pivotal and
>> cover Pivotal HDB/Apache HAWQ, Pivotal Greenplum Database, and Apache
>> MADlib.   I started my career at Sun Microsystems, and have been working
>> for EMC/Greenplum and now Pivotal since 2000….a LONG time in quite a number
>> of roles.   I was part of the team that launched Greenplum’s first Hadoop
>> distribution and was around for the birth of HAWQ or as we called it when
>> it was in it’s infancy…. GOH or Greenplum on Hadoop.   I have been actively
>> running some webcasts on various HAWQ how-to topics for Hortonworks, so you
>> can check those out on their site.
>>
>>
>>
>> Hoping this community really takes off in a big way!
>>
>>
>>
>> Dan
>>
>>
>>
>>
>>
>> On December 14, 2016 at 10:09:34 AM, Ruilong Huo (rhuo@pivotal.io) wrote:
>>
>> Hi All,
>>
>>
>>
>> Great for Gregory to start the thread that people can know each other
>> much better, at least in Apache HAWQ community!
>>
>>
>>
>> I am Ruilong Huo and I am from HDB/HAWQ engineering team in Pivotal. I am
>> from Teradata and joined Pivotal after that. It's my honor to be part of
>> HAWQ project at its early stage. I am a fan of RDBMS (especially MPP
>> database), big data, and cloud technology that changes the IT
>> infrastructure of the enterprises and helps to do information
>> transformation in a very large extent.
>>
>>
>>
>> I hope that with joint effort from hawq community, it will become even
>> greater product in big data area, especially in SQL-on-Hadoop category.
>>
>>
>> Best regards,
>>
>> Ruilong Huo
>>
>>
>>
>> On Wed, Dec 14, 2016 at 2:27 PM, Bob Glithero <rg...@pivotal.io>
>> wrote:
>>
>> Hello all,
>>
>>
>>
>> I'm Bob, and I'm doing product marketing for HDB/HAWQ at Pivotal.  I'm
>> new-ish here, and not so much from a coding background as from networking.
>> I'm from Cisco Systems, where I focused on analytics use cases in
>> telecommunications, particularly for mobile network operators, for service
>> assurance, customer care, and customer profiling.  (also, as you're
>> introducing yourselves, we'd love to hear what use cases you're involved
>> with, too).
>>
>>
>>
>> About a year before I left my group at Cisco acquired an MPP database of
>> its own -- ParStream -- for its IoT and fog computing use cases, so it's
>> interesting to come here and learn about the architecture and applications
>> of HAWQ.
>>
>>
>>
>> I hope to help make your experience with HAWQ a good one.  If I can help
>> in any way, please reach out to me directly or on the list.
>>
>>
>>
>> Cheers,
>>
>> Bob
>>
>>
>>
>>
>>
>>
>> Bob Glithero | Product Marketing
>>
>> Pivotal, Inc.
>>
>> rglithero@pivotal.io | m: 415.341.5592
>>
>>
>>
>>
>>
>> On Sun, Dec 11, 2016 at 6:56 PM, Roman Shaposhnik <ro...@shaposhnik.org>
>> wrote:
>>
>> Greg, thanks for kicking off the roll call. Getting to know each other is
>> super
>> useful (and can be fun! ;-)). I'll go next:
>>
>> I am Roman (your friendly neighborhood mentor). I hang around a lot of ASF
>> big data projects (as a committer and a PMC member), but lately I've been
>> gravitating towards IoT as well (Apache Mynewt). I started my career at
>> Sun
>> microsystems back at a time when Linux  wasn't even 1.x and I've been
>> doing
>> enterprise software ever since. I was lucky enough to get to work on
>> the original
>> Hadoop team at Yahoo! and fall in love with not one but two elephants
>> (Hadoop
>> and Postgres). Recently I've assumed a position of VP of Technology at
>> ODPi
>> and I'm still hoping to MHGA! My secret weapon is Apache Bigtop (which
>> co-founded)
>> and I'm not afraid to use it!
>>
>> I'm here to help as much as I can to make sure that this community
>> evolves into
>> a vibrant, self-governed, exciting place worthy of being a top level
>> project (TLP)
>> at ASF. If you have any questions or ideas that you may want to bounce
>> off of
>> me -- please don't hesitate to reach out directly or on the mailing list.
>>
>> Thanks,
>> Roman.
>>
>> On Fri, Dec 9, 2016 at 11:53 AM, Gregory Chase <gc...@pivotal.io> wrote:
>> >
>> > Dear HAWQs,
>> >
>> > I thought it would be fun to get to know some of the other people in
>> the community.
>> >
>> > My name is Greg Chase and I run community development for Pivotal for
>> big data open source communities that Pivotal contributes to.
>> >
>> > Some of you may have seen my frequent emails about virtual events I
>> help organize for user and contributor education.
>> >
>> > Not so long ago, I was in charge of product marketing for an in-memory
>> data warehouse named after a Hawaiian town from a three-letter acronymed
>> German Company. We treated Hadoop as an external table, and returning
>> results from these queries was both slow and brittle due to the network
>> transfer rates.
>> >
>> > So I have a special appreciation of the innovation that has gone into
>> creating Hadoop-native HAWQ out of PostgreSQL and Greenplum.
>> >
>> > These days I'm much more of a marketer than a coder, but I still love
>> hearing about the kinds of projects that HAWQ users are involved in.
>> >
>> > I know we'd all love to hear more about everyone else's projects, and
>> how you became a HAWQ user.  So please introduce yourselves!
>> >
>>
>> > --
>> > Greg Chase
>> >
>> > Global Head, Big Data Communities
>> > http://www.pivotal.io/big-data
>> >
>> > Pivotal Software
>> > http://www.pivotal.io/
>> >
>> > 650-215-0477
>> > @GregChase
>> > Blog: http://geekmarketing.biz/
>> >
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>


-- 
Thanks

Hubert Zhang

Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Posted by Michael Schubert <ms...@pivotal.io>.
Hi David, 
instructions to unsubscribe are at the bottom of every email. It's self service; we can't unsubscribe for you.
Cheers.




On Tue, Dec 20, 2016 at 5:09 AM -1000, "David Wierz" <dj...@theocillc.com> wrote:




















Good Morning –


 


Please unsubscribe me from this listserv


 


Thank you


 


David


 


David J Wierz


Senior Principal | Market Strategy & Finance





djwierz@theocillc.com | +1.484.432.8075


387 Hidden Farm Drive | Exton, PA  19341-1185


 


The information contained in this e-mail is confidential and may be privileged. It may be read, copied and used only by the intended recipient. If you have received
 it in error, please contact the sender immediately and delete the material from any computer


 


From: Hong [mailto:xunzhang@apache.org]


Sent: Tuesday, 20 December, 2016 10:06

To: user@hawq.incubator.apache.org

Subject: Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community


 



Dear all,



 




My name is HongWu(xunzhang is my ID on the Internet). I began to contribute to HAWQ this year and now become the HAWQ committer. I wrote near
100 commits to make HAWQ better! I am a believer of open source since I benefit a lot from open source projects and I know the true meaning of it. I think open source is more than
 just publishing the code. It is delivering comprehensive documents/tutorials, collecting users/developers/feedbacks, establish discussion/comparison, solving confusions/issues, reproducing good or bad results, making clear roadmap, keeping long-term development,
 treating the project as our own child. Moreover, Apache Software Foundation is an elite community in the open source world, there are lots of high-quality projects, developers and users.




 




I started my career 2012 at Douban which is a social network website connecting people with interest such as book, movie, music, photo, blog and so on. I was working as an algorithm engineer there, focused on developing large-scale machine
 learning platforms and optimizing recommendation engine. I am the author of 
Paracel open source project, which is a distributed computational framework designed for machine learning, graph algorithms and scientific computing. Paracel wants to simplify the distributed and communication details for model developers and data scientists,
 lets them be concentrated on model developing. During this period, I also created the Plato: a real-time recommendation system. This work got very outstanding results on
douban.fm product. Plato changed the architecture of Douban's offline processing way to do machine learning.




 




It's a very exciting jouney with HAWQ! In my point of view, SQL engine is a super critical infrastructure in the big data ecosystem. I have tried to design a parallel programming language named Girl(for some reason it's still at very early
 stage), but no matter how simple it will be, it supposes developers following some paradigms or programming idioms/patterns. But SQL on the other side, is the only existing language that erases the distributed coding logic: we just write SQLs.




 




I definitely believe HAWQ could be one of the best SQL engines on Hadoop ecosystem with all of our collaborative effort including HAWQ users, HAWQ developers, HDB custormers, Pivotal FTEs, HAWQ competitors, Apache mentors and so on. Look
 forward to hear more people with diversed background joining HAWQ family.




 




Beers




 



2016-12-20 18:03 GMT+08:00 Wen Lin <wl...@pivotal.io>:




Hi, All,



 




My name is LinWen. I joined Pivotal HAWQ Beijing team since November, 2014. Before that, I am an engineer of VMware.




I have implemented some features for HAWQ, like new fault tolerance service, libyarn, etc.




I believe HAWQ can be the best SQL on Hadoop engine with our joint effort. 







 



On Thu, Dec 15, 2016 at 11:03 PM, Lili Ma <li...@apache.org> wrote:




Hello everyone,



 




Glad to know everybody here:)   




 




I'm Lili Ma, from Pivotal HAWQ R&D team in Beijing. I have been focusing on HAWQ development and product management since 2012 when I joined Pivotal. I experienced and contributed in HAWQ's all growth path, from birth, Alpha, 1.X, 2.0... 




 




My main covering fields about HAWQ include three parts: 1) Storage such as internal table storage, HAWQ Input/OutputFormat, hawq extract/register,etc 2) Dispatcher and interconnect 3) Security including Ranger integration, Kerberos and
 LDAP. 




 




Before Pivotal, I worked at IBM for more than 2 years and focused on providing data service inside of our public cloud provision. The data service includes RDS(relational data service) which can provision a distributed relational database
 based on DB2 Federation, and NOSQL service which is based on HBase.




 




I believe HAWQ can become more successful with our joint effort!  Welcome to reach me or this mail list for any HAWQ or other kinds of issues :)




 




Thanks




Lili




 







 



2016-12-15 4:45 GMT+08:00 Dan Baskette <db...@gmail.com>:





I will add to the email flow…




 




I am Dan Baskette, I am the Director of Tech Marketing for Pivotal and cover Pivotal HDB/Apache HAWQ, Pivotal Greenplum Database, and Apache MADlib.   I started
 my career at Sun Microsystems, and have been working for EMC/Greenplum and now Pivotal since 2000….a LONG time in quite a number of roles.   I was part of the team that launched Greenplum’s first Hadoop distribution and was around for the birth of HAWQ or
 as we called it when it was in it’s infancy…. GOH or Greenplum on Hadoop.   I have been actively running some webcasts on various HAWQ how-to topics for Hortonworks, so you can check those out on their site.  




 




Hoping this community really takes off in a big way!




 




Dan





 


 



On December 14, 2016 at 10:09:34 AM, Ruilong Huo (rhuo@pivotal.io) wrote:







Hi All,




 



Great for Gregory to start the thread that people can know each other much better, at least in Apache HAWQ community!




 




I am Ruilong Huo and I am from HDB/HAWQ engineering team in Pivotal. I am from Teradata and joined Pivotal after that. It's my honor to be part of HAWQ project at its early stage. I am a fan of RDBMS (especially MPP database), big data,
 and cloud technology that changes the IT infrastructure of the enterprises and helps to do information transformation in a very large extent.




 




I hope that with joint effort from hawq community, it will become even greater product in big data area, especially in SQL-on-Hadoop category.













Best regards, 



Ruilong Huo






 



On Wed, Dec 14, 2016 at 2:27 PM, Bob Glithero <rg...@pivotal.io> wrote:





Hello all,




 




I'm Bob, and I'm doing product marketing for HDB/HAWQ at Pivotal.  I'm new-ish here, and not so much from a coding background as from networking.  I'm from Cisco Systems, where I focused on analytics use cases in telecommunications, particularly
 for mobile network operators, for service assurance, customer care, and customer profiling.  (also, as you're introducing yourselves, we'd love to hear what use cases you're involved with, too).




 




About a year before I left my group at Cisco acquired an MPP database of its own -- ParStream -- for its IoT and fog computing use cases, so it's interesting to come here and learn about the architecture and applications of HAWQ.




 




I hope to help make your experience with HAWQ a good one.  If I can help in any way, please reach out to me directly or on the list.




 




Cheers,




Bob 




 














 




Bob Glithero | Product Marketing




Pivotal, Inc.



rglithero@pivotal.io | m: 415.341.5592




 









 



On Sun, Dec 11, 2016 at 6:56 PM, Roman Shaposhnik <ro...@shaposhnik.org> wrote:



Greg, thanks for kicking off the roll call. Getting to know each other is super

useful (and can be fun! ;-)). I'll go next:



I am Roman (your friendly neighborhood mentor). I hang around a lot of ASF

big data projects (as a committer and a PMC member), but lately I've been

gravitating towards IoT as well (Apache Mynewt). I started my career at Sun

microsystems back at a time when Linux  wasn't even 1.x and I've been doing

enterprise software ever since. I was lucky enough to get to work on

the original

Hadoop team at Yahoo! and fall in love with not one but two elephants (Hadoop

and Postgres). Recently I've assumed a position of VP of Technology at ODPi

and I'm still hoping to MHGA! My secret weapon is Apache Bigtop (which

co-founded)

and I'm not afraid to use it!



I'm here to help as much as I can to make sure that this community evolves into

a vibrant, self-governed, exciting place worthy of being a top level

project (TLP)

at ASF. If you have any questions or ideas that you may want to bounce off of

me -- please don't hesitate to reach out directly or on the mailing list.



Thanks,

Roman.



On Fri, Dec 9, 2016 at 11:53 AM, Gregory Chase <gc...@pivotal.io> wrote:

>

> Dear HAWQs,

>

> I thought it would be fun to get to know some of the other people in the community.

>

> My name is Greg Chase and I run community development for Pivotal for big data open source communities that Pivotal contributes to.

>

> Some of you may have seen my frequent emails about virtual events I help organize for user and contributor education.

>

> Not so long ago, I was in charge of product marketing for an in-memory data warehouse named after a Hawaiian town from a three-letter acronymed German
 Company. We treated Hadoop as an external table, and returning results from these queries was both slow and brittle due to the network transfer rates.

>

> So I have a special appreciation of the innovation that has gone into creating Hadoop-native HAWQ out of PostgreSQL and Greenplum.

>

> These days I'm much more of a marketer than a coder, but I still love hearing about the kinds of projects that HAWQ users are involved in.

>

> I know we'd all love to hear more about everyone else's projects, and how you became a HAWQ user.  So please introduce yourselves!

>




> --

> Greg Chase

>

> Global Head, Big Data Communities

> http://www.pivotal.io/big-data

>

> Pivotal Software

> http://www.pivotal.io/

>

> 650-215-0477

> @GregChase

> Blog: http://geekmarketing.biz/

>






 







 











 







 







 











RE: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Posted by David Wierz <dj...@theocillc.com>.
Good Morning –

Please unsubscribe me from this listserv

Thank you

David

David J Wierz
Senior Principal | Market Strategy & Finance
[OCI_black_rgb]
djwierz@theocillc.com<ma...@theocillc.com> | +1.484.432.8075
387 Hidden Farm Drive | Exton, PA  19341-1185

The information contained in this e-mail is confidential and may be privileged. It may be read, copied and used only by the intended recipient. If you have received it in error, please contact the sender immediately and delete the material from any computer

From: Hong [mailto:xunzhang@apache.org]
Sent: Tuesday, 20 December, 2016 10:06
To: user@hawq.incubator.apache.org
Subject: Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Dear all,

My name is HongWu(xunzhang is my ID on the Internet). I began to contribute to HAWQ this year and now become the HAWQ committer. I wrote near 100 commits<https://github.com/apache/incubator-hawq/commits?author=xunzhang> to make HAWQ better! I am a believer of open source since I benefit a lot from open source projects and I know the true meaning of it. I think open source is more than just publishing the code. It is delivering comprehensive documents/tutorials, collecting users/developers/feedbacks, establish discussion/comparison, solving confusions/issues, reproducing good or bad results, making clear roadmap, keeping long-term development, treating the project as our own child. Moreover, Apache Software Foundation is an elite community in the open source world, there are lots of high-quality projects, developers and users.

I started my career 2012 at Douban which is a social network website connecting people with interest such as book, movie, music, photo, blog and so on. I was working as an algorithm engineer there, focused on developing large-scale machine learning platforms and optimizing recommendation engine. I am the author of Paracel<http://paracel.io/> open source project, which is a distributed computational framework designed for machine learning, graph algorithms and scientific computing. Paracel wants to simplify the distributed and communication details for model developers and data scientists, lets them be concentrated on model developing. During this period, I also created the Plato: a real-time recommendation system. This work got very outstanding results on douban.fm<http://douban.fm> product. Plato changed the architecture of Douban's offline processing way to do machine learning.

It's a very exciting jouney with HAWQ! In my point of view, SQL engine is a super critical infrastructure in the big data ecosystem. I have tried to design a parallel programming language named Girl(for some reason it's still at very early stage), but no matter how simple it will be, it supposes developers following some paradigms or programming idioms/patterns. But SQL on the other side, is the only existing language that erases the distributed coding logic: we just write SQLs.

I definitely believe HAWQ could be one of the best SQL engines on Hadoop ecosystem with all of our collaborative effort including HAWQ users, HAWQ developers, HDB custormers, Pivotal FTEs, HAWQ competitors, Apache mentors and so on. Look forward to hear more people with diversed background joining HAWQ family.

Beers

2016-12-20 18:03 GMT+08:00 Wen Lin <wl...@pivotal.io>>:
Hi, All,

My name is LinWen. I joined Pivotal HAWQ Beijing team since November, 2014. Before that, I am an engineer of VMware.
I have implemented some features for HAWQ, like new fault tolerance service, libyarn, etc.
I believe HAWQ can be the best SQL on Hadoop engine with our joint effort.

On Thu, Dec 15, 2016 at 11:03 PM, Lili Ma <li...@apache.org>> wrote:
Hello everyone,

Glad to know everybody here:)

I'm Lili Ma, from Pivotal HAWQ R&D team in Beijing. I have been focusing on HAWQ development and product management since 2012 when I joined Pivotal. I experienced and contributed in HAWQ's all growth path, from birth, Alpha, 1.X, 2.0...

My main covering fields about HAWQ include three parts: 1) Storage such as internal table storage, HAWQ Input/OutputFormat, hawq extract/register,etc 2) Dispatcher and interconnect 3) Security including Ranger integration, Kerberos and LDAP.

Before Pivotal, I worked at IBM for more than 2 years and focused on providing data service inside of our public cloud provision. The data service includes RDS(relational data service) which can provision a distributed relational database based on DB2 Federation, and NOSQL service which is based on HBase.

I believe HAWQ can become more successful with our joint effort!  Welcome to reach me or this mail list for any HAWQ or other kinds of issues :)

Thanks
Lili


2016-12-15 4:45 GMT+08:00 Dan Baskette <db...@gmail.com>>:
I will add to the email flow…

I am Dan Baskette, I am the Director of Tech Marketing for Pivotal and cover Pivotal HDB/Apache HAWQ, Pivotal Greenplum Database, and Apache MADlib.   I started my career at Sun Microsystems, and have been working for EMC/Greenplum and now Pivotal since 2000….a LONG time in quite a number of roles.   I was part of the team that launched Greenplum’s first Hadoop distribution and was around for the birth of HAWQ or as we called it when it was in it’s infancy…. GOH or Greenplum on Hadoop.   I have been actively running some webcasts on various HAWQ how-to topics for Hortonworks, so you can check those out on their site.

Hoping this community really takes off in a big way!

Dan



On December 14, 2016 at 10:09:34 AM, Ruilong Huo (rhuo@pivotal.io<ma...@pivotal.io>) wrote:
Hi All,

Great for Gregory to start the thread that people can know each other much better, at least in Apache HAWQ community!

I am Ruilong Huo and I am from HDB/HAWQ engineering team in Pivotal. I am from Teradata and joined Pivotal after that. It's my honor to be part of HAWQ project at its early stage. I am a fan of RDBMS (especially MPP database), big data, and cloud technology that changes the IT infrastructure of the enterprises and helps to do information transformation in a very large extent.

I hope that with joint effort from hawq community, it will become even greater product in big data area, especially in SQL-on-Hadoop category.

Best regards,
Ruilong Huo

On Wed, Dec 14, 2016 at 2:27 PM, Bob Glithero <rg...@pivotal.io>> wrote:
Hello all,

I'm Bob, and I'm doing product marketing for HDB/HAWQ at Pivotal.  I'm new-ish here, and not so much from a coding background as from networking.  I'm from Cisco Systems, where I focused on analytics use cases in telecommunications, particularly for mobile network operators, for service assurance, customer care, and customer profiling.  (also, as you're introducing yourselves, we'd love to hear what use cases you're involved with, too).

About a year before I left my group at Cisco acquired an MPP database of its own -- ParStream -- for its IoT and fog computing use cases, so it's interesting to come here and learn about the architecture and applications of HAWQ.

I hope to help make your experience with HAWQ a good one.  If I can help in any way, please reach out to me directly or on the list.

Cheers,
Bob



Bob Glithero | Product Marketing
Pivotal, Inc.
rglithero@pivotal.io<ma...@pivotal.io> | m: 415.341.5592<tel:415.341.5592>


On Sun, Dec 11, 2016 at 6:56 PM, Roman Shaposhnik <ro...@shaposhnik.org>> wrote:
Greg, thanks for kicking off the roll call. Getting to know each other is super
useful (and can be fun! ;-)). I'll go next:

I am Roman (your friendly neighborhood mentor). I hang around a lot of ASF
big data projects (as a committer and a PMC member), but lately I've been
gravitating towards IoT as well (Apache Mynewt). I started my career at Sun
microsystems back at a time when Linux  wasn't even 1.x and I've been doing
enterprise software ever since. I was lucky enough to get to work on
the original
Hadoop team at Yahoo! and fall in love with not one but two elephants (Hadoop
and Postgres). Recently I've assumed a position of VP of Technology at ODPi
and I'm still hoping to MHGA! My secret weapon is Apache Bigtop (which
co-founded)
and I'm not afraid to use it!

I'm here to help as much as I can to make sure that this community evolves into
a vibrant, self-governed, exciting place worthy of being a top level
project (TLP)
at ASF. If you have any questions or ideas that you may want to bounce off of
me -- please don't hesitate to reach out directly or on the mailing list.

Thanks,
Roman.

On Fri, Dec 9, 2016 at 11:53 AM, Gregory Chase <gc...@pivotal.io>> wrote:
>
> Dear HAWQs,
>
> I thought it would be fun to get to know some of the other people in the community.
>
> My name is Greg Chase and I run community development for Pivotal for big data open source communities that Pivotal contributes to.
>
> Some of you may have seen my frequent emails about virtual events I help organize for user and contributor education.
>
> Not so long ago, I was in charge of product marketing for an in-memory data warehouse named after a Hawaiian town from a three-letter acronymed German Company. We treated Hadoop as an external table, and returning results from these queries was both slow and brittle due to the network transfer rates.
>
> So I have a special appreciation of the innovation that has gone into creating Hadoop-native HAWQ out of PostgreSQL and Greenplum.
>
> These days I'm much more of a marketer than a coder, but I still love hearing about the kinds of projects that HAWQ users are involved in.
>
> I know we'd all love to hear more about everyone else's projects, and how you became a HAWQ user.  So please introduce yourselves!
>
> --
> Greg Chase
>
> Global Head, Big Data Communities
> http://www.pivotal.io/big-data
>
> Pivotal Software
> http://www.pivotal.io/
>
> 650-215-0477<tel:650-215-0477>
> @GregChase
> Blog: http://geekmarketing.biz/
>






Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Posted by Hong <xu...@apache.org>.
Dear all,

My name is HongWu(xunzhang is my ID on the Internet). I began to contribute
to HAWQ this year and now become the HAWQ committer. I wrote near 100
commits <https://github.com/apache/incubator-hawq/commits?author=xunzhang>
to make HAWQ better! I am a believer of open source since I benefit a lot
from open source projects and I know the true meaning of it. I think open
source is more than just publishing the code. It is delivering
comprehensive documents/tutorials, collecting users/developers/feedbacks,
establish discussion/comparison, solving confusions/issues, reproducing
good or bad results, making clear roadmap, keeping long-term development,
treating the project as our own child. Moreover, Apache Software Foundation
is an elite community in the open source world, there are lots of
high-quality projects, developers and users.

I started my career 2012 at Douban which is a social network website
connecting people with interest such as book, movie, music, photo, blog and
so on. I was working as an algorithm engineer there, focused on developing
large-scale machine learning platforms and optimizing recommendation
engine. I am the author of Paracel <http://paracel.io/> open source
project, which is a distributed computational framework designed for
machine learning, graph algorithms and scientific computing. Paracel wants
to simplify the distributed and communication details for model developers
and data scientists, lets them be concentrated on model developing. During
this period, I also created the Plato: a real-time recommendation system.
This work got very outstanding results on douban.fm product. Plato changed
the architecture of Douban's offline processing way to do machine learning.

It's a very exciting jouney with HAWQ! In my point of view, SQL engine is a
super critical infrastructure in the big data ecosystem. I have tried to
design a parallel programming language named Girl(for some reason it's
still at very early stage), but no matter how simple it will be, it
supposes developers following some paradigms or programming
idioms/patterns. But SQL on the other side, is the only existing language
that erases the distributed coding logic: we just write SQLs.

I definitely believe HAWQ could be one of the best SQL engines on Hadoop
ecosystem with all of our collaborative effort including HAWQ users, HAWQ
developers, HDB custormers, Pivotal FTEs, HAWQ competitors, Apache mentors
and so on. Look forward to hear more people with diversed background
joining HAWQ family.

Beers

2016-12-20 18:03 GMT+08:00 Wen Lin <wl...@pivotal.io>:

> Hi, All,
>
> My name is LinWen. I joined Pivotal HAWQ Beijing team since November,
> 2014. Before that, I am an engineer of VMware.
> I have implemented some features for HAWQ, like new fault tolerance
> service, libyarn, etc.
> I believe HAWQ can be the best SQL on Hadoop engine with our joint
> effort.
>
> On Thu, Dec 15, 2016 at 11:03 PM, Lili Ma <li...@apache.org> wrote:
>
>> Hello everyone,
>>
>> Glad to know everybody here:)
>>
>> I'm Lili Ma, from Pivotal HAWQ R&D team in Beijing. I have been focusing
>> on HAWQ development and product management since 2012 when I joined
>> Pivotal. I experienced and contributed in HAWQ's all growth path, from
>> birth, Alpha, 1.X, 2.0...
>>
>> My main covering fields about HAWQ include three parts: 1) Storage such
>> as internal table storage, HAWQ Input/OutputFormat, hawq
>> extract/register,etc 2) Dispatcher and interconnect 3) Security including
>> Ranger integration, Kerberos and LDAP.
>>
>> Before Pivotal, I worked at IBM for more than 2 years and focused on
>> providing data service inside of our public cloud provision. The data
>> service includes RDS(relational data service) which can provision a
>> distributed relational database based on DB2 Federation, and NOSQL service
>> which is based on HBase.
>>
>> I believe HAWQ can become more successful with our joint effort!  Welcome
>> to reach me or this mail list for any HAWQ or other kinds of issues :)
>>
>> Thanks
>> Lili
>>
>>
>> 2016-12-15 4:45 GMT+08:00 Dan Baskette <db...@gmail.com>:
>>
>>> I will add to the email flow…
>>>
>>> I am Dan Baskette, I am the Director of Tech Marketing for Pivotal and
>>> cover Pivotal HDB/Apache HAWQ, Pivotal Greenplum Database, and Apache
>>> MADlib.   I started my career at Sun Microsystems, and have been working
>>> for EMC/Greenplum and now Pivotal since 2000….a LONG time in quite a number
>>> of roles.   I was part of the team that launched Greenplum’s first Hadoop
>>> distribution and was around for the birth of HAWQ or as we called it when
>>> it was in it’s infancy…. GOH or Greenplum on Hadoop.   I have been actively
>>> running some webcasts on various HAWQ how-to topics for Hortonworks, so you
>>> can check those out on their site.
>>>
>>> Hoping this community really takes off in a big way!
>>>
>>> Dan
>>>
>>>
>>> On December 14, 2016 at 10:09:34 AM, Ruilong Huo (rhuo@pivotal.io)
>>> wrote:
>>>
>>> Hi All,
>>>
>>> Great for Gregory to start the thread that people can know each other
>>> much better, at least in Apache HAWQ community!
>>>
>>> I am Ruilong Huo and I am from HDB/HAWQ engineering team in Pivotal. I
>>> am from Teradata and joined Pivotal after that. It's my honor to be part of
>>> HAWQ project at its early stage. I am a fan of RDBMS (especially MPP
>>> database), big data, and cloud technology that changes the IT
>>> infrastructure of the enterprises and helps to do information
>>> transformation in a very large extent.
>>>
>>> I hope that with joint effort from hawq community, it will become even
>>> greater product in big data area, especially in SQL-on-Hadoop category.
>>>
>>> Best regards,
>>> Ruilong Huo
>>>
>>> On Wed, Dec 14, 2016 at 2:27 PM, Bob Glithero <rg...@pivotal.io>
>>> wrote:
>>>
>>>> Hello all,
>>>>
>>>> I'm Bob, and I'm doing product marketing for HDB/HAWQ at Pivotal.  I'm
>>>> new-ish here, and not so much from a coding background as from networking.
>>>> I'm from Cisco Systems, where I focused on analytics use cases in
>>>> telecommunications, particularly for mobile network operators, for service
>>>> assurance, customer care, and customer profiling.  (also, as you're
>>>> introducing yourselves, we'd love to hear what use cases you're involved
>>>> with, too).
>>>>
>>>> About a year before I left my group at Cisco acquired an MPP database
>>>> of its own -- ParStream -- for its IoT and fog computing use cases, so it's
>>>> interesting to come here and learn about the architecture and applications
>>>> of HAWQ.
>>>>
>>>> I hope to help make your experience with HAWQ a good one.  If I can
>>>> help in any way, please reach out to me directly or on the list.
>>>>
>>>> Cheers,
>>>> Bob
>>>>
>>>>
>>>>
>>>> Bob Glithero | Product Marketing
>>>> Pivotal, Inc.
>>>> rglithero@pivotal.io | m: 415.341.5592
>>>>
>>>>
>>>> On Sun, Dec 11, 2016 at 6:56 PM, Roman Shaposhnik <roman@shaposhnik.org
>>>> > wrote:
>>>>
>>>>> Greg, thanks for kicking off the roll call. Getting to know each other
>>>>> is super
>>>>> useful (and can be fun! ;-)). I'll go next:
>>>>>
>>>>> I am Roman (your friendly neighborhood mentor). I hang around a lot of
>>>>> ASF
>>>>> big data projects (as a committer and a PMC member), but lately I've
>>>>> been
>>>>> gravitating towards IoT as well (Apache Mynewt). I started my career
>>>>> at Sun
>>>>> microsystems back at a time when Linux  wasn't even 1.x and I've been
>>>>> doing
>>>>> enterprise software ever since. I was lucky enough to get to work on
>>>>> the original
>>>>> Hadoop team at Yahoo! and fall in love with not one but two elephants
>>>>> (Hadoop
>>>>> and Postgres). Recently I've assumed a position of VP of Technology at
>>>>> ODPi
>>>>> and I'm still hoping to MHGA! My secret weapon is Apache Bigtop (which
>>>>> co-founded)
>>>>> and I'm not afraid to use it!
>>>>>
>>>>> I'm here to help as much as I can to make sure that this community
>>>>> evolves into
>>>>> a vibrant, self-governed, exciting place worthy of being a top level
>>>>> project (TLP)
>>>>> at ASF. If you have any questions or ideas that you may want to bounce
>>>>> off of
>>>>> me -- please don't hesitate to reach out directly or on the mailing
>>>>> list.
>>>>>
>>>>> Thanks,
>>>>> Roman.
>>>>>
>>>>> On Fri, Dec 9, 2016 at 11:53 AM, Gregory Chase <gc...@pivotal.io>
>>>>> wrote:
>>>>> >
>>>>> > Dear HAWQs,
>>>>> >
>>>>> > I thought it would be fun to get to know some of the other people in
>>>>> the community.
>>>>> >
>>>>> > My name is Greg Chase and I run community development for Pivotal
>>>>> for big data open source communities that Pivotal contributes to.
>>>>> >
>>>>> > Some of you may have seen my frequent emails about virtual events I
>>>>> help organize for user and contributor education.
>>>>> >
>>>>> > Not so long ago, I was in charge of product marketing for an
>>>>> in-memory data warehouse named after a Hawaiian town from a three-letter
>>>>> acronymed German Company. We treated Hadoop as an external table, and
>>>>> returning results from these queries was both slow and brittle due to the
>>>>> network transfer rates.
>>>>> >
>>>>> > So I have a special appreciation of the innovation that has gone
>>>>> into creating Hadoop-native HAWQ out of PostgreSQL and Greenplum.
>>>>> >
>>>>> > These days I'm much more of a marketer than a coder, but I still
>>>>> love hearing about the kinds of projects that HAWQ users are involved in.
>>>>> >
>>>>> > I know we'd all love to hear more about everyone else's projects,
>>>>> and how you became a HAWQ user.  So please introduce yourselves!
>>>>> >
>>>>> > --
>>>>> > Greg Chase
>>>>> >
>>>>> > Global Head, Big Data Communities
>>>>> > http://www.pivotal.io/big-data
>>>>> >
>>>>> > Pivotal Software
>>>>> > http://www.pivotal.io/
>>>>> >
>>>>> > 650-215-0477
>>>>> > @GregChase
>>>>> > Blog: http://geekmarketing.biz/
>>>>> >
>>>>>
>>>>
>>>>
>>>
>>
>

Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Posted by Wen Lin <wl...@pivotal.io>.
Hi, All,

My name is LinWen. I joined Pivotal HAWQ Beijing team since November, 2014.
Before that, I am an engineer of VMware.
I have implemented some features for HAWQ, like new fault tolerance
service, libyarn, etc.
I believe HAWQ can be the best SQL on Hadoop engine with our joint effort.

On Thu, Dec 15, 2016 at 11:03 PM, Lili Ma <li...@apache.org> wrote:

> Hello everyone,
>
> Glad to know everybody here:)
>
> I'm Lili Ma, from Pivotal HAWQ R&D team in Beijing. I have been focusing
> on HAWQ development and product management since 2012 when I joined
> Pivotal. I experienced and contributed in HAWQ's all growth path, from
> birth, Alpha, 1.X, 2.0...
>
> My main covering fields about HAWQ include three parts: 1) Storage such as
> internal table storage, HAWQ Input/OutputFormat, hawq extract/register,etc
> 2) Dispatcher and interconnect 3) Security including Ranger integration,
> Kerberos and LDAP.
>
> Before Pivotal, I worked at IBM for more than 2 years and focused on
> providing data service inside of our public cloud provision. The data
> service includes RDS(relational data service) which can provision a
> distributed relational database based on DB2 Federation, and NOSQL service
> which is based on HBase.
>
> I believe HAWQ can become more successful with our joint effort!  Welcome
> to reach me or this mail list for any HAWQ or other kinds of issues :)
>
> Thanks
> Lili
>
>
> 2016-12-15 4:45 GMT+08:00 Dan Baskette <db...@gmail.com>:
>
>> I will add to the email flow…
>>
>> I am Dan Baskette, I am the Director of Tech Marketing for Pivotal and
>> cover Pivotal HDB/Apache HAWQ, Pivotal Greenplum Database, and Apache
>> MADlib.   I started my career at Sun Microsystems, and have been working
>> for EMC/Greenplum and now Pivotal since 2000….a LONG time in quite a number
>> of roles.   I was part of the team that launched Greenplum’s first Hadoop
>> distribution and was around for the birth of HAWQ or as we called it when
>> it was in it’s infancy…. GOH or Greenplum on Hadoop.   I have been actively
>> running some webcasts on various HAWQ how-to topics for Hortonworks, so you
>> can check those out on their site.
>>
>> Hoping this community really takes off in a big way!
>>
>> Dan
>>
>>
>> On December 14, 2016 at 10:09:34 AM, Ruilong Huo (rhuo@pivotal.io) wrote:
>>
>> Hi All,
>>
>> Great for Gregory to start the thread that people can know each other
>> much better, at least in Apache HAWQ community!
>>
>> I am Ruilong Huo and I am from HDB/HAWQ engineering team in Pivotal. I am
>> from Teradata and joined Pivotal after that. It's my honor to be part of
>> HAWQ project at its early stage. I am a fan of RDBMS (especially MPP
>> database), big data, and cloud technology that changes the IT
>> infrastructure of the enterprises and helps to do information
>> transformation in a very large extent.
>>
>> I hope that with joint effort from hawq community, it will become even
>> greater product in big data area, especially in SQL-on-Hadoop category.
>>
>> Best regards,
>> Ruilong Huo
>>
>> On Wed, Dec 14, 2016 at 2:27 PM, Bob Glithero <rg...@pivotal.io>
>> wrote:
>>
>>> Hello all,
>>>
>>> I'm Bob, and I'm doing product marketing for HDB/HAWQ at Pivotal.  I'm
>>> new-ish here, and not so much from a coding background as from networking.
>>> I'm from Cisco Systems, where I focused on analytics use cases in
>>> telecommunications, particularly for mobile network operators, for service
>>> assurance, customer care, and customer profiling.  (also, as you're
>>> introducing yourselves, we'd love to hear what use cases you're involved
>>> with, too).
>>>
>>> About a year before I left my group at Cisco acquired an MPP database of
>>> its own -- ParStream -- for its IoT and fog computing use cases, so it's
>>> interesting to come here and learn about the architecture and applications
>>> of HAWQ.
>>>
>>> I hope to help make your experience with HAWQ a good one.  If I can help
>>> in any way, please reach out to me directly or on the list.
>>>
>>> Cheers,
>>> Bob
>>>
>>>
>>>
>>> Bob Glithero | Product Marketing
>>> Pivotal, Inc.
>>> rglithero@pivotal.io | m: 415.341.5592
>>>
>>>
>>> On Sun, Dec 11, 2016 at 6:56 PM, Roman Shaposhnik <ro...@shaposhnik.org>
>>> wrote:
>>>
>>>> Greg, thanks for kicking off the roll call. Getting to know each other
>>>> is super
>>>> useful (and can be fun! ;-)). I'll go next:
>>>>
>>>> I am Roman (your friendly neighborhood mentor). I hang around a lot of
>>>> ASF
>>>> big data projects (as a committer and a PMC member), but lately I've
>>>> been
>>>> gravitating towards IoT as well (Apache Mynewt). I started my career at
>>>> Sun
>>>> microsystems back at a time when Linux  wasn't even 1.x and I've been
>>>> doing
>>>> enterprise software ever since. I was lucky enough to get to work on
>>>> the original
>>>> Hadoop team at Yahoo! and fall in love with not one but two elephants
>>>> (Hadoop
>>>> and Postgres). Recently I've assumed a position of VP of Technology at
>>>> ODPi
>>>> and I'm still hoping to MHGA! My secret weapon is Apache Bigtop (which
>>>> co-founded)
>>>> and I'm not afraid to use it!
>>>>
>>>> I'm here to help as much as I can to make sure that this community
>>>> evolves into
>>>> a vibrant, self-governed, exciting place worthy of being a top level
>>>> project (TLP)
>>>> at ASF. If you have any questions or ideas that you may want to bounce
>>>> off of
>>>> me -- please don't hesitate to reach out directly or on the mailing
>>>> list.
>>>>
>>>> Thanks,
>>>> Roman.
>>>>
>>>> On Fri, Dec 9, 2016 at 11:53 AM, Gregory Chase <gc...@pivotal.io>
>>>> wrote:
>>>> >
>>>> > Dear HAWQs,
>>>> >
>>>> > I thought it would be fun to get to know some of the other people in
>>>> the community.
>>>> >
>>>> > My name is Greg Chase and I run community development for Pivotal for
>>>> big data open source communities that Pivotal contributes to.
>>>> >
>>>> > Some of you may have seen my frequent emails about virtual events I
>>>> help organize for user and contributor education.
>>>> >
>>>> > Not so long ago, I was in charge of product marketing for an
>>>> in-memory data warehouse named after a Hawaiian town from a three-letter
>>>> acronymed German Company. We treated Hadoop as an external table, and
>>>> returning results from these queries was both slow and brittle due to the
>>>> network transfer rates.
>>>> >
>>>> > So I have a special appreciation of the innovation that has gone into
>>>> creating Hadoop-native HAWQ out of PostgreSQL and Greenplum.
>>>> >
>>>> > These days I'm much more of a marketer than a coder, but I still love
>>>> hearing about the kinds of projects that HAWQ users are involved in.
>>>> >
>>>> > I know we'd all love to hear more about everyone else's projects, and
>>>> how you became a HAWQ user.  So please introduce yourselves!
>>>> >
>>>> > --
>>>> > Greg Chase
>>>> >
>>>> > Global Head, Big Data Communities
>>>> > http://www.pivotal.io/big-data
>>>> >
>>>> > Pivotal Software
>>>> > http://www.pivotal.io/
>>>> >
>>>> > 650-215-0477
>>>> > @GregChase
>>>> > Blog: http://geekmarketing.biz/
>>>> >
>>>>
>>>
>>>
>>
>

Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Posted by Lili Ma <li...@apache.org>.
Hello everyone,

Glad to know everybody here:)

I'm Lili Ma, from Pivotal HAWQ R&D team in Beijing. I have been focusing on
HAWQ development and product management since 2012 when I joined Pivotal. I
experienced and contributed in HAWQ's all growth path, from birth, Alpha,
1.X, 2.0...

My main covering fields about HAWQ include three parts: 1) Storage such as
internal table storage, HAWQ Input/OutputFormat, hawq extract/register,etc
2) Dispatcher and interconnect 3) Security including Ranger integration,
Kerberos and LDAP.

Before Pivotal, I worked at IBM for more than 2 years and focused on
providing data service inside of our public cloud provision. The data
service includes RDS(relational data service) which can provision a
distributed relational database based on DB2 Federation, and NOSQL service
which is based on HBase.

I believe HAWQ can become more successful with our joint effort!  Welcome
to reach me or this mail list for any HAWQ or other kinds of issues :)

Thanks
Lili


2016-12-15 4:45 GMT+08:00 Dan Baskette <db...@gmail.com>:

> I will add to the email flow…
>
> I am Dan Baskette, I am the Director of Tech Marketing for Pivotal and
> cover Pivotal HDB/Apache HAWQ, Pivotal Greenplum Database, and Apache
> MADlib.   I started my career at Sun Microsystems, and have been working
> for EMC/Greenplum and now Pivotal since 2000….a LONG time in quite a number
> of roles.   I was part of the team that launched Greenplum’s first Hadoop
> distribution and was around for the birth of HAWQ or as we called it when
> it was in it’s infancy…. GOH or Greenplum on Hadoop.   I have been actively
> running some webcasts on various HAWQ how-to topics for Hortonworks, so you
> can check those out on their site.
>
> Hoping this community really takes off in a big way!
>
> Dan
>
>
> On December 14, 2016 at 10:09:34 AM, Ruilong Huo (rhuo@pivotal.io) wrote:
>
> Hi All,
>
> Great for Gregory to start the thread that people can know each other much
> better, at least in Apache HAWQ community!
>
> I am Ruilong Huo and I am from HDB/HAWQ engineering team in Pivotal. I am
> from Teradata and joined Pivotal after that. It's my honor to be part of
> HAWQ project at its early stage. I am a fan of RDBMS (especially MPP
> database), big data, and cloud technology that changes the IT
> infrastructure of the enterprises and helps to do information
> transformation in a very large extent.
>
> I hope that with joint effort from hawq community, it will become even
> greater product in big data area, especially in SQL-on-Hadoop category.
>
> Best regards,
> Ruilong Huo
>
> On Wed, Dec 14, 2016 at 2:27 PM, Bob Glithero <rg...@pivotal.io>
> wrote:
>
>> Hello all,
>>
>> I'm Bob, and I'm doing product marketing for HDB/HAWQ at Pivotal.  I'm
>> new-ish here, and not so much from a coding background as from networking.
>> I'm from Cisco Systems, where I focused on analytics use cases in
>> telecommunications, particularly for mobile network operators, for service
>> assurance, customer care, and customer profiling.  (also, as you're
>> introducing yourselves, we'd love to hear what use cases you're involved
>> with, too).
>>
>> About a year before I left my group at Cisco acquired an MPP database of
>> its own -- ParStream -- for its IoT and fog computing use cases, so it's
>> interesting to come here and learn about the architecture and applications
>> of HAWQ.
>>
>> I hope to help make your experience with HAWQ a good one.  If I can help
>> in any way, please reach out to me directly or on the list.
>>
>> Cheers,
>> Bob
>>
>>
>>
>> Bob Glithero | Product Marketing
>> Pivotal, Inc.
>> rglithero@pivotal.io | m: 415.341.5592
>>
>>
>> On Sun, Dec 11, 2016 at 6:56 PM, Roman Shaposhnik <ro...@shaposhnik.org>
>> wrote:
>>
>>> Greg, thanks for kicking off the roll call. Getting to know each other
>>> is super
>>> useful (and can be fun! ;-)). I'll go next:
>>>
>>> I am Roman (your friendly neighborhood mentor). I hang around a lot of
>>> ASF
>>> big data projects (as a committer and a PMC member), but lately I've been
>>> gravitating towards IoT as well (Apache Mynewt). I started my career at
>>> Sun
>>> microsystems back at a time when Linux  wasn't even 1.x and I've been
>>> doing
>>> enterprise software ever since. I was lucky enough to get to work on
>>> the original
>>> Hadoop team at Yahoo! and fall in love with not one but two elephants
>>> (Hadoop
>>> and Postgres). Recently I've assumed a position of VP of Technology at
>>> ODPi
>>> and I'm still hoping to MHGA! My secret weapon is Apache Bigtop (which
>>> co-founded)
>>> and I'm not afraid to use it!
>>>
>>> I'm here to help as much as I can to make sure that this community
>>> evolves into
>>> a vibrant, self-governed, exciting place worthy of being a top level
>>> project (TLP)
>>> at ASF. If you have any questions or ideas that you may want to bounce
>>> off of
>>> me -- please don't hesitate to reach out directly or on the mailing list.
>>>
>>> Thanks,
>>> Roman.
>>>
>>> On Fri, Dec 9, 2016 at 11:53 AM, Gregory Chase <gc...@pivotal.io>
>>> wrote:
>>> >
>>> > Dear HAWQs,
>>> >
>>> > I thought it would be fun to get to know some of the other people in
>>> the community.
>>> >
>>> > My name is Greg Chase and I run community development for Pivotal for
>>> big data open source communities that Pivotal contributes to.
>>> >
>>> > Some of you may have seen my frequent emails about virtual events I
>>> help organize for user and contributor education.
>>> >
>>> > Not so long ago, I was in charge of product marketing for an in-memory
>>> data warehouse named after a Hawaiian town from a three-letter acronymed
>>> German Company. We treated Hadoop as an external table, and returning
>>> results from these queries was both slow and brittle due to the network
>>> transfer rates.
>>> >
>>> > So I have a special appreciation of the innovation that has gone into
>>> creating Hadoop-native HAWQ out of PostgreSQL and Greenplum.
>>> >
>>> > These days I'm much more of a marketer than a coder, but I still love
>>> hearing about the kinds of projects that HAWQ users are involved in.
>>> >
>>> > I know we'd all love to hear more about everyone else's projects, and
>>> how you became a HAWQ user.  So please introduce yourselves!
>>> >
>>> > --
>>> > Greg Chase
>>> >
>>> > Global Head, Big Data Communities
>>> > http://www.pivotal.io/big-data
>>> >
>>> > Pivotal Software
>>> > http://www.pivotal.io/
>>> >
>>> > 650-215-0477
>>> > @GregChase
>>> > Blog: http://geekmarketing.biz/
>>> >
>>>
>>
>>
>

Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Posted by Dan Baskette <db...@gmail.com>.
I will add to the email flow…

I am Dan Baskette, I am the Director of Tech Marketing for Pivotal and
cover Pivotal HDB/Apache HAWQ, Pivotal Greenplum Database, and Apache
MADlib.   I started my career at Sun Microsystems, and have been working
for EMC/Greenplum and now Pivotal since 2000….a LONG time in quite a number
of roles.   I was part of the team that launched Greenplum’s first Hadoop
distribution and was around for the birth of HAWQ or as we called it when
it was in it’s infancy…. GOH or Greenplum on Hadoop.   I have been actively
running some webcasts on various HAWQ how-to topics for Hortonworks, so you
can check those out on their site.

Hoping this community really takes off in a big way!

Dan


On December 14, 2016 at 10:09:34 AM, Ruilong Huo (rhuo@pivotal.io) wrote:

Hi All,

Great for Gregory to start the thread that people can know each other much
better, at least in Apache HAWQ community!

I am Ruilong Huo and I am from HDB/HAWQ engineering team in Pivotal. I am
from Teradata and joined Pivotal after that. It's my honor to be part of
HAWQ project at its early stage. I am a fan of RDBMS (especially MPP
database), big data, and cloud technology that changes the IT
infrastructure of the enterprises and helps to do information
transformation in a very large extent.

I hope that with joint effort from hawq community, it will become even
greater product in big data area, especially in SQL-on-Hadoop category.

Best regards,
Ruilong Huo

On Wed, Dec 14, 2016 at 2:27 PM, Bob Glithero <rg...@pivotal.io> wrote:

> Hello all,
>
> I'm Bob, and I'm doing product marketing for HDB/HAWQ at Pivotal.  I'm
> new-ish here, and not so much from a coding background as from networking.
> I'm from Cisco Systems, where I focused on analytics use cases in
> telecommunications, particularly for mobile network operators, for service
> assurance, customer care, and customer profiling.  (also, as you're
> introducing yourselves, we'd love to hear what use cases you're involved
> with, too).
>
> About a year before I left my group at Cisco acquired an MPP database of
> its own -- ParStream -- for its IoT and fog computing use cases, so it's
> interesting to come here and learn about the architecture and applications
> of HAWQ.
>
> I hope to help make your experience with HAWQ a good one.  If I can help
> in any way, please reach out to me directly or on the list.
>
> Cheers,
> Bob
>
>
>
> Bob Glithero | Product Marketing
> Pivotal, Inc.
> rglithero@pivotal.io | m: 415.341.5592
>
>
> On Sun, Dec 11, 2016 at 6:56 PM, Roman Shaposhnik <ro...@shaposhnik.org>
> wrote:
>
>> Greg, thanks for kicking off the roll call. Getting to know each other is
>> super
>> useful (and can be fun! ;-)). I'll go next:
>>
>> I am Roman (your friendly neighborhood mentor). I hang around a lot of ASF
>> big data projects (as a committer and a PMC member), but lately I've been
>> gravitating towards IoT as well (Apache Mynewt). I started my career at
>> Sun
>> microsystems back at a time when Linux  wasn't even 1.x and I've been
>> doing
>> enterprise software ever since. I was lucky enough to get to work on
>> the original
>> Hadoop team at Yahoo! and fall in love with not one but two elephants
>> (Hadoop
>> and Postgres). Recently I've assumed a position of VP of Technology at
>> ODPi
>> and I'm still hoping to MHGA! My secret weapon is Apache Bigtop (which
>> co-founded)
>> and I'm not afraid to use it!
>>
>> I'm here to help as much as I can to make sure that this community
>> evolves into
>> a vibrant, self-governed, exciting place worthy of being a top level
>> project (TLP)
>> at ASF. If you have any questions or ideas that you may want to bounce
>> off of
>> me -- please don't hesitate to reach out directly or on the mailing list.
>>
>> Thanks,
>> Roman.
>>
>> On Fri, Dec 9, 2016 at 11:53 AM, Gregory Chase <gc...@pivotal.io> wrote:
>> >
>> > Dear HAWQs,
>> >
>> > I thought it would be fun to get to know some of the other people in
>> the community.
>> >
>> > My name is Greg Chase and I run community development for Pivotal for
>> big data open source communities that Pivotal contributes to.
>> >
>> > Some of you may have seen my frequent emails about virtual events I
>> help organize for user and contributor education.
>> >
>> > Not so long ago, I was in charge of product marketing for an in-memory
>> data warehouse named after a Hawaiian town from a three-letter acronymed
>> German Company. We treated Hadoop as an external table, and returning
>> results from these queries was both slow and brittle due to the network
>> transfer rates.
>> >
>> > So I have a special appreciation of the innovation that has gone into
>> creating Hadoop-native HAWQ out of PostgreSQL and Greenplum.
>> >
>> > These days I'm much more of a marketer than a coder, but I still love
>> hearing about the kinds of projects that HAWQ users are involved in.
>> >
>> > I know we'd all love to hear more about everyone else's projects, and
>> how you became a HAWQ user.  So please introduce yourselves!
>> >
>> > --
>> > Greg Chase
>> >
>> > Global Head, Big Data Communities
>> > http://www.pivotal.io/big-data
>> >
>> > Pivotal Software
>> > http://www.pivotal.io/
>> >
>> > 650-215-0477
>> > @GregChase
>> > Blog: http://geekmarketing.biz/
>> >
>>
>
>

Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Posted by Ruilong Huo <rh...@pivotal.io>.
Hi All,

Great for Gregory to start the thread that people can know each other much
better, at least in Apache HAWQ community!

I am Ruilong Huo and I am from HDB/HAWQ engineering team in Pivotal. I am
from Teradata and joined Pivotal after that. It's my honor to be part of
HAWQ project at its early stage. I am a fan of RDBMS (especially MPP
database), big data, and cloud technology that changes the IT
infrastructure of the enterprises and helps to do information
transformation in a very large extent.

I hope that with joint effort from hawq community, it will become even
greater product in big data area, especially in SQL-on-Hadoop category.

Best regards,
Ruilong Huo

On Wed, Dec 14, 2016 at 2:27 PM, Bob Glithero <rg...@pivotal.io> wrote:

> Hello all,
>
> I'm Bob, and I'm doing product marketing for HDB/HAWQ at Pivotal.  I'm
> new-ish here, and not so much from a coding background as from networking.
> I'm from Cisco Systems, where I focused on analytics use cases in
> telecommunications, particularly for mobile network operators, for service
> assurance, customer care, and customer profiling.  (also, as you're
> introducing yourselves, we'd love to hear what use cases you're involved
> with, too).
>
> About a year before I left my group at Cisco acquired an MPP database of
> its own -- ParStream -- for its IoT and fog computing use cases, so it's
> interesting to come here and learn about the architecture and applications
> of HAWQ.
>
> I hope to help make your experience with HAWQ a good one.  If I can help
> in any way, please reach out to me directly or on the list.
>
> Cheers,
> Bob
>
>
>
> Bob Glithero | Product Marketing
> Pivotal, Inc.
> rglithero@pivotal.io | m: 415.341.5592
>
>
> On Sun, Dec 11, 2016 at 6:56 PM, Roman Shaposhnik <ro...@shaposhnik.org>
> wrote:
>
>> Greg, thanks for kicking off the roll call. Getting to know each other is
>> super
>> useful (and can be fun! ;-)). I'll go next:
>>
>> I am Roman (your friendly neighborhood mentor). I hang around a lot of ASF
>> big data projects (as a committer and a PMC member), but lately I've been
>> gravitating towards IoT as well (Apache Mynewt). I started my career at
>> Sun
>> microsystems back at a time when Linux  wasn't even 1.x and I've been
>> doing
>> enterprise software ever since. I was lucky enough to get to work on
>> the original
>> Hadoop team at Yahoo! and fall in love with not one but two elephants
>> (Hadoop
>> and Postgres). Recently I've assumed a position of VP of Technology at
>> ODPi
>> and I'm still hoping to MHGA! My secret weapon is Apache Bigtop (which
>> co-founded)
>> and I'm not afraid to use it!
>>
>> I'm here to help as much as I can to make sure that this community
>> evolves into
>> a vibrant, self-governed, exciting place worthy of being a top level
>> project (TLP)
>> at ASF. If you have any questions or ideas that you may want to bounce
>> off of
>> me -- please don't hesitate to reach out directly or on the mailing list.
>>
>> Thanks,
>> Roman.
>>
>> On Fri, Dec 9, 2016 at 11:53 AM, Gregory Chase <gc...@pivotal.io> wrote:
>> >
>> > Dear HAWQs,
>> >
>> > I thought it would be fun to get to know some of the other people in
>> the community.
>> >
>> > My name is Greg Chase and I run community development for Pivotal for
>> big data open source communities that Pivotal contributes to.
>> >
>> > Some of you may have seen my frequent emails about virtual events I
>> help organize for user and contributor education.
>> >
>> > Not so long ago, I was in charge of product marketing for an in-memory
>> data warehouse named after a Hawaiian town from a three-letter acronymed
>> German Company. We treated Hadoop as an external table, and returning
>> results from these queries was both slow and brittle due to the network
>> transfer rates.
>> >
>> > So I have a special appreciation of the innovation that has gone into
>> creating Hadoop-native HAWQ out of PostgreSQL and Greenplum.
>> >
>> > These days I'm much more of a marketer than a coder, but I still love
>> hearing about the kinds of projects that HAWQ users are involved in.
>> >
>> > I know we'd all love to hear more about everyone else's projects, and
>> how you became a HAWQ user.  So please introduce yourselves!
>> >
>> > --
>> > Greg Chase
>> >
>> > Global Head, Big Data Communities
>> > http://www.pivotal.io/big-data
>> >
>> > Pivotal Software
>> > http://www.pivotal.io/
>> >
>> > 650-215-0477
>> > @GregChase
>> > Blog: http://geekmarketing.biz/
>> >
>>
>
>

Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Posted by Bob Glithero <rg...@pivotal.io>.
Hello all,

I'm Bob, and I'm doing product marketing for HDB/HAWQ at Pivotal.  I'm
new-ish here, and not so much from a coding background as from networking.
I'm from Cisco Systems, where I focused on analytics use cases in
telecommunications, particularly for mobile network operators, for service
assurance, customer care, and customer profiling.  (also, as you're
introducing yourselves, we'd love to hear what use cases you're involved
with, too).

About a year before I left my group at Cisco acquired an MPP database of
its own -- ParStream -- for its IoT and fog computing use cases, so it's
interesting to come here and learn about the architecture and applications
of HAWQ.

I hope to help make your experience with HAWQ a good one.  If I can help in
any way, please reach out to me directly or on the list.

Cheers,
Bob



Bob Glithero | Product Marketing
Pivotal, Inc.
rglithero@pivotal.io | m: 415.341.5592


On Sun, Dec 11, 2016 at 6:56 PM, Roman Shaposhnik <ro...@shaposhnik.org>
wrote:

> Greg, thanks for kicking off the roll call. Getting to know each other is
> super
> useful (and can be fun! ;-)). I'll go next:
>
> I am Roman (your friendly neighborhood mentor). I hang around a lot of ASF
> big data projects (as a committer and a PMC member), but lately I've been
> gravitating towards IoT as well (Apache Mynewt). I started my career at Sun
> microsystems back at a time when Linux  wasn't even 1.x and I've been doing
> enterprise software ever since. I was lucky enough to get to work on
> the original
> Hadoop team at Yahoo! and fall in love with not one but two elephants
> (Hadoop
> and Postgres). Recently I've assumed a position of VP of Technology at ODPi
> and I'm still hoping to MHGA! My secret weapon is Apache Bigtop (which
> co-founded)
> and I'm not afraid to use it!
>
> I'm here to help as much as I can to make sure that this community evolves
> into
> a vibrant, self-governed, exciting place worthy of being a top level
> project (TLP)
> at ASF. If you have any questions or ideas that you may want to bounce off
> of
> me -- please don't hesitate to reach out directly or on the mailing list.
>
> Thanks,
> Roman.
>
> On Fri, Dec 9, 2016 at 11:53 AM, Gregory Chase <gc...@pivotal.io> wrote:
> >
> > Dear HAWQs,
> >
> > I thought it would be fun to get to know some of the other people in the
> community.
> >
> > My name is Greg Chase and I run community development for Pivotal for
> big data open source communities that Pivotal contributes to.
> >
> > Some of you may have seen my frequent emails about virtual events I help
> organize for user and contributor education.
> >
> > Not so long ago, I was in charge of product marketing for an in-memory
> data warehouse named after a Hawaiian town from a three-letter acronymed
> German Company. We treated Hadoop as an external table, and returning
> results from these queries was both slow and brittle due to the network
> transfer rates.
> >
> > So I have a special appreciation of the innovation that has gone into
> creating Hadoop-native HAWQ out of PostgreSQL and Greenplum.
> >
> > These days I'm much more of a marketer than a coder, but I still love
> hearing about the kinds of projects that HAWQ users are involved in.
> >
> > I know we'd all love to hear more about everyone else's projects, and
> how you became a HAWQ user.  So please introduce yourselves!
> >
> > --
> > Greg Chase
> >
> > Global Head, Big Data Communities
> > http://www.pivotal.io/big-data
> >
> > Pivotal Software
> > http://www.pivotal.io/
> >
> > 650-215-0477
> > @GregChase
> > Blog: http://geekmarketing.biz/
> >
>

Re: [INTRODUCTIONS] Hi I'm Greg and I'm part of the Apache HAWQ community

Posted by Roman Shaposhnik <ro...@shaposhnik.org>.
Greg, thanks for kicking off the roll call. Getting to know each other is super
useful (and can be fun! ;-)). I'll go next:

I am Roman (your friendly neighborhood mentor). I hang around a lot of ASF
big data projects (as a committer and a PMC member), but lately I've been
gravitating towards IoT as well (Apache Mynewt). I started my career at Sun
microsystems back at a time when Linux  wasn't even 1.x and I've been doing
enterprise software ever since. I was lucky enough to get to work on
the original
Hadoop team at Yahoo! and fall in love with not one but two elephants (Hadoop
and Postgres). Recently I've assumed a position of VP of Technology at ODPi
and I'm still hoping to MHGA! My secret weapon is Apache Bigtop (which
co-founded)
and I'm not afraid to use it!

I'm here to help as much as I can to make sure that this community evolves into
a vibrant, self-governed, exciting place worthy of being a top level
project (TLP)
at ASF. If you have any questions or ideas that you may want to bounce off of
me -- please don't hesitate to reach out directly or on the mailing list.

Thanks,
Roman.

On Fri, Dec 9, 2016 at 11:53 AM, Gregory Chase <gc...@pivotal.io> wrote:
>
> Dear HAWQs,
>
> I thought it would be fun to get to know some of the other people in the community.
>
> My name is Greg Chase and I run community development for Pivotal for big data open source communities that Pivotal contributes to.
>
> Some of you may have seen my frequent emails about virtual events I help organize for user and contributor education.
>
> Not so long ago, I was in charge of product marketing for an in-memory data warehouse named after a Hawaiian town from a three-letter acronymed German Company. We treated Hadoop as an external table, and returning results from these queries was both slow and brittle due to the network transfer rates.
>
> So I have a special appreciation of the innovation that has gone into creating Hadoop-native HAWQ out of PostgreSQL and Greenplum.
>
> These days I'm much more of a marketer than a coder, but I still love hearing about the kinds of projects that HAWQ users are involved in.
>
> I know we'd all love to hear more about everyone else's projects, and how you became a HAWQ user.  So please introduce yourselves!
>
> --
> Greg Chase
>
> Global Head, Big Data Communities
> http://www.pivotal.io/big-data
>
> Pivotal Software
> http://www.pivotal.io/
>
> 650-215-0477
> @GregChase
> Blog: http://geekmarketing.biz/
>