You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Xiangrui Meng <me...@gmail.com> on 2014/12/22 21:37:43 UTC

Announcing Spark Packages

Dear Spark users and developers,

I’m happy to announce Spark Packages (http://spark-packages.org), a
community package index to track the growing number of open source
packages and libraries that work with Apache Spark. Spark Packages
makes it easy for users to find, discuss, rate, and install packages
for any version of Spark, and makes it easy for developers to
contribute packages.

Spark Packages will feature integrations with various data sources,
management tools, higher level domain-specific libraries, machine
learning algorithms, code samples, and other Spark content. Thanks to
the package authors, the initial listing of packages includes
scientific computing libraries, a job execution server, a connector
for importing Avro data, tools for launching Spark on Google Compute
Engine, and many others.

I’d like to invite you to contribute and use Spark Packages and
provide feedback! As a disclaimer: Spark Packages is a community index
maintained by Databricks and (by design) will include packages outside
of the ASF Spark project. We are excited to help showcase and support
all of the great work going on in the broader Spark community!

Cheers,
Xiangrui

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Announcing Spark Packages

Posted by Nicholas Chammas <ni...@gmail.com>.
Okie doke! (I just assumed there was an issue since the policy was brought
up.)

On Mon Dec 22 2014 at 8:33:53 PM Patrick Wendell <pw...@gmail.com> wrote:

> Hey Nick,
>
> I think Hitesh was just trying to be helpful and point out the policy
> - not necessarily saying there was an issue. We've taken a close look
> at this and I think we're in good shape her vis-a-vis this policy.
>
> - Patrick
>
> On Mon, Dec 22, 2014 at 5:29 PM, Nicholas Chammas
> <ni...@gmail.com> wrote:
> > Hitesh,
> >
> > From your link:
> >
> > You may not use ASF trademarks such as "Apache" or "ApacheFoo" or "Foo"
> in
> > your own domain names if that use would be likely to confuse a relevant
> > consumer about the source of software or services provided through your
> > website, without written approval of the VP, Apache Brand Management or
> > designee.
> >
> > The title on the packages website is "A community index of packages for
> > Apache Spark." Furthermore, the footnote of the website reads "Spark
> > Packages is a community site hosting modules that are not part of Apache
> > Spark."
> >
> > I think there's nothing on there that would "confuse a relevant consumer
> > about the source of software". It's pretty clear that the Spark Packages
> > name is well within the ASF's guidelines.
> >
> > Have I misunderstood the ASF's policy?
> >
> > Nick
> >
> >
> > On Mon Dec 22 2014 at 6:40:10 PM Hitesh Shah <hi...@apache.org> wrote:
> >>
> >> Hello Xiangrui,
> >>
> >> If you have not already done so, you should look at
> >> http://www.apache.org/foundation/marks/#domains for the policy on use
> of ASF
> >> trademarked terms in domain names.
> >>
> >> thanks
> >> -- Hitesh
> >>
> >> On Dec 22, 2014, at 12:37 PM, Xiangrui Meng <me...@gmail.com> wrote:
> >>
> >> > Dear Spark users and developers,
> >> >
> >> > I'm happy to announce Spark Packages (http://spark-packages.org), a
> >> > community package index to track the growing number of open source
> >> > packages and libraries that work with Apache Spark. Spark Packages
> >> > makes it easy for users to find, discuss, rate, and install packages
> >> > for any version of Spark, and makes it easy for developers to
> >> > contribute packages.
> >> >
> >> > Spark Packages will feature integrations with various data sources,
> >> > management tools, higher level domain-specific libraries, machine
> >> > learning algorithms, code samples, and other Spark content. Thanks to
> >> > the package authors, the initial listing of packages includes
> >> > scientific computing libraries, a job execution server, a connector
> >> > for importing Avro data, tools for launching Spark on Google Compute
> >> > Engine, and many others.
> >> >
> >> > I'd like to invite you to contribute and use Spark Packages and
> >> > provide feedback! As a disclaimer: Spark Packages is a community index
> >> > maintained by Databricks and (by design) will include packages outside
> >> > of the ASF Spark project. We are excited to help showcase and support
> >> > all of the great work going on in the broader Spark community!
> >> >
> >> > Cheers,
> >> > Xiangrui
> >> >
> >> > ---------------------------------------------------------------------
> >> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> >> > For additional commands, e-mail: dev-help@spark.apache.org
> >> >
> >>
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> >> For additional commands, e-mail: user-help@spark.apache.org
> >>
> >
>

Re: Announcing Spark Packages

Posted by Nicholas Chammas <ni...@gmail.com>.
Okie doke! (I just assumed there was an issue since the policy was brought
up.)

On Mon Dec 22 2014 at 8:33:53 PM Patrick Wendell <pw...@gmail.com> wrote:

> Hey Nick,
>
> I think Hitesh was just trying to be helpful and point out the policy
> - not necessarily saying there was an issue. We've taken a close look
> at this and I think we're in good shape her vis-a-vis this policy.
>
> - Patrick
>
> On Mon, Dec 22, 2014 at 5:29 PM, Nicholas Chammas
> <ni...@gmail.com> wrote:
> > Hitesh,
> >
> > From your link:
> >
> > You may not use ASF trademarks such as "Apache" or "ApacheFoo" or "Foo"
> in
> > your own domain names if that use would be likely to confuse a relevant
> > consumer about the source of software or services provided through your
> > website, without written approval of the VP, Apache Brand Management or
> > designee.
> >
> > The title on the packages website is "A community index of packages for
> > Apache Spark." Furthermore, the footnote of the website reads "Spark
> > Packages is a community site hosting modules that are not part of Apache
> > Spark."
> >
> > I think there's nothing on there that would "confuse a relevant consumer
> > about the source of software". It's pretty clear that the Spark Packages
> > name is well within the ASF's guidelines.
> >
> > Have I misunderstood the ASF's policy?
> >
> > Nick
> >
> >
> > On Mon Dec 22 2014 at 6:40:10 PM Hitesh Shah <hi...@apache.org> wrote:
> >>
> >> Hello Xiangrui,
> >>
> >> If you have not already done so, you should look at
> >> http://www.apache.org/foundation/marks/#domains for the policy on use
> of ASF
> >> trademarked terms in domain names.
> >>
> >> thanks
> >> -- Hitesh
> >>
> >> On Dec 22, 2014, at 12:37 PM, Xiangrui Meng <me...@gmail.com> wrote:
> >>
> >> > Dear Spark users and developers,
> >> >
> >> > I'm happy to announce Spark Packages (http://spark-packages.org), a
> >> > community package index to track the growing number of open source
> >> > packages and libraries that work with Apache Spark. Spark Packages
> >> > makes it easy for users to find, discuss, rate, and install packages
> >> > for any version of Spark, and makes it easy for developers to
> >> > contribute packages.
> >> >
> >> > Spark Packages will feature integrations with various data sources,
> >> > management tools, higher level domain-specific libraries, machine
> >> > learning algorithms, code samples, and other Spark content. Thanks to
> >> > the package authors, the initial listing of packages includes
> >> > scientific computing libraries, a job execution server, a connector
> >> > for importing Avro data, tools for launching Spark on Google Compute
> >> > Engine, and many others.
> >> >
> >> > I'd like to invite you to contribute and use Spark Packages and
> >> > provide feedback! As a disclaimer: Spark Packages is a community index
> >> > maintained by Databricks and (by design) will include packages outside
> >> > of the ASF Spark project. We are excited to help showcase and support
> >> > all of the great work going on in the broader Spark community!
> >> >
> >> > Cheers,
> >> > Xiangrui
> >> >
> >> > ---------------------------------------------------------------------
> >> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> >> > For additional commands, e-mail: dev-help@spark.apache.org
> >> >
> >>
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> >> For additional commands, e-mail: user-help@spark.apache.org
> >>
> >
>

Re: Announcing Spark Packages

Posted by Patrick Wendell <pw...@gmail.com>.
Hey Nick,

I think Hitesh was just trying to be helpful and point out the policy
- not necessarily saying there was an issue. We've taken a close look
at this and I think we're in good shape her vis-a-vis this policy.

- Patrick

On Mon, Dec 22, 2014 at 5:29 PM, Nicholas Chammas
<ni...@gmail.com> wrote:
> Hitesh,
>
> From your link:
>
> You may not use ASF trademarks such as "Apache" or "ApacheFoo" or "Foo" in
> your own domain names if that use would be likely to confuse a relevant
> consumer about the source of software or services provided through your
> website, without written approval of the VP, Apache Brand Management or
> designee.
>
> The title on the packages website is "A community index of packages for
> Apache Spark." Furthermore, the footnote of the website reads "Spark
> Packages is a community site hosting modules that are not part of Apache
> Spark."
>
> I think there's nothing on there that would "confuse a relevant consumer
> about the source of software". It's pretty clear that the Spark Packages
> name is well within the ASF's guidelines.
>
> Have I misunderstood the ASF's policy?
>
> Nick
>
>
> On Mon Dec 22 2014 at 6:40:10 PM Hitesh Shah <hi...@apache.org> wrote:
>>
>> Hello Xiangrui,
>>
>> If you have not already done so, you should look at
>> http://www.apache.org/foundation/marks/#domains for the policy on use of ASF
>> trademarked terms in domain names.
>>
>> thanks
>> -- Hitesh
>>
>> On Dec 22, 2014, at 12:37 PM, Xiangrui Meng <me...@gmail.com> wrote:
>>
>> > Dear Spark users and developers,
>> >
>> > I'm happy to announce Spark Packages (http://spark-packages.org), a
>> > community package index to track the growing number of open source
>> > packages and libraries that work with Apache Spark. Spark Packages
>> > makes it easy for users to find, discuss, rate, and install packages
>> > for any version of Spark, and makes it easy for developers to
>> > contribute packages.
>> >
>> > Spark Packages will feature integrations with various data sources,
>> > management tools, higher level domain-specific libraries, machine
>> > learning algorithms, code samples, and other Spark content. Thanks to
>> > the package authors, the initial listing of packages includes
>> > scientific computing libraries, a job execution server, a connector
>> > for importing Avro data, tools for launching Spark on Google Compute
>> > Engine, and many others.
>> >
>> > I'd like to invite you to contribute and use Spark Packages and
>> > provide feedback! As a disclaimer: Spark Packages is a community index
>> > maintained by Databricks and (by design) will include packages outside
>> > of the ASF Spark project. We are excited to help showcase and support
>> > all of the great work going on in the broader Spark community!
>> >
>> > Cheers,
>> > Xiangrui
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> > For additional commands, e-mail: dev-help@spark.apache.org
>> >
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Announcing Spark Packages

Posted by Patrick Wendell <pw...@gmail.com>.
Hey Nick,

I think Hitesh was just trying to be helpful and point out the policy
- not necessarily saying there was an issue. We've taken a close look
at this and I think we're in good shape her vis-a-vis this policy.

- Patrick

On Mon, Dec 22, 2014 at 5:29 PM, Nicholas Chammas
<ni...@gmail.com> wrote:
> Hitesh,
>
> From your link:
>
> You may not use ASF trademarks such as "Apache" or "ApacheFoo" or "Foo" in
> your own domain names if that use would be likely to confuse a relevant
> consumer about the source of software or services provided through your
> website, without written approval of the VP, Apache Brand Management or
> designee.
>
> The title on the packages website is "A community index of packages for
> Apache Spark." Furthermore, the footnote of the website reads "Spark
> Packages is a community site hosting modules that are not part of Apache
> Spark."
>
> I think there's nothing on there that would "confuse a relevant consumer
> about the source of software". It's pretty clear that the Spark Packages
> name is well within the ASF's guidelines.
>
> Have I misunderstood the ASF's policy?
>
> Nick
>
>
> On Mon Dec 22 2014 at 6:40:10 PM Hitesh Shah <hi...@apache.org> wrote:
>>
>> Hello Xiangrui,
>>
>> If you have not already done so, you should look at
>> http://www.apache.org/foundation/marks/#domains for the policy on use of ASF
>> trademarked terms in domain names.
>>
>> thanks
>> -- Hitesh
>>
>> On Dec 22, 2014, at 12:37 PM, Xiangrui Meng <me...@gmail.com> wrote:
>>
>> > Dear Spark users and developers,
>> >
>> > I'm happy to announce Spark Packages (http://spark-packages.org), a
>> > community package index to track the growing number of open source
>> > packages and libraries that work with Apache Spark. Spark Packages
>> > makes it easy for users to find, discuss, rate, and install packages
>> > for any version of Spark, and makes it easy for developers to
>> > contribute packages.
>> >
>> > Spark Packages will feature integrations with various data sources,
>> > management tools, higher level domain-specific libraries, machine
>> > learning algorithms, code samples, and other Spark content. Thanks to
>> > the package authors, the initial listing of packages includes
>> > scientific computing libraries, a job execution server, a connector
>> > for importing Avro data, tools for launching Spark on Google Compute
>> > Engine, and many others.
>> >
>> > I'd like to invite you to contribute and use Spark Packages and
>> > provide feedback! As a disclaimer: Spark Packages is a community index
>> > maintained by Databricks and (by design) will include packages outside
>> > of the ASF Spark project. We are excited to help showcase and support
>> > all of the great work going on in the broader Spark community!
>> >
>> > Cheers,
>> > Xiangrui
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> > For additional commands, e-mail: dev-help@spark.apache.org
>> >
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Announcing Spark Packages

Posted by Nicholas Chammas <ni...@gmail.com>.
Hitesh,

>From your link <http://www.apache.org/foundation/marks/#domains>:

You may not use ASF trademarks such as “Apache” or “ApacheFoo” or “Foo” in
your own domain names if that use would be likely to confuse a relevant
consumer about the source of software or services provided through your
website, without written approval of the VP, Apache Brand Management or
designee.

The title on the packages website is “A community index of packages for
Apache Spark.” Furthermore, the footnote of the website reads “Spark
Packages is a community site hosting modules that are not part of Apache
Spark.”

I think there’s nothing on there that would “confuse a relevant consumer
about the source of software”. It’s pretty clear that the Spark Packages
name is well within the ASF’s guidelines.

Have I misunderstood the ASF’s policy?

Nick
​

On Mon Dec 22 2014 at 6:40:10 PM Hitesh Shah <hi...@apache.org> wrote:

> Hello Xiangrui,
>
> If you have not already done so, you should look at http://www.apache.org/
> foundation/marks/#domains for the policy on use of ASF trademarked terms
> in domain names.
>
> thanks
> — Hitesh
>
> On Dec 22, 2014, at 12:37 PM, Xiangrui Meng <me...@gmail.com> wrote:
>
> > Dear Spark users and developers,
> >
> > I’m happy to announce Spark Packages (http://spark-packages.org), a
> > community package index to track the growing number of open source
> > packages and libraries that work with Apache Spark. Spark Packages
> > makes it easy for users to find, discuss, rate, and install packages
> > for any version of Spark, and makes it easy for developers to
> > contribute packages.
> >
> > Spark Packages will feature integrations with various data sources,
> > management tools, higher level domain-specific libraries, machine
> > learning algorithms, code samples, and other Spark content. Thanks to
> > the package authors, the initial listing of packages includes
> > scientific computing libraries, a job execution server, a connector
> > for importing Avro data, tools for launching Spark on Google Compute
> > Engine, and many others.
> >
> > I’d like to invite you to contribute and use Spark Packages and
> > provide feedback! As a disclaimer: Spark Packages is a community index
> > maintained by Databricks and (by design) will include packages outside
> > of the ASF Spark project. We are excited to help showcase and support
> > all of the great work going on in the broader Spark community!
> >
> > Cheers,
> > Xiangrui
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> > For additional commands, e-mail: dev-help@spark.apache.org
> >
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: Announcing Spark Packages

Posted by Nicholas Chammas <ni...@gmail.com>.
Hitesh,

>From your link <http://www.apache.org/foundation/marks/#domains>:

You may not use ASF trademarks such as “Apache” or “ApacheFoo” or “Foo” in
your own domain names if that use would be likely to confuse a relevant
consumer about the source of software or services provided through your
website, without written approval of the VP, Apache Brand Management or
designee.

The title on the packages website is “A community index of packages for
Apache Spark.” Furthermore, the footnote of the website reads “Spark
Packages is a community site hosting modules that are not part of Apache
Spark.”

I think there’s nothing on there that would “confuse a relevant consumer
about the source of software”. It’s pretty clear that the Spark Packages
name is well within the ASF’s guidelines.

Have I misunderstood the ASF’s policy?

Nick
​

On Mon Dec 22 2014 at 6:40:10 PM Hitesh Shah <hi...@apache.org> wrote:

> Hello Xiangrui,
>
> If you have not already done so, you should look at http://www.apache.org/
> foundation/marks/#domains for the policy on use of ASF trademarked terms
> in domain names.
>
> thanks
> — Hitesh
>
> On Dec 22, 2014, at 12:37 PM, Xiangrui Meng <me...@gmail.com> wrote:
>
> > Dear Spark users and developers,
> >
> > I’m happy to announce Spark Packages (http://spark-packages.org), a
> > community package index to track the growing number of open source
> > packages and libraries that work with Apache Spark. Spark Packages
> > makes it easy for users to find, discuss, rate, and install packages
> > for any version of Spark, and makes it easy for developers to
> > contribute packages.
> >
> > Spark Packages will feature integrations with various data sources,
> > management tools, higher level domain-specific libraries, machine
> > learning algorithms, code samples, and other Spark content. Thanks to
> > the package authors, the initial listing of packages includes
> > scientific computing libraries, a job execution server, a connector
> > for importing Avro data, tools for launching Spark on Google Compute
> > Engine, and many others.
> >
> > I’d like to invite you to contribute and use Spark Packages and
> > provide feedback! As a disclaimer: Spark Packages is a community index
> > maintained by Databricks and (by design) will include packages outside
> > of the ASF Spark project. We are excited to help showcase and support
> > all of the great work going on in the broader Spark community!
> >
> > Cheers,
> > Xiangrui
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> > For additional commands, e-mail: dev-help@spark.apache.org
> >
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: Announcing Spark Packages

Posted by Hitesh Shah <hi...@apache.org>.
Hello Xiangrui, 

If you have not already done so, you should look at http://www.apache.org/foundation/marks/#domains for the policy on use of ASF trademarked terms in domain names. 

thanks
— Hitesh

On Dec 22, 2014, at 12:37 PM, Xiangrui Meng <me...@gmail.com> wrote:

> Dear Spark users and developers,
> 
> I’m happy to announce Spark Packages (http://spark-packages.org), a
> community package index to track the growing number of open source
> packages and libraries that work with Apache Spark. Spark Packages
> makes it easy for users to find, discuss, rate, and install packages
> for any version of Spark, and makes it easy for developers to
> contribute packages.
> 
> Spark Packages will feature integrations with various data sources,
> management tools, higher level domain-specific libraries, machine
> learning algorithms, code samples, and other Spark content. Thanks to
> the package authors, the initial listing of packages includes
> scientific computing libraries, a job execution server, a connector
> for importing Avro data, tools for launching Spark on Google Compute
> Engine, and many others.
> 
> I’d like to invite you to contribute and use Spark Packages and
> provide feedback! As a disclaimer: Spark Packages is a community index
> maintained by Databricks and (by design) will include packages outside
> of the ASF Spark project. We are excited to help showcase and support
> all of the great work going on in the broader Spark community!
> 
> Cheers,
> Xiangrui
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Announcing Spark Packages

Posted by Patrick Wendell <pw...@gmail.com>.
Xiangrui asked me to report that it's back and running :)

On Mon, Dec 22, 2014 at 3:21 PM, peng <pc...@uowmail.edu.au> wrote:
> Me 2 :)
>
>
> On 12/22/2014 06:14 PM, Andrew Ash wrote:
>
> Hi Xiangrui,
>
> That link is currently returning a 503 Over Quota error message.  Would you
> mind pinging back out when the page is back up?
>
> Thanks!
> Andrew
>
> On Mon, Dec 22, 2014 at 12:37 PM, Xiangrui Meng <me...@gmail.com> wrote:
>>
>> Dear Spark users and developers,
>>
>> I'm happy to announce Spark Packages (http://spark-packages.org), a
>> community package index to track the growing number of open source
>> packages and libraries that work with Apache Spark. Spark Packages
>> makes it easy for users to find, discuss, rate, and install packages
>> for any version of Spark, and makes it easy for developers to
>> contribute packages.
>>
>> Spark Packages will feature integrations with various data sources,
>> management tools, higher level domain-specific libraries, machine
>> learning algorithms, code samples, and other Spark content. Thanks to
>> the package authors, the initial listing of packages includes
>> scientific computing libraries, a job execution server, a connector
>> for importing Avro data, tools for launching Spark on Google Compute
>> Engine, and many others.
>>
>> I'd like to invite you to contribute and use Spark Packages and
>> provide feedback! As a disclaimer: Spark Packages is a community index
>> maintained by Databricks and (by design) will include packages outside
>> of the ASF Spark project. We are excited to help showcase and support
>> all of the great work going on in the broader Spark community!
>>
>> Cheers,
>> Xiangrui
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Announcing Spark Packages

Posted by Patrick Wendell <pw...@gmail.com>.
Xiangrui asked me to report that it's back and running :)

On Mon, Dec 22, 2014 at 3:21 PM, peng <pc...@uowmail.edu.au> wrote:
> Me 2 :)
>
>
> On 12/22/2014 06:14 PM, Andrew Ash wrote:
>
> Hi Xiangrui,
>
> That link is currently returning a 503 Over Quota error message.  Would you
> mind pinging back out when the page is back up?
>
> Thanks!
> Andrew
>
> On Mon, Dec 22, 2014 at 12:37 PM, Xiangrui Meng <me...@gmail.com> wrote:
>>
>> Dear Spark users and developers,
>>
>> I'm happy to announce Spark Packages (http://spark-packages.org), a
>> community package index to track the growing number of open source
>> packages and libraries that work with Apache Spark. Spark Packages
>> makes it easy for users to find, discuss, rate, and install packages
>> for any version of Spark, and makes it easy for developers to
>> contribute packages.
>>
>> Spark Packages will feature integrations with various data sources,
>> management tools, higher level domain-specific libraries, machine
>> learning algorithms, code samples, and other Spark content. Thanks to
>> the package authors, the initial listing of packages includes
>> scientific computing libraries, a job execution server, a connector
>> for importing Avro data, tools for launching Spark on Google Compute
>> Engine, and many others.
>>
>> I'd like to invite you to contribute and use Spark Packages and
>> provide feedback! As a disclaimer: Spark Packages is a community index
>> maintained by Databricks and (by design) will include packages outside
>> of the ASF Spark project. We are excited to help showcase and support
>> all of the great work going on in the broader Spark community!
>>
>> Cheers,
>> Xiangrui
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Announcing Spark Packages

Posted by peng <pc...@uowmail.edu.au>.
Me 2 :)

On 12/22/2014 06:14 PM, Andrew Ash wrote:
> Hi Xiangrui,
>
> That link is currently returning a 503 Over Quota error message.  
> Would you mind pinging back out when the page is back up?
>
> Thanks!
> Andrew
>
> On Mon, Dec 22, 2014 at 12:37 PM, Xiangrui Meng <mengxr@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     Dear Spark users and developers,
>
>     I’m happy to announce Spark Packages (http://spark-packages.org), a
>     community package index to track the growing number of open source
>     packages and libraries that work with Apache Spark. Spark Packages
>     makes it easy for users to find, discuss, rate, and install packages
>     for any version of Spark, and makes it easy for developers to
>     contribute packages.
>
>     Spark Packages will feature integrations with various data sources,
>     management tools, higher level domain-specific libraries, machine
>     learning algorithms, code samples, and other Spark content. Thanks to
>     the package authors, the initial listing of packages includes
>     scientific computing libraries, a job execution server, a connector
>     for importing Avro data, tools for launching Spark on Google Compute
>     Engine, and many others.
>
>     I’d like to invite you to contribute and use Spark Packages and
>     provide feedback! As a disclaimer: Spark Packages is a community index
>     maintained by Databricks and (by design) will include packages outside
>     of the ASF Spark project. We are excited to help showcase and support
>     all of the great work going on in the broader Spark community!
>
>     Cheers,
>     Xiangrui
>
>     ---------------------------------------------------------------------
>     To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>     <ma...@spark.apache.org>
>     For additional commands, e-mail: user-help@spark.apache.org
>     <ma...@spark.apache.org>
>
>


Re: Announcing Spark Packages

Posted by Andrew Ash <an...@andrewash.com>.
Hi Xiangrui,

That link is currently returning a 503 Over Quota error message.  Would you
mind pinging back out when the page is back up?

Thanks!
Andrew

On Mon, Dec 22, 2014 at 12:37 PM, Xiangrui Meng <me...@gmail.com> wrote:

> Dear Spark users and developers,
>
> I’m happy to announce Spark Packages (http://spark-packages.org), a
> community package index to track the growing number of open source
> packages and libraries that work with Apache Spark. Spark Packages
> makes it easy for users to find, discuss, rate, and install packages
> for any version of Spark, and makes it easy for developers to
> contribute packages.
>
> Spark Packages will feature integrations with various data sources,
> management tools, higher level domain-specific libraries, machine
> learning algorithms, code samples, and other Spark content. Thanks to
> the package authors, the initial listing of packages includes
> scientific computing libraries, a job execution server, a connector
> for importing Avro data, tools for launching Spark on Google Compute
> Engine, and many others.
>
> I’d like to invite you to contribute and use Spark Packages and
> provide feedback! As a disclaimer: Spark Packages is a community index
> maintained by Databricks and (by design) will include packages outside
> of the ASF Spark project. We are excited to help showcase and support
> all of the great work going on in the broader Spark community!
>
> Cheers,
> Xiangrui
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: Announcing Spark Packages

Posted by Andrew Ash <an...@andrewash.com>.
Hi Xiangrui,

That link is currently returning a 503 Over Quota error message.  Would you
mind pinging back out when the page is back up?

Thanks!
Andrew

On Mon, Dec 22, 2014 at 12:37 PM, Xiangrui Meng <me...@gmail.com> wrote:

> Dear Spark users and developers,
>
> I’m happy to announce Spark Packages (http://spark-packages.org), a
> community package index to track the growing number of open source
> packages and libraries that work with Apache Spark. Spark Packages
> makes it easy for users to find, discuss, rate, and install packages
> for any version of Spark, and makes it easy for developers to
> contribute packages.
>
> Spark Packages will feature integrations with various data sources,
> management tools, higher level domain-specific libraries, machine
> learning algorithms, code samples, and other Spark content. Thanks to
> the package authors, the initial listing of packages includes
> scientific computing libraries, a job execution server, a connector
> for importing Avro data, tools for launching Spark on Google Compute
> Engine, and many others.
>
> I’d like to invite you to contribute and use Spark Packages and
> provide feedback! As a disclaimer: Spark Packages is a community index
> maintained by Databricks and (by design) will include packages outside
> of the ASF Spark project. We are excited to help showcase and support
> all of the great work going on in the broader Spark community!
>
> Cheers,
> Xiangrui
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: Announcing Spark Packages

Posted by Hitesh Shah <hi...@apache.org>.
Hello Xiangrui, 

If you have not already done so, you should look at http://www.apache.org/foundation/marks/#domains for the policy on use of ASF trademarked terms in domain names. 

thanks
— Hitesh

On Dec 22, 2014, at 12:37 PM, Xiangrui Meng <me...@gmail.com> wrote:

> Dear Spark users and developers,
> 
> I’m happy to announce Spark Packages (http://spark-packages.org), a
> community package index to track the growing number of open source
> packages and libraries that work with Apache Spark. Spark Packages
> makes it easy for users to find, discuss, rate, and install packages
> for any version of Spark, and makes it easy for developers to
> contribute packages.
> 
> Spark Packages will feature integrations with various data sources,
> management tools, higher level domain-specific libraries, machine
> learning algorithms, code samples, and other Spark content. Thanks to
> the package authors, the initial listing of packages includes
> scientific computing libraries, a job execution server, a connector
> for importing Avro data, tools for launching Spark on Google Compute
> Engine, and many others.
> 
> I’d like to invite you to contribute and use Spark Packages and
> provide feedback! As a disclaimer: Spark Packages is a community index
> maintained by Databricks and (by design) will include packages outside
> of the ASF Spark project. We are excited to help showcase and support
> all of the great work going on in the broader Spark community!
> 
> Cheers,
> Xiangrui
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org