You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by pritish <pr...@nirvana-international.com> on 2014/08/04 02:13:51 UTC

I would like to contribute

Hi

We would like to contribute to Spark but we are not sure how. We can offer
project management, release management to begin with. Please advice on how to
get engaged.

Thank you!!

Regards
Pritish
Nirvana International Inc.
Big Data, Hadoop, Oracle EBS and IT Solutions
VA - SWaM, MD - MBE Certified Company
pritish@nirvana-international.com
http://www.nirvana-international.com

> On August 2, 2014 at 9:04 PM Anand Avati <av...@gluster.org> wrote:
>
>
> We are currently blocked on non availability of the following external
> dependencies for porting Spark to Scala 2.11 [SPARK-1812 Jira]:
>
> - akka-*_2.11 (2.3.4-shaded-protobuf from org.spark-project). The shaded
> protobuf needs to be 2.5.0, and the shading is needed because Hadoop1
> specifically needs protobuf 2.4. Issues arising because of this
> incompatibility is already explained in SPARK-1812 Jira.
>
> - chill_2.11 (0.4 from com.twitter) for core
> - algebird_2.11 (0.7 from com.twitter) for examples
> - kafka_2.11 (0.8 from org.apache) for external/kafka and examples
> - akka-zeromq_2.11 (2.3.4 from com.typesafe, but probably not needed if a
> shaded-protobuf version is released from org.spark-project)
>
> First,
> Who do I pester to get org.spark-project artifacts published for the akka
> shaded-protobuf version?
>
> Second,
> In the past what has been the convention to request/pester external
> projects to re-release artifacts in a new Scala version?
>
> Thanks!

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: I would like to contribute

Posted by Josh Rosen <ro...@gmail.com>.
The Contributing to Spark guide on the Spark Wiki provides a good overview on how to start contributing:

https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark


On August 3, 2014 at 5:14:23 PM, pritish (pritish@nirvana-international.com) wrote:

Hi 

We would like to contribute to Spark but we are not sure how. We can offer 
project management, release management to begin with. Please advice on how to 
get engaged. 

Thank you!! 

Regards 
Pritish 
Nirvana International Inc. 
Big Data, Hadoop, Oracle EBS and IT Solutions 
VA - SWaM, MD - MBE Certified Company 
pritish@nirvana-international.com 
http://www.nirvana-international.com 

> On August 2, 2014 at 9:04 PM Anand Avati <av...@gluster.org> wrote: 
> 
> 
> We are currently blocked on non availability of the following external 
> dependencies for porting Spark to Scala 2.11 [SPARK-1812 Jira]: 
> 
> - akka-*_2.11 (2.3.4-shaded-protobuf from org.spark-project). The shaded 
> protobuf needs to be 2.5.0, and the shading is needed because Hadoop1 
> specifically needs protobuf 2.4. Issues arising because of this 
> incompatibility is already explained in SPARK-1812 Jira. 
> 
> - chill_2.11 (0.4 from com.twitter) for core 
> - algebird_2.11 (0.7 from com.twitter) for examples 
> - kafka_2.11 (0.8 from org.apache) for external/kafka and examples 
> - akka-zeromq_2.11 (2.3.4 from com.typesafe, but probably not needed if a 
> shaded-protobuf version is released from org.spark-project) 
> 
> First, 
> Who do I pester to get org.spark-project artifacts published for the akka 
> shaded-protobuf version? 
> 
> Second, 
> In the past what has been the convention to request/pester external 
> projects to re-release artifacts in a new Scala version? 
> 
> Thanks! 

--------------------------------------------------------------------- 
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org 
For additional commands, e-mail: dev-help@spark.apache.org