You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Nicholas Chammas <ni...@gmail.com> on 2014/07/22 00:42:31 UTC

Contributing to Spark needs PySpark build/test instructions

Contributing to Spark
<https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark>
needs a line or two about building and testing PySpark. A call out of
run-tests, for example, would be helpful for new contributors to PySpark.

Nick
​

Re: Contributing to Spark needs PySpark build/test instructions

Posted by Nicholas Chammas <ni...@gmail.com>.
That works! Thank you.


On Tue, Jul 22, 2014 at 12:28 AM, Reynold Xin <rx...@databricks.com> wrote:

> I missed that bullet point. I removed that and just pointed it towards the
> instruction.
>
>
> On Mon, Jul 21, 2014 at 9:20 PM, Nicholas Chammas <
> nicholas.chammas@gmail.com> wrote:
>
> > Looks good. Does sbt/sbt test cover the same tests as /dev/run-tests?
> >
> > I’m looking at step 5 under “Contributing Code”. Someone contributing to
> > PySpark will want to be directed to run something in addition to (or
> > instead of) sbt/sbt test, I believe.
> >
> > Nick
> > ​
> >
> >
> > On Mon, Jul 21, 2014 at 11:43 PM, Reynold Xin <rx...@databricks.com>
> wrote:
> >
> > > I added an automated testing section:
> > >
> > >
> >
> https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-AutomatedTesting
> > >
> > > Can you take a look to see if it is what you had in mind?
> > >
> > >
> > >
> > > On Mon, Jul 21, 2014 at 3:54 PM, Nicholas Chammas <
> > > nicholas.chammas@gmail.com> wrote:
> > >
> > > > For the record, the triggering discussion is here
> > > > <https://github.com/apache/spark/pull/1505#issuecomment-49671550>. I
> > > > assumed that sbt/sbt test covers all the tests required before
> > > submitting a
> > > > patch, and it appears that it doesn’t.
> > > > ​
> > > >
> > > >
> > > > On Mon, Jul 21, 2014 at 6:42 PM, Nicholas Chammas <
> > > > nicholas.chammas@gmail.com> wrote:
> > > >
> > > > > Contributing to Spark
> > > > > <
> > >
> https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark
> > > > >
> > > > > needs a line or two about building and testing PySpark. A call out
> of
> > > > > run-tests, for example, would be helpful for new contributors to
> > > PySpark.
> > > > >
> > > > > Nick
> > > > > ​
> > > > >
> > > >
> > >
> >
>

Re: Contributing to Spark needs PySpark build/test instructions

Posted by Reynold Xin <rx...@databricks.com>.
I missed that bullet point. I removed that and just pointed it towards the
instruction.


On Mon, Jul 21, 2014 at 9:20 PM, Nicholas Chammas <
nicholas.chammas@gmail.com> wrote:

> Looks good. Does sbt/sbt test cover the same tests as /dev/run-tests?
>
> I’m looking at step 5 under “Contributing Code”. Someone contributing to
> PySpark will want to be directed to run something in addition to (or
> instead of) sbt/sbt test, I believe.
>
> Nick
> ​
>
>
> On Mon, Jul 21, 2014 at 11:43 PM, Reynold Xin <rx...@databricks.com> wrote:
>
> > I added an automated testing section:
> >
> >
> https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-AutomatedTesting
> >
> > Can you take a look to see if it is what you had in mind?
> >
> >
> >
> > On Mon, Jul 21, 2014 at 3:54 PM, Nicholas Chammas <
> > nicholas.chammas@gmail.com> wrote:
> >
> > > For the record, the triggering discussion is here
> > > <https://github.com/apache/spark/pull/1505#issuecomment-49671550>. I
> > > assumed that sbt/sbt test covers all the tests required before
> > submitting a
> > > patch, and it appears that it doesn’t.
> > > ​
> > >
> > >
> > > On Mon, Jul 21, 2014 at 6:42 PM, Nicholas Chammas <
> > > nicholas.chammas@gmail.com> wrote:
> > >
> > > > Contributing to Spark
> > > > <
> > https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark
> > > >
> > > > needs a line or two about building and testing PySpark. A call out of
> > > > run-tests, for example, would be helpful for new contributors to
> > PySpark.
> > > >
> > > > Nick
> > > > ​
> > > >
> > >
> >
>

Re: Contributing to Spark needs PySpark build/test instructions

Posted by Nicholas Chammas <ni...@gmail.com>.
Looks good. Does sbt/sbt test cover the same tests as /dev/run-tests?

I’m looking at step 5 under “Contributing Code”. Someone contributing to
PySpark will want to be directed to run something in addition to (or
instead of) sbt/sbt test, I believe.

Nick
​


On Mon, Jul 21, 2014 at 11:43 PM, Reynold Xin <rx...@databricks.com> wrote:

> I added an automated testing section:
>
> https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-AutomatedTesting
>
> Can you take a look to see if it is what you had in mind?
>
>
>
> On Mon, Jul 21, 2014 at 3:54 PM, Nicholas Chammas <
> nicholas.chammas@gmail.com> wrote:
>
> > For the record, the triggering discussion is here
> > <https://github.com/apache/spark/pull/1505#issuecomment-49671550>. I
> > assumed that sbt/sbt test covers all the tests required before
> submitting a
> > patch, and it appears that it doesn’t.
> > ​
> >
> >
> > On Mon, Jul 21, 2014 at 6:42 PM, Nicholas Chammas <
> > nicholas.chammas@gmail.com> wrote:
> >
> > > Contributing to Spark
> > > <
> https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark
> > >
> > > needs a line or two about building and testing PySpark. A call out of
> > > run-tests, for example, would be helpful for new contributors to
> PySpark.
> > >
> > > Nick
> > > ​
> > >
> >
>

Re: Contributing to Spark needs PySpark build/test instructions

Posted by Reynold Xin <rx...@databricks.com>.
I added an automated testing section:
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-AutomatedTesting

Can you take a look to see if it is what you had in mind?



On Mon, Jul 21, 2014 at 3:54 PM, Nicholas Chammas <
nicholas.chammas@gmail.com> wrote:

> For the record, the triggering discussion is here
> <https://github.com/apache/spark/pull/1505#issuecomment-49671550>. I
> assumed that sbt/sbt test covers all the tests required before submitting a
> patch, and it appears that it doesn’t.
> ​
>
>
> On Mon, Jul 21, 2014 at 6:42 PM, Nicholas Chammas <
> nicholas.chammas@gmail.com> wrote:
>
> > Contributing to Spark
> > <https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark
> >
> > needs a line or two about building and testing PySpark. A call out of
> > run-tests, for example, would be helpful for new contributors to PySpark.
> >
> > Nick
> > ​
> >
>

Re: Contributing to Spark needs PySpark build/test instructions

Posted by Nicholas Chammas <ni...@gmail.com>.
For the record, the triggering discussion is here
<https://github.com/apache/spark/pull/1505#issuecomment-49671550>. I
assumed that sbt/sbt test covers all the tests required before submitting a
patch, and it appears that it doesn’t.
​


On Mon, Jul 21, 2014 at 6:42 PM, Nicholas Chammas <
nicholas.chammas@gmail.com> wrote:

> Contributing to Spark
> <https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark>
> needs a line or two about building and testing PySpark. A call out of
> run-tests, for example, would be helpful for new contributors to PySpark.
>
> Nick
> ​
>