You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Tony Reix <to...@bull.net> on 2015/01/08 14:40:49 UTC

Results of tests

Hi,
I'm checking that Spark works fine on a new environment (PPC64 hardware).
I've found some issues, with versions 1.1.0, 1.1.1, and 1.2.0, even when running on Ubuntu on x86_64 with Oracle JVM. I'd like to know where I can find the results of the tests of Spark, for each version and for the different versions, in order to have a reference to compare my results with. I cannot find them on Spark web-site.
Thx
Tony


Re: Results of tests

Posted by Nicholas Chammas <ni...@gmail.com>.
Just created: "Integrate Python unit tests into Jenkins"

https://issues.apache.org/jira/browse/SPARK-5178

Nick


On Fri Jan 09 2015 at 2:48:48 PM Josh Rosen <ro...@gmail.com> wrote:

> The "Test Result" pages for Jenkins builds shows some nice statistics for
> the test run, including individual test times:
>
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-
> 1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=
> centos/testReport/
>
> Currently this only covers the Java / Scala tests, but we might be able to
> integrate the PySpark tests here, too (I think it's just a matter of
> getting the Python test runner to generate the correct test result XML
> output).
>
> On Fri, Jan 9, 2015 at 10:47 AM, Ted Yu <yu...@gmail.com> wrote:
>
> > For a build which uses JUnit, we would see a summary such as the
> following
> > (
> > https://builds.apache.org/job/HBase-TRUNK/6007/console):
> >
> > Tests run: 2199, Failures: 0, Errors: 0, Skipped: 25
> >
> >
> > In
> > https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-
> 1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=
> centos/consoleFull
> > , I don't see such statistics.
> >
> >
> > Looks like scalatest-maven-plugin can be enhanced :-)
> >
> >
> > On Fri, Jan 9, 2015 at 3:52 AM, Sean Owen <so...@cloudera.com> wrote:
> >
> > > Hey Tony, the number of tests run could vary depending on how the
> > > build is configured. For example, YARN-related tests would only run
> > > when the yarn profile is turned on. Java 8 tests would only run under
> > > Java 8.
> > >
> > > Although I don't know that there's any reason to believe the IBM JVM
> > > has a problem with Spark, I see this issue that is potentially related
> > > to endian-ness : https://issues.apache.org/jira/browse/SPARK-2018 I
> > > don't know if that was a Spark issue. Certainly, would be good for you
> > > to investigate if you are interested in resolving it.
> > >
> > > The Jenkins output shows you exactly what tests were run and how --
> > > have a look at the logs.
> > >
> > >
> > >
> > https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-
> 1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=
> centos/consoleFull
> > >
> > > On Fri, Jan 9, 2015 at 9:15 AM, Tony Reix <to...@bull.net> wrote:
> > > > Hi Ted
> > > >
> > > > Thanks for the info.
> > > > However, I'm still unable to understand how the page:
> > > >
> > >
> > https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-
> 1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=
> centos/testReport/
> > > > has been built.
> > > > This page contains details I do not find in the page you indicated to
> > me:
> > > >
> > >
> > https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-
> 1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=
> centos/consoleFull
> > > >
> > > > As an example, I'm still unable to find these details:
> > > > org.apache.spark<
> > >
> > https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-
> 1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=
> centos/testReport/org.apache.spark/
> > >
> > >      12 mn   0
> > > >         1
> > > >         247
> > > >         248
> > > >
> > > > org.apache.spark.api.python<
> > >
> > https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-
> 1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=
> centos/testReport/org.apache.spark.api.python/
> > >
> > > 20 ms   0
> > > >         0
> > > >         2
> > > >         2
> > > >
> > > > org.apache.spark.bagel<
> > >
> > https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-
> 1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=
> centos/testReport/org.apache.spark.bagel/
> > >
> > >  7.7 s   0
> > > >         0
> > > >         4
> > > >         4
> > > >
> > > > org.apache.spark.broadcast<
> > >
> > https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-
> 1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=
> centos/testReport/org.apache.spark.broadcast/
> > >
> > >  43 s    0
> > > >         0
> > > >         17
> > > >         17
> > > >
> > > > org.apache.spark.deploy<
> > >
> > https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-
> 1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=
> centos/testReport/org.apache.spark.deploy/
> > >
> > > 16 s    0
> > > >         0
> > > >         29
> > > >         29
> > > >
> > > > org.apache.spark.deploy.worker<
> > >
> > https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-
> 1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=
> centos/testReport/org.apache.spark.deploy.worker/
> > >
> > >  0.55 s  0
> > > >         0
> > > >         12
> > > >         12
> > > >
> > > > ........
> > > >
> > > >
> > > > Moreover, in my Ubuntu/x86_64 environment, I do not find 3745 tests
> and
> > > 0 failures, but 3485 tests and 4 failures (when using Oracle JVM 1.7 ).
> > > When using IBM JVM, there are only 2566 tests and 5 failures (in same
> > > component: Streaming).
> > > >
> > > > On my PPC64BE (BE = Big-Endian)environment, the tests block after 2
> > > hundreds of tests.
> > > > Is Spark independent of Little/Big-Endian stuff ?
> > > >
> > > > On my PPC64LE (LE = Little-Endian) environment, I have 3485 tests
> only
> > > (like on Ubuntu/x86_64 with IBM JVM), with 6 or 285 failures...
> > > >
> > > > So, I need to learn more about how your Jenkins environment extracts
> > > details about the results.
> > > > Moreover, which JVM is used ?
> > > >
> > > > Do you plan to use IBM JVM in order to check that Spark and IBM JVM
> are
> > > compatible ? (they already do not look to be compatible 100% ...).
> > > >
> > > > Thanks
> > > >
> > > > Tony
> > > >
> > > > IBM Coop Architect & Technical Leader
> > > > Office : +33 (0) 4 76 29 72 67
> > > > 1 rue de Provence - 38432 Échirolles - France
> > > > www.atos.net<http://www.atos.net/>
> > > > ________________________________
> > > > De : Ted Yu [yuzhihong@gmail.com]
> > > > Envoyé : jeudi 8 janvier 2015 17:43
> > > > À : Tony Reix
> > > > Cc : dev@spark.apache.org
> > > > Objet : Re: Results of tests
> > > >
> > > > Here it is:
> > > >
> > > > [centos] $
> > >
> > /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Mav
> en_3.0.5/bin/mvn
> > > -DHADOOP_PROFILE=hadoop-2.4 -Dlabel=centos -DskipTests -Phadoop-2.4
> > -Pyarn
> > > -Phive clean package
> > > >
> > > >
> > > > You can find the above in
> > >
> > https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-
> 1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=
> centos/consoleFull
> > > >
> > > >
> > > > Cheers
> > > >
> > > > On Thu, Jan 8, 2015 at 8:05 AM, Tony Reix <tony.reix@bull.net
> <mailto:
> > > tony.reix@bull.net>> wrote:
> > > > Thanks !
> > > >
> > > > I've been able to see that there are 3745 tests for version 1.2.0
> with
> > > profile Hadoop 2.4  .
> > > > However, on my side, the maximum tests I've seen are 3485... About
> 300
> > > tests are missing on my side.
> > > > Which Maven option has been used for producing the report file used
> for
> > > building the page:
> > > >
> > >
> > https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-
> 1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=
> centos/testReport/
> > > >   ? (I'm not authorized to look at the "configuration" part)
> > > >
> > > > Thx !
> > > >
> > > > Tony
> > > >
> > > > ________________________________
> > > > De : Ted Yu [yuzhihong@gmail.com<ma...@gmail.com>]
> > > > Envoyé : jeudi 8 janvier 2015 16:11
> > > > À : Tony Reix
> > > > Cc : dev@spark.apache.org<ma...@spark.apache.org>
> > > > Objet : Re: Results of tests
> > > >
> > > > Please take a look at
> > https://amplab.cs.berkeley.edu/jenkins/view/Spark/
> > > >
> > > > On Thu, Jan 8, 2015 at 5:40 AM, Tony Reix <tony.reix@bull.net
> <mailto:
> > > tony.reix@bull.net>> wrote:
> > > > Hi,
> > > > I'm checking that Spark works fine on a new environment (PPC64
> > hardware).
> > > > I've found some issues, with versions 1.1.0, 1.1.1, and 1.2.0, even
> > when
> > > running on Ubuntu on x86_64 with Oracle JVM. I'd like to know where I
> can
> > > find the results of the tests of Spark, for each version and for the
> > > different versions, in order to have a reference to compare my results
> > > with. I cannot find them on Spark web-site.
> > > > Thx
> > > > Tony
> > > >
> > > >
> > > >
> > >
> > > ---------------------------------------------------------------------
> > > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> > > For additional commands, e-mail: dev-help@spark.apache.org
> > >
> > >
> >
>

Re: Results of tests

Posted by Ted Yu <yu...@gmail.com>.
I noticed that org.apache.spark.sql.hive.execution has a lot of tests
skipped.

Is there plan to enable these tests on Jenkins (so that there is no
regression across releases) ?

Cheers

On Fri, Jan 9, 2015 at 11:46 AM, Josh Rosen <ro...@gmail.com> wrote:

> The "Test Result" pages for Jenkins builds shows some nice statistics for
> the test run, including individual test times:
>
>
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/
>
> Currently this only covers the Java / Scala tests, but we might be able to
> integrate the PySpark tests here, too (I think it's just a matter of
> getting the Python test runner to generate the correct test result XML
> output).
>
> On Fri, Jan 9, 2015 at 10:47 AM, Ted Yu <yu...@gmail.com> wrote:
>
>> For a build which uses JUnit, we would see a summary such as the
>> following (
>> https://builds.apache.org/job/HBase-TRUNK/6007/console):
>>
>> Tests run: 2199, Failures: 0, Errors: 0, Skipped: 25
>>
>>
>> In
>> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/consoleFull
>> , I don't see such statistics.
>>
>>
>> Looks like scalatest-maven-plugin can be enhanced :-)
>>
>>
>> On Fri, Jan 9, 2015 at 3:52 AM, Sean Owen <so...@cloudera.com> wrote:
>>
>> > Hey Tony, the number of tests run could vary depending on how the
>> > build is configured. For example, YARN-related tests would only run
>> > when the yarn profile is turned on. Java 8 tests would only run under
>> > Java 8.
>> >
>> > Although I don't know that there's any reason to believe the IBM JVM
>> > has a problem with Spark, I see this issue that is potentially related
>> > to endian-ness : https://issues.apache.org/jira/browse/SPARK-2018 I
>> > don't know if that was a Spark issue. Certainly, would be good for you
>> > to investigate if you are interested in resolving it.
>> >
>> > The Jenkins output shows you exactly what tests were run and how --
>> > have a look at the logs.
>> >
>> >
>> >
>> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/consoleFull
>> >
>> > On Fri, Jan 9, 2015 at 9:15 AM, Tony Reix <to...@bull.net> wrote:
>> > > Hi Ted
>> > >
>> > > Thanks for the info.
>> > > However, I'm still unable to understand how the page:
>> > >
>> >
>> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/
>> > > has been built.
>> > > This page contains details I do not find in the page you indicated to
>> me:
>> > >
>> >
>> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/consoleFull
>> > >
>> > > As an example, I'm still unable to find these details:
>> > > org.apache.spark<
>> >
>> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark/
>> >
>> >      12 mn   0
>> > >         1
>> > >         247
>> > >         248
>> > >
>> > > org.apache.spark.api.python<
>> >
>> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.api.python/
>> >
>> > 20 ms   0
>> > >         0
>> > >         2
>> > >         2
>> > >
>> > > org.apache.spark.bagel<
>> >
>> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.bagel/
>> >
>> >  7.7 s   0
>> > >         0
>> > >         4
>> > >         4
>> > >
>> > > org.apache.spark.broadcast<
>> >
>> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.broadcast/
>> >
>> >  43 s    0
>> > >         0
>> > >         17
>> > >         17
>> > >
>> > > org.apache.spark.deploy<
>> >
>> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.deploy/
>> >
>> > 16 s    0
>> > >         0
>> > >         29
>> > >         29
>> > >
>> > > org.apache.spark.deploy.worker<
>> >
>> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.deploy.worker/
>> >
>> >  0.55 s  0
>> > >         0
>> > >         12
>> > >         12
>> > >
>> > > ........
>> > >
>> > >
>> > > Moreover, in my Ubuntu/x86_64 environment, I do not find 3745 tests
>> and
>> > 0 failures, but 3485 tests and 4 failures (when using Oracle JVM 1.7 ).
>> > When using IBM JVM, there are only 2566 tests and 5 failures (in same
>> > component: Streaming).
>> > >
>> > > On my PPC64BE (BE = Big-Endian)environment, the tests block after 2
>> > hundreds of tests.
>> > > Is Spark independent of Little/Big-Endian stuff ?
>> > >
>> > > On my PPC64LE (LE = Little-Endian) environment, I have 3485 tests only
>> > (like on Ubuntu/x86_64 with IBM JVM), with 6 or 285 failures...
>> > >
>> > > So, I need to learn more about how your Jenkins environment extracts
>> > details about the results.
>> > > Moreover, which JVM is used ?
>> > >
>> > > Do you plan to use IBM JVM in order to check that Spark and IBM JVM
>> are
>> > compatible ? (they already do not look to be compatible 100% ...).
>> > >
>> > > Thanks
>> > >
>> > > Tony
>> > >
>> > > IBM Coop Architect & Technical Leader
>> > > Office : +33 (0) 4 76 29 72 67
>> > > 1 rue de Provence - 38432 Échirolles - France
>> > > www.atos.net<http://www.atos.net/>
>> > > ________________________________
>> > > De : Ted Yu [yuzhihong@gmail.com]
>> > > Envoyé : jeudi 8 janvier 2015 17:43
>> > > À : Tony Reix
>> > > Cc : dev@spark.apache.org
>> > > Objet : Re: Results of tests
>> > >
>> > > Here it is:
>> > >
>> > > [centos] $
>> >
>> /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.0.5/bin/mvn
>> > -DHADOOP_PROFILE=hadoop-2.4 -Dlabel=centos -DskipTests -Phadoop-2.4
>> -Pyarn
>> > -Phive clean package
>> > >
>> > >
>> > > You can find the above in
>> >
>> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/consoleFull
>> > >
>> > >
>> > > Cheers
>> > >
>> > > On Thu, Jan 8, 2015 at 8:05 AM, Tony Reix <tony.reix@bull.net<mailto:
>> > tony.reix@bull.net>> wrote:
>> > > Thanks !
>> > >
>> > > I've been able to see that there are 3745 tests for version 1.2.0 with
>> > profile Hadoop 2.4  .
>> > > However, on my side, the maximum tests I've seen are 3485... About 300
>> > tests are missing on my side.
>> > > Which Maven option has been used for producing the report file used
>> for
>> > building the page:
>> > >
>> >
>> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/
>> > >   ? (I'm not authorized to look at the "configuration" part)
>> > >
>> > > Thx !
>> > >
>> > > Tony
>> > >
>> > > ________________________________
>> > > De : Ted Yu [yuzhihong@gmail.com<ma...@gmail.com>]
>> > > Envoyé : jeudi 8 janvier 2015 16:11
>> > > À : Tony Reix
>> > > Cc : dev@spark.apache.org<ma...@spark.apache.org>
>> > > Objet : Re: Results of tests
>> > >
>> > > Please take a look at
>> https://amplab.cs.berkeley.edu/jenkins/view/Spark/
>> > >
>> > > On Thu, Jan 8, 2015 at 5:40 AM, Tony Reix <tony.reix@bull.net<mailto:
>> > tony.reix@bull.net>> wrote:
>> > > Hi,
>> > > I'm checking that Spark works fine on a new environment (PPC64
>> hardware).
>> > > I've found some issues, with versions 1.1.0, 1.1.1, and 1.2.0, even
>> when
>> > running on Ubuntu on x86_64 with Oracle JVM. I'd like to know where I
>> can
>> > find the results of the tests of Spark, for each version and for the
>> > different versions, in order to have a reference to compare my results
>> > with. I cannot find them on Spark web-site.
>> > > Thx
>> > > Tony
>> > >
>> > >
>> > >
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> > For additional commands, e-mail: dev-help@spark.apache.org
>> >
>> >
>>
>
>

Re: Results of tests

Posted by Josh Rosen <ro...@gmail.com>.
The "Test Result" pages for Jenkins builds shows some nice statistics for
the test run, including individual test times:

https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/

Currently this only covers the Java / Scala tests, but we might be able to
integrate the PySpark tests here, too (I think it's just a matter of
getting the Python test runner to generate the correct test result XML
output).

On Fri, Jan 9, 2015 at 10:47 AM, Ted Yu <yu...@gmail.com> wrote:

> For a build which uses JUnit, we would see a summary such as the following
> (
> https://builds.apache.org/job/HBase-TRUNK/6007/console):
>
> Tests run: 2199, Failures: 0, Errors: 0, Skipped: 25
>
>
> In
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/consoleFull
> , I don't see such statistics.
>
>
> Looks like scalatest-maven-plugin can be enhanced :-)
>
>
> On Fri, Jan 9, 2015 at 3:52 AM, Sean Owen <so...@cloudera.com> wrote:
>
> > Hey Tony, the number of tests run could vary depending on how the
> > build is configured. For example, YARN-related tests would only run
> > when the yarn profile is turned on. Java 8 tests would only run under
> > Java 8.
> >
> > Although I don't know that there's any reason to believe the IBM JVM
> > has a problem with Spark, I see this issue that is potentially related
> > to endian-ness : https://issues.apache.org/jira/browse/SPARK-2018 I
> > don't know if that was a Spark issue. Certainly, would be good for you
> > to investigate if you are interested in resolving it.
> >
> > The Jenkins output shows you exactly what tests were run and how --
> > have a look at the logs.
> >
> >
> >
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/consoleFull
> >
> > On Fri, Jan 9, 2015 at 9:15 AM, Tony Reix <to...@bull.net> wrote:
> > > Hi Ted
> > >
> > > Thanks for the info.
> > > However, I'm still unable to understand how the page:
> > >
> >
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/
> > > has been built.
> > > This page contains details I do not find in the page you indicated to
> me:
> > >
> >
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/consoleFull
> > >
> > > As an example, I'm still unable to find these details:
> > > org.apache.spark<
> >
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark/
> >
> >      12 mn   0
> > >         1
> > >         247
> > >         248
> > >
> > > org.apache.spark.api.python<
> >
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.api.python/
> >
> > 20 ms   0
> > >         0
> > >         2
> > >         2
> > >
> > > org.apache.spark.bagel<
> >
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.bagel/
> >
> >  7.7 s   0
> > >         0
> > >         4
> > >         4
> > >
> > > org.apache.spark.broadcast<
> >
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.broadcast/
> >
> >  43 s    0
> > >         0
> > >         17
> > >         17
> > >
> > > org.apache.spark.deploy<
> >
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.deploy/
> >
> > 16 s    0
> > >         0
> > >         29
> > >         29
> > >
> > > org.apache.spark.deploy.worker<
> >
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.deploy.worker/
> >
> >  0.55 s  0
> > >         0
> > >         12
> > >         12
> > >
> > > ........
> > >
> > >
> > > Moreover, in my Ubuntu/x86_64 environment, I do not find 3745 tests and
> > 0 failures, but 3485 tests and 4 failures (when using Oracle JVM 1.7 ).
> > When using IBM JVM, there are only 2566 tests and 5 failures (in same
> > component: Streaming).
> > >
> > > On my PPC64BE (BE = Big-Endian)environment, the tests block after 2
> > hundreds of tests.
> > > Is Spark independent of Little/Big-Endian stuff ?
> > >
> > > On my PPC64LE (LE = Little-Endian) environment, I have 3485 tests only
> > (like on Ubuntu/x86_64 with IBM JVM), with 6 or 285 failures...
> > >
> > > So, I need to learn more about how your Jenkins environment extracts
> > details about the results.
> > > Moreover, which JVM is used ?
> > >
> > > Do you plan to use IBM JVM in order to check that Spark and IBM JVM are
> > compatible ? (they already do not look to be compatible 100% ...).
> > >
> > > Thanks
> > >
> > > Tony
> > >
> > > IBM Coop Architect & Technical Leader
> > > Office : +33 (0) 4 76 29 72 67
> > > 1 rue de Provence - 38432 Échirolles - France
> > > www.atos.net<http://www.atos.net/>
> > > ________________________________
> > > De : Ted Yu [yuzhihong@gmail.com]
> > > Envoyé : jeudi 8 janvier 2015 17:43
> > > À : Tony Reix
> > > Cc : dev@spark.apache.org
> > > Objet : Re: Results of tests
> > >
> > > Here it is:
> > >
> > > [centos] $
> >
> /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.0.5/bin/mvn
> > -DHADOOP_PROFILE=hadoop-2.4 -Dlabel=centos -DskipTests -Phadoop-2.4
> -Pyarn
> > -Phive clean package
> > >
> > >
> > > You can find the above in
> >
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/consoleFull
> > >
> > >
> > > Cheers
> > >
> > > On Thu, Jan 8, 2015 at 8:05 AM, Tony Reix <tony.reix@bull.net<mailto:
> > tony.reix@bull.net>> wrote:
> > > Thanks !
> > >
> > > I've been able to see that there are 3745 tests for version 1.2.0 with
> > profile Hadoop 2.4  .
> > > However, on my side, the maximum tests I've seen are 3485... About 300
> > tests are missing on my side.
> > > Which Maven option has been used for producing the report file used for
> > building the page:
> > >
> >
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/
> > >   ? (I'm not authorized to look at the "configuration" part)
> > >
> > > Thx !
> > >
> > > Tony
> > >
> > > ________________________________
> > > De : Ted Yu [yuzhihong@gmail.com<ma...@gmail.com>]
> > > Envoyé : jeudi 8 janvier 2015 16:11
> > > À : Tony Reix
> > > Cc : dev@spark.apache.org<ma...@spark.apache.org>
> > > Objet : Re: Results of tests
> > >
> > > Please take a look at
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/
> > >
> > > On Thu, Jan 8, 2015 at 5:40 AM, Tony Reix <tony.reix@bull.net<mailto:
> > tony.reix@bull.net>> wrote:
> > > Hi,
> > > I'm checking that Spark works fine on a new environment (PPC64
> hardware).
> > > I've found some issues, with versions 1.1.0, 1.1.1, and 1.2.0, even
> when
> > running on Ubuntu on x86_64 with Oracle JVM. I'd like to know where I can
> > find the results of the tests of Spark, for each version and for the
> > different versions, in order to have a reference to compare my results
> > with. I cannot find them on Spark web-site.
> > > Thx
> > > Tony
> > >
> > >
> > >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> > For additional commands, e-mail: dev-help@spark.apache.org
> >
> >
>

Re: Results of tests

Posted by Ted Yu <yu...@gmail.com>.
For a build which uses JUnit, we would see a summary such as the following (
https://builds.apache.org/job/HBase-TRUNK/6007/console):

Tests run: 2199, Failures: 0, Errors: 0, Skipped: 25


In https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/consoleFull
, I don't see such statistics.


Looks like scalatest-maven-plugin can be enhanced :-)


On Fri, Jan 9, 2015 at 3:52 AM, Sean Owen <so...@cloudera.com> wrote:

> Hey Tony, the number of tests run could vary depending on how the
> build is configured. For example, YARN-related tests would only run
> when the yarn profile is turned on. Java 8 tests would only run under
> Java 8.
>
> Although I don't know that there's any reason to believe the IBM JVM
> has a problem with Spark, I see this issue that is potentially related
> to endian-ness : https://issues.apache.org/jira/browse/SPARK-2018 I
> don't know if that was a Spark issue. Certainly, would be good for you
> to investigate if you are interested in resolving it.
>
> The Jenkins output shows you exactly what tests were run and how --
> have a look at the logs.
>
>
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/consoleFull
>
> On Fri, Jan 9, 2015 at 9:15 AM, Tony Reix <to...@bull.net> wrote:
> > Hi Ted
> >
> > Thanks for the info.
> > However, I'm still unable to understand how the page:
> >
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/
> > has been built.
> > This page contains details I do not find in the page you indicated to me:
> >
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/consoleFull
> >
> > As an example, I'm still unable to find these details:
> > org.apache.spark<
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark/>
>      12 mn   0
> >         1
> >         247
> >         248
> >
> > org.apache.spark.api.python<
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.api.python/>
> 20 ms   0
> >         0
> >         2
> >         2
> >
> > org.apache.spark.bagel<
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.bagel/>
>  7.7 s   0
> >         0
> >         4
> >         4
> >
> > org.apache.spark.broadcast<
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.broadcast/>
>  43 s    0
> >         0
> >         17
> >         17
> >
> > org.apache.spark.deploy<
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.deploy/>
> 16 s    0
> >         0
> >         29
> >         29
> >
> > org.apache.spark.deploy.worker<
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.deploy.worker/>
>  0.55 s  0
> >         0
> >         12
> >         12
> >
> > ........
> >
> >
> > Moreover, in my Ubuntu/x86_64 environment, I do not find 3745 tests and
> 0 failures, but 3485 tests and 4 failures (when using Oracle JVM 1.7 ).
> When using IBM JVM, there are only 2566 tests and 5 failures (in same
> component: Streaming).
> >
> > On my PPC64BE (BE = Big-Endian)environment, the tests block after 2
> hundreds of tests.
> > Is Spark independent of Little/Big-Endian stuff ?
> >
> > On my PPC64LE (LE = Little-Endian) environment, I have 3485 tests only
> (like on Ubuntu/x86_64 with IBM JVM), with 6 or 285 failures...
> >
> > So, I need to learn more about how your Jenkins environment extracts
> details about the results.
> > Moreover, which JVM is used ?
> >
> > Do you plan to use IBM JVM in order to check that Spark and IBM JVM are
> compatible ? (they already do not look to be compatible 100% ...).
> >
> > Thanks
> >
> > Tony
> >
> > IBM Coop Architect & Technical Leader
> > Office : +33 (0) 4 76 29 72 67
> > 1 rue de Provence - 38432 Échirolles - France
> > www.atos.net<http://www.atos.net/>
> > ________________________________
> > De : Ted Yu [yuzhihong@gmail.com]
> > Envoyé : jeudi 8 janvier 2015 17:43
> > À : Tony Reix
> > Cc : dev@spark.apache.org
> > Objet : Re: Results of tests
> >
> > Here it is:
> >
> > [centos] $
> /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.0.5/bin/mvn
> -DHADOOP_PROFILE=hadoop-2.4 -Dlabel=centos -DskipTests -Phadoop-2.4 -Pyarn
> -Phive clean package
> >
> >
> > You can find the above in
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/consoleFull
> >
> >
> > Cheers
> >
> > On Thu, Jan 8, 2015 at 8:05 AM, Tony Reix <tony.reix@bull.net<mailto:
> tony.reix@bull.net>> wrote:
> > Thanks !
> >
> > I've been able to see that there are 3745 tests for version 1.2.0 with
> profile Hadoop 2.4  .
> > However, on my side, the maximum tests I've seen are 3485... About 300
> tests are missing on my side.
> > Which Maven option has been used for producing the report file used for
> building the page:
> >
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/
> >   ? (I'm not authorized to look at the "configuration" part)
> >
> > Thx !
> >
> > Tony
> >
> > ________________________________
> > De : Ted Yu [yuzhihong@gmail.com<ma...@gmail.com>]
> > Envoyé : jeudi 8 janvier 2015 16:11
> > À : Tony Reix
> > Cc : dev@spark.apache.org<ma...@spark.apache.org>
> > Objet : Re: Results of tests
> >
> > Please take a look at https://amplab.cs.berkeley.edu/jenkins/view/Spark/
> >
> > On Thu, Jan 8, 2015 at 5:40 AM, Tony Reix <tony.reix@bull.net<mailto:
> tony.reix@bull.net>> wrote:
> > Hi,
> > I'm checking that Spark works fine on a new environment (PPC64 hardware).
> > I've found some issues, with versions 1.1.0, 1.1.1, and 1.2.0, even when
> running on Ubuntu on x86_64 with Oracle JVM. I'd like to know where I can
> find the results of the tests of Spark, for each version and for the
> different versions, in order to have a reference to compare my results
> with. I cannot find them on Spark web-site.
> > Thx
> > Tony
> >
> >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>

Re: Results of tests

Posted by Sean Owen <so...@cloudera.com>.
Hey Tony, the number of tests run could vary depending on how the
build is configured. For example, YARN-related tests would only run
when the yarn profile is turned on. Java 8 tests would only run under
Java 8.

Although I don't know that there's any reason to believe the IBM JVM
has a problem with Spark, I see this issue that is potentially related
to endian-ness : https://issues.apache.org/jira/browse/SPARK-2018 I
don't know if that was a Spark issue. Certainly, would be good for you
to investigate if you are interested in resolving it.

The Jenkins output shows you exactly what tests were run and how --
have a look at the logs.

https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/consoleFull

On Fri, Jan 9, 2015 at 9:15 AM, Tony Reix <to...@bull.net> wrote:
> Hi Ted
>
> Thanks for the info.
> However, I'm still unable to understand how the page:
>    https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/
> has been built.
> This page contains details I do not find in the page you indicated to me:
>    https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/consoleFull
>
> As an example, I'm still unable to find these details:
> org.apache.spark<https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark/>       12 mn   0
>         1
>         247
>         248
>
> org.apache.spark.api.python<https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.api.python/> 20 ms   0
>         0
>         2
>         2
>
> org.apache.spark.bagel<https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.bagel/>   7.7 s   0
>         0
>         4
>         4
>
> org.apache.spark.broadcast<https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.broadcast/>   43 s    0
>         0
>         17
>         17
>
> org.apache.spark.deploy<https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.deploy/> 16 s    0
>         0
>         29
>         29
>
> org.apache.spark.deploy.worker<https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.deploy.worker/>   0.55 s  0
>         0
>         12
>         12
>
> ........
>
>
> Moreover, in my Ubuntu/x86_64 environment, I do not find 3745 tests and 0 failures, but 3485 tests and 4 failures (when using Oracle JVM 1.7 ). When using IBM JVM, there are only 2566 tests and 5 failures (in same component: Streaming).
>
> On my PPC64BE (BE = Big-Endian)environment, the tests block after 2 hundreds of tests.
> Is Spark independent of Little/Big-Endian stuff ?
>
> On my PPC64LE (LE = Little-Endian) environment, I have 3485 tests only (like on Ubuntu/x86_64 with IBM JVM), with 6 or 285 failures...
>
> So, I need to learn more about how your Jenkins environment extracts details about the results.
> Moreover, which JVM is used ?
>
> Do you plan to use IBM JVM in order to check that Spark and IBM JVM are compatible ? (they already do not look to be compatible 100% ...).
>
> Thanks
>
> Tony
>
> IBM Coop Architect & Technical Leader
> Office : +33 (0) 4 76 29 72 67
> 1 rue de Provence - 38432 Échirolles - France
> www.atos.net<http://www.atos.net/>
> ________________________________
> De : Ted Yu [yuzhihong@gmail.com]
> Envoyé : jeudi 8 janvier 2015 17:43
> À : Tony Reix
> Cc : dev@spark.apache.org
> Objet : Re: Results of tests
>
> Here it is:
>
> [centos] $ /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.0.5/bin/mvn -DHADOOP_PROFILE=hadoop-2.4 -Dlabel=centos -DskipTests -Phadoop-2.4 -Pyarn -Phive clean package
>
>
> You can find the above in https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/consoleFull
>
>
> Cheers
>
> On Thu, Jan 8, 2015 at 8:05 AM, Tony Reix <to...@bull.net>> wrote:
> Thanks !
>
> I've been able to see that there are 3745 tests for version 1.2.0 with profile Hadoop 2.4  .
> However, on my side, the maximum tests I've seen are 3485... About 300 tests are missing on my side.
> Which Maven option has been used for producing the report file used for building the page:
>      https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/
>   ? (I'm not authorized to look at the "configuration" part)
>
> Thx !
>
> Tony
>
> ________________________________
> De : Ted Yu [yuzhihong@gmail.com<ma...@gmail.com>]
> Envoyé : jeudi 8 janvier 2015 16:11
> À : Tony Reix
> Cc : dev@spark.apache.org<ma...@spark.apache.org>
> Objet : Re: Results of tests
>
> Please take a look at https://amplab.cs.berkeley.edu/jenkins/view/Spark/
>
> On Thu, Jan 8, 2015 at 5:40 AM, Tony Reix <to...@bull.net>> wrote:
> Hi,
> I'm checking that Spark works fine on a new environment (PPC64 hardware).
> I've found some issues, with versions 1.1.0, 1.1.1, and 1.2.0, even when running on Ubuntu on x86_64 with Oracle JVM. I'd like to know where I can find the results of the tests of Spark, for each version and for the different versions, in order to have a reference to compare my results with. I cannot find them on Spark web-site.
> Thx
> Tony
>
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


RE:Results of tests

Posted by Tony Reix <to...@bull.net>.
Hi Ted

Thanks for the info.
However, I'm still unable to understand how the page:
   https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/
has been built.
This page contains details I do not find in the page you indicated to me:
   https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/consoleFull

As an example, I'm still unable to find these details:
org.apache.spark<https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark/>       12 mn   0
        1
        247
        248

org.apache.spark.api.python<https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.api.python/> 20 ms   0
        0
        2
        2

org.apache.spark.bagel<https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.bagel/>   7.7 s   0
        0
        4
        4

org.apache.spark.broadcast<https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.broadcast/>   43 s    0
        0
        17
        17

org.apache.spark.deploy<https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.deploy/> 16 s    0
        0
        29
        29

org.apache.spark.deploy.worker<https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/org.apache.spark.deploy.worker/>   0.55 s  0
        0
        12
        12

........


Moreover, in my Ubuntu/x86_64 environment, I do not find 3745 tests and 0 failures, but 3485 tests and 4 failures (when using Oracle JVM 1.7 ). When using IBM JVM, there are only 2566 tests and 5 failures (in same component: Streaming).

On my PPC64BE (BE = Big-Endian)environment, the tests block after 2 hundreds of tests.
Is Spark independent of Little/Big-Endian stuff ?

On my PPC64LE (LE = Little-Endian) environment, I have 3485 tests only (like on Ubuntu/x86_64 with IBM JVM), with 6 or 285 failures...

So, I need to learn more about how your Jenkins environment extracts details about the results.
Moreover, which JVM is used ?

Do you plan to use IBM JVM in order to check that Spark and IBM JVM are compatible ? (they already do not look to be compatible 100% ...).

Thanks

Tony

IBM Coop Architect & Technical Leader
Office : +33 (0) 4 76 29 72 67
1 rue de Provence - 38432 Échirolles - France
www.atos.net<http://www.atos.net/>
________________________________
De : Ted Yu [yuzhihong@gmail.com]
Envoyé : jeudi 8 janvier 2015 17:43
À : Tony Reix
Cc : dev@spark.apache.org
Objet : Re: Results of tests

Here it is:

[centos] $ /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.0.5/bin/mvn -DHADOOP_PROFILE=hadoop-2.4 -Dlabel=centos -DskipTests -Phadoop-2.4 -Pyarn -Phive clean package


You can find the above in https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/consoleFull


Cheers

On Thu, Jan 8, 2015 at 8:05 AM, Tony Reix <to...@bull.net>> wrote:
Thanks !

I've been able to see that there are 3745 tests for version 1.2.0 with profile Hadoop 2.4  .
However, on my side, the maximum tests I've seen are 3485... About 300 tests are missing on my side.
Which Maven option has been used for producing the report file used for building the page:
     https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/
  ? (I'm not authorized to look at the "configuration" part)

Thx !

Tony

________________________________
De : Ted Yu [yuzhihong@gmail.com<ma...@gmail.com>]
Envoyé : jeudi 8 janvier 2015 16:11
À : Tony Reix
Cc : dev@spark.apache.org<ma...@spark.apache.org>
Objet : Re: Results of tests

Please take a look at https://amplab.cs.berkeley.edu/jenkins/view/Spark/

On Thu, Jan 8, 2015 at 5:40 AM, Tony Reix <to...@bull.net>> wrote:
Hi,
I'm checking that Spark works fine on a new environment (PPC64 hardware).
I've found some issues, with versions 1.1.0, 1.1.1, and 1.2.0, even when running on Ubuntu on x86_64 with Oracle JVM. I'd like to know where I can find the results of the tests of Spark, for each version and for the different versions, in order to have a reference to compare my results with. I cannot find them on Spark web-site.
Thx
Tony




Re: Results of tests

Posted by Ted Yu <yu...@gmail.com>.
Here it is:

[centos] $ /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.0.5/bin/mvn
-DHADOOP_PROFILE=hadoop-2.4 -Dlabel=centos -DskipTests -Phadoop-2.4
-Pyarn -Phive clean package


You can find the above in
https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/consoleFull


Cheers


On Thu, Jan 8, 2015 at 8:05 AM, Tony Reix <to...@bull.net> wrote:

>  Thanks !
>
> I've been able to see that there are 3745 tests for version 1.2.0 with
> profile Hadoop 2.4  .
> However, on my side, the maximum tests I've seen are 3485... About 300
> tests are missing on my side.
> Which Maven option has been used for producing the report file used for
> building the page:
>
> https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/
>   ? (I'm not authorized to look at the "configuration" part)
>
> Thx !
>
> Tony
>
>  ------------------------------
> *De :* Ted Yu [yuzhihong@gmail.com]
> *Envoyé :* jeudi 8 janvier 2015 16:11
> *À :* Tony Reix
> *Cc :* dev@spark.apache.org
> *Objet :* Re: Results of tests
>
>   Please take a look at https://amplab.cs.berkeley.edu/jenkins/view/Spark/
>
> On Thu, Jan 8, 2015 at 5:40 AM, Tony Reix <to...@bull.net> wrote:
>
>> Hi,
>> I'm checking that Spark works fine on a new environment (PPC64 hardware).
>> I've found some issues, with versions 1.1.0, 1.1.1, and 1.2.0, even when
>> running on Ubuntu on x86_64 with Oracle JVM. I'd like to know where I can
>> find the results of the tests of Spark, for each version and for the
>> different versions, in order to have a reference to compare my results
>> with. I cannot find them on Spark web-site.
>> Thx
>> Tony
>>
>>
>

RE:Results of tests

Posted by Tony Reix <to...@bull.net>.
Thanks !

I've been able to see that there are 3745 tests for version 1.2.0 with profile Hadoop 2.4  .
However, on my side, the maximum tests I've seen are 3485... About 300 tests are missing on my side.
Which Maven option has been used for producing the report file used for building the page:
     https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-1.2-Maven-with-YARN/lastSuccessfulBuild/HADOOP_PROFILE=hadoop-2.4,label=centos/testReport/
  ? (I'm not authorized to look at the "configuration" part)

Thx !

Tony

________________________________
De : Ted Yu [yuzhihong@gmail.com]
Envoyé : jeudi 8 janvier 2015 16:11
À : Tony Reix
Cc : dev@spark.apache.org
Objet : Re: Results of tests

Please take a look at https://amplab.cs.berkeley.edu/jenkins/view/Spark/

On Thu, Jan 8, 2015 at 5:40 AM, Tony Reix <to...@bull.net>> wrote:
Hi,
I'm checking that Spark works fine on a new environment (PPC64 hardware).
I've found some issues, with versions 1.1.0, 1.1.1, and 1.2.0, even when running on Ubuntu on x86_64 with Oracle JVM. I'd like to know where I can find the results of the tests of Spark, for each version and for the different versions, in order to have a reference to compare my results with. I cannot find them on Spark web-site.
Thx
Tony



Re: Results of tests

Posted by Ted Yu <yu...@gmail.com>.
Please take a look at https://amplab.cs.berkeley.edu/jenkins/view/Spark/

On Thu, Jan 8, 2015 at 5:40 AM, Tony Reix <to...@bull.net> wrote:

> Hi,
> I'm checking that Spark works fine on a new environment (PPC64 hardware).
> I've found some issues, with versions 1.1.0, 1.1.1, and 1.2.0, even when
> running on Ubuntu on x86_64 with Oracle JVM. I'd like to know where I can
> find the results of the tests of Spark, for each version and for the
> different versions, in order to have a reference to compare my results
> with. I cannot find them on Spark web-site.
> Thx
> Tony
>
>