You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "zhao bo (Jira)" <ji...@apache.org> on 2019/11/21 08:40:00 UTC

[jira] [Comment Edited] (SPARK-29106) Add jenkins arm test for spark

    [ https://issues.apache.org/jira/browse/SPARK-29106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16979050#comment-16979050 ] 

zhao bo edited comment on SPARK-29106 at 11/21/19 8:39 AM:
-----------------------------------------------------------

Hi [~shaneknapp],

I had success to build apache/arrow in the new ARM test worker(high performance). And SparkR and pyspark(python3.6 exec) testings are pass(including arrow and pandas tests). But for SparkR CRAN testing, there is a issue can not avoid, the testing will access wiki and return 443(as this VM is in China, it returns Connection time out), others are all pass. 

In the new ARM test worker, I install apache/arrow and other deps in a specific directory which not in /usr/lib, and uses another venv to test. So how do you think to add sparkR test as SCM periodic test, and ingoring the network accesstion error? Just want to know your advice.

Also I will create the script on the new ARM test worker for sparkR and pyspark test, and you can have a brief about the configuration of the testing.

Thanks very much.

 


was (Author: bzhaoopenstack):
Hi [~shaneknapp],

I had success to build apache/arrow in the new ARM test worker(high performance). And SparkR and pyspark(python3.6 exec) testings are pass(including arrow and pandas tests). But for SparkR CRAN testing, there is a issue can not avoid, the testing will access wiki and return 403(as this VM is in China), others are all pass. 

In the new ARM test worker, I install apache/arrow and other deps in a specific directory which not in /usr/lib, and uses another venv to test. So how do you think to add sparkR test as SCM periodic test, and ingoring the network accesstion error? Just want to know your advice.

Also I will create the script on the new ARM test worker for sparkR and pyspark test, and you can have a brief about the configuration of the testing.

Thanks very much.

 

> Add jenkins arm test for spark
> ------------------------------
>
>                 Key: SPARK-29106
>                 URL: https://issues.apache.org/jira/browse/SPARK-29106
>             Project: Spark
>          Issue Type: Test
>          Components: Tests
>    Affects Versions: 3.0.0
>            Reporter: huangtianhua
>            Priority: Minor
>         Attachments: R-ansible.yml, R-libs.txt, arm-python36.txt
>
>
> Add arm test jobs to amplab jenkins for spark.
> Till now we made two arm test periodic jobs for spark in OpenLab, one is based on master with hadoop 2.7(similar with QA test of amplab jenkins), other one is based on a new branch which we made on date 09-09, see  [http://status.openlabtesting.org/builds/job/spark-master-unit-test-hadoop-2.7-arm64]  and [http://status.openlabtesting.org/builds/job/spark-unchanged-branch-unit-test-hadoop-2.7-arm64.|http://status.openlabtesting.org/builds/job/spark-unchanged-branch-unit-test-hadoop-2.7-arm64] We only have to care about the first one when integrate arm test with amplab jenkins.
> About the k8s test on arm, we have took test it, see [https://github.com/theopenlab/spark/pull/17], maybe we can integrate it later. 
> And we plan test on other stable branches too, and we can integrate them to amplab when they are ready.
> We have offered an arm instance and sent the infos to shane knapp, thanks shane to add the first arm job to amplab jenkins :) 
> The other important thing is about the leveldbjni [https://github.com/fusesource/leveldbjni,|https://github.com/fusesource/leveldbjni/issues/80] spark depends on leveldbjni-all-1.8 [https://mvnrepository.com/artifact/org.fusesource.leveldbjni/leveldbjni-all/1.8], we can see there is no arm64 supporting. So we build an arm64 supporting release of leveldbjni see [https://mvnrepository.com/artifact/org.openlabtesting.leveldbjni/leveldbjni-all/1.8], but we can't modified the spark pom.xml directly with something like 'property'/'profile' to choose correct jar package on arm or x86 platform, because spark depends on some hadoop packages like hadoop-hdfs, the packages depend on leveldbjni-all-1.8 too, unless hadoop release with new arm supporting leveldbjni jar. Now we download the leveldbjni-al-1.8 of openlabtesting and 'mvn install' to use it when arm testing for spark.
> PS: The issues found and fixed:
>  SPARK-28770
>  [https://github.com/apache/spark/pull/25673]
>   
>  SPARK-28519
>  [https://github.com/apache/spark/pull/25279]
>   
>  SPARK-28433
>  [https://github.com/apache/spark/pull/25186]
>  
> SPARK-28467
> [https://github.com/apache/spark/pull/25864]
>  
> SPARK-29286
> [https://github.com/apache/spark/pull/26021]
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org