You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:01:08 UTC
[jira] [Updated] (SPARK-19051) test_hivecontext
(pyspark.sql.tests.HiveSparkSubmitTests) fails in python/run-tests
[ https://issues.apache.org/jira/browse/SPARK-19051?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon updated SPARK-19051:
---------------------------------
Labels: bulk-closed (was: )
> test_hivecontext (pyspark.sql.tests.HiveSparkSubmitTests) fails in python/run-tests
> -----------------------------------------------------------------------------------
>
> Key: SPARK-19051
> URL: https://issues.apache.org/jira/browse/SPARK-19051
> Project: Spark
> Issue Type: Bug
> Components: PySpark, SQL, Tests
> Affects Versions: 2.0.1
> Environment: Ubuntu 14.04, ppc64le, x86_64
> Reporter: Nirman Narang
> Priority: Minor
> Labels: bulk-closed
>
> Full log here.
> {code:title=python/run-tests|borderStyle=solid}
> Ivy Default Cache set to: /var/lib/jenkins/.ivy2/cache
> The jars for the packages stored in: /var/lib/jenkins/.ivy2/jars
> file:/tmp/tmp7Ie4FN added as a remote repository with the name: repo-1
> :: loading settings :: url = jar:file:/var/lib/jenkins/workspace/Sparkv2.0.1/spark/assembly/target/scala-2.11/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
> a#mylib added as a dependency
> :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
> confs: [default]
> found a#mylib;0.1 in repo-1
> :: resolution report :: resolve 428ms :: artifacts dl 5ms
> :: modules in use:
> a#mylib;0.1 from repo-1 in [default]
> ---------------------------------------------------------------------
> | | modules || artifacts |
> | conf | number| search|dwnlded|evicted|| number|dwnlded|
> ---------------------------------------------------------------------
> | default | 1 | 0 | 0 | 0 || 1 | 0 |
> ---------------------------------------------------------------------
> :: retrieving :: org.apache.spark#spark-submit-parent
> confs: [default]
> 0 artifacts copied, 1 already retrieved (0kB/16ms)
> .Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> Ivy Default Cache set to: /var/lib/jenkins/.ivy2/cache
> The jars for the packages stored in: /var/lib/jenkins/.ivy2/jars
> file:/tmp/tmpo5SXug added as a remote repository with the name: repo-1
> :: loading settings :: url = jar:file:/var/lib/jenkins/workspace/Sparkv2.0.1/spark/assembly/target/scala-2.11/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
> a#mylib added as a dependency
> :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
> confs: [default]
> found a#mylib;0.1 in repo-1
> :: resolution report :: resolve 320ms :: artifacts dl 4ms
> :: modules in use:
> a#mylib;0.1 from repo-1 in [default]
> ---------------------------------------------------------------------
> | | modules || artifacts |
> | conf | number| search|dwnlded|evicted|| number|dwnlded|
> ---------------------------------------------------------------------
> | default | 1 | 0 | 0 | 0 || 1 | 0 |
> ---------------------------------------------------------------------
> :: retrieving :: org.apache.spark#spark-submit-parent
> confs: [default]
> 0 artifacts copied, 1 already retrieved (0kB/7ms)
> .Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> .Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> .Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> ...
> [Stage 23:======================================> (143 + 5) / 200]
> [Stage 23:==================================================> (188 + 5) / 200]
>
> .....................
> [Stage 88:================================> (122 + 4) / 200]
> [Stage 88:=====================================================>(198 + 2) / 200]
>
> [Stage 90:> (0 + 4) / 200]
> [Stage 90:===> (14 + 4) / 200]
> [Stage 90:=======> (28 + 4) / 200]
> [Stage 90:===========> (42 + 4) / 200]
> [Stage 90:===============> (57 + 4) / 200]
> [Stage 90:====================> (73 + 4) / 200]
> [Stage 90:=======================> (86 + 4) / 200]
> [Stage 90:===========================> (101 + 4) / 200]
> [Stage 90:==============================> (113 + 4) / 200]
> [Stage 90:==================================> (127 + 4) / 200]
> [Stage 90:=====================================> (140 + 4) / 200]
> [Stage 90:=========================================> (153 + 4) / 200]
> [Stage 90:============================================> (166 + 4) / 200]
> [Stage 90:=================================================> (182 + 4) / 200]
> [Stage 90:====================================================> (196 + 4) / 200]
>
> .......
> [Stage 104:> (0 + 4) / 4]
>
> [Stage 107:======================> (83 + 4) / 199]
> [Stage 107:====================================> (138 + 4) / 199]
>
> ..
> [Stage 110:> (0 + 0) / 4]
> [Stage 110:> (0 + 4) / 4]SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
> SLF4J: Defaulting to no-operation (NOP) logger implementation
> SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
>
> ................/var/lib/jenkins/workspace/Sparkv2.0.1/spark/python/pyspark/sql/session.py:316: UserWarning: inferring schema from dict is deprecated,please use pyspark.sql.Row instead
> warnings.warn("inferring schema from dict is deprecated,"
> /var/lib/jenkins/workspace/Sparkv2.0.1/spark/python/pyspark/sql/session.py:336: UserWarning: Using RDD of dict to inferSchema is deprecated. Use pyspark.sql.Row instead
> warnings.warn("Using RDD of dict to inferSchema is deprecated. "
> .................................
> [Stage 411:=======> (28 + 4) / 200]
> [Stage 411:==========> (40 + 4) / 200]
> [Stage 411:==============> (53 + 4) / 200]
> [Stage 411:=================> (65 + 4) / 200]
> [Stage 411:=====================> (79 + 6) / 200]
> [Stage 411:=========================> (94 + 4) / 200]
> [Stage 411:=============================> (110 + 4) / 200]
> [Stage 411:=================================> (126 + 4) / 200]
> [Stage 411:=====================================> (141 + 4) / 200]
> [Stage 411:========================================> (154 + 4) / 200]
> [Stage 411:============================================> (168 + 4) / 200]
> [Stage 411:================================================> (182 + 5) / 200]
> [Stage 411:====================================================>(198 + 2) / 200]
>
> ..
> [Stage 417:=========> (37 + 4) / 200]
> [Stage 417:==============> (53 + 4) / 200]
> [Stage 417:==================> (70 + 4) / 200]
> [Stage 417:=======================> (86 + 4) / 200]
> [Stage 417:==========================> (100 + 5) / 200]
> [Stage 417:===============================> (117 + 4) / 200]
> [Stage 417:===================================> (134 + 4) / 200]
> [Stage 417:=======================================> (150 + 4) / 200]
> [Stage 417:===========================================> (164 + 5) / 200]
> [Stage 417:================================================> (182 + 4) / 200]
> [Stage 417:====================================================>(199 + 1) / 200]
>
> ...........Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> .Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> .Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> Ivy Default Cache set to: /var/lib/jenkins/.ivy2/cache
> The jars for the packages stored in: /var/lib/jenkins/.ivy2/jars
> file:/tmp/tmpr5i56T added as a remote repository with the name: repo-1
> :: loading settings :: url = jar:file:/var/lib/jenkins/workspace/Sparkv2.0.1/spark/assembly/target/scala-2.11/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
> a#mylib added as a dependency
> :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
> confs: [default]
> found a#mylib;0.1 in repo-1
> :: resolution report :: resolve 340ms :: artifacts dl 5ms
> :: modules in use:
> a#mylib;0.1 from repo-1 in [default]
> ---------------------------------------------------------------------
> | | modules || artifacts |
> | conf | number| search|dwnlded|evicted|| number|dwnlded|
> ---------------------------------------------------------------------
> | default | 1 | 0 | 0 | 0 || 1 | 0 |
> ---------------------------------------------------------------------
> :: retrieving :: org.apache.spark#spark-submit-parent
> confs: [default]
> 0 artifacts copied, 1 already retrieved (0kB/18ms)
> .Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> Ivy Default Cache set to: /var/lib/jenkins/.ivy2/cache
> The jars for the packages stored in: /var/lib/jenkins/.ivy2/jars
> file:/tmp/tmpgtMKHj added as a remote repository with the name: repo-1
> :: loading settings :: url = jar:file:/var/lib/jenkins/workspace/Sparkv2.0.1/spark/assembly/target/scala-2.11/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
> a#mylib added as a dependency
> :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
> confs: [default]
> found a#mylib;0.1 in repo-1
> :: resolution report :: resolve 256ms :: artifacts dl 3ms
> :: modules in use:
> a#mylib;0.1 from repo-1 in [default]
> ---------------------------------------------------------------------
> | | modules || artifacts |
> | conf | number| search|dwnlded|evicted|| number|dwnlded|
> ---------------------------------------------------------------------
> | default | 1 | 0 | 0 | 0 || 1 | 0 |
> ---------------------------------------------------------------------
> :: retrieving :: org.apache.spark#spark-submit-parent
> confs: [default]
> 0 artifacts copied, 1 already retrieved (0kB/8ms)
> .Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> .Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> .Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> Picked up _JAVA_OPTIONS: -Xms1024m -Xmx4096m
> .
> ======================================================================
> FAIL: test_hivecontext (pyspark.sql.tests.HiveSparkSubmitTests)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
> File "/var/lib/jenkins/workspace/Sparkv2.0.1/spark/python/pyspark/sql/tests.py", line 1736, in test_hivecontext
> self.assertEqual(0, proc.returncode)
> AssertionError: 0 != 1
> ----------------------------------------------------------------------
> Ran 113 tests in 251.010s
> FAILED (failures=1, skipped=1)
> +---+----------+
> |key| val|
> +---+----------+
> | 0|[0.0, 0.0]|
> | 1|[1.0, 1.0]|
> | 2|[2.0, 2.0]|
> | 0|[3.0, 3.0]|
> | 1|[4.0, 4.0]|
> | 2|[5.0, 5.0]|
> | 0|[6.0, 6.0]|
> | 1|[7.0, 7.0]|
> | 2|[8.0, 8.0]|
> | 0|[9.0, 9.0]|
> +---+----------+
> == Parsed Logical Plan ==
> GlobalLimit 1
> +- LocalLimit 1
> +- Project [id#3499L, <lambda>(id#3499L) AS copy#3503]
> +- Sort [id#3499L ASC], true
> +- Range (0, 10, step=1, splits=Some(4))
> == Analyzed Logical Plan ==
> id: bigint, copy: int
> GlobalLimit 1
> +- LocalLimit 1
> +- Project [id#3499L, <lambda>(id#3499L) AS copy#3503]
> +- Sort [id#3499L ASC], true
> +- Range (0, 10, step=1, splits=Some(4))
> == Optimized Logical Plan ==
> GlobalLimit 1
> +- LocalLimit 1
> +- Project [id#3499L, <lambda>(id#3499L) AS copy#3503]
> +- Sort [id#3499L ASC], true
> +- Range (0, 10, step=1, splits=Some(4))
> == Physical Plan ==
> TakeOrderedAndProject(limit=1, orderBy=[id#3499L ASC], output=[id#3499L,copy#3503])
> +- BatchEvalPython [<lambda>(id#3499L)], [id#3499L, pythonUDF0#3508]
> +- *Range (0, 10, step=1, splits=Some(4))
> [31m
> Had test failures in pyspark.sql.tests with python; see logs
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org