You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/10/30 01:23:56 UTC

[GitHub] [spark] ahshahid commented on pull request #30185: [SPARK-33152][SQL] This PR proposes a new logic to maintain & track constraints which solves the OOM or performance issues in query compilation

ahshahid commented on pull request #30185:
URL: https://github.com/apache/spark/pull/30185#issuecomment-719113915


   The hive suit which got aborted is 
   org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite
   
   Through IDE , below exception is seen.
   Looking into it
   org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1.
   Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.hive.metastore.version=1.2.1' '--conf' 'spark.sql.hive.metastore.jars=maven' '--conf' 'spark.sql.warehouse.dir=/private/var/folders/vl/gk1hr_957qs1jmbwhmvkq2d80000gp/T/warehouse-33cd9b12-1a96-40cd-afad-4b57880723a6' '--conf' 'spark.sql.test.version.index=0' '--driver-java-options' '-Dderby.system.home=/private/var/folders/vl/gk1hr_957qs1jmbwhmvkq2d80000gp/T/warehouse-33cd9b12-1a96-40cd-afad-4b57880723a6' '/private/var/folders/vl/gk1hr_957qs1jmbwhmvkq2d80000gp/T/test8221341981962562358.py'
   
   2020-10-29 18:20:28.01 - stderr> 20/10/29 18:20:28 WARN Utils: Your hostname, ALTERVICTIMFEWER.workdayinternal.com resolves to a loopback address: 127.0.0.1; using 192.168.0.10 instead (on interface en0)
   2020-10-29 18:20:28.011 - stderr> 20/10/29 18:20:28 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
   2020-10-29 18:20:29.793 - stderr> 20/10/29 18:20:29 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
   2020-10-29 18:20:31.692 - stdout> Traceback (most recent call last):
   2020-10-29 18:20:31.693 - stdout>   File "/private/var/folders/vl/gk1hr_957qs1jmbwhmvkq2d80000gp/T/test8221341981962562358.py", line 2, in <module>
   2020-10-29 18:20:31.693 - stdout>     from pyspark.sql import SparkSession
   2020-10-29 18:20:31.693 - stdout>   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
   2020-10-29 18:20:31.694 - stdout>   File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
   2020-10-29 18:20:31.694 - stdout>   File "<frozen importlib._bootstrap>", line 655, in _load_unlocked
   2020-10-29 18:20:31.695 - stdout>   File "<frozen importlib._bootstrap>", line 618, in _load_backward_compatible
   2020-10-29 18:20:31.695 - stdout>   File "<frozen zipimport>", line 259, in load_module
   2020-10-29 18:20:31.696 - stdout>   File "/private/tmp/test-spark/spark-2.4.7/python/lib/pyspark.zip/pyspark/__init__.py", line 51, in <module>
   2020-10-29 18:20:31.697 - stdout>   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
   2020-10-29 18:20:31.697 - stdout>   File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
   2020-10-29 18:20:31.698 - stdout>   File "<frozen importlib._bootstrap>", line 655, in _load_unlocked
   2020-10-29 18:20:31.699 - stdout>   File "<frozen importlib._bootstrap>", line 618, in _load_backward_compatible
   2020-10-29 18:20:31.699 - stdout>   File "<frozen zipimport>", line 259, in load_module
   2020-10-29 18:20:31.7 - stdout>   File "/private/tmp/test-spark/spark-2.4.7/python/lib/pyspark.zip/pyspark/context.py", line 31, in <module>
   2020-10-29 18:20:31.7 - stdout>   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
   2020-10-29 18:20:31.701 - stdout>   File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
   2020-10-29 18:20:31.701 - stdout>   File "<frozen importlib._bootstrap>", line 655, in _load_unlocked
   2020-10-29 18:20:31.702 - stdout>   File "<frozen importlib._bootstrap>", line 618, in _load_backward_compatible
   2020-10-29 18:20:31.702 - stdout>   File "<frozen zipimport>", line 259, in load_module
   2020-10-29 18:20:31.703 - stdout>   File "/private/tmp/test-spark/spark-2.4.7/python/lib/pyspark.zip/pyspark/accumulators.py", line 97, in <module>
   2020-10-29 18:20:31.704 - stdout>   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
   2020-10-29 18:20:31.704 - stdout>   File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
   2020-10-29 18:20:31.705 - stdout>   File "<frozen importlib._bootstrap>", line 655, in _load_unlocked
   2020-10-29 18:20:31.705 - stdout>   File "<frozen importlib._bootstrap>", line 618, in _load_backward_compatible
   2020-10-29 18:20:31.706 - stdout>   File "<frozen zipimport>", line 259, in load_module
   2020-10-29 18:20:31.707 - stdout>   File "/private/tmp/test-spark/spark-2.4.7/python/lib/pyspark.zip/pyspark/serializers.py", line 72, in <module>
   2020-10-29 18:20:31.708 - stdout>   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
   2020-10-29 18:20:31.709 - stdout>   File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
   2020-10-29 18:20:31.709 - stdout>   File "<frozen importlib._bootstrap>", line 655, in _load_unlocked
   2020-10-29 18:20:31.71 - stdout>   File "<frozen importlib._bootstrap>", line 618, in _load_backward_compatible
   2020-10-29 18:20:31.71 - stdout>   File "<frozen zipimport>", line 259, in load_module
   2020-10-29 18:20:31.711 - stdout>   File "/private/tmp/test-spark/spark-2.4.7/python/lib/pyspark.zip/pyspark/cloudpickle.py", line 145, in <module>
   2020-10-29 18:20:31.712 - stdout>   File "/private/tmp/test-spark/spark-2.4.7/python/lib/pyspark.zip/pyspark/cloudpickle.py", line 126, in _make_cell_set_template_code
   2020-10-29 18:20:31.712 - stdout> TypeError: an integer is required (got type bytes)
   2020-10-29 18:20:31.737 - stderr> log4j:WARN No appenders could be found for logger (org.apache.spark.util.ShutdownHookManager).
   2020-10-29 18:20:31.738 - stderr> log4j:WARN Please initialize the log4j system properly.
   2020-10-29 18:20:31.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org