You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2018/12/25 07:55:00 UTC
[jira] [Resolved] (SPARK-26426) ExpressionInfo related unit tests
fail in Windows
[ https://issues.apache.org/jira/browse/SPARK-26426?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-26426.
----------------------------------
Resolution: Fixed
Fix Version/s: 2.4.1
Issue resolved by pull request 23363
[https://github.com/apache/spark/pull/23363]
> ExpressionInfo related unit tests fail in Windows
> -------------------------------------------------
>
> Key: SPARK-26426
> URL: https://issues.apache.org/jira/browse/SPARK-26426
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.4.0, 3.0.0
> Environment: Windows 7 Operation System
> ===== maven version info ======
> Apache Maven 3.6.0 (97c98ec64a1fdfee7767ce5ffb20918da4f719f3; 2018-10-25T02:41:47+08:00)
> Maven home: D:\apache-maven-3.6.0
> Java version: 1.8.0_121, vendor: Oracle Corporation, runtime: D:\java\jdk1.8.0_121\jre
> Default locale: zh_CN, platform encoding: GBK
> OS name: "windows 7", version: "6.1", arch: "amd64", family: "windows"
> ===== java version info =====
> java version "1.8.0_121"
> Java(TM) SE Runtime Environment (build 1.8.0_121-b13)
> Java HotSpot(TM) 64-Bit Server VM (build 25.121-b13, mixed mode)
> Reporter: Wang Yanlin
> Assignee: Wang Yanlin
> Priority: Major
> Fix For: 2.4.1, 3.0.0
>
> Attachments: unit-test.log
>
>
> Using Windows 7 OS, after run"mvn install" for latest spark version, and then run "mvn test -Dtest=none -DwildcardSuites=org.apache.spark.sql.execution.streaming.sources.ForeachBatchSinkSuite -pl :spark-sql_2.12", it fails with Assertion error as follows
> ForeachBatchSinkSuite:
> - foreachBatch with non-stateful query *** FAILED ***
> java.lang.AssertionError:
> at org.apache.spark.sql.catalyst.expressions.ExpressionInfo.<init>(ExpressionInfo.java:82)
> at org.apache.spark.sql.catalyst.analysis.FunctionRegistry$.expressionInfo(FunctionRegistry.scala:636)
> at org.apache.spark.sql.catalyst.analysis.FunctionRegistry$.expression(FunctionRegistry.scala:595)
> at org.apache.spark.sql.catalyst.analysis.FunctionRegistry$.<init>(FunctionRegistry.scala:193)
> at org.apache.spark.sql.catalyst.analysis.FunctionRegistry$.<clinit>(FunctionRegistry.scala)
> at org.apache.spark.sql.internal.BaseSessionStateBuilder.$anonfun$functionRegistry$2(BaseSessionStateBuilder.scala:99)
> at scala.Option.getOrElse(Option.scala:138)
> at org.apache.spark.sql.internal.BaseSessionStateBuilder.functionRegistry$lzycompute(BaseSessionStateBuilder.scala:99)
> at org.apache.spark.sql.internal.BaseSessionStateBuilder.functionRegistry(BaseSessionStateBuilder.scala:97)
> at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:290)
> ...
> *** RUN ABORTED ***
> java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.sql.catalyst.analysis.FunctionRegistry$
> at org.apache.spark.sql.internal.BaseSessionStateBuilder.$anonfun$functionRegistry$2(BaseSessionStateBuilder.scala:99)
> at scala.Option.getOrElse(Option.scala:138)
> at org.apache.spark.sql.internal.BaseSessionStateBuilder.functionRegistry$lzycompute(BaseSessionStateBuilder.scala:99)
> at org.apache.spark.sql.internal.BaseSessionStateBuilder.functionRegistry(BaseSessionStateBuilder.scala:97)
> at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:290)
> at org.apache.spark.sql.test.TestSparkSession.sessionState$lzycompute(TestSQLContext.scala:42)
> at org.apache.spark.sql.test.TestSparkSession.sessionState(TestSQLContext.scala:41)
> at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:77)
> at org.apache.spark.sql.execution.streaming.MemoryStreamBase.toDF(memory.scala:60)
> at org.apache.spark.sql.execution.streaming.sources.ForeachBatchSinkSuite.$anonfun$new$5(ForeachBatchSinkSuite.scala:48)
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org