You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Alexander Ulanov (JIRA)" <ji...@apache.org> on 2016/06/11 01:18:20 UTC

[jira] [Created] (SPARK-15893) spark.createDataFrame raises an exception in Spark 2.0 tests on Windows

Alexander Ulanov created SPARK-15893:
----------------------------------------

             Summary: spark.createDataFrame raises an exception in Spark 2.0 tests on Windows
                 Key: SPARK-15893
                 URL: https://issues.apache.org/jira/browse/SPARK-15893
             Project: Spark
          Issue Type: Bug
          Components: Tests
    Affects Versions: 2.0.0
            Reporter: Alexander Ulanov


spark.createDataFrame raises an exception in Spark 2.0 tests on Windows

For example, LogisticRegressionSuite fails at Line 46:
Exception encountered when invoking run on a nested suite - java.net.URISyntaxException: Relative path in absolute URI: file:C:/dev/spark/external/flume-assembly/spark-warehouse
java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: file:C:/dev/spark/external/flume-assembly/spark-warehouse
	at org.apache.hadoop.fs.Path.initialize(Path.java:206)
	at org.apache.hadoop.fs.Path.<init>(Path.java:172)
	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.makeQualifiedPath(SessionCatalog.scala:109)


Another example, DataFrameSuite raises:
java.net.URISyntaxException: Relative path in absolute URI: file:C:/dev/spark/external/flume-assembly/spark-warehouse
java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: file:C:/dev/spark/external/flume-assembly/spark-warehouse
	at org.apache.hadoop.fs.Path.initialize(Path.java:206)
	at org.apache.hadoop.fs.Path.<init>(Path.java:172)





--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org