You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael Armbrust (JIRA)" <ji...@apache.org> on 2016/04/07 21:59:25 UTC

[jira] [Created] (SPARK-14463) read.text broken for partitioned tables

Michael Armbrust created SPARK-14463:
----------------------------------------

             Summary: read.text broken for partitioned tables
                 Key: SPARK-14463
                 URL: https://issues.apache.org/jira/browse/SPARK-14463
             Project: Spark
          Issue Type: Bug
          Components: SQL
            Reporter: Michael Armbrust
            Priority: Critical


Strongly typing the return values of {{read.text}} as {{Dataset\[String]}} breaks when trying to load a partitioned table (or any table where the path looks partitioned)

{code}
Seq((1, "test"))
  .toDF("a", "b")
  .write
  .format("text")
  .partitionBy("a")
  .save("/home/michael/text-part-bug")

sqlContext.read.text("/home/michael/text-part-bug")
{code}

{code}
org.apache.spark.sql.AnalysisException: Try to map struct<value:string,a:int> to Tuple1, but failed as the number of fields does not line up.
 - Input schema: struct<value:string,a:int>
 - Target schema: struct<value:string>;
	at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.org$apache$spark$sql$catalyst$encoders$ExpressionEncoder$$fail$1(ExpressionEncoder.scala:265)
	at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.validate(ExpressionEncoder.scala:279)
	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:197)
	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:168)
	at org.apache.spark.sql.Dataset$.apply(Dataset.scala:57)
	at org.apache.spark.sql.Dataset.as(Dataset.scala:357)
	at org.apache.spark.sql.DataFrameReader.text(DataFrameReader.scala:450)
{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org