You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Jacek Laskowski <ja...@japila.pl> on 2018/10/30 10:39:22 UTC

Why does spark.range(1).write.mode("overwrite").saveAsTable("t1") throw an Exception?

Hi,

Just ran into it today and wonder whether it's a bug or something I may
have missed before.

scala> spark.version
res21: String = 2.3.2

// that's OK
scala> spark.range(1).write.saveAsTable("t1")
org.apache.spark.sql.AnalysisException: Table `t1` already exists.;
  at
org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:408)
  at
org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:393)
  ... 51 elided

// Let's overwrite it then
// An exception?! Why?!
scala> spark.range(1).write.mode("overwrite").saveAsTable("t1")
org.apache.spark.sql.AnalysisException: Unable to infer schema for Parquet.
It must be specified manually.;
  at
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$9.apply(DataSource.scala:208)
  at
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$9.apply(DataSource.scala:208)
  at scala.Option.getOrElse(Option.scala:121)
...

// If the above works properly, why does the following work fine (and not
throw an exception)?
scala> spark.range(1).write.saveAsTable("t10")

p.s. I was not sure whether I should be sending the question to dev or
users so accept my apologizes when sent to a wrong mailing list.

Pozdrawiam,
Jacek Laskowski
----
https://about.me/JacekLaskowski
Mastering Spark SQL https://bit.ly/mastering-spark-sql
Spark Structured Streaming https://bit.ly/spark-structured-streaming
Mastering Kafka Streams https://bit.ly/mastering-kafka-streams
Follow me at https://twitter.com/jaceklaskowski