You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "xubo245 (JIRA)" <ji...@apache.org> on 2018/01/05 13:06:00 UTC
[jira] [Created] (SPARK-22972) Couldn't find corresponding Hive
SerDe for data source provider org.apache.spark.sql.hive.orc.
xubo245 created SPARK-22972:
-------------------------------
Summary: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.hive.orc.
Key: SPARK-22972
URL: https://issues.apache.org/jira/browse/SPARK-22972
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 2.2.1
Reporter: xubo245
*There is error when running test code:*
{code:java}
test("create orc table") {
spark.sql(
s"""CREATE TABLE normal_orc_as_source_hive
|USING org.apache.spark.sql.hive.orc
|OPTIONS (
| PATH '${new File(orcTableAsDir.getAbsolutePath).toURI}'
|)
""".stripMargin)
val df = spark.sql("select * from normal_orc_as_source_hive")
spark.sql("desc formatted normal_orc_as_source_hive").show()
}
{code}
*warning:*
{code:java}
05:00:44.038 WARN org.apache.spark.sql.hive.test.TestHiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.hive.orc. Persisting data source table `default`.`normal_orc_as_source_hive` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
{code}
Root cause analysis:
ORC related code is incorrect in HiveSerDe :
{code:java}
org.apache.spark.sql.internal.HiveSerDe#sourceToSerDe
{code}
{code:java}
def sourceToSerDe(source: String): Option[HiveSerDe] = {
val key = source.toLowerCase(Locale.ROOT) match {
case s if s.startsWith("org.apache.spark.sql.parquet") => "parquet"
case s if s.startsWith("org.apache.spark.sql.orc") => "orc"
case s if s.equals("orcfile") => "orc"
case s if s.equals("parquetfile") => "parquet"
case s if s.equals("avrofile") => "avro"
case s => s
}
{code}
Solution:
change "org.apache.spark.sql.orc“ to "org.apache.spark.sql.hive.orc"
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org