You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/11/19 09:04:00 UTC

[jira] [Updated] (SPARK-18464) Spark SQL fails to load tables created without providing a schema

     [ https://issues.apache.org/jira/browse/SPARK-18464?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen updated SPARK-18464:
------------------------------
    Assignee: Wenchen Fan

> Spark SQL fails to load tables created without providing a schema
> -----------------------------------------------------------------
>
>                 Key: SPARK-18464
>                 URL: https://issues.apache.org/jira/browse/SPARK-18464
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.1.0
>            Reporter: Yin Huai
>            Assignee: Wenchen Fan
>            Priority: Blocker
>             Fix For: 2.1.0
>
>
> I have a old table that was created without providing a schema. Seems branch 2.1 fail to load it and says that the schema is corrupt. 
> With {{spark.sql.debug}} enabled, I get the metadata by using {{describe formatted}}.
> {code}
> [col,array<string>,from deserializer]
> [,,]
> [# Detailed Table Information,,]
> [Database:,mydb,]
> [Owner:,root,]
> [Create Time:,Fri Jun 17 11:55:07 UTC 2016,]
> [Last Access Time:,Thu Jan 01 00:00:00 UTC 1970,]
> [Location:,mylocation,]
> [Table Type:,EXTERNAL,]
> [Table Parameters:,,]
> [  transient_lastDdlTime,1466164507,]
> [  spark.sql.sources.provider,parquet,]
> [,,]
> [# Storage Information,,]
> [SerDe Library:,org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe,]
> [InputFormat:,org.apache.hadoop.mapred.SequenceFileInputFormat,]
> [OutputFormat:,org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat,]
> [Compressed:,No,]
> [Storage Desc Parameters:,,]
> [  path,/myPatch,]
> [  serialization.format,1,]
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org