You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/12/31 01:47:49 UTC

[jira] [Commented] (SPARK-9757) Can't create persistent data source tables with decimal

    [ https://issues.apache.org/jira/browse/SPARK-9757?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15075575#comment-15075575 ] 

Apache Spark commented on SPARK-9757:
-------------------------------------

User 'yhuai' has created a pull request for this issue:
https://github.com/apache/spark/pull/10533

> Can't create persistent data source tables with decimal
> -------------------------------------------------------
>
>                 Key: SPARK-9757
>                 URL: https://issues.apache.org/jira/browse/SPARK-9757
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.5.0
>            Reporter: Michael Armbrust
>            Assignee: Cheng Lian
>            Priority: Blocker
>             Fix For: 1.5.0
>
>
> {{ParquetHiveSerDe}} in Hive versions < 1.2.0 doesn't support decimal. Persisting Parquet relations to metastore of such versions (say 0.13.1) throws the following exception after SPARK-6923.
> {code}
> Caused by: java.lang.UnsupportedOperationException: Parquet does not support decimal. See HIVE-6384
> 	at org.apache.hadoop.hive.ql.io.parquet.serde.ArrayWritableObjectInspector.getObjectInspector(ArrayWritableObjectInspector.java:102)
> 	at org.apache.hadoop.hive.ql.io.parquet.serde.ArrayWritableObjectInspector.<init>(ArrayWritableObjectInspector.java:60)
> 	at org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe.initialize(ParquetHiveSerDe.java:113)
> 	at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:339)
> 	at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:288)
> 	at org.apache.hadoop.hive.ql.metadata.Table.checkValidity(Table.java:194)
> 	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:597)
> 	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:576)
> 	at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$createTable$1.apply$mcV$sp(ClientWrapper.scala:358)
> 	at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$createTable$1.apply(ClientWrapper.scala:356)
> 	at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$createTable$1.apply(ClientWrapper.scala:356)
> 	at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:256)
> 	at org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:211)
> 	at org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:248)
> 	at org.apache.spark.sql.hive.client.ClientWrapper.createTable(ClientWrapper.scala:356)
> 	at org.apache.spark.sql.hive.HiveMetastoreCatalog.createDataSourceTable(HiveMetastoreCatalog.scala:351)
> 	at org.apache.spark.sql.hive.HiveMetastoreCatalog.createDataSourceTable(HiveMetastoreCatalog.scala:198)
> 	at org.apache.spark.sql.hive.execution.CreateMetastoreDataSource.run(commands.scala:152)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org