You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Greg Hazel (JIRA)" <ji...@apache.org> on 2016/11/10 22:22:58 UTC

[jira] [Issue Comment Deleted] (HIVE-15179) Type description limited in size

     [ https://issues.apache.org/jira/browse/HIVE-15179?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Greg Hazel updated HIVE-15179:
------------------------------
    Comment: was deleted

(was: The type for this column is very large, and Parquet seems to give up around 4-5k characters. I'd "[ELIDED]" most of it, because it is quite large and reveals a bunch of schema I am not allowed to share.

java.lang.IllegalArgumentException: Error: > expected at the position 4136 of 'string:string:bigint:bigint:string:string:int:int:string:string:string:int:int:string:string:string:string:struct<post_id:string,[ELIDED]' but ':' is found.
	at org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.expect(TypeInfoUtils.java:360)
	at org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.parseType(TypeInfoUtils.java:472)
	at org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.parseType(TypeInfoUtils.java:447)
	at org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.parseType(TypeInfoUtils.java:481)
	at org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.parseType(TypeInfoUtils.java:481)
	at org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.parseType(TypeInfoUtils.java:481)
	at org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.parseTypeInfos(TypeInfoUtils.java:305)
	at org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils.getTypeInfosFromTypeString(TypeInfoUtils.java:754)
	at org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe.initialize(ParquetHiveSerDe.java:104)
	at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:339)
	at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:288)
	at org.apache.hadoop.hive.ql.metadata.Table.checkValidity(Table.java:194)
	at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1017)
	at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$getTableOption$1.apply(HiveClientImpl.scala:353)
	at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$getTableOption$1.apply(HiveClientImpl.scala:351)
	at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:280)
	at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:227)
	at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:226)
	at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:269)
	at org.apache.spark.sql.hive.client.HiveClientImpl.getTableOption(HiveClientImpl.scala:351)
	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$tableExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:228)
	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$tableExists$1.apply(HiveExternalCatalog.scala:228)
	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$tableExists$1.apply(HiveExternalCatalog.scala:228)
	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:72)
	at org.apache.spark.sql.hive.HiveExternalCatalog.tableExists(HiveExternalCatalog.scala:227)
	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:456)
	at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:359)
	at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:354)
)

> Type description limited in size
> --------------------------------
>
>                 Key: HIVE-15179
>                 URL: https://issues.apache.org/jira/browse/HIVE-15179
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Greg Hazel
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)