You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Michael Shtelma <ms...@gmail.com> on 2018/04/19 11:37:43 UTC

INSERT INTO TABLE_PARAMS fails during ANALYZE TABLE

Hi everybody,

I wanted to test CBO with enabled histograms.
In order to do this, I  have enabled property
spark.sql.statistics.histogram.enabled.
In this test derby was used as a database for hive metastore.

The problem is, that in some cases, the values, that are inserted to table
TABLE_PARAMS exceed the maximum length of 4000 symbols:

org.apache.spark.sql.AnalysisException:
org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter table.
Put request failed : INSERT INTO TABLE_PARAMS
(PARAM_VALUE,TBL_ID,PARAM_KEY) VALUES (?,?,?)

org.datanucleus.exceptions.NucleusDataStoreException: Put request failed :
INSERT INTO TABLE_PARAMS (PARAM_VALUE,TBL_ID,PARAM_KEY) VALUES (?,?,?)

and then

Caused by: java.sql.SQLDataException: A truncation error was encountered
trying to shrink VARCHAR
'TFo0QmxvY2smMQwAAOAXAABMl6MI8TlBBw+MWLFixgAAAP7Bn9+7oD1wpMEv&' to length
4000.
The detailed stack trace can be seen here:

https://gist.github.com/mshtelma/c5ee8206200533fc1d606964dd5a30e2

Is it a known issue ?

Best,
Michael