You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Buntu Dev <bu...@gmail.com> on 2016/05/06 23:27:42 UTC

org.apache.hadoop.hive.serde2.io.DoubleWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable error

I created a table using SparkSQL and loaded parquet data into the table but
when I attempt to do a 'SELECT * FROM tbl' I keep running into this error:

Error: java.io.IOException:
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.ClassCastException:
org.apache.hadoop.hive.serde2.io.DoubleWritable cannot be cast to
org.apache.hadoop.hive.serde2.io.HiveDecimalWritable (state=,code=0)

Here is how I created the table:

CREATE TABLE IF NOT EXISTS tbl (
     c1 STRING,
     c2 STRING,
     value DECIMAL(28, 22))
 STORED AS PARQUET")

Is there anyway to fix this error or do I need some other way to load the
data into the table?

Thanks!