You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2018/06/26 16:09:20 UTC
[jira] [Issue Comment Deleted] (SPARK-24631) Cannot up cast column
from bigint to smallint as it may truncate
[ https://issues.apache.org/jira/browse/SPARK-24631?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Marcelo Vanzin updated SPARK-24631:
-----------------------------------
Comment: was deleted
(was: User 'vanzin' has created a pull request for this issue:
https://github.com/apache/spark/pull/21639)
> Cannot up cast column from bigint to smallint as it may truncate
> ----------------------------------------------------------------
>
> Key: SPARK-24631
> URL: https://issues.apache.org/jira/browse/SPARK-24631
> Project: Spark
> Issue Type: New JIRA Project
> Components: Spark Core, Spark Submit
> Affects Versions: 2.2.1
> Reporter: Sivakumar
> Priority: Major
>
> Getting the below error when executing the simple select query,
> Sample:
> Table Description:
> name: String, id: BigInt
> val df=spark.sql("select name,id from testtable")
> ERROR: {color:#ff0000}Cannot up cast column "id" from bigint to smallint as it may truncate.{color}
> I am not doing any transformation's, I am just trying to query a table ,But still I am getting the error.
> I am getting this error only on production cluster and only for a single table, other tables are running fine.
> + more data,
> val df=spark.sql("select* from table_name")
> I am just trying this query a table. But with other tables it is running fine.
> {color:#d04437}18/06/22 01:36:29 ERROR Driver1: [] [main] Exception occurred: org.apache.spark.sql.AnalysisException: Cannot up cast `column_name` from bigint to column_name#2525: smallint as it may truncate.{color}
> that specific column is having Bigint datatype, But there were other table's that ran fine with Bigint columns.
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org