You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "YuanGuanhu (Jira)" <ji...@apache.org> on 2021/05/10 01:33:00 UTC
[jira] [Updated] (SPARK-35359) when insert into char/varchar column
exceed length limitation will fail
[ https://issues.apache.org/jira/browse/SPARK-35359?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
YuanGuanhu updated SPARK-35359:
-------------------------------
Description:
In Spark3.1.1 have support Char/Varchar type, but when insert data with char/varchar datatype will fail when data length exceed length limitation even when spark.sql.legacy.charVarcharAsString is true.
reproduce:
create table chartb01(a char(3));
insert into chartb01 select 'aaaaa';
was:In Spark3.1.1 have support Char/Varchar type, but when we insert data to char/varchar type will fail if data length exceed length limitation. we should have an configuration to compatible with versions earlier than Spark3.1
> when insert into char/varchar column exceed length limitation will fail
> -----------------------------------------------------------------------
>
> Key: SPARK-35359
> URL: https://issues.apache.org/jira/browse/SPARK-35359
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.1.1
> Reporter: YuanGuanhu
> Priority: Major
>
> In Spark3.1.1 have support Char/Varchar type, but when insert data with char/varchar datatype will fail when data length exceed length limitation even when spark.sql.legacy.charVarcharAsString is true.
> reproduce:
> create table chartb01(a char(3));
> insert into chartb01 select 'aaaaa';
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org