You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2022/04/03 01:06:00 UTC

[jira] [Updated] (SPARK-38764) spark thrift server issue: Length field is empty for varchar fields

     [ https://issues.apache.org/jira/browse/SPARK-38764?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-38764:
---------------------------------
    Priority: Major  (was: Critical)

> spark thrift server issue: Length field is empty for varchar fields
> -------------------------------------------------------------------
>
>                 Key: SPARK-38764
>                 URL: https://issues.apache.org/jira/browse/SPARK-38764
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.1.1
>            Reporter: Ayan Ray
>            Priority: Major
>
> I am trying to read Data from Spark Thrift Server using SAS. In the table definition through DBeaver, I am seeing that *Length* field is empty only for fields with *VARCHAR* data type. I can see the length in the Data Type field as {*}varchar(32){*}. But that doesn't suffice my purpose as the SAS application taps into the Length field. Since, this field is not populated now, SAS is defaulting to the max size and as a result its becoming extremely slow. I get the length field populated in Hive.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org