You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/04/11 10:00:36 UTC
[jira] [Commented] (SPARK-14524) In SparkSQL, it can't be select
column of String type because of UTF8String when setting more than 32G for
executors.
[ https://issues.apache.org/jira/browse/SPARK-14524?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15234651#comment-15234651 ]
Sean Owen commented on SPARK-14524:
-----------------------------------
Can you try this in 1.6 or master? I think this was resolved already. It does sound like a CompressedOops issue. I think you could verify by setting -XX:-UseCompressedOops on executors to be sure. Can you give a short reproduction? CC [~davies]
> In SparkSQL, it can't be select column of String type because of UTF8String when setting more than 32G for executors.
> ---------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-14524
> URL: https://issues.apache.org/jira/browse/SPARK-14524
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.5.2
> Environment: Centos
> Reporter: Deng Changchun
> Priority: Critical
> Labels: UTF8String
>
> (related Issue:https://github.com/apache/spark/pull/8210/files)
> When we set 32G(or more) for executor, select the column of String type, it shows the Wrong result, such as:
> 'abcde' (less than 8 chars) => '' (it will show nothing)
> 'abcdefghijklmn' (more than 8 chars) =>'ijklmn' ( it will cut the the front of 8 chars)
> However, when we set 31G( or less) for executor, all is good.
> We also have debugged this problem, we found that SparkSQL uses UTF8String internally, it depends on some properties of locally JVM Memmory allocation ( see class 'org.apache.spark.unsafe.Platform').
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org