You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/07/15 12:18:20 UTC

[jira] [Resolved] (SPARK-16357) After enabling Spark shuffle RPC encryption using 3DES, Sparksql query has poor performance when running in parallel.

     [ https://issues.apache.org/jira/browse/SPARK-16357?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-16357.
-------------------------------
    Resolution: Not A Problem

> After enabling Spark shuffle RPC encryption using 3DES, Sparksql query has poor performance when running in parallel.
> ---------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-16357
>                 URL: https://issues.apache.org/jira/browse/SPARK-16357
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.1
>         Environment: Apache Spark 1.6.1 
> TPCx-BB 1.0.1
> jdk 1.7
>            Reporter: zeweichen
>            Priority: Minor
>
> We use TPCx-BB(BigBench) to evaluate the performance of Spark shuffle RPC encryption in our local cluster(E5-2699 v3, 256G, 72 vcores, 1 master node + 5 worker nodes). 
> During our performance test of Spark shuffle RPC encryption using 3DES on Sparksql, we found that throughput test (run queries in parallel ) of 2 stream lasts 2.68X time than power test(run query one by one), which is much larger than it should be. Q30 cost 0.5 hour in stream1,  but q30 last 2.5 hours in stream 0 which is much longer than it should be.  This caused that queries in stream1 go ahead much faster than stream0. So it works like single stream and can not fully use CPU resource.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org