You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "discipleforteen (JIRA)" <ji...@apache.org> on 2017/01/16 01:41:26 UTC

[jira] [Commented] (SPARK-19225) Spark SQL round constant double return null

    [ https://issues.apache.org/jira/browse/SPARK-19225?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15823348#comment-15823348 ] 

discipleforteen commented on SPARK-19225:
-----------------------------------------

spark 1.4.1

> select round(4.4, 2);
2017-01-16 09:47:21,573 INFO ParseDriver: Parsing command: select round(4.4, 2)
2017-01-16 09:47:21,783 INFO ParseDriver: Parse Completed
2017-01-16 09:47:22,318 INFO SparkContext: Starting job: processCmd at CliDriver.java:423
2017-01-16 09:47:22,335 INFO DAGScheduler: Got job 0 (processCmd at CliDriver.java:423) with 1 output partitions (allowLocal=false)
2017-01-16 09:47:22,335 INFO DAGScheduler: Final stage: ResultStage 0(processCmd at CliDriver.java:423)
2017-01-16 09:47:22,335 INFO DAGScheduler: Parents of final stage: List()
2017-01-16 09:47:22,339 INFO DAGScheduler: Missing parents: List()
2017-01-16 09:47:22,344 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[2] at processCmd at CliDriver.java:423), which has no missing parents
2017-01-16 09:47:22,382 INFO MemoryStore: ensureFreeSpace(3232) called with curMem=0, maxMem=278302556
2017-01-16 09:47:22,384 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 3.2 KB, free 265.4 MB)
2017-01-16 09:47:22,516 INFO MemoryStore: ensureFreeSpace(1852) called with curMem=3232, maxMem=278302556
2017-01-16 09:47:22,516 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1852.0 B, free 265.4 MB)
2017-01-16 09:47:22,519 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:45436 (size: 1852.0 B, free: 265.4 MB)
2017-01-16 09:47:22,520 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:874
2017-01-16 09:47:22,525 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[2] at processCmd at CliDriver.java:423)
2017-01-16 09:47:22,526 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
2017-01-16 09:47:22,562 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, PROCESS_LOCAL, 1392 bytes)
2017-01-16 09:47:22,571 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
2017-01-16 09:47:22,643 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 955 bytes result sent to driver
2017-01-16 09:47:22,658 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 113 ms on localhost (1/1)
2017-01-16 09:47:22,658 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
2017-01-16 09:47:22,661 INFO DAGScheduler: ResultStage 0 (processCmd at CliDriver.java:423) finished in 0.126 s
2017-01-16 09:47:22,671 INFO DAGScheduler: Job 0 finished: processCmd at CliDriver.java:423, took 0.352237 s
4.4

 
 
spark 2.1.0

> select round(4.4, 2);
17/01/16 09:48:13 INFO SparkSqlParser: Parsing command: select round(4.4, 2)
17/01/16 09:48:15 INFO CodeGenerator: Code generated in 215.145435 ms
17/01/16 09:48:15 INFO SparkContext: Starting job: processCmd at CliDriver.java:376
17/01/16 09:48:15 INFO DAGScheduler: Got job 0 (processCmd at CliDriver.java:376) with 1 output partitions
17/01/16 09:48:15 INFO DAGScheduler: Final stage: ResultStage 0 (processCmd at CliDriver.java:376)
17/01/16 09:48:15 INFO DAGScheduler: Parents of final stage: List()
17/01/16 09:48:15 INFO DAGScheduler: Missing parents: List()
17/01/16 09:48:15 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[3] at processCmd at CliDriver.java:376), which has no missing parents
17/01/16 09:48:15 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 5.9 KB, free 408.9 MB)
17/01/16 09:48:15 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 3.1 KB, free 408.9 MB)
17/01/16 09:48:15 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.9.233.62:34512 (size: 3.1 KB, free: 408.9 MB)
17/01/16 09:48:15 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:996
17/01/16 09:48:15 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[3] at processCmd at CliDriver.java:376)
17/01/16 09:48:15 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
17/01/16 09:48:16 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 6244 bytes)
17/01/16 09:48:16 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
17/01/16 09:48:16 INFO CodeGenerator: Code generated in 8.289946 ms
17/01/16 09:48:16 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1295 bytes result sent to driver
17/01/16 09:48:16 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 77 ms on localhost (executor driver) (1/1)
17/01/16 09:48:16 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
17/01/16 09:48:16 INFO DAGScheduler: ResultStage 0 (processCmd at CliDriver.java:376) finished in 0.097 s
17/01/16 09:48:16 INFO DAGScheduler: Job 0 finished: processCmd at CliDriver.java:376, took 0.205855 s
NULL

> Spark SQL round constant double return null 
> --------------------------------------------
>
>                 Key: SPARK-19225
>                 URL: https://issues.apache.org/jira/browse/SPARK-19225
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 2.0.0, 2.0.1, 2.0.2, 2.1.0
>            Reporter: discipleforteen
>   Original Estimate: 1m
>  Remaining Estimate: 1m
>
> Spark SQL round constant double value may return null . like 'select round(4.4, 2)' return null in spark 2.x. it's not compatible with spark 1.x version. it seems 4.4 is casted to decimal with new SqlBase.g4 , which is casted to double in spark 1.x version.  round decimal 4.4, 2 get null in changeprecision...



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org