You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Takeshi Yamamuro (JIRA)" <ji...@apache.org> on 2016/08/19 10:24:21 UTC
[jira] [Comment Edited] (SPARK-15382) monotonicallyIncreasingId
doesn't work when data is upsampled
[ https://issues.apache.org/jira/browse/SPARK-15382?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15427965#comment-15427965 ]
Takeshi Yamamuro edited comment on SPARK-15382 at 8/19/16 10:23 AM:
--------------------------------------------------------------------
[~rxin] [~viirya] Seems this ticket has already been fixed in SPARK-16686.
Can we close this?
was (Author: maropu):
@rxin [~viirya] Seems this ticket has already been fixed in SPARK-16686.
Can we close this?
> monotonicallyIncreasingId doesn't work when data is upsampled
> -------------------------------------------------------------
>
> Key: SPARK-15382
> URL: https://issues.apache.org/jira/browse/SPARK-15382
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.6.1
> Reporter: Mateusz Buśkiewicz
>
> Assigned ids are not unique
> {code}
> from pyspark.sql import Row
> from pyspark.sql.functions import monotonicallyIncreasingId
> hiveContext.createDataFrame([Row(a=1), Row(a=2)]).sample(True, 10.0).withColumn('id', monotonicallyIncreasingId()).collect()
> {code}
> Output:
> {code}
> [Row(a=1, id=429496729600),
> Row(a=1, id=429496729600),
> Row(a=1, id=429496729600),
> Row(a=1, id=429496729600),
> Row(a=1, id=429496729600),
> Row(a=1, id=429496729600),
> Row(a=1, id=429496729600),
> Row(a=2, id=867583393792),
> Row(a=2, id=867583393792),
> Row(a=2, id=867583393792),
> Row(a=2, id=867583393792),
> Row(a=2, id=867583393792),
> Row(a=2, id=867583393792),
> Row(a=2, id=867583393792),
> Row(a=2, id=867583393792),
> Row(a=2, id=867583393792),
> Row(a=2, id=867583393792),
> Row(a=2, id=867583393792)]
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org