You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2016/07/12 10:43:20 UTC

[jira] [Commented] (SPARK-15382) monotonicallyIncreasingId doesn't work when data is upsampled

    [ https://issues.apache.org/jira/browse/SPARK-15382?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15372673#comment-15372673 ] 

Hyukjin Kwon commented on SPARK-15382:
--------------------------------------

This also happends in master branch (2.1.0)

here is the shorten version in Scala I tested.

{code}
spark.range(2).sample(true, 10.0).withColumn("mid", monotonically_increasing_id).show()
{code}



> monotonicallyIncreasingId doesn't work when data is upsampled
> -------------------------------------------------------------
>
>                 Key: SPARK-15382
>                 URL: https://issues.apache.org/jira/browse/SPARK-15382
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.1
>            Reporter: Mateusz Buśkiewicz
>
> Assigned ids are not unique
> {code}
> from pyspark.sql import Row
> from pyspark.sql.functions import monotonicallyIncreasingId
> hiveContext.createDataFrame([Row(a=1), Row(a=2)]).sample(True, 10.0).withColumn('id', monotonicallyIncreasingId()).collect()
> {code}
> Output:
> {code}
> [Row(a=1, id=429496729600),
>  Row(a=1, id=429496729600),
>  Row(a=1, id=429496729600),
>  Row(a=1, id=429496729600),
>  Row(a=1, id=429496729600),
>  Row(a=1, id=429496729600),
>  Row(a=1, id=429496729600),
>  Row(a=2, id=867583393792),
>  Row(a=2, id=867583393792),
>  Row(a=2, id=867583393792),
>  Row(a=2, id=867583393792),
>  Row(a=2, id=867583393792),
>  Row(a=2, id=867583393792),
>  Row(a=2, id=867583393792),
>  Row(a=2, id=867583393792),
>  Row(a=2, id=867583393792),
>  Row(a=2, id=867583393792),
>  Row(a=2, id=867583393792)]
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org