You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by "Amit Sela (JIRA)" <ji...@apache.org> on 2016/03/20 10:49:33 UTC
[jira] [Updated] (BEAM-112) Using non-IntervalWindows seems to fail
[ https://issues.apache.org/jira/browse/BEAM-112?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Amit Sela updated BEAM-112:
---------------------------
Attachment: test_stocks.csv
example input file
> Using non-IntervalWindows seems to fail
> ---------------------------------------
>
> Key: BEAM-112
> URL: https://issues.apache.org/jira/browse/BEAM-112
> Project: Beam
> Issue Type: Bug
> Components: runner-spark
> Reporter: Ben Chambers
> Assignee: Amit Sela
> Priority: Minor
> Attachments: test_stocks.csv
>
>
> See here for more details: http://stackoverflow.com/questions/35993777/globalwindow-cannot-be-cast-to-intervalwindow
> The linked stack trace indicates this is using the Spark Runner:
> {noformat:title=Exception}
> java.lang.ClassCastException: com.google.cloud.dataflow.sdk.transforms.windowing.GlobalWindow cannot be cast to com.google.cloud.dataflow.sdk.transforms.windowing.IntervalWindow
> at com.google.cloud.dataflow.sdk.transforms.windowing.IntervalWindow$IntervalWindowCoder.encode(IntervalWindow.java:171)
> at com.google.cloud.dataflow.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:113)
> at com.google.cloud.dataflow.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:59)
> at com.google.cloud.dataflow.sdk.util.WindowedValue$FullWindowedValueCoder.encode(WindowedValue.java:599)
> at com.google.cloud.dataflow.sdk.util.WindowedValue$FullWindowedValueCoder.encode(WindowedValue.java:540)
> at com.cloudera.dataflow.spark.CoderHelpers.toByteArray(CoderHelpers.java:48)
> at com.cloudera.dataflow.spark.CoderHelpers$3.call(CoderHelpers.java:134)
> at com.cloudera.dataflow.spark.CoderHelpers$3.call(CoderHelpers.java:131)
> at org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:1018)
> at org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:1018)
> at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
> at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:149)
> at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
> at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
> at org.apache.spark.scheduler.Task.run(Task.scala:89)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> {noformat}
> It seems likely that at some point the Spark runner is assuming that all windows are IntervalWindows, which may not be true. Specifically the GlobalWindow+Triggers is valid, as is any custom implementation of BoundedWindow.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)