You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Randy Kerber (JIRA)" <ji...@apache.org> on 2017/06/06 05:01:18 UTC

[jira] [Commented] (SPARK-16599) java.util.NoSuchElementException: None.get at at org.apache.spark.storage.BlockInfoManager.releaseAllLocksForTask(BlockInfoManager.scala:343)

    [ https://issues.apache.org/jira/browse/SPARK-16599?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16038185#comment-16038185 ] 

Randy Kerber commented on SPARK-16599:
--------------------------------------

I've been trying to resolve this error for past couple of days. Figured I must just be doing something wrong, but after seeing the messages here I now think I'm hitting a bug.

I'm running a version of 2.2.1-SNAPSHOT from 2-4 weeks ago.
Not using a Spark Notebook.
Running in IntelliJ in a Scala console on a MacBook Pro.
I read in two simple Parquet files into DataFrames.  They are vertices and edges from which I will create a GraphFrame.
All DataFrame fields are Strings.
After reading in the 'vertices' DataFrame I can call show() and count() on it and it works fine.
I create a GraphFrame with `val gf = GraphFrame( vertices, edges )`
I can call `gf.vertices.count()` and `gf..edges.count()` and it works.
Then I do a `gf.persist()`.
Then if I do 'gf.vertices.count()` (the same statement that just worked!) I get an error.
```
org.apache.spark.SparkException: Job aborted due to stage failure: Task serialization failed: java.util.NoSuchElementException: None.get
java.util.NoSuchElementException: None.get
	at scala.None$.get(Option.scala:347)
	at scala.None$.get(Option.scala:345)
	at org.apache.spark.sql.execution.DataSourceScanExec$class.org$apache$spark$sql$execution$DataSourceScanExec$$redact(DataSourceScanExec.scala:70)
	at org.apache.spark.sql.execution.DataSourceScanExec$$anonfun$4.apply(DataSourceScanExec.scala:54)
	at org.apache.spark.sql.execution.DataSourceScanExec$$anonfun$4.apply(DataSourceScanExec.scala:52)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
	at scala.collection.AbstractTraversable.map(Traversable.scala:104)
	at org.apache.spark.sql.execution.DataSourceScanExec$class.simpleString(DataSourceScanExec.scala:52)
	at org.apache.spark.sql.execution.FileSourceScanExec.simpleString(DataSourceScanExec.scala:155)
	at org.apache.spark.sql.catalyst.plans.QueryPlan.verboseString(QueryPlan.scala:349)
	at org.apache.spark.sql.execution.FileSourceScanExec.org$apache$spark$sql$execution$DataSourceScanExec$$super$verboseString(DataSourceScanExec.scala:155)
	at org.apache.spark.sql.execution.DataSourceScanExec$class.verboseString(DataSourceScanExec.scala:60)
	at org.apache.spark.sql.execution.FileSourceScanExec.verboseString(DataSourceScanExec.scala:155)
	at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:546)
	at org.apache.spark.sql.execution.WholeStageCodegenExec.generateTreeString(WholeStageCodegenExec.scala:451)
	at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:558)
	at org.apache.spark.sql.catalyst.trees.TreeNode.treeString(TreeNode.scala:470)
	at org.apache.spark.sql.catalyst.trees.TreeNode.treeString(TreeNode.scala:467)
	at org.apache.spark.sql.catalyst.trees.TreeNode.toString(TreeNode.scala:464)
	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1421)
	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
	...
```
Not seeing it this time, but on previous runs have seen messages that make it sound like a new SparkSession is being created somewhere.


> java.util.NoSuchElementException: None.get  at at org.apache.spark.storage.BlockInfoManager.releaseAllLocksForTask(BlockInfoManager.scala:343)
> ----------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-16599
>                 URL: https://issues.apache.org/jira/browse/SPARK-16599
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 2.0.0
>         Environment: centos 6.7   spark 2.0
>            Reporter: binde
>
> run a spark job with spark 2.0, error message
> Job aborted due to stage failure: Task 0 in stage 821.0 failed 4 times, most recent failure: Lost task 0.3 in stage 821.0 (TID 1480, e103): java.util.NoSuchElementException: None.get
> 	at scala.None$.get(Option.scala:347)
> 	at scala.None$.get(Option.scala:345)
> 	at org.apache.spark.storage.BlockInfoManager.releaseAllLocksForTask(BlockInfoManager.scala:343)
> 	at org.apache.spark.storage.BlockManager.releaseAllLocksForTask(BlockManager.scala:644)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:281)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 	at java.lang.Thread.run(Thread.java:745)



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org