You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sanjay Dasgupta (JIRA)" <ji...@apache.org> on 2016/06/15 14:49:09 UTC
[jira] [Created] (SPARK-15964) Assignment to RDD-typed val fails
Sanjay Dasgupta created SPARK-15964:
---------------------------------------
Summary: Assignment to RDD-typed val fails
Key: SPARK-15964
URL: https://issues.apache.org/jira/browse/SPARK-15964
Project: Spark
Issue Type: Bug
Affects Versions: 2.0.0
Environment: Notebook on Databricks Community-Edition
Spark-2.0 preview
Google Chrome Browser
Linux Ubuntu 14.04 LTS
Reporter: Sanjay Dasgupta
Unusual assignment error, giving the following error message:
found : org.apache.spark.rdd.RDD[Name]
required : org.apache.spark.rdd.RDD[Name]
This occurs when the assignment is attempted in a cell that is different from the cell in which the item on the right-hand-side is defined. As in the following example:
// CELL-1
import org.apache.spark.sql.Dataset
import org.apache.spark.rdd.RDD
case class Name(number: Int, name: String)
val names = Seq(Name(1, "one"), Name(2, "two"), Name(3, "three"), Name(4, "four"))
val dataset: Dataset[Name] = spark.sparkContext.parallelize(names).toDF.as[Name]
// CELL-2
// Error reported here ...
val dataRdd: RDD[Name] = dataset.rdd
The error is reported in CELL-2
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org