You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2017/06/30 01:28:00 UTC

[jira] [Comment Edited] (SPARK-21246) Unexpected Data Type conversion from LONG to BIGINT

    [ https://issues.apache.org/jira/browse/SPARK-21246?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16069281#comment-16069281 ] 

Hyukjin Kwon edited comment on SPARK-21246 at 6/30/17 1:27 AM:
---------------------------------------------------------------

Of course, it follows the schema an user specified.

{code}
scala> peopleDF.schema == schema
res9: Boolean = true
{code}

and this throws an exception as the schema is mismatched. I don't think at least the same schema is an issue here. I am resolving this.


was (Author: hyukjin.kwon):
Of course, it follows the schema an user specified.

{code}
scala> peopleDF.schema == schema
res9: Boolean = true
{code}

and this throws an exception as the schema is mismatched. I don't think at least the same schema is not an issue here. I am resolving this.

> Unexpected Data Type conversion from LONG to BIGINT
> ---------------------------------------------------
>
>                 Key: SPARK-21246
>                 URL: https://issues.apache.org/jira/browse/SPARK-21246
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.1
>         Environment: Using Zeppelin Notebook or Spark Shell
>            Reporter: Monica Raj
>
> The unexpected conversion occurred when creating a data frame out of an existing data collection. The following code can be run in zeppelin notebook to reproduce the bug:
> import org.apache.spark.sql.types._
> import org.apache.spark.sql.Row
> val schemaString = "name"
> val lstVals = Seq(3)
> val rowRdd = sc.parallelize(lstVals).map(x => Row( x ))
> rowRdd.collect()
> // Generate the schema based on the string of schema
> val fields = schemaString.split(" ")
> .map(fieldName => StructField(fieldName, LongType, nullable = true))
> val schema = StructType(fields)
> print(schema)
> val peopleDF = sqlContext.createDataFrame(rowRdd, schema)



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org