You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2016/09/26 05:34:20 UTC
[jira] [Created] (SPARK-17665) SparkR supports options in other
types consistently other APIs
Hyukjin Kwon created SPARK-17665:
------------------------------------
Summary: SparkR supports options in other types consistently other APIs
Key: SPARK-17665
URL: https://issues.apache.org/jira/browse/SPARK-17665
Project: Spark
Issue Type: Improvement
Reporter: Hyukjin Kwon
Priority: Minor
Currently, SparkR only supports a string as option in some APIs such as `read.df`/`write.df` and etc.
It'd be great if they support other types consistently with Python/Scala/Java/SQL APIs.
- Python supports all types but converts it to string
- Scala/Java/SQL - Long/Boolean/String/Double.
Currently,
{code}
> read.df("text.json", "csv", inferSchema=FALSE)
{code}
throws an exception as below:
{code}
Error in value[[3L]](cond) :
Error in invokeJava(isStatic = TRUE, className, methodName, ...): java.lang.ClassCastException: java.lang.Boolean cannot be cast to java.lang.String
at org.apache.spark.sql.internal.SessionState$$anonfun$newHadoopConfWithOptions$1.apply(SessionState.scala:59)
at org.apache.spark.sql.internal.SessionState$$anonfun$newHadoopConfWithOptions$1.apply(SessionState.scala:59)
at scala.collection.immutable.Map$Map3.foreach(Map.scala:161)
at org.apache.spark.sql.internal.SessionState.newHadoopConfWithOptions(SessionState.scala:59)
at org.apache.spark.sql.execution.datasources.PartitioningAwareFileCatalog.<init>(PartitioningAwareFileCatalog.scala:45)
at org.apache.spark.sql.execution.datasources.ListingFileCatalog.<init>(ListingFileCatalog.scala:45)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:401)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:149)
at org.apache.spark.sql.DataFrameReader.lo
{code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org