You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "ABHISHEK KUMAR GUPTA (JIRA)" <ji...@apache.org> on 2018/04/24 08:59:00 UTC
[jira] [Created] (SPARK-24064) [Spark SQL] Create table using csv
does not support binary column Type
ABHISHEK KUMAR GUPTA created SPARK-24064:
--------------------------------------------
Summary: [Spark SQL] Create table using csv does not support binary column Type
Key: SPARK-24064
URL: https://issues.apache.org/jira/browse/SPARK-24064
Project: Spark
Issue Type: Bug
Components: Deploy
Affects Versions: 2.3.0
Environment: OS Type: Suse 11
Spark Version: 2.3.0
Reporter: ABHISHEK KUMAR GUPTA
# Launch spark-sql --master yarn
# create table csvTable (time timestamp, name string, isright boolean, datetoday date, num binary, height double, score float, decimaler decimal(10,0), id tinyint, age int, license bigint, length smallint) using CSV options (path "/user/datatmo/customer1.csv");
Result: Table creation is successful
3. Select * from csvTable;
Throws below Exception
ERROR SparkSQLDriver:91 - Failed in [select * from csvtable]
java.lang.UnsupportedOperationException: *CSV data source does not support binary data type*.
at org.apache.spark.sql.execution.datasources.csv.CSVUtils$.org$apache$spark$sql$execution$datasources$csv$CSVUtils$$verifyType$1(CSVUtils.scala:127)
at org.apache.spark.sql.execution.datasources.csv.CSVUtils$$anonfun$verifySchema$1.apply(CSVUtils.scala:131)
at org.apache.spark.sql.execution.datasources.csv.CSVUtils$$anonfun$verifySchema$1.apply(CSVUtils.scala:131)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at org.apache.spark.sql.types.StructType.foreach(StructType.scala:99)
But Normal table supports binary Data Type.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org