You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Senthil Kumar (Jira)" <ji...@apache.org> on 2021/10/13 09:19:00 UTC
[jira] [Created] (SPARK-36996) fixing "SQL column nullable setting
not retained as part of spark read" issue
Senthil Kumar created SPARK-36996:
-------------------------------------
Summary: fixing "SQL column nullable setting not retained as part of spark read" issue
Key: SPARK-36996
URL: https://issues.apache.org/jira/browse/SPARK-36996
Project: Spark
Issue Type: Improvement
Components: SQL
Affects Versions: 3.1.2, 3.1.1, 3.1.0, 3.0.0
Reporter: Senthil Kumar
Sql 'nullable' columns are not retaining 'nullable' type as it is while reading from Spark read using jdbc format.
SQL :
------------
mysql> CREATE TABLE Persons(Id int NOT NULL, FirstName varchar(255), LastName varchar(255), Age int);
mysql> desc Persons;
+-----------+--------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+-----------+--------------+------+-----+---------+-------+
| Id | int | NO | | NULL | |
| FirstName | varchar(255) | YES | | NULL | |
| LastName | varchar(255) | YES | | NULL | |
| Age | int | YES | | NULL | |
+-----------+--------------+------+-----+---------+-------+
But in Spark we get all the columns as "Nullable":
=============
scala> val df = spark.read.format("jdbc").option("database","Test_DB").option("user", "root").option("password", "").option("driver", "com.mysql.cj.jdbc.Driver").option("url", "jdbc:mysql://localhost:3306/Test_DB").option("dbtable", "Persons").load()
df: org.apache.spark.sql.DataFrame = [Id: int, FirstName: string ... 2 more fields]
scala> df.printSchema()
root
|-- Id: integer (nullable = true)
|-- FirstName: string (nullable = true)
|-- LastName: string (nullable = true)
|-- Age: integer (nullable = true)
=============
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org