You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Maryann Xue (JIRA)" <ji...@apache.org> on 2018/06/18 17:30:00 UTC

[jira] [Created] (SPARK-24583) Wrong schema type in InsertIntoDataSourceCommand

Maryann Xue created SPARK-24583:
-----------------------------------

             Summary: Wrong schema type in InsertIntoDataSourceCommand
                 Key: SPARK-24583
                 URL: https://issues.apache.org/jira/browse/SPARK-24583
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.3.0
            Reporter: Maryann Xue
             Fix For: 2.4.0


For a DataSource table, whose schema contains a field with "nullable=false", while user tries to insert a NULL value into this field, the input dataFrame will return an incorrect value or throw NullPointerException. And that's because, the schema nullability of the input relation has been overridden bluntly with the destination schema by the code below in {{InsertIntoDataSourceCommand}}:
{code:java}
  override def run(sparkSession: SparkSession): Seq[Row] = {
    val relation = logicalRelation.relation.asInstanceOf[InsertableRelation]
    val data = Dataset.ofRows(sparkSession, query)
    // Apply the schema of the existing table to the new data.
    val df = sparkSession.internalCreateDataFrame(data.queryExecution.toRdd, logicalRelation.schema)
    relation.insert(df, overwrite)

    // Re-cache all cached plans(including this relation itself, if it's cached) that refer to this
    // data source relation.
    sparkSession.sharedState.cacheManager.recacheByPlan(sparkSession, logicalRelation)

    Seq.empty[Row]
  }
{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org