You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "jalendhar Baddam (JIRA)" <ji...@apache.org> on 2017/10/04 04:39:02 UTC
[jira] [Updated] (SPARK-21951) Unable to add the new column and
writing into the Hive using spark
[ https://issues.apache.org/jira/browse/SPARK-21951?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
jalendhar Baddam updated SPARK-21951:
-------------------------------------
Issue Type: Question (was: Bug)
> Unable to add the new column and writing into the Hive using spark
> ------------------------------------------------------------------
>
> Key: SPARK-21951
> URL: https://issues.apache.org/jira/browse/SPARK-21951
> Project: Spark
> Issue Type: Question
> Components: Java API
> Affects Versions: 2.1.1
> Reporter: jalendhar Baddam
>
> I am creating one new column to the Existing Dataset and unable to write into the Hive using the Spark.
> Ex: Dataset<Row> ds=spark.sql("select *from Table");
> ds= ds.withColumn("newColumn",newColumnvalues);
> ds.write().saveMode("overwite").format("parquet').saveAsTable("Table"); //Here I am getting the Exception
> I am loading the Table from Hive using Spark, and adding the new Column to that Dataset and again write the same table into Hive with the "OverWrite" option
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org