You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Carefree (Jira)" <ji...@apache.org> on 2021/03/23 03:22:00 UTC
[jira] [Created] (SPARK-34831) spark2.3 can't add column in
carbondata table
Carefree created SPARK-34831:
--------------------------------
Summary: spark2.3 can't add column in carbondata table
Key: SPARK-34831
URL: https://issues.apache.org/jira/browse/SPARK-34831
Project: Spark
Issue Type: Bug
Components: Spark Shell
Affects Versions: 3.1.0
Environment: spark2.3 carbondata1.5.3 cdh5.16
Reporter: Carefree
when i want to add an column in carbondata table with spark2.3, but it occur a error,here it's the detail of mistake:
{code:java}
// ALTER ADD COLUMNS does not support datasource table with type org.apache.spark.sql.CarbonSource
at org.apache.spark.sql.execution.command.AlterTableAddColumnsCommand.verifyAlterTableAddColumn(tables.scala:242)
at org.apache.spark.sql.execution.command.AlterTableAddColumnsCommand.run(tables.scala:194)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:190)
at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:190)
at org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3259)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3258)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:190)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:75)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
{code}
so,i wish it can be supported in latest version.Thanks
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org