You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (JIRA)" <ji...@apache.org> on 2016/08/19 20:17:20 UTC

[jira] [Commented] (SPARK-13342) Cannot run INSERT statements in Spark

    [ https://issues.apache.org/jira/browse/SPARK-13342?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15428753#comment-15428753 ] 

Dongjoon Hyun commented on SPARK-13342:
---------------------------------------

Hi, All.
Just to make this issue up-to-date, the following is the result of Spark 2.0.0.
{code}
scala> sql("create table x(a int)")
scala> sql("select * from x").show
+---+
|  a|
+---+
+---+
scala> sql("insert into x values 1")
scala> sql("select * from x").show
+---+
|  a|
+---+
|  1|
+---+
{code}

> Cannot run INSERT statements in Spark
> -------------------------------------
>
>                 Key: SPARK-13342
>                 URL: https://issues.apache.org/jira/browse/SPARK-13342
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.5.1, 1.6.0
>            Reporter: neo
>
> I cannot run a INSERT statement using spark-sql. I tried with both versions 1.5.1 and 1.6.0 without any luck. But it runs ok on hive.
> These are the steps I took.
> 1) Launch hive and create the table / insert a record.
> create database test
> use test
> CREATE TABLE stgTable
> (
> sno string,
> total bigint
> );
> INSERT INTO TABLE stgTable VALUES ('12',12)
> 2) Launch spark-sql (1.5.1 or 1.6.0)
> 3) Try inserting a record from the shell
> INSERT INTO table stgTable SELECT 'sno2',224 from stgTable limit 1
> I got this error message 
> "Invalid method name: 'alter_table_with_cascade'"
> I tried changing the hive version inside the spark-sql shell  using SET command.
> I changed the hive version
> from
> SET spark.sql.hive.version=1.2.1  (this is the default setting for my spark installation)
> to
> SET spark.sql.hive.version=0.14.0
> but that did not help either



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org