You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Lunen (JIRA)" <ji...@apache.org> on 2015/10/16 17:55:05 UTC

[jira] [Created] (SPARK-11148) Unable to create views

Lunen created SPARK-11148:
-----------------------------

             Summary: Unable to create views
                 Key: SPARK-11148
                 URL: https://issues.apache.org/jira/browse/SPARK-11148
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 1.5.1
         Environment: Ubuntu 14.04
Spark-1.5.1-bin-hadoop2.6
(I don't have Hadoop or Hive installed)
Start spark-all.sh and thriftserver with mysql jar driver
            Reporter: Lunen
            Priority: Critical


I am unable to create views within spark SQL. 
Creating tables without specifying the column names work. eg.

CREATE TABLE trade2 
USING org.apache.spark.sql.jdbc
OPTIONS ( 
url "jdbc:mysql://192.168.30.191:3318/?user=root", 
dbtable "database.trade", 
driver "com.mysql.jdbc.Driver" 
);

Ceating tables with datatypes gives an error:

CREATE TABLE trade2( 
COL1 timestamp, 
COL2 STRING, 
COL3 STRING) 
USING org.apache.spark.sql.jdbc 
OPTIONS (
  url "jdbc:mysql://192.168.30.191:3318/?user=root",   
  dbtable "database.trade",   
  driver "com.mysql.jdbc.Driver" 
);
Error: org.apache.spark.sql.AnalysisException: org.apache.spark.sql.execution.datasources.jdbc.DefaultSource does not allow user-specified schemas.; SQLState: null ErrorCode: 0

Trying to create a VIEW from the table that was created.(The select statement below returns data)
CREATE VIEW viewtrade as Select Col1 from trade2;

Error: org.apache.spark.sql.execution.QueryExecutionException: FAILED: SemanticException [Error 10004]: Line 1:30 Invalid table alias or column reference 'Col1': (possible column names are: col)
SQLState:  null
ErrorCode: 0



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org