You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "sandeep katta (JIRA)" <ji...@apache.org> on 2018/06/22 05:24:00 UTC

[jira] [Commented] (SPARK-24082) [Spark SQL] Tables are not listing under DB

    [ https://issues.apache.org/jira/browse/SPARK-24082?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16520023#comment-16520023 ] 

sandeep katta commented on SPARK-24082:
---------------------------------------

In your case both the tables created are External tables.

For external tables location of the table is same as what mentioned by the user.

So you cannot find the table in /user/sparkhive/warehouse/one.db

> [Spark SQL] Tables are not listing under DB
> -------------------------------------------
>
>                 Key: SPARK-24082
>                 URL: https://issues.apache.org/jira/browse/SPARK-24082
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.0
>         Environment: OS: Suse11
> Spark Version: 2.3
>            Reporter: ABHISHEK KUMAR GUPTA
>            Priority: Minor
>
> Steps:
>  # Launch spark-sql --master yarn
>  # use one;(* DB name is one )
>  # . create table csvTable (time timestamp, name string, isright boolean, datetoday date, num binary, height double, score float, decimaler decimal(10,0), id tinyint, age int, license bigint, length smallint) using CSV options (path "/user/datatmo/customer1.csv");
>  # CREATE TABLE Parquettemp USING org.apache.spark.sql.parquet OPTIONS (path "/user/sparkhive/warehouse/one.db/test1.parquet/");
>  # show tables;
> Tables are listing as below
> one csvtable false
> one parquettemp false
> But when listing the tables under "one.db" in HDFS FIle system, csvTable  and Parquettemp  tables are not listing.
> Used below command
> BLR1000023111:/opt/Antsecure/install/hadoop/namenode/bin # ./hdfs dfs -ls /user/sparkhive/warehouse/one.db
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org