You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "ASF GitHub Bot (Jira)" <ji...@apache.org> on 2021/10/28 16:59:00 UTC

[jira] [Work logged] (HIVE-25591) CREATE EXTERNAL TABLE fails for JDBC tables stored in non-default schema

     [ https://issues.apache.org/jira/browse/HIVE-25591?focusedWorklogId=671549&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-671549 ]

ASF GitHub Bot logged work on HIVE-25591:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 28/Oct/21 16:58
            Start Date: 28/Oct/21 16:58
    Worklog Time Spent: 10m 
      Work Description: zabetak opened a new pull request #2759:
URL: https://github.com/apache/hive/pull/2759


   The tests rely on HIVE-25594 for which there is a separate pull request (https://github.com/apache/hive/pull/2742). Please do not review https://github.com/apache/hive/commit/cb3026b4db9454c12d5376c71a28eb34b35d783d here. If there are remarks please comment on https://github.com/apache/hive/pull/2742 instead. 
   
   ### What changes were proposed in this pull request?
   1. Remove getOriQueryToExecute method in favor of getQueryToExecute
   2. Move getQueryToExecute method into GenericJdbcDatabaseAccessor to improve encapsulation since the method is only used in this class.
   3. Include hive.sql.schema if available when generating the SQL query.
   4. Add tests/usage samples of hive.sql.schema property in different DBMS.
   
   ### Why are the changes needed?
   1. Avoid failures when the table is in non-default schema.
   2. Demonstrate how hive.sql.schema can be used in different DBMS.
   3. Minor encapsulation improvement.
   
   ### Does this PR introduce _any_ user-facing change?
   Fixes a failure.
   
   ### How was this patch tested?
   `mvn test -Dtest=TestMiniLlapLocalCliDriver -Dqfile_regex="jdbc_table_with_schema.*" -Dtest.output.overwrite`


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: gitbox-unsubscribe@hive.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


Issue Time Tracking
-------------------

            Worklog Id:     (was: 671549)
    Remaining Estimate: 0h
            Time Spent: 10m

> CREATE EXTERNAL TABLE fails for JDBC tables stored in non-default schema
> ------------------------------------------------------------------------
>
>                 Key: HIVE-25591
>                 URL: https://issues.apache.org/jira/browse/HIVE-25591
>             Project: Hive
>          Issue Type: Bug
>          Components: Query Planning
>            Reporter: Stamatis Zampetakis
>            Assignee: Stamatis Zampetakis
>            Priority: Major
>          Time Spent: 10m
>  Remaining Estimate: 0h
>
> Consider the following use case where tables reside in some user-defined schema in some JDBC compliant database:
> +Postgres+
> {code:sql}
> create schema world;
> create table if not exists world.country (name varchar(80) not null);
> insert into world.country (name) values ('India');
> insert into world.country (name) values ('Russia');
> insert into world.country (name) values ('USA');
> {code}
> The following DDL statement in Hive fails:
> +Hive+
> {code:sql}
> CREATE EXTERNAL TABLE country (name varchar(80))
> STORED BY 'org.apache.hive.storage.jdbc.JdbcStorageHandler'
> TBLPROPERTIES (
> "hive.sql.database.type" = "POSTGRES",
> "hive.sql.jdbc.driver" = "org.postgresql.Driver",
> "hive.sql.jdbc.url" = "jdbc:postgresql://localhost:5432/test",
> "hive.sql.dbcp.username" = "user",
> "hive.sql.dbcp.password" = "pwd",
> "hive.sql.schema" = "world",
> "hive.sql.table" = "country");
> {code}
> The exception is the following:
> {noformat}
> org.postgresql.util.PSQLException: ERROR: relation "country" does not exist
>   Position: 15
> 	at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2532) ~[postgresql-42.2.14.jar:42.2.14]
> 	at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2267) ~[postgresql-42.2.14.jar:42.2.14]
> 	at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:312) ~[postgresql-42.2.14.jar:42.2.14]
> 	at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:448) ~[postgresql-42.2.14.jar:42.2.14]
> 	at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:369) ~[postgresql-42.2.14.jar:42.2.14]
> 	at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:153) ~[postgresql-42.2.14.jar:42.2.14]
> 	at org.postgresql.jdbc.PgPreparedStatement.executeQuery(PgPreparedStatement.java:103) ~[postgresql-42.2.14.jar:42.2.14]
> 	at org.apache.commons.dbcp2.DelegatingPreparedStatement.executeQuery(DelegatingPreparedStatement.java:122) ~[commons-dbcp2-2.7.0.jar:2.7.0]
> 	at org.apache.commons.dbcp2.DelegatingPreparedStatement.executeQuery(DelegatingPreparedStatement.java:122) ~[commons-dbcp2-2.7.0.jar:2.7.0]
> 	at org.apache.hive.storage.jdbc.dao.GenericJdbcDatabaseAccessor.getColumnNames(GenericJdbcDatabaseAccessor.java:83) [hive-jdbc-handler-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
> 	at org.apache.hive.storage.jdbc.JdbcSerDe.initialize(JdbcSerDe.java:98) [hive-jdbc-handler-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:95) [hive-metastore-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:78) [hive-metastore-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
> 	at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:342) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
> 	at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:324) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
> 	at org.apache.hadoop.hive.ql.metadata.Table.getColsInternal(Table.java:734) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
> 	at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:717) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
> 	at org.apache.hadoop.hive.ql.ddl.table.create.CreateTableDesc.toTable(CreateTableDesc.java:933) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
> 	at org.apache.hadoop.hive.ql.ddl.table.create.CreateTableOperation.execute(CreateTableOperation.java:59) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
> 	at org.apache.hadoop.hive.ql.ddl.DDLTask.execute(DDLTask.java:84) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
> 	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:212) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
> 	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:105) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
> 	at org.apache.hadoop.hive.ql.Executor.launchTask(Executor.java:361) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
> 	at org.apache.hadoop.hive.ql.Executor.launchTasks(Executor.java:334) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
> 	at org.apache.hadoop.hive.ql.Executor.runTasks(Executor.java:245) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
> 	at org.apache.hadoop.hive.ql.Executor.execute(Executor.java:108) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
> 	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:348) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)