You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (JIRA)" <ji...@apache.org> on 2017/01/19 12:22:26 UTC
[jira] [Resolved] (SPARK-19059) Unable to retrieve data from a
parquet table whose name starts with underscore
[ https://issues.apache.org/jira/browse/SPARK-19059?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan resolved SPARK-19059.
---------------------------------
Resolution: Fixed
Assignee: Jayadevan M
Fix Version/s: 2.2.0
> Unable to retrieve data from a parquet table whose name starts with underscore
> ------------------------------------------------------------------------------
>
> Key: SPARK-19059
> URL: https://issues.apache.org/jira/browse/SPARK-19059
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 2.1.0
> Reporter: Giambattista
> Assignee: Jayadevan M
> Fix For: 2.2.0
>
>
> It looks like there is some bug introduced in Spark 2.1.0 preventing to read data from a parquet table (hive support is enabled) whose name starts with underscore. CREATE and INSERT statements on the same table instead seems to work as expected.
> The problem can be reproduced from spark-shell through the following steps:
> 1) Create a table with some values
> scala> spark.sql("CREATE TABLE `_a`(i INT) USING parquet").show
> scala> spark.sql("INSERT INTO `_a` VALUES (1), (2), (3)").show
> 2) Select data from the just created and filled table --> no results
> scala> spark.sql("SELECT * FROM `_a`").show
> +---+
> | i|
> +---+
> +---+
> 3) rename the table so that the prefixing underscore disappears
> scala> spark.sql("ALTER TABLE `_a` RENAME TO `a`").show
> 4) select data from the just renamed table --> results are shown
> scala> spark.sql("SELECT * FROM `a`").show
> +---+
> | i|
> +---+
> | 1|
> | 2|
> | 3|
> +---+
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org