You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Daniel Haviv <da...@gmail.com> on 2017/02/07 12:33:10 UTC

Hive support is required to select over the following tables:

Hi,
I'm creating a table using the %sql context in the following manner:
create table wikistatnew2
(projectcode string,
pagename string,
pageviews int,
pagesize int)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ' '
LOCATION 's3://....'

but when I try to select from it, I'm getting the following error:
org.apache.spark.sql.AnalysisException:
Hive support is required to select over the following tables:
`default`.`wikistatnew2`
;;
'Project [*]
+- 'SubqueryAlias wikistatnew2
+- 'SimpleCatalogRelation default, CatalogTable(
Table: `default`.`wikistatnew2`
Created: Tue Feb 07 11:47:59 UTC 2017
Last Access: Wed Dec 31 23:59:59 UTC 1969
Type: EXTERNAL
Schema: [StructField(projectcode,StringType,true),
StructField(pagename,StringType,true),
StructField(pageviews,IntegerType,true),
StructField(pagesize,IntegerType,true)]
Provider: hive

It is unclear to me why.
useHiveContext is true and it works fine from spark-shell.


Any ideas?

Thank you,
Daniel