You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marco Gaido (JIRA)" <ji...@apache.org> on 2018/02/21 11:54:00 UTC
[jira] [Resolved] (SPARK-23473) spark.catalog.listTables error when
database name starts with a number
[ https://issues.apache.org/jira/browse/SPARK-23473?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Marco Gaido resolved SPARK-23473.
---------------------------------
Resolution: Invalid
> spark.catalog.listTables error when database name starts with a number
> ----------------------------------------------------------------------
>
> Key: SPARK-23473
> URL: https://issues.apache.org/jira/browse/SPARK-23473
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.1.0
> Reporter: Goun Na
> Priority: Trivial
> Attachments: spark_catalog_err.txt
>
>
> Errors when Hive database name starts with a number such as 11st.
> ------------------------------------------------------------------------------------------------------------------------------------
> scala> spark.catalog.setCurrentDatabase("11st")
> scala> spark.catalog.listTables
> scala> spark.catalog.listTables
> 18/02/21 15:47:44 ERROR log: error in initSerDe: java.lang.ClassNotFoundException Class org.apache.hadoop.hive.contrib.serde2.RegexSerDe not found
> java.lang.ClassNotFoundException: Class org.apache.hadoop.hive.contrib.serde2.RegexSerDe not found
> at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105)
> at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:385)
> at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276)
> at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258)
> at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605)
> at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$getTableOption$1$$anonfun$apply$10.apply(HiveClientImpl.scala:365)
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org