You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/11/07 17:54:50 UTC

[GitHub] [spark] sunchao commented on a change in pull request #34308: [SPARK-37035][SQL] Improve error message when use parquet vectorize reader

sunchao commented on a change in pull request #34308:
URL: https://github.com/apache/spark/pull/34308#discussion_r744292413



##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryExecutionErrors.scala
##########
@@ -593,6 +593,16 @@ object QueryExecutionErrors {
     new QueryExecutionException(message, e)
   }
 
+  def unsupportedParquetDictionaryDecodingError(
+      valueType: String,
+      dictionary: String,

Review comment:
       Yes it's a bit strange. The modules are organized as follow:
   
   - **Catalyst (sql/catalyst)** - An implementation-agnostic framework for manipulating trees of relational operators and expressions.
   - **Execution (sql/core)** - A query planner / execution engine for translating Catalyst’s logical query plans into Spark RDDs. This component also includes a new public interface, SqlContext, that allows users to execute SQL or structured scala queries against existing RDDs and Parquet files. 
   - **Hive Metastore Support (sql/hive)** - An extension of SqlContext called HiveContext that allows users to write queries using a subset of HiveQL and access data from a Hive Metastore using Hive SerDes. There are also wrappers that allows users to run queries that include Hive UDFs, UDAFs, and UDTFs.
   
   So it seems `QueryExecutionError` are more suited for `sql/core`




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org