You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (JIRA)" <ji...@apache.org> on 2017/10/11 02:08:03 UTC
[jira] [Commented] (SPARK-21686) spark.sql.hive.convertMetastoreOrc
is causing NullPointerException while reading ORC tables
[ https://issues.apache.org/jira/browse/SPARK-21686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16199695#comment-16199695 ]
Dongjoon Hyun commented on SPARK-21686:
---------------------------------------
Hi, [~emattosHWX] and [~viirya].
This is a duplicate of SPARK-18355 which happens when Hive adds a new column after ORC file is created.
I will close this issue.
> spark.sql.hive.convertMetastoreOrc is causing NullPointerException while reading ORC tables
> -------------------------------------------------------------------------------------------
>
> Key: SPARK-21686
> URL: https://issues.apache.org/jira/browse/SPARK-21686
> Project: Spark
> Issue Type: Bug
> Components: Spark Shell
> Affects Versions: 1.6.1
> Environment: spark_2_4_2_0_258-1.6.1.2.4.2.0-258.el6.noarch
> spark_2_4_2_0_258-python-1.6.1.2.4.2.0-258.el6.noarch
> spark_2_4_2_0_258-yarn-shuffle-1.6.1.2.4.2.0-258.el6.noarch
> RHEL-7 (64-Bit)
> JDK 1.8
> Reporter: Ernani Pereira de Mattos Junior
>
> The issue is very similar to SPARK-10304;
> Spark Query throws a NullPointerException.
> >>> sqlContext.sql('select * from core_next.spark_categorization').show(57)
> 17/06/19 11:26:54 ERROR Executor: Exception in task 2.0 in stage 21.0 (TID 48)
> java.lang.NullPointerException
> at org.apache.spark.sql.hive.HiveInspectors$class.unwrapperFor(HiveInspectors.scala:488)
> at org.apache.spark.sql.hive.orc.OrcTableScan.unwrapperFor(OrcRelation.scala:244)
> at org.apache.spark.sql.hive.orc.OrcTableScan$$anonfun$org$apache$spark$sql$hive$orc$OrcTableScan$$fillObject$1$$anonfun$6.apply(OrcRelation.scala:275)
> at org.apache.spark.sql.hive.orc.OrcTableScan$$anonfun$org$apache$spark$sql$hive$orc$OrcTableScan$$fillObject$1$$anonfun$6.apply(OrcRelation.scala:275)
> Turn off ORC optimizations and issue was resolved:
> "sqlContext.setConf("spark.sql.hive.convertMetastoreOrc", "false")
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org