You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Randy Tidd (JIRA)" <ji...@apache.org> on 2017/09/28 22:36:00 UTC
[jira] [Commented] (SPARK-18660) Parquet complains "Can not
initialize counter due to context is not a instance of
TaskInputOutputContext, but is
org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl "
[ https://issues.apache.org/jira/browse/SPARK-18660?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16185024#comment-16185024 ]
Randy Tidd commented on SPARK-18660:
------------------------------------
I am experiencing this problem now. However, it occurs at a time when Spark is writing the wrong number of rows to the parquet files. My data set has 6 rows, and it is writing 532. So I am not sure it is a frivolous error. It's logged as an error in ParquetRecordReader but eventually logs as a warning in Spark.
> Parquet complains "Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl "
> --------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-18660
> URL: https://issues.apache.org/jira/browse/SPARK-18660
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Reporter: Yin Huai
>
> Parquet record reader always complain "Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl". Looks like we always create TaskAttemptContextImpl (https://github.com/apache/spark/blob/2f7461f31331cfc37f6cfa3586b7bbefb3af5547/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala#L368). But, Parquet wants to use TaskInputOutputContext, which is a subclass of TaskAttemptContextImpl.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org