You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Paul Praet (JIRA)" <ji...@apache.org> on 2018/10/09 13:24:00 UTC

[jira] [Commented] (SPARK-18660) Parquet complains "Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl "

    [ https://issues.apache.org/jira/browse/SPARK-18660?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16643418#comment-16643418 ] 

Paul Praet commented on SPARK-18660:
------------------------------------

It's really polluting our logs. Any workaround ?

> Parquet complains "Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl "
> --------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-18660
>                 URL: https://issues.apache.org/jira/browse/SPARK-18660
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>            Reporter: Yin Huai
>            Priority: Major
>
> Parquet record reader always complain "Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl". Looks like we always create TaskAttemptContextImpl (https://github.com/apache/spark/blob/2f7461f31331cfc37f6cfa3586b7bbefb3af5547/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala#L368). But, Parquet wants to use TaskInputOutputContext, which is a subclass of TaskAttemptContextImpl. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org