You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Bertrand Bossy (JIRA)" <ji...@apache.org> on 2016/04/21 17:44:25 UTC

[jira] [Created] (SPARK-14805) accumulator values are not escaped when written to event logs

Bertrand Bossy created SPARK-14805:
--------------------------------------

             Summary: accumulator values are not escaped when written to event logs
                 Key: SPARK-14805
                 URL: https://issues.apache.org/jira/browse/SPARK-14805
             Project: Spark
          Issue Type: Bug
    Affects Versions: 1.5.2
            Reporter: Bertrand Bossy
            Priority: Minor


Affects Spark History Server:

When a (named) accumulator value contains special characters (not exactly sure which one triggers it, might be "�" or " "), the history server fails to read the history file:
{code}
16/04/21 07:16:14 ERROR FsHistoryProvider: Exception encountered when attempting to load application log hdfs://foo/eventlogs/spark/
72ddad06-5ba2-4221-ad2a-88d339f247b3-0159
java.nio.charset.MalformedInputException: Input length = 1
        at java.nio.charset.CoderResult.throwException(CoderResult.java:281)
        at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:339)
        at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
        at java.io.InputStreamReader.read(InputStreamReader.java:184)
        at java.io.BufferedReader.fill(BufferedReader.java:161)
        at java.io.BufferedReader.readLine(BufferedReader.java:324)
        at java.io.BufferedReader.readLine(BufferedReader.java:389)
        at scala.io.BufferedSource$BufferedLineIterator.hasNext(BufferedSource.scala:67)
        at org.apache.spark.scheduler.ReplayListenerBus.replay(ReplayListenerBus.scala:55)
        at org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$replay(FsHistoryProvide
r.scala:457)
        at org.apache.spark.deploy.history.FsHistoryProvider$$anonfun$10.apply(FsHistoryProvider.scala:292)
        at org.apache.spark.deploy.history.FsHistoryProvider$$anonfun$10.apply(FsHistoryProvider.scala:289)
        at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
        at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
        at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:251)
        at scala.collection.AbstractTraversable.flatMap(Traversable.scala:105)
        at org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$mergeApplicationListing
(FsHistoryProvider.scala:289)
        at org.apache.spark.deploy.history.FsHistoryProvider$$anonfun$checkForLogs$1$$anon$2.run(FsHistoryProvider.scala:210)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
{code}

The problem does not occur if those strings are escaped using
{code}
org.apache.commons.lang3.StringEscapeUtils.escapeJson
{code}
as a work around before adding them to the accumulator.

In the example above we get "\uFFFD" instead of "�" and "\u0017" instead of " " and the history server is able to display the history of the application.

I'm not exactly sure which character results in the parsing failure, but I suspect that it is related to unicode characters that should be escaped in JSON.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org