You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by vanzin <gi...@git.apache.org> on 2018/01/03 21:09:01 UTC

[GitHub] spark pull request #19893: [SPARK-16139][TEST] Add logging functionality for...

Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19893#discussion_r159524447
  
    --- Diff: core/src/test/scala/org/apache/spark/ThreadAudit.scala ---
    @@ -0,0 +1,126 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark
    +
    +import scala.collection.JavaConverters._
    +
    +import org.apache.spark.internal.Logging
    +
    +/**
    + * Thread audit for test suites.
    + *
    + * Thread audit happens normally in [[SparkFunSuite]] automatically when a new test suite created.
    + * The only prerequisite for that is that the test class must extend [[SparkFunSuite]].
    + *
    + * There are some test suites which are doing initialization before [[SparkFunSuite#beforeAll]]
    + * executed. This case auditing can be moved into another place in the call sequence.
    + *
    + * To do the audit in a custom place/way the following can be done:
    + *
    + * class MyTestSuite extends SparkFunSuite {
    + *
    + *   override val doThreadAuditInSparkFunSuite = false
    + *
    + *   protected override def beforeAll(): Unit = {
    + *     doThreadPreAudit
    + *     super.beforeAll
    + *   }
    + *
    + *   protected override def afterAll(): Unit = {
    + *     super.afterAll
    + *     doThreadPostAudit
    + *   }
    + * }
    + */
    +trait ThreadAudit extends Logging {
    +
    +  val threadWhiteList = Set(
    +    /**
    +     * Netty related internal threads.
    +     * These are excluded because their lifecycle is handled by the netty itself
    +     * and spark has no explicit effect on them.
    +     */
    +    "netty.*",
    +
    +    /**
    +     * Netty related internal threads.
    +     * A Single-thread singleton EventExecutor inside netty which creates such threads.
    +     * These are excluded because their lifecycle is handled by the netty itself
    +     * and spark has no explicit effect on them.
    +     */
    +    "globalEventExecutor.*",
    +
    +    /**
    +     * Netty related internal threads.
    +     * Checks if a thread is alive periodically and runs a task when a thread dies.
    +     * These are excluded because their lifecycle is handled by the netty itself
    +     * and spark has no explicit effect on them.
    +     */
    +    "threadDeathWatcher.*",
    +
    +    /**
    +     * During [[SparkContext]] creation [[org.apache.spark.rpc.netty.NettyRpcEnv]]
    +     * creates event loops. One is wrapped inside
    +     * [[org.apache.spark.network.server.TransportServer]]
    +     * the other one is inside [[org.apache.spark.network.client.TransportClient]].
    +     * The thread pools behind shut down asynchronously triggered by [[SparkContext#stop]].
    +     * Manually checked and all of them stopped properly.
    +     */
    +    "rpc-client.*",
    +    "rpc-server.*",
    +
    +    /**
    +     * During [[SparkContext]] creation BlockManager
    +     * creates event loops. One is wrapped inside
    --- End diff --
    
    nit: line wrapped too early.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org