You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kyuubi.apache.org by GitBox <gi...@apache.org> on 2022/02/20 15:26:14 UTC

[GitHub] [incubator-kyuubi] turboFei edited a comment on issue #1796: [Bug] When the Spark application fails to submit, the session still waits for the timeout to exit

turboFei edited a comment on issue #1796:
URL: https://github.com/apache/incubator-kyuubi/issues/1796#issuecomment-1046261617


   I had merged this pr and tested it on our test env.
   
   It seems that `process.destroyForcibly()` does not work.
   
   
   I have to found another temporary workaround.
   
   ```
     protected val KYUUBI_PROC_UNIQUE_ID: String = "kyuubi.proc.uniqueId"
   
     protected lazy val procUniqueId = UUID.randomUUID().toString
   
     protected def procConf(): Map[String, String] = Map(KYUUBI_PROC_UNIQUE_ID -> procUniqueId)
   ```
   
   ```
       var allConf = conf.getAll ++ procConf()
   ```
   
   
   ```
     protected def killProcessByUniqueId(): Unit = {
       val psEf = Runtime.getRuntime.exec("ps -ef")
       val input = new BufferedReader(new InputStreamReader(psEf.getInputStream))
       val regex = s"$KYUUBI_PROC_UNIQUE_ID=$procUniqueId".r
       var psInfo = input.readLine()
       var found = false
       while (!found && psInfo != null) {
         regex findFirstIn (psInfo) match {
           case Some(_) =>
             found = true
             psInfo.trim.split("\\s+").drop(1).headOption.foreach { psId =>
               Runtime.getRuntime.exec(s"kill -9 $psId")
             }
           case None =>
         }
         psInfo = input.readLine()
       }
     }
   
   ```
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org