You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "vadim (JIRA)" <ji...@apache.org> on 2018/07/17 09:56:00 UTC

[jira] [Created] (SPARK-24830) Problem with logging on Glassfish

vadim created SPARK-24830:
-----------------------------

             Summary: Problem with logging on Glassfish
                 Key: SPARK-24830
                 URL: https://issues.apache.org/jira/browse/SPARK-24830
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.3.0
         Environment: Glassfish 4.1.1, 4.1.2, java 1.8.0_151-b12, Windows 10 or Linux-Fedora 20 
            Reporter: vadim


When driver program is running inside J2EE application (in yarn-client mode) all log messages of Glassfish server are not going into Glassfish log after when Spark application is started. In order to see these messages need to setup log4j in application with spark, so that messages would go into file(log4j.rootLogger=INFO, file). That give the possibility to see that is happening, but it's just workaround. Logs for all other applications, deployed on Glassfish are affected, while these applications should be in the isolated environments.

Note: the matter is about "usual"  log messages before-after driver program got answers, not about these, which could be originated from the code of tasks executed on workers.

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org