You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "manohar (Jira)" <ji...@apache.org> on 2022/08/17 13:21:00 UTC

[jira] [Updated] (SPARK-40123) Security Vulnerability CVE-2018-11793 due to mesos-1.4.3-shaded-protobuf.jar

     [ https://issues.apache.org/jira/browse/SPARK-40123?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

manohar updated SPARK-40123:
----------------------------
     Flags: Patch
    Labels: security-issue  (was: )

> Security Vulnerability CVE-2018-11793 due to mesos-1.4.3-shaded-protobuf.jar
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-40123
>                 URL: https://issues.apache.org/jira/browse/SPARK-40123
>             Project: Spark
>          Issue Type: Bug
>          Components: Mesos
>    Affects Versions: 3.3.0
>            Reporter: manohar
>            Priority: Major
>              Labels: security-issue
>             Fix For: 3.3.1
>
>
> Hello Team,
> We are facing this vulnerability on Spark Installation 3.3.3 , Can we please upgrade the version of mesos in our installation to address this vulnerability. 
> ||Package||cve||cvss||severity||pkg_version||fixed_in_pkg||pkg_path||
> |1|org.apache.mesos_mesos|CVE-2018-11793|7|high|1.4.0|1.7.1, 1.6.2, 1.5.2, 1.4.3|/opt/domino/spark/python/build/lib/pyspark/jars/mesos-1.4.0-shaded-protobuf.jar|
> In our source code i found that the depedant version of mesos jar is 1.4.3 
> user@ThinkPad-E14-02:~/Downloads/spark-master$ grep -ir mesos- * 
> core/src/main/scala/org/apache/spark/scheduler/SchedulerBackend.scala: * TaskSchedulerImpl. We assume a Mesos-like model where the application gets resource offers as
> *dev/deps/spark-deps-hadoop-2-hive-2.3:mesos/1.4.3/shaded-protobuf/mesos-1.4.3-shaded-protobuf.jar
> dev/deps/spark-deps-hadoop-3-hive-2.3:mesos/1.4.3/shaded-protobuf/mesos-1.4.3-shaded-protobuf.jar
> *



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org