You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Bjørn Jørgensen (Jira)" <ji...@apache.org> on 2022/10/21 15:25:00 UTC
[jira] [Comment Edited] (SPARK-40861) CVE-2022-42889 upgrade commons text library to 1.10.0 in spark 3.0.0
[ https://issues.apache.org/jira/browse/SPARK-40861?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17622322#comment-17622322 ]
Bjørn Jørgensen edited comment on SPARK-40861 at 10/21/22 3:24 PM:
-------------------------------------------------------------------
[SPARK-40801|https://issues.apache.org/jira/projects/SPARK/issues/SPARK-40801]
CC [~dongjoon]
was (Author: bjornjorgensen):
[SPARK-40801|https://issues.apache.org/jira/projects/SPARK/issues/SPARK-40801]
> CVE-2022-42889 upgrade commons text library to 1.10.0 in spark 3.0.0
> --------------------------------------------------------------------
>
> Key: SPARK-40861
> URL: https://issues.apache.org/jira/browse/SPARK-40861
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 3.0.0
> Reporter: Rajesh
> Priority: Major
> Labels: CVE, SECURITY, security
>
> Hi Team,
>
> We use spark-core_2.12:3.0.0 which has transitive dependency on commons-text 1.6 and this is flagged as CVE-2022-42889.
>
> We have our spark application built using maven using spark-core_2.12:3.0.0.
> Need clarifications on below :
> * Does spark-core use StringSubstitutor and do we need to worry about this?
> * If its getting used , then which lib or code within spark core triggers it ?
> * can we include the apache commons text 1.10.0 as explicit dependency on our POM and add common text 1.6 in exclusions for spark-core , will it work ?
> * Upgrading the another spark version which may have commons text upgraded to 1.10.0 is not feasible and big task for us considering all dependent application using 3.0.0 version
>
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org