You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Adam Roberts (JIRA)" <ji...@apache.org> on 2016/07/27 12:53:20 UTC

[jira] [Created] (SPARK-16751) Upgrade derby to 10.12.1.1 from 10.11.1.1

Adam Roberts created SPARK-16751:
------------------------------------

             Summary: Upgrade derby to 10.12.1.1 from 10.11.1.1
                 Key: SPARK-16751
                 URL: https://issues.apache.org/jira/browse/SPARK-16751
             Project: Spark
          Issue Type: Improvement
          Components: Build
    Affects Versions: 2.0.0, 1.6.2, 1.5.2, 1.4.1, 1.3.1
         Environment: All platforms and major Spark releases
            Reporter: Adam Roberts
            Priority: Critical


This JIRA is to upgrade the derby version from 10.11.1.1 to 10.12.1.1

We only use derby for tests as far as myself and Sean Owen know, let's not include it in the jars folder for Spark then.

The upgrade is due to an already disclosed vulnerability (CVE-2015-1832) in derby 10.11.1.1. We used https://www.versioneye.com/search and will be checking for any other problems in a variety of libraries too: investigating if we can set up a Jenkins job to check our pom on a regular basis so we can stay ahead of the game for matters like this.

This was raised on the mailing list at http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-2-0-0-RC5-tp18367p18465.html by Stephen Hellberg and replied to by Sean Owen.

I've checked the impact to previous Spark releases and this particular version of derby is the only relatively recent and without vulnerabilities version (I checked up to the 1.3 branch) so ideally we'd backport this for all impacted Spark releases.

I've marked this as critical and ticked the important checkbox as it's going to impact every user, there isn't a security component (should we add one?) and hence the build tag.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org