You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Deepti Sharma S <de...@ericsson.com.INVALID> on 2022/11/29 04:24:06 UTC

Vulnerability query on Hadoop

Hello Team,

We had a query regarding below High and Critical vulnerability on Hadoop, could you please help here.

Query for below mentioned HIGH Vulnerability.

We are having java based HDFS client which uses Hadoop-Common-3.3.3, Hadoop-hdfs-3.3.3 and Hadoop-hdfs-client-3.3.3 as it's dependency.
Hadoop-Common and Hadoop-hdfs uses protobuf-java-2.5.0 as dependency.
Hadoop-hdfs-client uses okhttp-2.7.5 as dependency

We got the following high vulnerablilities in protobuf-java using "Anchore Grype" and in okhttp using "JFrog Xray".

1. Description : A parsing issue with binary data in protobuf-java core and lite versions prior to 3.21.7, 3.20.3, 3.19.6 and 3.16.3 can lead to a denial of service attack.
                 Inputs containing multiple instances of non-repeated embedded messages with repeated or unknown fields causes objects to be converted back-n-forth between mutable and immutable forms,
                 resulting in potentially long garbage collection pauses. We recommend updating to the versions mentioned above.


2. Description : OkHttp contains a flaw that is triggered during the handling of non-ASCII ETag headers. This may allow a remote attacker to crash a process linked against the library.

3. Description : OkHttp contains a flaw that is triggered during the reading of non-ASCII characters in HTTP/2 headers or in cached HTTP headers. This may allow a remote attacker to crash a process linked against the library.

What is the impact of these vulnerablilities on HDFS client?
If HDFS Client is impacted then what is the mitigation plan for that?

Query for below mentioned CRITICAL Vulnerability.

We are having java based HDFS client which uses Hadoop-Common-3.3.3 as it's dependency. in our application.
Hadoop-Common-3.3.3 uses netty-codec-4.1.42.Final as deep dependency.

We got the following critical vulnerablility in netty-codec using JFrog Xray.

Description : Netty contains an overflow condition in the Lz4FrameEncoder::finishEncode() function in codec/src/main/java/io/netty/handler/codec/compression/Lz4FrameEncoder.java
that is triggered when compressing data and writing the last header.
This may allow an attacker to cause a buffer overflow, resulting in a denial of service or potentially allowing the execution of arbitrary code.

What is the impact of this vulnerablility on HDFS client?
If HDFS Client is impacted then what is the mitigation plan for that?



Regards,
Deepti Sharma
PMP(r) & ITIL


RE: Vulnerability query on Hadoop

Posted by Deepti Sharma S <de...@ericsson.com.INVALID>.
Thank you Ayush


Regards,
Deepti Sharma
PMP® & ITIL

From: Ayush Saxena <ay...@gmail.com>
Sent: 29 November 2022 16:27
To: Deepti Sharma S <de...@ericsson.com>
Cc: user@hadoop.apache.org
Subject: Re: Vulnerability query on Hadoop

Hi Deepti,
The OkHttp one I think got sorted as part of HDFS-16453, It is there in Hadoop-3.3.4(Released),
Second, netty is also upgraded as part of HADOOP-18079 and is also there in Hadoop-3.3.4, I tried to grep on the dependency tree of 3.3.4 and didn't find 4.1.42. If you still see it let me know what is pulling that in, we can fix that in the next release(3.3.5) next month.

So, ideally an upgrade from hadoop 3.3.3 to 3.3.4 should get things fixed for you.

-Ayush

Refs:
https://issues.apache.org/jira/browse/HDFS-16453
https://issues.apache.org/jira/browse/HADOOP-18079


On Tue, 29 Nov 2022 at 09:54, Deepti Sharma S <de...@ericsson.com.invalid>> wrote:
Hello Team,

We had a query regarding below High and Critical vulnerability on Hadoop, could you please help here.

Query for below mentioned HIGH Vulnerability.

We are having java based HDFS client which uses Hadoop-Common-3.3.3, Hadoop-hdfs-3.3.3 and Hadoop-hdfs-client-3.3.3 as it's dependency.
Hadoop-Common and Hadoop-hdfs uses protobuf-java-2.5.0 as dependency.
Hadoop-hdfs-client uses okhttp-2.7.5 as dependency

We got the following high vulnerablilities in protobuf-java using "Anchore Grype" and in okhttp using "JFrog Xray".

1. Description : A parsing issue with binary data in protobuf-java core and lite versions prior to 3.21.7, 3.20.3, 3.19.6 and 3.16.3 can lead to a denial of service attack.
                 Inputs containing multiple instances of non-repeated embedded messages with repeated or unknown fields causes objects to be converted back-n-forth between mutable and immutable forms,
                 resulting in potentially long garbage collection pauses. We recommend updating to the versions mentioned above.


2. Description : OkHttp contains a flaw that is triggered during the handling of non-ASCII ETag headers. This may allow a remote attacker to crash a process linked against the library.

3. Description : OkHttp contains a flaw that is triggered during the reading of non-ASCII characters in HTTP/2 headers or in cached HTTP headers. This may allow a remote attacker to crash a process linked against the library.

What is the impact of these vulnerablilities on HDFS client?
If HDFS Client is impacted then what is the mitigation plan for that?

Query for below mentioned CRITICAL Vulnerability.

We are having java based HDFS client which uses Hadoop-Common-3.3.3 as it's dependency. in our application.
Hadoop-Common-3.3.3 uses netty-codec-4.1.42.Final as deep dependency.

We got the following critical vulnerablility in netty-codec using JFrog Xray.

Description : Netty contains an overflow condition in the Lz4FrameEncoder::finishEncode() function in codec/src/main/java/io/netty/handler/codec/compression/Lz4FrameEncoder.java
that is triggered when compressing data and writing the last header.
This may allow an attacker to cause a buffer overflow, resulting in a denial of service or potentially allowing the execution of arbitrary code.

What is the impact of this vulnerablility on HDFS client?
If HDFS Client is impacted then what is the mitigation plan for that?



Regards,
Deepti Sharma
PMP® & ITIL


Vulnerability query on Hadoop

Posted by Deepti Sharma S <de...@ericsson.com.INVALID>.
Hello Team,
We are having java based HDFS client which uses Hadoop-hdfs-3.3.3 as it's dependency. in our application.
Hadoop-hdfs-3.3.3 uses netty 3.10.6.Final as deep dependency.
 We got the following vulnerability in netty using JFrog Xray.
 Description : Netty contains a flaw in the AbstractDiskHttpData.delete() function in handler/codec/http/multipart/AbstractDiskHttpData.java that is triggered as temporary file entries are added to the 'DeleteOnExitHook' object but not properly removed when processing POST requests that are 16 kB. This may allow a remote attacker to exhaust available memory resources, potentially resulting in a denial of service.
What is the impact of this vulnerablility on HDFS client?
If HDFS Client is impacted then what is the mitigation plan for that?



Regards,
Deepti Sharma
PMP® & ITIL

Re: Vulnerability query on Hadoop

Posted by Ayush Saxena <ay...@gmail.com>.
Hi Deepti,
The OkHttp one I think got sorted as part of HDFS-16453, It is there in
Hadoop-3.3.4(Released),
Second, netty is also upgraded as part of HADOOP-18079 and is also there in
Hadoop-3.3.4, I tried to grep on the dependency tree of 3.3.4 and didn't
find 4.1.42. If you still see it let me know what is pulling that in, we
can fix that in the next release(3.3.5) next month.

So, ideally an upgrade from hadoop 3.3.3 to 3.3.4 should get things fixed
for you.

-Ayush

Refs:
https://issues.apache.org/jira/browse/HDFS-16453
https://issues.apache.org/jira/browse/HADOOP-18079


On Tue, 29 Nov 2022 at 09:54, Deepti Sharma S
<de...@ericsson.com.invalid> wrote:

> Hello Team,
>
>
>
> We had a query regarding below High and Critical vulnerability on Hadoop,
> could you please help here.
>
>
>
> *Query for below mentioned HIGH Vulnerability.*
>
>
>
> We are having java based HDFS client which uses Hadoop-Common-3.3.3,
> Hadoop-hdfs-3.3.3 and Hadoop-hdfs-client-3.3.3 as it's dependency.
>
> Hadoop-Common and Hadoop-hdfs uses protobuf-java-2.5.0 as dependency.
>
> Hadoop-hdfs-client uses okhttp-2.7.5 as dependency
>
>
>
> We got the following high vulnerablilities in protobuf-java using "Anchore
> Grype" and in okhttp using "JFrog Xray".
>
>
>
> 1. Description : A parsing issue with binary data in protobuf-java core
> and lite versions prior to 3.21.7, 3.20.3, 3.19.6 and 3.16.3 can lead to a
> denial of service attack.
>
>                  Inputs containing multiple instances of non-repeated
> embedded messages with repeated or unknown fields causes objects to be
> converted back-n-forth between mutable and immutable forms,
>
>                  resulting in potentially long garbage collection pauses.
> We recommend updating to the versions mentioned above.
>
>
>
>
>
> 2. Description : OkHttp contains a flaw that is triggered during the
> handling of non-ASCII ETag headers. This may allow a remote attacker to
> crash a process linked against the library.
>
>
>
> 3. Description : OkHttp contains a flaw that is triggered during the
> reading of non-ASCII characters in HTTP/2 headers or in cached HTTP
> headers. This may allow a remote attacker to crash a process linked against
> the library.
>
>
>
> What is the impact of these vulnerablilities on HDFS client?
>
> If HDFS Client is impacted then what is the mitigation plan for that?
>
>
>
> *Query for below mentioned CRITICAL Vulnerability.*
>
>
>
> We are having java based HDFS client which uses Hadoop-Common-3.3.3 as
> it's dependency. in our application.
>
> Hadoop-Common-3.3.3 uses netty-codec-4.1.42.Final as deep dependency.
>
>
>
> We got the following *critical vulnerablility* in netty-codec using JFrog
> Xray.
>
>
>
> *Description* : Netty contains an overflow condition in the
> Lz4FrameEncoder::finishEncode() function in
> codec/src/main/java/io/netty/handler/codec/compression/Lz4FrameEncoder.java
>
> that is triggered when compressing data and writing the last header.
>
> This may allow an attacker to cause a buffer overflow, resulting in a
> denial of service or potentially allowing the execution of arbitrary code.
>
>
>
> What is the impact of this vulnerablility on HDFS client?
>
> If HDFS Client is impacted then what is the mitigation plan for that?
>
>
>
>
>
>
>
> Regards,
>
> Deepti Sharma
> * PMP® & ITIL*
>
>
>