You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "SHOBHIT SHUKLA (Jira)" <ji...@apache.org> on 2020/01/20 06:53:00 UTC

[jira] [Commented] (SPARK-30467) On Federal Information Processing Standard (FIPS) enabled cluster, Spark Workers are not able to connect to Remote Master.

    [ https://issues.apache.org/jira/browse/SPARK-30467?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17019263#comment-17019263 ] 

SHOBHIT SHUKLA commented on SPARK-30467:
----------------------------------------

By using FIPS service provider setting in java.security file, we have seen standalone JAVA application to generate KEY is working fine on FIPS enabled cluster.

> On Federal Information Processing Standard (FIPS) enabled cluster, Spark Workers are not able to connect to Remote Master.
> --------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-30467
>                 URL: https://issues.apache.org/jira/browse/SPARK-30467
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.3.3, 2.3.4, 2.4.4
>            Reporter: SHOBHIT SHUKLA
>            Priority: Blocker
>              Labels: security
>
> On _*Federal Information Processing Standard*_ (FIPS) enabled clusters, If we configured *spark.network.crypto.enabled true* , Spark Workers are not able to create Spark Context because of communication between Spark Worker and Spark Master is failing.
> Default Algorithm ( *_spark.network.crypto.keyFactoryAlgorithm_* ) set to *_PBKDF2WithHmacSHA1_* and that is one of the Not Approved Cryptographic Algorithm. We had tried so many values from FIPS Approved Cryptographic Algorithm but those values are also not working.
> *Error logs :*
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
> *fips.c(145): OpenSSL internal error, assertion failed: FATAL FIPS SELFTEST FAILURE*
> JVMDUMP039I Processing dump event "abort", detail "" at 2020/01/09 06:41:50 - please wait.
> JVMDUMP032I JVM requested System dump using '<SPARK_HOME>/bin/core.20200109.064150.283.0001.dmp' in response to an event
> JVMDUMP030W Cannot write dump to file<SPARK_HOME>/bin/core.20200109.064150.283.0001.dmp: Permission denied
> JVMDUMP012E Error in System dump: The core file created by child process with pid = 375 was not found. Expected to find core file with name "/var/cores/core-netty-rpc-conne-sig11-user1000320999-group0-pid375-time*"
> JVMDUMP030W Cannot write dump to file <SPARK_HOME>/bin/javacore.20200109.064150.283.0002.txt: Permission denied
> JVMDUMP032I JVM requested Java dump using '/tmp/javacore.20200109.064150.283.0002.txt' in response to an event
> JVMDUMP010I Java dump written to /tmp/javacore.20200109.064150.283.0002.txt
> JVMDUMP032I JVM requested Snap dump using '<SPARK_HOME>/bin/Snap.20200109.064150.283.0003.trc' in response to an event
> JVMDUMP030W Cannot write dump to file <SPARK_HOME>/bin/Snap.20200109.064150.283.0003.trc: Permission denied
> JVMDUMP010I Snap dump written to /tmp/Snap.20200109.064150.283.0003.trc
> JVMDUMP030W Cannot write dump to file <SPARK_HOME>/bin/jitdump.20200109.064150.283.0004.dmp: Permission denied
> JVMDUMP007I JVM Requesting JIT dump using '/tmp/jitdump.20200109.064150.283.0004.dmp'
> JVMDUMP010I JIT dump written to /tmp/jitdump.20200109.064150.283.0004.dmp
> JVMDUMP013I Processed dump event "abort", detail "".



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org