You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Stephan Ewen (JIRA)" <ji...@apache.org> on 2017/01/11 20:23:16 UTC
[jira] [Resolved] (FLINK-5361) Flink shouldn't require Kerberos
credentials
[ https://issues.apache.org/jira/browse/FLINK-5361?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Stephan Ewen resolved FLINK-5361.
---------------------------------
Resolution: Fixed
Fix Version/s: 1.3.0
1.2.0
Fixed in
1.2.0 via 00193f7e238340cc18c57a44c7e6377432839373
1.3.0 via fc3a778c0cafe1adc9efbd8796a8bd64122e4ad2
> Flink shouldn't require Kerberos credentials
> --------------------------------------------
>
> Key: FLINK-5361
> URL: https://issues.apache.org/jira/browse/FLINK-5361
> Project: Flink
> Issue Type: Bug
> Components: Client
> Reporter: Eron Wright
> Assignee: Eron Wright
> Labels: kerberos, security
> Fix For: 1.2.0, 1.3.0
>
>
> The behavior since FLINK-3929 has been to fail if Hadoop security is enabled but a Kerberos credential (or delegation token) isn't available. It should proceed with a warning that a credential isn't available.
> For example, say your shell has a HADOOP_CONF_DIR variable that points to a secure Hadoop installation. Say you'd like to use Flink for a completely unrelated reason (not involving Hadoop). The Flink CLI will fail at startup with a message to the effect that a kerberos ticket is needed. A ticket should not be needed in this scenario.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)