You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@falcon.apache.org by Josh Clum <jo...@gmail.com> on 2014/08/27 15:08:05 UTC

Secure Cluster Kerberos Exception

Hi,

I'm trying to submit a cluster process in a secure environment. But am
getting the following exception saying that the Server has an invalid
principal. Has anyone run across this before? I'm setting <property
name="dfs.namenode.kerberos.principal" value="nn/_HOST@EXAMPLE.COM"/>.

Client Command:

-bash-4.1$ falcon entity -submit -type process -file
raw_cc_bp_ratio_lcms_comp_process.xml
Error: java.io.IOException: Failed on local exception: java.io.IOException:
java.lang.IllegalArgumentException: Server has invalid Kerberos principal:
nn/ip-54-40-237-210.EXAMPLE.COM@EXAMPLE.COM; Host Details : local host is: "
ip-54-40-237-210.EXAMPLE.COM/54.40.237.210"; destination host is: "
ip-54-40-237-210.EXAMPLE.COM":8020;

Falcon Server Logs:

Caused by: java.io.IOException: Failed on local exception:
java.io.IOException: java.lang.IllegalArgumentException: Server has invalid
Kerberos principal: nn/ip-54-40-237-210.EXAMPLE.COM@EXAMPLE.COM; Host
Details : local host is: "ip-54-40-237-210.EXAMPLE.COM/54.40.237.210";
destination host is: "ip-54-40-237-210.EXAMPLE.COM":8020;
  at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
  at org.apache.hadoop.ipc.Client.call(Client.java:1414)
  at org.apache.hadoop.ipc.Client.call(Client.java:1363)
  at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
  at com.sun.proxy.$Proxy67.getFileInfo(Unknown Source)
  at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:699)
  at sun.reflect.GeneratedMethodAccessor24.invoke(Unknown Source)
  at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:606)
  at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
  at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
  at com.sun.proxy.$Proxy68.getFileInfo(Unknown Source)
  at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1762)
  at
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1124)
  at
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
  at
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
  at
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
  at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1398)
  at
org.apache.falcon.entity.parser.ProcessEntityParser.validateHDFSPaths(ProcessEntityParser.java:122)
  ... 56 more
Caused by: java.io.IOException: java.lang.IllegalArgumentException: Server
has invalid Kerberos principal: nn/ip-54-40-237-210.EXAMPLE.COM@EXAMPLE.COM
  at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:677)
  at java.security.AccessController.doPrivileged(Native Method)
  at javax.security.auth.Subject.doAs(Subject.java:415)
  at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
  at
org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:640)
  at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:724)
  at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
  at org.apache.hadoop.ipc.Client.getConnection(Client.java:1462)
  at org.apache.hadoop.ipc.Client.call(Client.java:1381)
  ... 73 more
Caused by: java.lang.IllegalArgumentException: Server has invalid Kerberos
principal: nn/ip-54-40-237-210.EXAMPLE.COM@EXAMPLE.COM
  at
org.apache.hadoop.security.SaslRpcClient.getServerPrincipal(SaslRpcClient.java:332)
  at
org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:231)
  at
org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:159)
  at
org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:394)
  at
org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:550)
  at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:367)
  at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:716)
  at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:712)
  at java.security.AccessController.doPrivileged(Native Method)
  at javax.security.auth.Subject.doAs(Subject.java:415)
  at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
  at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:711)
  ... 76 more

Cluster Entity:

<?xml version="1.0" encoding="UTF-8"?>
<cluster colo="local" description="Cluster" name="cluster"
xmlns="uri:falcon:cluster:0.1">
  <interfaces>
    <interface type="readonly" endpoint="hdfs://nn" version="2.4.0"/>
    <interface type="write" endpoint="hdfs://nn" version="2.4.0"/>
    <interface type="execute" endpoint="ip-54-40-237-222.EXAMPLE.COM:8050"
version="2.4.0"/>
    <interface type="workflow" endpoint="
http://ip-54-40-237-210.EXAMPLE.COM:11000/oozie" version="4.0.0"/>
    <interface type="registry" endpoint="thrift://
ip-54-40-237-222.EXAMPLE.COM:9083" version="0.13.0"/>
    <interface type="messaging" endpoint="tcp://
ip-54-40-237-210.EXAMPLE.COM:61616?daemon=true" version="5.4.3"/>
  </interfaces>
  <locations>
    <location name="staging" path="/tmp"/>
    <location name="working" path="/tmp"/>
    <location name="temp" path="/tmp"/>
  </locations>
  <properties>
    <property name="dfs.namenode.kerberos.principal" value="nn/_
HOST@EXAMPLE.COM"/>
    <property name="hive.metastore.kerberos.principal" value="hive/
ip-54-40-237-222.EXAMPLE.COM@EXAMPLE.COM"/>
    <property name="hive.metastore.uris" value="thrift://
ip-54-40-237-222.EXAMPLE.COM:9083"/>
    <property name="hive.metastore.sasl.enabled" value="true"/>
  </properties>
</cluster>

Thanks,
Josh

Re: Secure Cluster Kerberos Exception

Posted by Josh Clum <jo...@gmail.com>.
Ok, thanks.


On Wed, Aug 27, 2014 at 12:49 PM, Seetharam Venkatesh <
venkatesh@innerzeal.com> wrote:

> Josh, the interface endpoints should not have the realm information.
>
> <interface type="execute" endpoint="ip-54-40-237-222*.EXAMPLE.COM*:8050
> <http://ip-54-40-237-222.example.com:8050/>" version="2.4.0"/>
>
> Should be:
> <interface type="execute" endpoint="ip-54-40-237-222:8050
> <http://ip-54-40-237-222.example.com:8050/>"  version="2.4.0"/>
>
>
> On Wed, Aug 27, 2014 at 6:08 AM, Josh Clum <jo...@gmail.com> wrote:
>
> > Hi,
> >
> > I'm trying to submit a cluster process in a secure environment. But am
> > getting the following exception saying that the Server has an invalid
> > principal. Has anyone run across this before? I'm setting <property
> > name="dfs.namenode.kerberos.principal" value="nn/_HOST@EXAMPLE.COM"/>.
> >
> > Client Command:
> >
> > -bash-4.1$ falcon entity -submit -type process -file
> > raw_cc_bp_ratio_lcms_comp_process.xml
> > Error: java.io.IOException: Failed on local exception:
> java.io.IOException:
> > java.lang.IllegalArgumentException: Server has invalid Kerberos
> principal:
> > nn/ip-54-40-237-210.EXAMPLE.COM@EXAMPLE.COM; Host Details : local host
> > is: "
> > ip-54-40-237-210.EXAMPLE.COM/54.40.237.210"; destination host is: "
> > ip-54-40-237-210.EXAMPLE.COM":8020;
> >
> > Falcon Server Logs:
> >
> > Caused by: java.io.IOException: Failed on local exception:
> > java.io.IOException: java.lang.IllegalArgumentException: Server has
> invalid
> > Kerberos principal: nn/ip-54-40-237-210.EXAMPLE.COM@EXAMPLE.COM; Host
> > Details : local host is: "ip-54-40-237-210.EXAMPLE.COM/54.40.237.210";
> > destination host is: "ip-54-40-237-210.EXAMPLE.COM":8020;
> >   at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
> >   at org.apache.hadoop.ipc.Client.call(Client.java:1414)
> >   at org.apache.hadoop.ipc.Client.call(Client.java:1363)
> >   at
> >
> >
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
> >   at com.sun.proxy.$Proxy67.getFileInfo(Unknown Source)
> >   at
> >
> >
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:699)
> >   at sun.reflect.GeneratedMethodAccessor24.invoke(Unknown Source)
> >   at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >   at java.lang.reflect.Method.invoke(Method.java:606)
> >   at
> >
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
> >   at
> >
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
> >   at com.sun.proxy.$Proxy68.getFileInfo(Unknown Source)
> >   at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1762)
> >   at
> >
> >
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1124)
> >   at
> >
> >
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
> >   at
> >
> >
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> >   at
> >
> >
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
> >   at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1398)
> >   at
> >
> >
> org.apache.falcon.entity.parser.ProcessEntityParser.validateHDFSPaths(ProcessEntityParser.java:122)
> >   ... 56 more
> > Caused by: java.io.IOException: java.lang.IllegalArgumentException:
> Server
> > has invalid Kerberos principal: nn/
> > ip-54-40-237-210.EXAMPLE.COM@EXAMPLE.COM
> >   at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:677)
> >   at java.security.AccessController.doPrivileged(Native Method)
> >   at javax.security.auth.Subject.doAs(Subject.java:415)
> >   at
> >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
> >   at
> >
> >
> org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:640)
> >   at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:724)
> >   at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
> >   at org.apache.hadoop.ipc.Client.getConnection(Client.java:1462)
> >   at org.apache.hadoop.ipc.Client.call(Client.java:1381)
> >   ... 73 more
> > Caused by: java.lang.IllegalArgumentException: Server has invalid
> Kerberos
> > principal: nn/ip-54-40-237-210.EXAMPLE.COM@EXAMPLE.COM
> >   at
> >
> >
> org.apache.hadoop.security.SaslRpcClient.getServerPrincipal(SaslRpcClient.java:332)
> >   at
> >
> >
> org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:231)
> >   at
> >
> >
> org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:159)
> >   at
> >
> >
> org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:394)
> >   at
> >
> >
> org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:550)
> >   at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:367)
> >   at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:716)
> >   at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:712)
> >   at java.security.AccessController.doPrivileged(Native Method)
> >   at javax.security.auth.Subject.doAs(Subject.java:415)
> >   at
> >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
> >   at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:711)
> >   ... 76 more
> >
> > Cluster Entity:
> >
> > <?xml version="1.0" encoding="UTF-8"?>
> > <cluster colo="local" description="Cluster" name="cluster"
> > xmlns="uri:falcon:cluster:0.1">
> >   <interfaces>
> >     <interface type="readonly" endpoint="hdfs://nn" version="2.4.0"/>
> >     <interface type="write" endpoint="hdfs://nn" version="2.4.0"/>
> >     <interface type="execute" endpoint="
> ip-54-40-237-222.EXAMPLE.COM:8050"
> > version="2.4.0"/>
> >     <interface type="workflow" endpoint="
> > http://ip-54-40-237-210.EXAMPLE.COM:11000/oozie" version="4.0.0"/>
> >     <interface type="registry" endpoint="thrift://
> > ip-54-40-237-222.EXAMPLE.COM:9083" version="0.13.0"/>
> >     <interface type="messaging" endpoint="tcp://
> > ip-54-40-237-210.EXAMPLE.COM:61616?daemon=true" version="5.4.3"/>
> >   </interfaces>
> >   <locations>
> >     <location name="staging" path="/tmp"/>
> >     <location name="working" path="/tmp"/>
> >     <location name="temp" path="/tmp"/>
> >   </locations>
> >   <properties>
> >     <property name="dfs.namenode.kerberos.principal" value="nn/_
> > HOST@EXAMPLE.COM"/>
> >     <property name="hive.metastore.kerberos.principal" value="hive/
> > ip-54-40-237-222.EXAMPLE.COM@EXAMPLE.COM"/>
> >     <property name="hive.metastore.uris" value="thrift://
> > ip-54-40-237-222.EXAMPLE.COM:9083"/>
> >     <property name="hive.metastore.sasl.enabled" value="true"/>
> >   </properties>
> > </cluster>
> >
> > Thanks,
> > Josh
> >
>
>
>
> --
> Regards,
> Venkatesh
>
> “Perfection (in design) is achieved not when there is nothing more to add,
> but rather when there is nothing more to take away.”
> - Antoine de Saint-Exupéry
>

Re: Secure Cluster Kerberos Exception

Posted by Seetharam Venkatesh <ve...@innerzeal.com>.
Josh, the interface endpoints should not have the realm information.

<interface type="execute" endpoint="ip-54-40-237-222*.EXAMPLE.COM*:8050
<http://ip-54-40-237-222.example.com:8050/>" version="2.4.0"/>

Should be:
<interface type="execute" endpoint="ip-54-40-237-222:8050
<http://ip-54-40-237-222.example.com:8050/>"  version="2.4.0"/>


On Wed, Aug 27, 2014 at 6:08 AM, Josh Clum <jo...@gmail.com> wrote:

> Hi,
>
> I'm trying to submit a cluster process in a secure environment. But am
> getting the following exception saying that the Server has an invalid
> principal. Has anyone run across this before? I'm setting <property
> name="dfs.namenode.kerberos.principal" value="nn/_HOST@EXAMPLE.COM"/>.
>
> Client Command:
>
> -bash-4.1$ falcon entity -submit -type process -file
> raw_cc_bp_ratio_lcms_comp_process.xml
> Error: java.io.IOException: Failed on local exception: java.io.IOException:
> java.lang.IllegalArgumentException: Server has invalid Kerberos principal:
> nn/ip-54-40-237-210.EXAMPLE.COM@EXAMPLE.COM; Host Details : local host
> is: "
> ip-54-40-237-210.EXAMPLE.COM/54.40.237.210"; destination host is: "
> ip-54-40-237-210.EXAMPLE.COM":8020;
>
> Falcon Server Logs:
>
> Caused by: java.io.IOException: Failed on local exception:
> java.io.IOException: java.lang.IllegalArgumentException: Server has invalid
> Kerberos principal: nn/ip-54-40-237-210.EXAMPLE.COM@EXAMPLE.COM; Host
> Details : local host is: "ip-54-40-237-210.EXAMPLE.COM/54.40.237.210";
> destination host is: "ip-54-40-237-210.EXAMPLE.COM":8020;
>   at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
>   at org.apache.hadoop.ipc.Client.call(Client.java:1414)
>   at org.apache.hadoop.ipc.Client.call(Client.java:1363)
>   at
>
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>   at com.sun.proxy.$Proxy67.getFileInfo(Unknown Source)
>   at
>
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:699)
>   at sun.reflect.GeneratedMethodAccessor24.invoke(Unknown Source)
>   at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:606)
>   at
>
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
>   at
>
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
>   at com.sun.proxy.$Proxy68.getFileInfo(Unknown Source)
>   at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1762)
>   at
>
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1124)
>   at
>
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
>   at
>
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>   at
>
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
>   at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1398)
>   at
>
> org.apache.falcon.entity.parser.ProcessEntityParser.validateHDFSPaths(ProcessEntityParser.java:122)
>   ... 56 more
> Caused by: java.io.IOException: java.lang.IllegalArgumentException: Server
> has invalid Kerberos principal: nn/
> ip-54-40-237-210.EXAMPLE.COM@EXAMPLE.COM
>   at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:677)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at javax.security.auth.Subject.doAs(Subject.java:415)
>   at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>   at
>
> org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:640)
>   at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:724)
>   at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
>   at org.apache.hadoop.ipc.Client.getConnection(Client.java:1462)
>   at org.apache.hadoop.ipc.Client.call(Client.java:1381)
>   ... 73 more
> Caused by: java.lang.IllegalArgumentException: Server has invalid Kerberos
> principal: nn/ip-54-40-237-210.EXAMPLE.COM@EXAMPLE.COM
>   at
>
> org.apache.hadoop.security.SaslRpcClient.getServerPrincipal(SaslRpcClient.java:332)
>   at
>
> org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:231)
>   at
>
> org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:159)
>   at
>
> org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:394)
>   at
>
> org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:550)
>   at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:367)
>   at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:716)
>   at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:712)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at javax.security.auth.Subject.doAs(Subject.java:415)
>   at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>   at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:711)
>   ... 76 more
>
> Cluster Entity:
>
> <?xml version="1.0" encoding="UTF-8"?>
> <cluster colo="local" description="Cluster" name="cluster"
> xmlns="uri:falcon:cluster:0.1">
>   <interfaces>
>     <interface type="readonly" endpoint="hdfs://nn" version="2.4.0"/>
>     <interface type="write" endpoint="hdfs://nn" version="2.4.0"/>
>     <interface type="execute" endpoint="ip-54-40-237-222.EXAMPLE.COM:8050"
> version="2.4.0"/>
>     <interface type="workflow" endpoint="
> http://ip-54-40-237-210.EXAMPLE.COM:11000/oozie" version="4.0.0"/>
>     <interface type="registry" endpoint="thrift://
> ip-54-40-237-222.EXAMPLE.COM:9083" version="0.13.0"/>
>     <interface type="messaging" endpoint="tcp://
> ip-54-40-237-210.EXAMPLE.COM:61616?daemon=true" version="5.4.3"/>
>   </interfaces>
>   <locations>
>     <location name="staging" path="/tmp"/>
>     <location name="working" path="/tmp"/>
>     <location name="temp" path="/tmp"/>
>   </locations>
>   <properties>
>     <property name="dfs.namenode.kerberos.principal" value="nn/_
> HOST@EXAMPLE.COM"/>
>     <property name="hive.metastore.kerberos.principal" value="hive/
> ip-54-40-237-222.EXAMPLE.COM@EXAMPLE.COM"/>
>     <property name="hive.metastore.uris" value="thrift://
> ip-54-40-237-222.EXAMPLE.COM:9083"/>
>     <property name="hive.metastore.sasl.enabled" value="true"/>
>   </properties>
> </cluster>
>
> Thanks,
> Josh
>



-- 
Regards,
Venkatesh

“Perfection (in design) is achieved not when there is nothing more to add,
but rather when there is nothing more to take away.”
- Antoine de Saint-Exupéry