You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/09/23 23:23:40 UTC

[GitHub] [spark] xkrogen edited a comment on pull request #34084: [SPARK-35672][CORE][YARN] Handle environment variable replacement in user classpath lists

xkrogen edited a comment on pull request #34084:
URL: https://github.com/apache/spark/pull/34084#issuecomment-926230650


   @peter-toth @tgravescs @mridulm FYI
   
   It's not super clean to have to perform the variable resolution in Spark code, and in particular one aspect I left out is handling of escaping, so you can't use e.g. `/home/\$usr/` to actually refer to the path `/home/$usr` -- `usr` will be resolved as a variable. Similarly on Windows, `\Home\%%usr\` wouldn't work as expected, resolving to `\Home\usr` instead of `\Home\%usr`. Handling the escapes is a lot trickier than just doing variable substitution and I'm not sure it would be a good idea to try to handle it via simple regexes as I'm doing for the substitution now. But I think the current logic should cover real-world use cases. Open to other ideas here, though. One option would be to spin up an external shell process to do the resolution, but it seems messy and could be quite slow for large lists.
   
   The other approach would be to hide this behind a feature flag, but it's kind of a weird feature flag. "Turn on scalable user JAR handling to bypass argument length limits, but also turn off environment variable substitution." I think it will be confusing for users so I'd prefer to avoid it if possible.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org