You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "huangxiaopingRD (via GitHub)" <gi...@apache.org> on 2024/01/23 12:21:40 UTC

[PR] [SPARK-46813][CORE] Don't set the executor id to "driver" when SparkContext is created by the executor side [spark]

huangxiaopingRD opened a new pull request, #44855:
URL: https://github.com/apache/spark/pull/44855

   
   
   ### What changes were proposed in this pull request?
   Don't set the executor id to "driver" when SparkContext is created by the executor side
   
   
   ### Why are the changes needed?
   fix a bug
   
   ### Does this PR introduce _any_ user-facing change?
   No
   
   
   ### How was this patch tested?
   
   
   
   ### Was this patch authored or co-authored using generative AI tooling?
   No
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-46813][CORE] Don't set the executor id to "driver" when SparkContext is created by the executor side [spark]

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on PR #44855:
URL: https://github.com/apache/spark/pull/44855#issuecomment-1907409989

   Although I am surprised that we have such an ability, the code change here looks incorrect to me, as the RpcEnv in a sub-SparkContext relies on this property.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-46813][CORE] Don't set the executor id to "driver" when SparkContext is created by the executor side [spark]

Posted by "huangxiaopingRD (via GitHub)" <gi...@apache.org>.
huangxiaopingRD commented on PR #44855:
URL: https://github.com/apache/spark/pull/44855#issuecomment-1907791631

   > Although I am surprised that we have such an ability, the code change here looks incorrect to me, as the RpcEnv in a sub-SparkContext relies on this property.
   
   You are right. But usually, the `EXECUTOR_ID` obtained by `NettyRpcEnv` from `SparkConf` has been correctly set in advance. Details can be seen in https://github.com/apache/spark/pull/23560/files


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-46813][CORE] Don't set the executor id to "driver" when SparkContext is created by the executor side [spark]

Posted by "github-actions[bot] (via GitHub)" <gi...@apache.org>.
github-actions[bot] commented on PR #44855:
URL: https://github.com/apache/spark/pull/44855#issuecomment-2093908111

   We're closing this PR because it hasn't been updated in a while. This isn't a judgement on the merit of the PR in any way. It's just a way of keeping the PR queue manageable.
   If you'd like to revive this PR, please reopen it and ask a committer to remove the Stale tag!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-46813][CORE] Don't set the executor id to "driver" when SparkContext is created by the executor side [spark]

Posted by "github-actions[bot] (via GitHub)" <gi...@apache.org>.
github-actions[bot] closed pull request #44855: [SPARK-46813][CORE] Don't set the executor id to "driver" when SparkContext is created by the executor side
URL: https://github.com/apache/spark/pull/44855


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-46813][CORE] Don't set the executor id to "driver" when SparkContext is created by the executor side [spark]

Posted by "huangxiaopingRD (via GitHub)" <gi...@apache.org>.
huangxiaopingRD commented on PR #44855:
URL: https://github.com/apache/spark/pull/44855#issuecomment-1907243544

   > May I ask your use case, @huangxiaopingRD ? It would be great if you can put that into the PR description because `spark.executor.allowSparkContext` is not recommended in the community since Apache Spark 3.1.
   
   Thank you for your review @dongjoon-hyun , it just makes the code more reasonable, there is no special use case.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org