You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@systemds.apache.org by GitBox <gi...@apache.org> on 2022/11/16 17:39:16 UTC

[GitHub] [systemds] phaniarnab commented on pull request #1733: [SYSTEMDS-3466] Asynchronous (Future-based) execution of Spark instructions

phaniarnab commented on PR #1733:
URL: https://github.com/apache/systemds/pull/1733#issuecomment-1317405848

   > > This extension allows triggering a chain of Spark instructions asynchronously and seeking the results only when needed.
   > 
   > This is really interesting @phaniarnab . Curious: is there a concrete use case for this feature?
   
   Yes. There are different use cases. They arise as we combine Spark's lazy execution with local eager execution. Currently, a chain of Spark operations is triggered only when we need the result locally---which leads to inefficient cluster utilization. @BACtaki 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@systemds.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org