You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by GitBox <gi...@apache.org> on 2022/09/02 00:57:13 UTC

[GitHub] [beam] esadler-hbo commented on issue #22813: [Feature Request]: Add a WriteBatchedParquet option for python parquet-io

esadler-hbo commented on issue #22813:
URL: https://github.com/apache/beam/issues/22813#issuecomment-1234950729

   Oh I didn’t know about the batched DoFns API. That looks incredible! So does the run inference api. I don't currently use beam at work so I am a bit out of loop.m :). Seems like y'all have added a lot of what we did at my last job and did a much better job haha. 
   
   I think that I just need to save the result from the run inference api to parquet without yielding individual rows. If that is possible with the batch api then my use-case is covered.
   
   Would be happy to contribute and it would be serious honor, but it sounds like you might already have a good solution? 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org