You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@iceberg.apache.org by "dependabot[bot] (via GitHub)" <gi...@apache.org> on 2023/04/30 05:00:08 UTC

[GitHub] [iceberg] dependabot[bot] opened a new pull request, #7479: Build: Bump ray from 2.3.1 to 2.4.0 in /python

dependabot[bot] opened a new pull request, #7479:
URL: https://github.com/apache/iceberg/pull/7479

   Bumps [ray](https://github.com/ray-project/ray) from 2.3.1 to 2.4.0.
   <details>
   <summary>Release notes</summary>
   <p><em>Sourced from <a href="https://github.com/ray-project/ray/releases">ray's releases</a>.</em></p>
   <blockquote>
   <h2>Ray-2.4.0</h2>
   <h1>Ray 2.4 - Generative AI and LLM support</h1>
   <p>Over the last few months, we have seen a flurry of innovative activity around <a href="https://www.mckinsey.com/featured-insights/mckinsey-explainers/what-is-generative-ai">generative AI models</a> and <a href="https://en.wikipedia.org/wiki/Large_language_model">large language models (LLM)</a>. To continue our effort to ensure Ray provides a pivotal compute substrate for <a href="https://www.anyscale.com/blog/ray-common-production-challenges-for-generative-ai-infrastructure">generative AI workloads</a> and addresses the challenges (as explained in our <a href="https://www.anyscale.com/blog/ray-common-production-challenges-for-generative-ai-infrastructure">blog series</a>), we have invested engineering efforts in this release to ensure that these open source LLM models and workloads are accessible to the open source community and performant with Ray.</p>
   <p>This release includes new examples for training, batch inference, and serving with your own LLM.</p>
   <h2>Generative AI and LLM Examples</h2>
   <ul>
   <li><a href="https://docs.ray.io/en/releases-2.4.0/ray-air/examples/gptj_deepspeed_fine_tuning.html">GPT-J (LLM) fine-tuning with Microsoft DeepSpeed and Ray Train</a></li>
   <li><a href="https://docs.ray.io/en/releases-2.4.0/ray-air/examples/gptj_batch_prediction.html">GPT-J-6B Batch Prediction with Ray Data</a></li>
   <li><a href="https://docs.ray.io/en/releases-2.4.0/ray-air/examples/gptj_serving.html">GPT-J-6B Serving with Ray Serve</a></li>
   <li><a href="https://docs.ray.io/en/releases-2.4.0/ray-air/examples/dreambooth_finetuning.html">Stable Diffusion (Dreambooth) fine-tuning with Ray Train</a></li>
   <li><a href="https://docs.ray.io/en/releases-2.4.0/ray-air/examples/stablediffusion_batch_prediction.html">Stable Diffusion Batch Prediction with Ray Data </a></li>
   <li><a href="https://docs.ray.io/en/releases-2.4.0/serve/tutorials/stable-diffusion.html">Stable Diffusion Serving with Ray Serve</a></li>
   </ul>
   <h2>Ray Train enhancements</h2>
   <ul>
   <li>We're introducing the <a href="https://docs.ray.io/en/releases-2.4.0/train/api/doc/ray.train.lightning.LightningTrainer.html">LightningTrainer</a>, allowing you to scale your <a href="https://lightning.ai/docs/pytorch/stable//index.html">PyTorch Lightning</a> on Ray. As part of our continued effort for seamless integration and ease of use, we have enhanced and replaced our existing ray_lightning integration, which was widely adopted, with the latest changes to Pytorch Lighting.</li>
   <li>we’re releasing an <a href="https://docs.ray.io/en/releases-2.4.0/train/api/doc/ray.train.huggingface.accelerate.AccelerateTrainer.html">AccelerateTrainer</a>, allowing you to run <a href="https://huggingface.co/docs/accelerate">HuggingFace Accelerate</a> and <a href="https://huggingface.co/docs/accelerate/usage_guides/deepspeed">DeepSpeed</a> on Ray with minimal code changes. This Trainer integrates with the rest of the Ray ecosystem—including the ability to run distributed <a href="https://docs.ray.io/en/latest/tune/index.html">hyperparameter tuning</a> with each trial being a distributed training job.</li>
   </ul>
   <h2>Ray Data highlights</h2>
   <ul>
   <li>Streaming execution is enabled by default, providing users with a more efficient data processing pipeline that can handle larger datasets and minimize memory consumption. Check out the docs here: (<a href="https://docs.ray.io/en/releases-2.4.0/data/dataset-internals.html#streaming-execution">doc</a>)</li>
   <li>We've implemented asynchronous batch prefetching of Dataset.iter_batches (<a href="https://docs.ray.io/en/releases-2.4.0/data/api/doc/ray.data.DatasetIterator.iter_batches.html">doc</a>), improving performance by fetching data in parallel while the main thread continues processing, thus reducing waiting time.</li>
   <li>Support reading SQL databases (<a href="https://docs.ray.io/en/releases-2.4.0/data/creating-datasets.html#reading-from-sql-databases">doc</a>), enabling users to seamlessly integrate relational databases into their Ray Data workflows.</li>
   <li>Introduced support for reading WebDataset (<a href="https://docs.ray.io/en/releases-2.4.0/data/api/doc/ray.data.read_webdataset.html">doc</a>), a common format for high-performance deep learning training jobs.</li>
   </ul>
   <h2>Ray Serve highlights</h2>
   <ul>
   <li>Multi-app CLI &amp; REST API support is now available, allowing users to manage multiple applications with different configurations within a single Ray Serve deployment. This simplifies deployment and scaling processes for users with multiple applications. (<a href="https://docs.ray.io/en/releases-2.4.0/serve/multi-app.html">doc</a>)</li>
   <li>Enhanced logging and metrics for Serve applications, giving users better visibility into their application's performance and facilitating easier debugging and monitoring.
   (<a href="https://docs.ray.io/en/releases-2.4.0/serve/production-guide/monitoring.html#monitoring-ray-serve">doc</a>)</li>
   </ul>
   <h2>Other enhancements</h2>
   <ul>
   <li><a href="https://redirect.github.com/ray-project/ray/issues/32904">Ray 2.4 is the last version that supports Python 3.6</a></li>
   <li>We've also added a brand new <a href="https://docs.ray.io/en/releases-2.4.0/">landing page</a></li>
   </ul>
   <h1>Ray Libraries</h1>
   <h2>Ray AIR</h2>
   <p>💫Enhancements:</p>
   <ul>
   <li>Add nightly test for alpa opt 30b inference. (<a href="https://redirect.github.com/ray-project/ray/issues/33419">#33419</a>)</li>
   <li>Add a sanity checking release test for Alpa and ray nightly. (<a href="https://redirect.github.com/ray-project/ray/issues/32995">#32995</a>)</li>
   <li>Add <code>TorchDetectionPredictor</code> (<a href="https://redirect.github.com/ray-project/ray/issues/32199">#32199</a>)</li>
   <li>Add <code>artifact_location</code>, <code>run_name</code> to MLFlow integration (<a href="https://redirect.github.com/ray-project/ray/issues/33641">#33641</a>)</li>
   <li>Add <code>*path</code> properties to <code>Result</code> and <code>ResultGrid</code> (<a href="https://redirect.github.com/ray-project/ray/issues/33410">#33410</a>)</li>
   <li>Make <code>Preprocessor.transform</code> lazy by default (<a href="https://redirect.github.com/ray-project/ray/issues/32872">#32872</a>)</li>
   <li>Make <code>BatchPredictor</code> lazy (<a href="https://redirect.github.com/ray-project/ray/issues/32510">#32510</a>, <a href="https://redirect.github.com/ray-project/ray/issues/32796">#32796</a>)</li>
   <li>Use a configurable ray temp directory for the <code>TempFileLock</code> util (<a href="https://redirect.github.com/ray-project/ray/issues/32862">#32862</a>)</li>
   <li>Add <code>collate_fn</code> to <code>iter_torch_batches</code> (<a href="https://redirect.github.com/ray-project/ray/issues/32412">#32412</a>)</li>
   <li>Allow users to pass <code>Callable[[torch.Tensor], torch.Tensor]</code> to <code>TorchVisionTransform</code> (<a href="https://redirect.github.com/ray-project/ray/issues/32383">#32383</a>)</li>
   <li>Automatically move <code>DatasetIterator</code> torch tensors to correct device (<a href="https://redirect.github.com/ray-project/ray/issues/31753">#31753</a>)</li>
   </ul>
   <p>🔨 Fixes:</p>
   <!-- raw HTML omitted -->
   </blockquote>
   <p>... (truncated)</p>
   </details>
   <details>
   <summary>Commits</summary>
   <ul>
   <li><a href="https://github.com/ray-project/ray/commit/cd1ba65e239360c8a7b130f991ed414eccc063ce"><code>cd1ba65</code></a> [docker] Disable docker builds for code cherry picks (<a href="https://redirect.github.com/ray-project/ray/issues/34744">#34744</a>)</li>
   <li><a href="https://github.com/ray-project/ray/commit/4479f66d4db967d3c9dd0af2572061276ba926ba"><code>4479f66</code></a> Cherry pick doc PRs <a href="https://redirect.github.com/ray-project/ray/issues/34614">#34614</a> <a href="https://redirect.github.com/ray-project/ray/issues/34615">#34615</a> <a href="https://redirect.github.com/ray-project/ray/issues/34435">#34435</a> <a href="https://redirect.github.com/ray-project/ray/issues/34505">#34505</a> <a href="https://redirect.github.com/ray-project/ray/issues/34617">#34617</a> <a href="https://redirect.github.com/ray-project/ray/issues/34623">#34623</a> <a href="https://redirect.github.com/ray-project/ray/issues/34660">#34660</a> (<a href="https://redirect.github.com/ray-project/ray/issues/34676">#34676</a>)</li>
   <li><a href="https://github.com/ray-project/ray/commit/fb34fc32fec610ae9e309493f83adc316b295d49"><code>fb34fc3</code></a> [train] Add AccelerateTrainer as valid AIR_TRAINER (<a href="https://redirect.github.com/ray-project/ray/issues/34639">#34639</a>) (<a href="https://redirect.github.com/ray-project/ray/issues/34657">#34657</a>)</li>
   <li><a href="https://github.com/ray-project/ray/commit/b0c23a912daee0d0aad9c41223bfe40bd4d81695"><code>b0c23a9</code></a> [CI] fix virtualenv version to deflake linux://python/ray/tests:test_runtime_...</li>
   <li><a href="https://github.com/ray-project/ray/commit/558b26b5dcb15f77aadb344add8c63b8ac90b49f"><code>558b26b</code></a> [Ci] fix pip version to deflake minimal install 3.10</li>
   <li><a href="https://github.com/ray-project/ray/commit/e935be9b13166f5d87cbcbf74129aad7324ee933"><code>e935be9</code></a> [docker] Enable docker builds for code cherry picks (<a href="https://redirect.github.com/ray-project/ray/issues/34649">#34649</a>)</li>
   <li><a href="https://github.com/ray-project/ray/commit/d5d34c1ea29557e9a3478101e727e23a0919d60e"><code>d5d34c1</code></a> Revert &quot;[core]Turn on light weight resource broadcasting. (<a href="https://redirect.github.com/ray-project/ray/issues/32625">#32625</a>)&quot; (<a href="https://redirect.github.com/ray-project/ray/issues/34636">#34636</a>)</li>
   <li><a href="https://github.com/ray-project/ray/commit/a8d7c9cfcf5f4c96ddf7aa3baf68a292de544b0d"><code>a8d7c9c</code></a> [Doc] Add missing links for LightningTrainer and HuggingfaceTrainer (<a href="https://redirect.github.com/ray-project/ray/issues/34612">#34612</a>)</li>
   <li><a href="https://github.com/ray-project/ray/commit/6fc9f70e801ac42edcf4de2d9ace6865ee51d85d"><code>6fc9f70</code></a> [Doc] Fix AIR benchmark configuration link failure(with pinned commit id). <a href="https://redirect.github.com/ray-project/ray/issues/3">#3</a>...</li>
   <li><a href="https://github.com/ray-project/ray/commit/d2804d953e6ebc98957074cef7d9994f329bc825"><code>d2804d9</code></a> [cherry pick][docs] for new landing page for 2.4.0 (<a href="https://redirect.github.com/ray-project/ray/issues/34546">#34546</a>)</li>
   <li>Additional commits viewable in <a href="https://github.com/ray-project/ray/compare/ray-2.3.1...ray-2.4.0">compare view</a></li>
   </ul>
   </details>
   <br />
   
   
   [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ray&package-manager=pip&previous-version=2.3.1&new-version=2.4.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
   
   Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
   
   [//]: # (dependabot-automerge-start)
   [//]: # (dependabot-automerge-end)
   
   ---
   
   <details>
   <summary>Dependabot commands and options</summary>
   <br />
   
   You can trigger Dependabot actions by commenting on this PR:
   - `@dependabot rebase` will rebase this PR
   - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
   - `@dependabot merge` will merge this PR after your CI passes on it
   - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
   - `@dependabot cancel merge` will cancel a previously requested merge and block automerging
   - `@dependabot reopen` will reopen this PR if it is closed
   - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
   - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
   - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
   - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
   
   
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org


[GitHub] [iceberg] Fokko merged pull request #7479: Build: Bump ray from 2.3.1 to 2.4.0 in /python

Posted by "Fokko (via GitHub)" <gi...@apache.org>.
Fokko merged PR #7479:
URL: https://github.com/apache/iceberg/pull/7479


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org