You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@iceberg.apache.org by gi...@apache.org on 2022/02/08 04:27:49 UTC

[iceberg-docs] branch asf-site updated: deploy: e217017751a2dfc8e9583084f525dcf7a43cf58c

This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/iceberg-docs.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new 1957603  deploy: e217017751a2dfc8e9583084f525dcf7a43cf58c
1957603 is described below

commit 1957603e160a850c752742c2ac7a2e8cda8dd2d9
Author: jackye1995 <ja...@users.noreply.github.com>
AuthorDate: Tue Feb 8 04:25:00 2022 +0000

    deploy: e217017751a2dfc8e9583084f525dcf7a43cf58c
---
 common/index.xml                   |   5 +-
 how-to-verify-a-release/index.html |  16 +--
 index.xml                          |   8 +-
 multi-engine-support/index.html    | 206 +++++++++++++++++++++++++++++++++++++
 project/index.xml                  |   3 +-
 releases/index.html                | 138 ++++++++++++++++++++++---
 sitemap.xml                        |   2 +-
 spec/index.html                    |   4 +-
 8 files changed, 350 insertions(+), 32 deletions(-)

diff --git a/common/index.xml b/common/index.xml
index 4109a97..931cdd5 100644
--- a/common/index.xml
+++ b/common/index.xml
@@ -10,9 +10,8 @@ Author: Daniel Weeks
 Using Spark in EMR with Apache Iceberg Date: December 10th, 2021, Company: Tabular</description></item><item><title/><link>https://iceberg.apache.org/talks/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/talks/</guid><description>Iceberg Talks Here is a list of talks and other videos related to Iceberg.
 Expert Roundtable: The Future of Metadata After Hive Metastore Date: November 15, 2021, Authors: Lior Ebel, Seshu Adunuthula, Ryan Blue &amp;amp; Oz Katz
 Spark and Iceberg at Apple&amp;rsquo;s Scale - Leveraging differential files for efficient upserts and deletes Date: October 21, 2020, Author: Anton
-Apache Iceberg - A Table Format for Huge Analytic Datasets Date: October 21, 2020, Author: Ryan Blue</description></item><item><title/><link>https://iceberg.apache.org/releases/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/releases/</guid><description>Downloads The latest version of Iceberg is 0.12.1.
-0.12.1 source tar.gz &amp;ndash; signature &amp;ndash; sha512 0.12.1 Spark 3.0 runtime Jar 0.12.1 Spark 2.4 runtime Jar 0.12.1 Flink runtime Jar 0.12.1 Hive runtime Jar To use Iceberg in Spark, download the runtime JAR and add it to the jars folder of your Spark install. Use iceberg-spark3-runtime for Spark 3, and iceberg-spark-runtime for Spark 2.4.
-To use Iceberg in Hive, download the iceberg-hive-runtime JAR and add it to Hive using ADD JAR.</description></item><item><title/><link>https://iceberg.apache.org/spec/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/spec/</guid><description>Iceberg Table Spec This is a specification for the Iceberg table format that is designed to manage a large, slow-changing collection of files in a distributed file system or key-value store as a table.
+Apache Iceberg - A Table Format for Huge Analytic Datasets Date: October 21, 2020, Author: Ryan Blue</description></item><item><title/><link>https://iceberg.apache.org/releases/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/releases/</guid><description>Downloads The latest version of Iceberg is 0.13.0.
+0.13.0 source tar.gz &amp;ndash; signature &amp;ndash; sha512 0.13.0 Spark 3.2 runtime Jar 0.13.0 Spark 3.1 runtime Jar 0.13.0 Spark 3.0 runtime Jar 0.13.0 Spark 2.4 runtime Jar 0.13.0 Flink 1.14 runtime Jar 0.13.0 Flink 1.13 runtime Jar 0.13.0 Flink 1.12 runtime Jar 0.13.0 Hive runtime Jar To use Iceberg in Spark or Flink, download the runtime JAR for your engine version and add it to the jars folder of your installation.</description></item><item><title/><link>https://iceberg.apache.or [...]
 Format Versioning Versions 1 and 2 of the Iceberg spec are complete and adopted by the community.
 The format version number is incremented when new features are added that will break forward-compatibility&amp;mdash;that is, when older readers would not read newer table features correctly.</description></item><item><title/><link>https://iceberg.apache.org/terms/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/terms/</guid><description>Terms Snapshot A snapshot is the state of a table at some time.
 Each snapshot lists all of the data files that make up the table&amp;rsquo;s contents at the time of the snapshot. Data files are stored across multiple manifest files, and the manifests for a snapshot are listed in a single manifest list file.
diff --git a/how-to-verify-a-release/index.html b/how-to-verify-a-release/index.html
index a400170..1e36d27 100644
--- a/how-to-verify-a-release/index.html
+++ b/how-to-verify-a-release/index.html
@@ -95,13 +95,13 @@ verify signatures, checksums, and documentation.</p>
 <div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-bash data-lang=bash>curl https://dist.apache.org/repos/dist/dev/iceberg/KEYS -o KEYS
 gpg --import KEYS
 </code></pre></div><p>Next, verify the <code>.asc</code> file.</p>
-<div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-bash data-lang=bash>gpg --verify apache-iceberg-0.12.1.tar.gz.asc
+<div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-bash data-lang=bash>gpg --verify apache-iceberg-0.13.0.tar.gz.asc
 </code></pre></div><h3 id=verifying-checksums>Verifying Checksums</h3>
-<div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-bash data-lang=bash>shasum -a <span style=color:#ae81ff>512</span> --check apache-iceberg-0.12.1.tar.gz.sha512
+<div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-bash data-lang=bash>shasum -a <span style=color:#ae81ff>512</span> --check apache-iceberg-0.13.0.tar.gz.sha512
 </code></pre></div><h3 id=verifying-license-documentation>Verifying License Documentation</h3>
 <p>Untar the archive and change into the source directory.</p>
-<div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-bash data-lang=bash>tar xzf apache-iceberg-0.12.1.tar.gz
-cd apache-iceberg-0.12.1
+<div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-bash data-lang=bash>tar xzf apache-iceberg-0.13.0.tar.gz
+cd apache-iceberg-0.13.0
 </code></pre></div><p>Run RAT checks to validate license headers.</p>
 <div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-bash data-lang=bash>dev/check-license
 </code></pre></div><h3 id=verifying-build-and-test>Verifying Build and Test</h3>
@@ -133,7 +133,7 @@ Replace <code>${MAVEN_URL}</code> with the URL provided in the release announcem
 <p>To verify using spark, start a <code>spark-shell</code> with a command like the following command:</p>
 <div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-bash data-lang=bash>spark-shell <span style=color:#ae81ff>\
 </span><span style=color:#ae81ff></span>    --conf spark.jars.repositories<span style=color:#f92672>=</span><span style=color:#e6db74>${</span>MAVEN_URL<span style=color:#e6db74>}</span> <span style=color:#ae81ff>\
-</span><span style=color:#ae81ff></span>    --packages org.apache.iceberg:iceberg-spark3-runtime:0.12.1 <span style=color:#ae81ff>\
+</span><span style=color:#ae81ff></span>    --packages org.apache.iceberg:iceberg-spark3-runtime:0.13.0 <span style=color:#ae81ff>\
 </span><span style=color:#ae81ff></span>    --conf spark.sql.extensions<span style=color:#f92672>=</span>org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions <span style=color:#ae81ff>\
 </span><span style=color:#ae81ff></span>    --conf spark.sql.catalog.local<span style=color:#f92672>=</span>org.apache.iceberg.spark.SparkCatalog <span style=color:#ae81ff>\
 </span><span style=color:#ae81ff></span>    --conf spark.sql.catalog.local.type<span style=color:#f92672>=</span>hadoop <span style=color:#ae81ff>\
@@ -142,17 +142,17 @@ Replace <code>${MAVEN_URL}</code> with the URL provided in the release announcem
 </span><span style=color:#ae81ff></span>    --conf spark.sql.defaultCatalog<span style=color:#f92672>=</span>local
 </code></pre></div><h3 id=verifying-with-flink>Verifying with Flink</h3>
 <p>To verify using Flink, start a Flink SQL Client with the following command:</p>
-<div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-bash data-lang=bash>wget <span style=color:#e6db74>${</span>MAVEN_URL<span style=color:#e6db74>}</span>/iceberg-flink-runtime/0.12.1/iceberg-flink-runtime-0.12.1.jar
+<div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-bash data-lang=bash>wget <span style=color:#e6db74>${</span>MAVEN_URL<span style=color:#e6db74>}</span>/iceberg-flink-runtime/0.13.0/iceberg-flink-runtime-0.13.0.jar
 
 sql-client.sh embedded <span style=color:#ae81ff>\
-</span><span style=color:#ae81ff></span>    -j iceberg-flink-runtime-0.12.1.jar <span style=color:#ae81ff>\
+</span><span style=color:#ae81ff></span>    -j iceberg-flink-runtime-0.13.0.jar <span style=color:#ae81ff>\
 </span><span style=color:#ae81ff></span>    -j <span style=color:#e6db74>${</span>FLINK_CONNECTOR_PACKAGE<span style=color:#e6db74>}</span>-<span style=color:#e6db74>${</span>HIVE_VERSION<span style=color:#e6db74>}</span>_<span style=color:#e6db74>${</span>SCALA_VERSION<span style=color:#e6db74>}</span>-<span style=color:#e6db74>${</span>FLINK_VERSION<span style=color:#e6db74>}</span>.jar <span style=color:#ae81ff>\
 </span><span style=color:#ae81ff></span>    shell
 </code></pre></div><h2 id=voting>Voting</h2>
 <p>Votes are cast by replying to the release candidate announcement email on the dev mailing list
 with either <code>+1</code>, <code>0</code>, or <code>-1</code>.</p>
 <blockquote>
-<p>[ ] +1 Release this as Apache Iceberg 0.12.1
+<p>[ ] +1 Release this as Apache Iceberg 0.13.0
 [ ] +0
 [ ] -1 Do not release this because&mldr;</p>
 </blockquote>
diff --git a/index.xml b/index.xml
index c806753..ebda09f 100644
--- a/index.xml
+++ b/index.xml
@@ -10,16 +10,16 @@ Author: Daniel Weeks
 Using Spark in EMR with Apache Iceberg Date: December 10th, 2021, Company: Tabular</description></item><item><title/><link>https://iceberg.apache.org/talks/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/talks/</guid><description>Iceberg Talks Here is a list of talks and other videos related to Iceberg.
 Expert Roundtable: The Future of Metadata After Hive Metastore Date: November 15, 2021, Authors: Lior Ebel, Seshu Adunuthula, Ryan Blue &amp;amp; Oz Katz
 Spark and Iceberg at Apple&amp;rsquo;s Scale - Leveraging differential files for efficient upserts and deletes Date: October 21, 2020, Author: Anton
-Apache Iceberg - A Table Format for Huge Analytic Datasets Date: October 21, 2020, Author: Ryan Blue</description></item><item><title/><link>https://iceberg.apache.org/releases/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/releases/</guid><description>Downloads The latest version of Iceberg is 0.12.1.
-0.12.1 source tar.gz &amp;ndash; signature &amp;ndash; sha512 0.12.1 Spark 3.0 runtime Jar 0.12.1 Spark 2.4 runtime Jar 0.12.1 Flink runtime Jar 0.12.1 Hive runtime Jar To use Iceberg in Spark, download the runtime JAR and add it to the jars folder of your Spark install. Use iceberg-spark3-runtime for Spark 3, and iceberg-spark-runtime for Spark 2.4.
-To use Iceberg in Hive, download the iceberg-hive-runtime JAR and add it to Hive using ADD JAR.</description></item><item><title/><link>https://iceberg.apache.org/spec/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/spec/</guid><description>Iceberg Table Spec This is a specification for the Iceberg table format that is designed to manage a large, slow-changing collection of files in a distributed file system or key-value store as a table.
+Apache Iceberg - A Table Format for Huge Analytic Datasets Date: October 21, 2020, Author: Ryan Blue</description></item><item><title/><link>https://iceberg.apache.org/releases/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/releases/</guid><description>Downloads The latest version of Iceberg is 0.13.0.
+0.13.0 source tar.gz &amp;ndash; signature &amp;ndash; sha512 0.13.0 Spark 3.2 runtime Jar 0.13.0 Spark 3.1 runtime Jar 0.13.0 Spark 3.0 runtime Jar 0.13.0 Spark 2.4 runtime Jar 0.13.0 Flink 1.14 runtime Jar 0.13.0 Flink 1.13 runtime Jar 0.13.0 Flink 1.12 runtime Jar 0.13.0 Hive runtime Jar To use Iceberg in Spark or Flink, download the runtime JAR for your engine version and add it to the jars folder of your installation.</description></item><item><title/><link>https://iceberg.apache.or [...]
 Format Versioning Versions 1 and 2 of the Iceberg spec are complete and adopted by the community.
 The format version number is incremented when new features are added that will break forward-compatibility&amp;mdash;that is, when older readers would not read newer table features correctly.</description></item><item><title/><link>https://iceberg.apache.org/terms/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/terms/</guid><description>Terms Snapshot A snapshot is the state of a table at some time.
 Each snapshot lists all of the data files that make up the table&amp;rsquo;s contents at the time of the snapshot. Data files are stored across multiple manifest files, and the manifests for a snapshot are listed in a single manifest list file.
 Manifest list A manifest list is a metadata file that lists the manifests that make up a table snapshot.</description></item><item><title/><link>https://iceberg.apache.org/how-to-verify-a-release/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/how-to-verify-a-release/</guid><description>How to Verify a Release Each Apache Iceberg release is validated by the community by holding a vote. A community release manager will prepare a release candidate  [...]
 Running Benchmarks on GitHub It is possible to run one or more Benchmarks via the JMH Benchmarks GH action on your own fork of the Iceberg repo.</description></item><item><title>How To Release</title><link>https://iceberg.apache.org/how-to-release/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/how-to-release/</guid><description>Setup To create a release candidate, you will need:
 Apache LDAP credentals for Nexus and SVN A GPG key for signing, published in KEYS If you have not published your GPG key yet, you must publish it before sending the vote email by doing:
-svn co https://dist.apache.org/repos/dist/dev/iceberg icebergsvn cd icebergsvn echo &amp;#34;&amp;#34; &amp;gt;&amp;gt; KEYS # append a newline gpg --list-sigs &amp;lt;YOUR KEY ID HERE&amp;gt; &amp;gt;&amp;gt; KEYS # append signatures gpg --armor --export &amp;lt;YOUR KEY ID HERE&amp;gt; &amp;gt;&amp;gt; KEYS # append public key block svn commit -m &amp;#34;add key for &amp;lt;YOUR NAME HERE&amp;gt;&amp;#34; Nexus access Nexus credentials are configured in your personal ~/.</description> [...]
+svn co https://dist.apache.org/repos/dist/dev/iceberg icebergsvn cd icebergsvn echo &amp;#34;&amp;#34; &amp;gt;&amp;gt; KEYS # append a newline gpg --list-sigs &amp;lt;YOUR KEY ID HERE&amp;gt; &amp;gt;&amp;gt; KEYS # append signatures gpg --armor --export &amp;lt;YOUR KEY ID HERE&amp;gt; &amp;gt;&amp;gt; KEYS # append public key block svn commit -m &amp;#34;add key for &amp;lt;YOUR NAME HERE&amp;gt;&amp;#34; Nexus access Nexus credentials are configured in your personal ~/.</description> [...]
+Multi-Version Support Processing engine connectors maintained in the iceberg repository are built for multiple versions.</description></item><item><title>Roadmap</title><link>https://iceberg.apache.org/roadmap/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/roadmap/</guid><description>Roadmap Overview This roadmap outlines projects that the Iceberg community is working on, their priority, and a rough size estimate. This is based on the latest com [...]
 Priority 1 API: Iceberg 1.0.0 [medium] Spark: Merge-on-read plans [large] Maintenance: Delete file compaction [medium] Flink: Upgrade to 1.</description></item><item><title>Security</title><link>https://iceberg.apache.org/security/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/security/</guid><description>Reporting Security Issues The Apache Iceberg Project uses the standard process outlined by the Apache Security Team for reporting vulnerabilit [...]
 To report a possible security vulnerability, please email security@iceberg.apache.org.
 Verifying Signed Releases Please refer to the instructions on the Release Verification page.</description></item><item><title>Trademarks</title><link>https://iceberg.apache.org/trademarks/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/trademarks/</guid><description>Trademarks Apache Iceberg, Iceberg, Apache, the Apache feather logo, and the Apache Iceberg project logo are either registered trademarks or trademarks of The Apache Software Foundati [...]
\ No newline at end of file
diff --git a/multi-engine-support/index.html b/multi-engine-support/index.html
new file mode 100644
index 0000000..21db4d2
--- /dev/null
+++ b/multi-engine-support/index.html
@@ -0,0 +1,206 @@
+<!doctype html><html>
+<head>
+<meta charset=utf-8>
+<meta http-equiv=x-ua-compatible content="IE=edge">
+<meta name=viewport content="width=device-width,initial-scale=1">
+<meta name=description content>
+<meta name=author content>
+<base href=https://iceberg.apache.org/>
+<title>Multi-Engine Support</title>
+<link href=https://iceberg.apache.org//css/bootstrap.css rel=stylesheet>
+<link href=https://iceberg.apache.org//css/landing-page.css rel=stylesheet>
+<link href=https://iceberg.apache.org//css/markdown.css rel=stylesheet>
+<link href=https://iceberg.apache.org//css/katex.min.css rel=stylesheet>
+<link rel=stylesheet href=https://iceberg.apache.org//css/pricing.css>
+<link href=https://iceberg.apache.org//font-awesome-4.7.0/css/font-awesome.min.css rel=stylesheet type=text/css>
+<link href="//fonts.googleapis.com/css?family=Lato:300,400,700,300italic,400italic,700italic" rel=stylesheet type=text/css>
+<link href=https://iceberg.apache.org//css/termynal.css rel=stylesheet>
+</head>
+<body>
+<head>
+<link href=https://iceberg.apache.org//css/markdown.css rel=stylesheet>
+<link href=https://iceberg.apache.org//css/katex.min.css rel=stylesheet>
+</head>
+<nav class="navbar navbar-default navbar-fixed-top" role=navigation>
+<div class=container>
+<div class="navbar-header page-scroll">
+<button type=button class=navbar-toggle data-toggle=collapse data-target=#bs-example-navbar-collapse-1>
+<span class=sr-only>Toggle navigation</span>
+<span class=icon-bar></span>
+<span class=icon-bar></span>
+<span class=icon-bar></span>
+</button>
+<a class="page-scroll navbar-brand" href=#intro><img class=top-navbar-logo src=https://iceberg.apache.org//img/iceberg-logo-icon.png> Apache Iceberg</a>
+</div>
+<div class="collapse navbar-collapse" id=bs-example-navbar-collapse-1>
+<ul class="nav navbar-nav navbar-right">
+<li>
+<a class=page-scroll href=https://iceberg.apache.org/docs/latest>Docs</a>
+</li>
+<li>
+<a class=page-scroll href=https://iceberg.apache.org/docs/latest/getting-started/>Spark</a>
+</li>
+<li>
+<a class=page-scroll href=https://iceberg.apache.org/docs/latest/flink/>Flink</a>
+</li>
+<li>
+<a class=page-scroll href=https://trino.io/docs/current/connector/iceberg.html target=_blank>Trino</a>
+</li>
+<li>
+<a class=page-scroll href=https://prestodb.io/docs/current/connector/iceberg.html target=_blank>Presto</a>
+</li>
+<li>
+<a class=page-scroll href=https://iceberg.apache.org/releases>Releases</a>
+</li>
+<li>
+<a class=page-scroll href=https://iceberg.apache.org/roadmap>Roadmap</a>
+</li>
+<li>
+<a class=page-scroll href=https://iceberg.apache.org/community>Community</a>
+</li>
+<li>
+<a class=page-scroll href=https://iceberg.apache.org/blogs>Blogs</a>
+</li>
+<li>
+<a class=page-scroll href=https://iceberg.apache.org/talks>Talks</a>
+</li>
+</ul>
+</div>
+</div>
+</nav>
+<div class=pad-for-navbar></div>
+<body dir=" ltr">
+<div class=markdown-body>
+<h1 id=multi-engine-support>Multi-Engine Support</h1>
+<p>Apache Iceberg is an open standard for huge analytic tables that can be used by any processing engine.
+The community continuously improves Iceberg core library components to enable integrations with different compute engines that power analytics, business intelligence, machine learning, etc.
+Connectors for Spark, Flink and Hive are maintained in the main Iceberg repository.</p>
+<h2 id=multi-version-support>Multi-Version Support</h2>
+<p>Processing engine connectors maintained in the iceberg repository are built for multiple versions.</p>
+<p>For Spark and Flink, each new version that introduces backwards incompatible upgrade has its dedicated integration codebase and release artifacts.
+For example, the code for Iceberg Spark 3.1 integration is under <code>/spark/v3.1</code> and the code for Iceberg Spark 3.2 integration is under <code>/spark/v3.2</code>.
+Different artifacts (<code>iceberg-spark-3.1_2.12</code> and <code>iceberg-spark-3.2_2.12</code>) are released for users to consume.
+By doing this, changes across versions are isolated.
+New features in Iceberg could be developed against the latest features of an engine without breaking support of old APIs in past engine versions.</p>
+<p>For Hive, Hive 2 uses the <code>iceberg-mr</code> package for Iceberg integration, and Hive 3 requires an additional dependency of the <code>iceberg-hive3</code> package.</p>
+<h3 id=runtime-jar>Runtime Jar</h3>
+<p>Iceberg provides a runtime connector Jar for each supported version of Spark, Flink and Hive.
+When using Iceberg with these engines, the runtime jar is the only addition to the classpath needed in addition to vendor dependencies.
+For example, to use Iceberg with Spark 3.2 and AWS integrations, <code>iceberg-spark-runtime-3.2_2.12</code> and AWS SDK dependencies are needed for the Spark installation.</p>
+<p>Spark and Flink provide different runtime jars for each supported engine version.
+Hive 2 and Hive 3 currently share the same runtime jar.
+The runtime jar names and latest version download links are listed in <a href=./multi-engine-support/#current-engine-version-lifecycle-status>the tables below</a>.</p>
+<h3 id=engine-version-lifecycle>Engine Version Lifecycle</h3>
+<p>Each engine version undergoes the following lifecycle stages:</p>
+<ol>
+<li><strong>Beta</strong>: a new engine version is supported, but still in the experimental stage. Maybe the engine version itself is still in preview (e.g. Spark <code>3.0.0-preview</code>), or the engine does not yet have full feature compatibility compared to old versions yet. This stage allows Iceberg to release an engine version support without the need to wait for feature parity, shortening the release time.</li>
+<li><strong>Maintained</strong>: an engine version is actively maintained by the community. Users can expect parity for most features across all the maintained versions. If a feature has to leverage some new engine functionalities that older versions don&rsquo;t have, then feature parity across maintained versions is not guaranteed.</li>
+<li><strong>Deprecated</strong>: an engine version is no longer actively maintained. People who are still interested in the version can backport any necessary feature or bug fix from newer versions, but the community will not spend effort in achieving feature parity. Iceberg recommends users to move towards a newer version. Contributions to a deprecated version is expected to diminish over time, so that eventually no change is added to a deprecated version.</li>
+<li><strong>End-of-life</strong>: a vote can be initiated in the community to fully remove a deprecated version out of the Iceberg repository to mark as its end of life.</li>
+</ol>
+<h2 id=current-engine-version-lifecycle-status>Current Engine Version Lifecycle Status</h2>
+<h3 id=apache-spark>Apache Spark</h3>
+<p>Note that Spark 2.4 and 3.0 artifact names do not comply to the naming convention of later versions for backwards compatibility.</p>
+<table>
+<thead>
+<tr>
+<th>Version</th>
+<th>Lifecycle Stage</th>
+<th>Runtime Artifact</th>
+</tr>
+</thead>
+<tbody>
+<tr>
+<td>2.4</td>
+<td>Deprecated</td>
+<td><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-spark-runtime/0.13.0/iceberg-spark-runtime-0.13.0.jar">iceberg-spark-runtime</a></td>
+</tr>
+<tr>
+<td>3.0</td>
+<td>Maintained</td>
+<td><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-spark3-runtime/0.13.0/iceberg-spark3-runtime-0.13.0.jar">iceberg-spark3-runtime</a></td>
+</tr>
+<tr>
+<td>3.1</td>
+<td>Maintained</td>
+<td><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-spark-runtime-3.1_2.12/0.13.0/iceberg-spark-runtime-3.1_2.12-0.13.0.jar">iceberg-spark-runtime-3.1_2.12</a></td>
+</tr>
+<tr>
+<td>3.2</td>
+<td>Maintained</td>
+<td><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-spark-runtime-3.2_2.12/0.13.0/iceberg-spark-runtime-3.2_2.12-0.13.0.jar">iceberg-spark-runtime-3.2_2.12</a></td>
+</tr>
+</tbody>
+</table>
+<h3 id=apache-flink>Apache Flink</h3>
+<p>Based on the guideline of the Flink community, only the latest 2 minor versions are actively maintained.
+Users should continuously upgrade their Flink version to stay up-to-date.</p>
+<table>
+<thead>
+<tr>
+<th>Version</th>
+<th>Lifecycle Stage</th>
+<th>Runtime Artifact</th>
+</tr>
+</thead>
+<tbody>
+<tr>
+<td>1.12</td>
+<td>Deprecated</td>
+<td><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-flink-runtime-1.12/0.13.0/iceberg-flink-runtime-1.12-0.13.0.jar">iceberg-flink-runtime-1.12</a></td>
+</tr>
+<tr>
+<td>1.13</td>
+<td>Maintained</td>
+<td><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-flink-runtime-1.13/0.13.0/iceberg-flink-runtime-1.13-0.13.0.jar">iceberg-flink-runtime-1.13</a></td>
+</tr>
+<tr>
+<td>1.14</td>
+<td>Maintained</td>
+<td><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-flink-runtime-1.14/0.13.0/iceberg-flink-runtime-1.14-0.13.0.jar">iceberg-flink-runtime-1.14</a></td>
+</tr>
+</tbody>
+</table>
+<h3 id=apache-hive>Apache Hive</h3>
+<table>
+<thead>
+<tr>
+<th>Version</th>
+<th>Recommended minor version</th>
+<th>Lifecycle Stage</th>
+<th>Runtime Artifact</th>
+</tr>
+</thead>
+<tbody>
+<tr>
+<td>2</td>
+<td>2.3.8</td>
+<td>Maintained</td>
+<td><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-hive-runtime/0.13.0/iceberg-hive-runtime-0.13.0.jar">iceberg-hive-runtime</a></td>
+</tr>
+<tr>
+<td>3</td>
+<td>3.1.2</td>
+<td>Maintained</td>
+<td><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-hive-runtime/0.13.0/iceberg-hive-runtime-0.13.0.jar">iceberg-hive-runtime</a></td>
+</tr>
+</tbody>
+</table>
+<h2 id=developer-guide>Developer Guide</h2>
+<h3 id=maintaining-existing-engine-versions>Maintaining existing engine versions</h3>
+<p>Iceberg recommends the following for developers who are maintaining existing engine versions:</p>
+<ol>
+<li>New features should always be prioritized first in the latest version, which is either a maintained or beta version.</li>
+<li>For features that could be backported, contributors are encouraged to either perform backports to all maintained versions, or at least create some issues to track the backport.</li>
+<li>If the change is small enough, updating all versions in a single PR is acceptable. Otherwise, using separated PRs for each version is recommended.</li>
+</ol>
+<h3 id=supporting-new-engines>Supporting new engines</h3>
+<p>Iceberg recommends new engines to build support by importing the Iceberg libraries to the engine&rsquo;s project.
+This allows the Iceberg support to evolve with the engine.
+Projects such as <a href=https://trino.io/docs/current/connector/iceberg.html>Trino</a> and <a href=https://prestodb.io/docs/current/connector/iceberg.html>Presto</a> are good examples of such support strategy.</p>
+<p>In this approach, an Iceberg version upgrade is needed for an engine to consume new Iceberg features.
+To facilitate engine development against unreleased Iceberg features, a daily snapshot is published in the <a href=https://repository.apache.org/content/repositories/snapshots/org/apache/iceberg/>Apache snapshot repository</a>.</p>
+<p>If bringing an engine directly to the Iceberg main repository is needed, please raise a discussion thread in the <a href=../community>Iceberg community</a>.</p>
+</div>
+</body>
+</html>
\ No newline at end of file
diff --git a/project/index.xml b/project/index.xml
index 7d5d85e..aaa1080 100644
--- a/project/index.xml
+++ b/project/index.xml
@@ -1,7 +1,8 @@
 <?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Apache Iceberg</title><link>https://iceberg.apache.org/project/</link><description>Recent content on Apache Iceberg</description><generator>Hugo -- gohugo.io</generator><language>en-us</language><atom:link href="https://iceberg.apache.org/project/index.xml" rel="self" type="application/rss+xml"/><item><title>Benchmarks</title><link>https://iceberg.apache.org/ [...]
 Running Benchmarks on GitHub It is possible to run one or more Benchmarks via the JMH Benchmarks GH action on your own fork of the Iceberg repo.</description></item><item><title>How To Release</title><link>https://iceberg.apache.org/how-to-release/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/how-to-release/</guid><description>Setup To create a release candidate, you will need:
 Apache LDAP credentals for Nexus and SVN A GPG key for signing, published in KEYS If you have not published your GPG key yet, you must publish it before sending the vote email by doing:
-svn co https://dist.apache.org/repos/dist/dev/iceberg icebergsvn cd icebergsvn echo &amp;#34;&amp;#34; &amp;gt;&amp;gt; KEYS # append a newline gpg --list-sigs &amp;lt;YOUR KEY ID HERE&amp;gt; &amp;gt;&amp;gt; KEYS # append signatures gpg --armor --export &amp;lt;YOUR KEY ID HERE&amp;gt; &amp;gt;&amp;gt; KEYS # append public key block svn commit -m &amp;#34;add key for &amp;lt;YOUR NAME HERE&amp;gt;&amp;#34; Nexus access Nexus credentials are configured in your personal ~/.</description> [...]
+svn co https://dist.apache.org/repos/dist/dev/iceberg icebergsvn cd icebergsvn echo &amp;#34;&amp;#34; &amp;gt;&amp;gt; KEYS # append a newline gpg --list-sigs &amp;lt;YOUR KEY ID HERE&amp;gt; &amp;gt;&amp;gt; KEYS # append signatures gpg --armor --export &amp;lt;YOUR KEY ID HERE&amp;gt; &amp;gt;&amp;gt; KEYS # append public key block svn commit -m &amp;#34;add key for &amp;lt;YOUR NAME HERE&amp;gt;&amp;#34; Nexus access Nexus credentials are configured in your personal ~/.</description> [...]
+Multi-Version Support Processing engine connectors maintained in the iceberg repository are built for multiple versions.</description></item><item><title>Roadmap</title><link>https://iceberg.apache.org/roadmap/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/roadmap/</guid><description>Roadmap Overview This roadmap outlines projects that the Iceberg community is working on, their priority, and a rough size estimate. This is based on the latest com [...]
 Priority 1 API: Iceberg 1.0.0 [medium] Spark: Merge-on-read plans [large] Maintenance: Delete file compaction [medium] Flink: Upgrade to 1.</description></item><item><title>Security</title><link>https://iceberg.apache.org/security/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/security/</guid><description>Reporting Security Issues The Apache Iceberg Project uses the standard process outlined by the Apache Security Team for reporting vulnerabilit [...]
 To report a possible security vulnerability, please email security@iceberg.apache.org.
 Verifying Signed Releases Please refer to the instructions on the Release Verification page.</description></item><item><title>Trademarks</title><link>https://iceberg.apache.org/trademarks/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/trademarks/</guid><description>Trademarks Apache Iceberg, Iceberg, Apache, the Apache feather logo, and the Apache Iceberg project logo are either registered trademarks or trademarks of The Apache Software Foundati [...]
\ No newline at end of file
diff --git a/releases/index.html b/releases/index.html
index 35e8b9f..606f511 100644
--- a/releases/index.html
+++ b/releases/index.html
@@ -72,20 +72,24 @@
 <body dir=" ltr">
 <div class=markdown-body>
 <h2 id=downloads>Downloads</h2>
-<p>The latest version of Iceberg is <a href=https://github.com/apache/iceberg/releases/tag/apache-iceberg-0.12.1>0.12.1</a>.</p>
+<p>The latest version of Iceberg is <a href=https://github.com/apache/iceberg/releases/tag/apache-iceberg-0.13.0>0.13.0</a>.</p>
 <ul>
-<li><a href=https://www.apache.org/dyn/closer.cgi/iceberg/apache-iceberg-0.12.1/apache-iceberg-0.12.1.tar.gz>0.12.1 source tar.gz</a> &ndash; <a href=https://downloads.apache.org/iceberg/apache-iceberg-0.12.1/apache-iceberg-0.12.1.tar.gz.asc>signature</a> &ndash; <a href=https://downloads.apache.org/iceberg/apache-iceberg-0.12.1/apache-iceberg-0.12.1.tar.gz.sha512>sha512</a></li>
-<li><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-spark3-runtime/0.12.1/iceberg-spark3-runtime-0.12.1.jar">0.12.1 Spark 3.0 runtime Jar</a></li>
-<li><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-spark-runtime/0.12.1/iceberg-spark-runtime-0.12.1.jar">0.12.1 Spark 2.4 runtime Jar</a></li>
-<li><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-flink-runtime/0.12.1/iceberg-flink-runtime-0.12.1.jar">0.12.1 Flink runtime Jar</a></li>
-<li><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-hive-runtime/0.12.1/iceberg-hive-runtime-0.12.1.jar">0.12.1 Hive runtime Jar</a></li>
-</ul>
-<p>To use Iceberg in Spark, download the runtime JAR and add it to the jars folder of your Spark install. Use iceberg-spark3-runtime for Spark 3, and iceberg-spark-runtime for Spark 2.4.</p>
-<p>To use Iceberg in Hive, download the iceberg-hive-runtime JAR and add it to Hive using <code>ADD JAR</code>.</p>
+<li><a href=https://www.apache.org/dyn/closer.cgi/iceberg/apache-iceberg-0.13.0/apache-iceberg-0.13.0.tar.gz>0.13.0 source tar.gz</a> &ndash; <a href=https://downloads.apache.org/iceberg/apache-iceberg-0.13.0/apache-iceberg-0.13.0.tar.gz.asc>signature</a> &ndash; <a href=https://downloads.apache.org/iceberg/apache-iceberg-0.13.0/apache-iceberg-0.13.0.tar.gz.sha512>sha512</a></li>
+<li><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-spark-runtime-3.2_2.12/0.13.0/iceberg-spark-runtime-3.2_2.12-0.13.0.jar">0.13.0 Spark 3.2 runtime Jar</a></li>
+<li><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-spark-runtime-3.1_2.12/0.13.0/iceberg-spark-runtime-3.1_2.12-0.13.0.jar">0.13.0 Spark 3.1 runtime Jar</a></li>
+<li><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-spark3-runtime/0.13.0/iceberg-spark3-runtime-0.13.0.jar">0.13.0 Spark 3.0 runtime Jar</a></li>
+<li><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-spark-runtime/0.13.0/iceberg-spark-runtime-0.13.0.jar">0.13.0 Spark 2.4 runtime Jar</a></li>
+<li><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-flink-runtime-1.14/0.13.0/iceberg-flink-runtime-1.14-0.13.0.jar">0.13.0 Flink 1.14 runtime Jar</a></li>
+<li><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-flink-runtime-1.13/0.13.0/iceberg-flink-runtime-1.13-0.13.0.jar">0.13.0 Flink 1.13 runtime Jar</a></li>
+<li><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-flink-runtime-1.12/0.13.0/iceberg-flink-runtime-1.12-0.13.0.jar">0.13.0 Flink 1.12 runtime Jar</a></li>
+<li><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-hive-runtime/0.13.0/iceberg-hive-runtime-0.13.0.jar">0.13.0 Hive runtime Jar</a></li>
+</ul>
+<p>To use Iceberg in Spark or Flink, download the runtime JAR for your engine version and add it to the jars folder of your installation.</p>
+<p>To use Iceberg in Hive 2 or Hive 3, download the Hive runtime JAR and add it to Hive using <code>ADD JAR</code>.</p>
 <h3 id=gradle>Gradle</h3>
 <p>To add a dependency on Iceberg in Gradle, add the following to <code>build.gradle</code>:</p>
 <pre tabindex=0><code>dependencies {
-  compile 'org.apache.iceberg:iceberg-core:0.12.1'
+  compile 'org.apache.iceberg:iceberg-core:0.13.0'
 }
 </code></pre><p>You may also want to include <code>iceberg-parquet</code> for Parquet file support.</p>
 <h3 id=maven>Maven</h3>
@@ -95,12 +99,121 @@
   &lt;dependency&gt;
     &lt;groupId&gt;org.apache.iceberg&lt;/groupId&gt;
     &lt;artifactId&gt;iceberg-core&lt;/artifactId&gt;
-    &lt;version&gt;0.12.1&lt;/version&gt;
+    &lt;version&gt;0.13.0&lt;/version&gt;
   &lt;/dependency&gt;
   ...
 &lt;/dependencies&gt;
-</code></pre><h2 id=0121-release-notes>0.12.1 Release Notes</h2>
+</code></pre><h2 id=0130-release-notes>0.13.0 Release Notes</h2>
+<p>Apache Iceberg 0.13.0 was released on February 4th, 2022.</p>
+<p><strong>High-level features:</strong></p>
+<ul>
+<li><strong>Core</strong>
+<ul>
+<li>Catalog caching now supports cache expiration through catalog property <code>cache.expiration-interval-ms</code> [<a href=https://github.com/apache/iceberg/pull/3543>#3543</a>]</li>
+<li>Catalog now supports registration of Iceberg table from a given metadata file location [<a href=https://github.com/apache/iceberg/pull/3851>#3851</a>]</li>
+<li>Hadoop catalog can be used with S3 and other file systems safely by using a lock manager [<a href=https://github.com/apache/iceberg/pull/3663>#3663</a>]</li>
+</ul>
+</li>
+<li><strong>Vendor Integrations</strong>
+<ul>
+<li>Google Cloud Storage (GCS) <code>FileIO</code> is supported with optimized read and write using GCS streaming transfer [<a href=https://github.com/apache/iceberg/pull/3711>#3711</a>]</li>
+<li>Aliyun Object Storage Service (OSS) <code>FileIO</code> is supported [<a href=https://github.com/apache/iceberg/pull/3553>#3553</a>]</li>
+<li>Any S3-compatible storage (e.g. MinIO) can now be accessed through AWS <code>S3FileIO</code> with custom endpoint and credential configurations [<a href=https://github.com/apache/iceberg/pull/3656>#3656</a>] [<a href=https://github.com/apache/iceberg/pull/3658>#3658</a>]</li>
+<li>AWS <code>S3FileIO</code> now supports server-side checksum validation [<a href=https://github.com/apache/iceberg/pull/3813>#3813</a>]</li>
+<li>AWS <code>GlueCatalog</code> now displays more table information including table location, description [<a href=https://github.com/apache/iceberg/pull/3467>#3467</a>] and columns [<a href=https://github.com/apache/iceberg/pull/3888>#3888</a>]</li>
+<li>Using multiple <code>FileIO</code>s based on file path scheme is supported by configuring a <code>ResolvingFileIO</code> [<a href=https://github.com/apache/iceberg/pull/3593>#3593</a>]</li>
+</ul>
+</li>
+<li><strong>Spark</strong>
+<ul>
+<li>Spark 3.2 is supported [<a href=https://github.com/apache/iceberg/pull/3335>#3335</a>] with merge-on-read <code>DELETE</code> [<a href=https://github.com/apache/iceberg/pull/3970>#3970</a>]</li>
+<li><code>RewriteDataFiles</code> action now supports sort-based table optimization [<a href=https://github.com/apache/iceberg/pull/2829>#2829</a>] and merge-on-read delete compaction [<a href=https://github.com/apache/iceberg/pull/3454>#3454</a>]. The corresponding Spark call procedure <code>rewrite_data_files</code> is also supported [<a href=https://github.com/apache/iceberg/pull/3375>#3375</a>]</li>
+<li>Time travel queries now use snapshot schema instead of the table&rsquo;s latest schema [<a href=https://github.com/apache/iceberg/pull/3722>#3722</a>]</li>
+<li>Spark vectorized reads now support row-level deletes [<a href=https://github.com/apache/iceberg/pull/3557>#3557</a>] [<a href=https://github.com/apache/iceberg/pull/3287>#3287</a>]</li>
+<li><code>add_files</code> procedure now skips duplicated files by default (can be turned off with the <code>check_duplicate_files</code> flag) [<a href=https://github.com/apache/iceberg/issues/2779>#2895</a>], skips folder without file [<a href=https://github.com/apache/iceberg/issues/3455>#2895</a>] and partitions with <code>null</code> values [<a href=https://github.com/apache/iceberg/issues/3778>#2895</a>] instead of throwing exception, and supports partition pruning for faster table [...]
+</ul>
+</li>
+<li><strong>Flink</strong>
+<ul>
+<li>Flink 1.13 and 1.14 are supported [<a href=https://github.com/apache/iceberg/pull/3116>#3116</a>] [<a href=https://github.com/apache/iceberg/pull/3434>#3434</a>]</li>
+<li>Flink connector support is supported [<a href=https://github.com/apache/iceberg/pull/2666>#2666</a>]</li>
+<li>Upsert write option is supported [<a href=https://github.com/apache/iceberg/pull/2863>#2863</a>]</li>
+</ul>
+</li>
+<li><strong>Hive</strong>
+<ul>
+<li>Table listing in Hive catalog can now skip non-Iceberg tables by disabling flag <code>list-all-tables</code> [<a href=https://github.com/apache/iceberg/pull/3908>#3908</a>]</li>
+<li>Hive tables imported to Iceberg can now be read by <code>IcebergInputFormat</code> [<a href=https://github.com/apache/iceberg/pull/3312>#3312</a>]</li>
+</ul>
+</li>
+<li><strong>File Formats</strong>
+<ul>
+<li>Reading legacy Parquet file (e.g. produced by <code>ParquetHiveSerDe</code> or Spark <code>spark.sql.parquet.writeLegacyFormat=true</code>) is now fully supported to facilitate Hive to Iceberg table migration [<a href=https://github.com/apache/iceberg/pull/3723>#3723</a>]</li>
+<li>ORC now supports writing delete file [<a href=https://github.com/apache/iceberg/pull/3248>#3248</a>] [<a href=https://github.com/apache/iceberg/pull/3250>#3250</a>] [<a href=https://github.com/apache/iceberg/pull/3366>#3366</a>]</li>
+</ul>
+</li>
+</ul>
+<p><strong>Important bug fixes:</strong></p>
+<ul>
+<li><strong>Core</strong>
+<ul>
+<li>Iceberg new data file root path is configured through <code>write.data.path</code> going forward. <code>write.folder-storage.path</code> and <code>write.object-storage.path</code> are deprecated [<a href=https://github.com/apache/iceberg/pull/3094>#3094</a>]</li>
+<li>Catalog commit status is <code>UNKNOWN</code> instead of <code>FAILURE</code> when new metadata location cannot be found in snapshot history [<a href=https://github.com/apache/iceberg/pull/3717>#3717</a>]</li>
+<li>Dropping table now also deletes old metadata files instead of leaving them strained [<a href=https://github.com/apache/iceberg/pull/3622>#3622</a>]</li>
+<li><code>history</code> and <code>snapshots</code> metadata tables can now query tables with no current snapshot instead of returning empty [<a href=https://github.com/apache/iceberg/pull/3812>#3812</a>]</li>
+</ul>
+</li>
+<li><strong>Vendor Integrations</strong>
+<ul>
+<li>Using cloud service integrations such as AWS <code>GlueCatalog</code> and <code>S3FileIO</code> no longer fail when missing Hadoop dependencies in the execution environment [<a href=https://github.com/apache/iceberg/pull/3590>#3590</a>]</li>
+<li>AWS clients are now auto-closed when related <code>FileIO</code> or <code>Catalog</code> is closed. There is no need to close the AWS clients separately [<a href=https://github.com/apache/iceberg/pull/2878>#2878</a>]</li>
+</ul>
+</li>
+<li><strong>Spark</strong>
+<ul>
+<li>For Spark >= 3.1, <code>REFRESH TABLE</code> can now be used with Spark session catalog instead of throwing exception [<a href=https://github.com/apache/iceberg/pull/3072>#3072</a>]</li>
+<li>Insert overwrite mode now skips partition with 0 record instead of failing the write operation [<a href=https://github.com/apache/iceberg/issues/2895>#2895</a>]</li>
+<li>Spark snapshot expiration action now supports custom <code>FileIO</code> instead of just <code>HadoopFileIO</code> [<a href=https://github.com/apache/iceberg/pull/3089>#3089</a>]</li>
+<li><code>REPLACE TABLE AS SELECT</code> can now work with tables with columns that have changed partition transform. Each old partition field of the same column is converted to a void transform with a different name [<a href=https://github.com/apache/iceberg/issues/3421>#3421</a>]</li>
+<li>Spark SQL filters containing binary or fixed literals can now be pushed down instead of throwing exception [<a href=https://github.com/apache/iceberg/pull/3728>#3728</a>]</li>
+</ul>
+</li>
+<li><strong>Flink</strong>
+<ul>
+<li>A <code>ValidationException</code> will be thrown if a user configures both <code>catalog-type</code> and <code>catalog-impl</code>. Previously it chose to use <code>catalog-type</code>. The new behavior brings Flink consistent with Spark and Hive [<a href=https://github.com/apache/iceberg/issues/3308>#3308</a>]</li>
+<li>Changelog tables can now be queried without <code>RowData</code> serialization issues [<a href=https://github.com/apache/iceberg/pull/3240>#3240</a>]</li>
+<li><code>java.sql.Time</code> data type can now be written without data overflow problem [<a href=https://github.com/apache/iceberg/pull/3740>#3740</a>]</li>
+<li>Avro position delete files can now be read without encountering <code>NullPointerException</code> [<a href=https://github.com/apache/iceberg/pull/3540>#3540</a>]</li>
+</ul>
+</li>
+<li><strong>Hive</strong>
+<ul>
+<li>Hive catalog can now be initialized with a <code>null</code> Hadoop configuration instead of throwing exception [<a href=https://github.com/apache/iceberg/pull/3252>#3252</a>]</li>
+<li>Table creation can now succeed instead of throwing exception when some columns do not have comments [<a href=https://github.com/apache/iceberg/pull/3531>#3531</a>]</li>
+</ul>
+</li>
+<li><strong>File Formats</strong>
+<ul>
+<li>Parquet file writing issue is fixed for string data with over 16 unparseable chars (e.g. high/low surrogates) [<a href=https://github.com/apache/iceberg/pull/3760>#3760</a>]</li>
+<li>ORC vectorized read is now configured using <code>read.orc.vectorization.batch-size</code> instead of <code>read.parquet.vectorization.batch-size</code> [<a href=https://github.com/apache/iceberg/pull/3133>#3133</a>]</li>
+</ul>
+</li>
+</ul>
+<p><strong>Other notable changes:</strong></p>
+<ul>
+<li>The community has finalized the long-term strategy of Spark, Flink and Hive support. See <a href=../multi-engine-support>Multi-Engine Support</a> page for more details.</li>
+</ul>
+<h2 id=past-releases>Past releases</h2>
+<h3 id=0121>0.12.1</h3>
 <p>Apache Iceberg 0.12.1 was released on November 8th, 2021.</p>
+<ul>
+<li>Git tag: <a href=https://github.com/apache/iceberg/releases/tag/apache-iceberg-0.12.1>0.12.1</a></li>
+<li><a href=https://www.apache.org/dyn/closer.cgi/iceberg/apache-iceberg-0.12.1/apache-iceberg-0.12.1.tar.gz>0.12.1 source tar.gz</a> &ndash; <a href=https://downloads.apache.org/iceberg/apache-iceberg-0.12.1/apache-iceberg-0.12.1.tar.gz.asc>signature</a> &ndash; <a href=https://downloads.apache.org/iceberg/apache-iceberg-0.12.1/apache-iceberg-0.12.1.tar.gz.sha512>sha512</a></li>
+<li><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-spark3-runtime/0.12.1/iceberg-spark3-runtime-0.12.1.jar">0.12.1 Spark 3.x runtime Jar</a></li>
+<li><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-spark-runtime/0.12.1/iceberg-spark-runtime-0.12.1.jar">0.12.1 Spark 2.4 runtime Jar</a></li>
+<li><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-flink-runtime/0.12.1/iceberg-flink-runtime-0.12.1.jar">0.12.1 Flink runtime Jar</a></li>
+<li><a href="https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-hive-runtime/0.12.1/iceberg-hive-runtime-0.12.1.jar">0.12.1 Hive runtime Jar</a></li>
+</ul>
 <p>Important bug fixes and changes:</p>
 <ul>
 <li><a href=https://github.com/apache/iceberg/pull/3258>#3264</a> fixes validation failures that occurred after snapshot expiration when writing Flink CDC streams to Iceberg tables.</li>
@@ -114,7 +227,6 @@
 <li><a href=https://github.com/apache/iceberg/pull/3332>#3332</a> fixes importing ORC files with float or double columns in <code>add_files</code>.</li>
 </ul>
 <p>A more exhaustive list of changes is available under the <a href="https://github.com/apache/iceberg/milestone/15?closed=1">0.12.1 release milestone</a>.</p>
-<h2 id=past-releases>Past releases</h2>
 <h3 id=0120>0.12.0</h3>
 <p>Apache Iceberg 0.12.0 was released on August 15, 2021. It consists of 395 commits authored by 74 contributors over a 139 day period.</p>
 <ul>
diff --git a/sitemap.xml b/sitemap.xml
index 7619089..705e416 100644
--- a/sitemap.xml
+++ b/sitemap.xml
@@ -1 +1 @@
-<?xml version="1.0" encoding="utf-8" standalone="yes"?><urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml"><url><loc>https://iceberg.apache.org/services/expressive-sql/</loc></url><url><loc>https://iceberg.apache.org/services/schema-evolution/</loc></url><url><loc>https://iceberg.apache.org/services/hidden-partitioning/</loc></url><url><loc>https://iceberg.apache.org/services/time-travel/</loc></url><url><loc>https://iceberg.apache.org/s [...]
\ No newline at end of file
+<?xml version="1.0" encoding="utf-8" standalone="yes"?><urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml"><url><loc>https://iceberg.apache.org/services/expressive-sql/</loc></url><url><loc>https://iceberg.apache.org/services/schema-evolution/</loc></url><url><loc>https://iceberg.apache.org/services/hidden-partitioning/</loc></url><url><loc>https://iceberg.apache.org/services/time-travel/</loc></url><url><loc>https://iceberg.apache.org/s [...]
\ No newline at end of file
diff --git a/spec/index.html b/spec/index.html
index 2ef6103..ef357da 100644
--- a/spec/index.html
+++ b/spec/index.html
@@ -1347,7 +1347,7 @@ Retention policy can be configured both globally and on snapshot reference throu
 </ol>
 <p>Notes:</p>
 <ol>
-<li>The file system table scheme is implemented in <a href=../../../javadoc/0.12.1/index.html?org/apache/iceberg/hadoop/HadoopTableOperations.html>HadoopTableOperations</a>.</li>
+<li>The file system table scheme is implemented in <a href=../../../javadoc/0.13.0/index.html?org/apache/iceberg/hadoop/HadoopTableOperations.html>HadoopTableOperations</a>.</li>
 </ol>
 <h4 id=metastore-tables>Metastore Tables</h4>
 <p>The atomic swap needed to commit new versions of table metadata can be implemented by storing a pointer in a metastore or database that is updated with a check-and-put operation [1]. The check-and-put validates that the version of the table that a write is based on is still current and then makes the new metadata from the write the current version.</p>
@@ -1364,7 +1364,7 @@ Retention policy can be configured both globally and on snapshot reference throu
 </ol>
 <p>Notes:</p>
 <ol>
-<li>The metastore table scheme is partly implemented in <a href=../../../javadoc/0.12.1/index.html?org/apache/iceberg/BaseMetastoreTableOperations.html>BaseMetastoreTableOperations</a>.</li>
+<li>The metastore table scheme is partly implemented in <a href=../../../javadoc/0.13.0/index.html?org/apache/iceberg/BaseMetastoreTableOperations.html>BaseMetastoreTableOperations</a>.</li>
 </ol>
 <h3 id=delete-formats>Delete Formats</h3>
 <p>This section details how to encode row-level deletes in Iceberg delete files. Row-level deletes are not supported in v1.</p>