You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@iceberg.apache.org by gi...@apache.org on 2022/10/18 18:06:15 UTC

[iceberg-docs] branch asf-site updated: deploy: 0b0e3d887717ef8f258482dd4fbf785b80c08fed

This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/iceberg-docs.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new bd16bf4f deploy: 0b0e3d887717ef8f258482dd4fbf785b80c08fed
bd16bf4f is described below

commit bd16bf4f176e9652e86f3fa9c028f476b3737c6d
Author: Fokko <Fo...@users.noreply.github.com>
AuthorDate: Tue Oct 18 18:06:10 2022 +0000

    deploy: 0b0e3d887717ef8f258482dd4fbf785b80c08fed
---
 docs/latest/api/index.html                        | 12 ++---
 docs/latest/aws/index.html                        | 30 +++++++++---
 docs/latest/configuration/index.html              |  8 +--
 docs/latest/custom-catalog/index.html             |  4 +-
 docs/latest/dell/index.html                       |  4 +-
 docs/latest/docssearch.json                       |  2 +-
 docs/latest/evolution/index.html                  |  4 +-
 docs/latest/flink-connector/index.html            |  6 +--
 docs/latest/flink/index.html                      | 14 +++---
 docs/latest/getting-started/index.html            | 10 ++--
 docs/latest/hive/index.html                       |  8 +--
 docs/latest/index.html                            |  4 +-
 docs/latest/index.xml                             | 10 ++--
 docs/latest/java-api-quickstart/index.html        |  6 +--
 docs/latest/jdbc/index.html                       |  6 +--
 docs/latest/maintenance/index.html                | 14 +++---
 docs/latest/nessie/index.html                     | 10 ++--
 docs/latest/partitioning/index.html               |  4 +-
 docs/latest/performance/index.html                |  4 +-
 docs/latest/python-api-intro/index.html           |  4 +-
 docs/latest/python-feature-support/index.html     |  4 +-
 docs/latest/python-quickstart/index.html          |  4 +-
 docs/latest/reliability/index.html                |  4 +-
 docs/latest/schemas/index.html                    |  4 +-
 docs/latest/sitemap.xml                           |  2 +-
 docs/latest/spark-configuration/index.html        |  8 +--
 docs/latest/spark-ddl/index.html                  | 10 ++--
 docs/latest/spark-procedures/index.html           | 12 ++---
 docs/latest/spark-queries/index.html              | 59 +++++++++++++----------
 docs/latest/spark-structured-streaming/index.html |  4 +-
 docs/latest/spark-writes/index.html               |  4 +-
 31 files changed, 151 insertions(+), 128 deletions(-)

diff --git a/docs/latest/api/index.html b/docs/latest/api/index.html
index 73a4ee45..106f954b 100644
--- a/docs/latest/api/index.html
+++ b/docs/latest/api/index.html
@@ -3,17 +3,17 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collaps [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chev [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Integrations class=collapse><ul class=sub-menu><li><a href=../aws/>AWS</a></li><li><a href=../dell/>Dell</a></li><li><a href=../jdbc/>JDBC</a></li><li><a href=../nessie/>Nessie</a></li></ul></div><li><a class=chevron-toggle data-toggle=collapse data-parent=full href=#API><span>API</span>
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=API class="collapse in"><ul class=sub-menu><li><a href=../java-api-quickstart/>Java Quickstart</a></li><li><a id=active href=../api/>Java API</a></li><li><a href=../custom-catalog/>Java Custom Catalog</a></li><li><a href=../python-quickstart/>Python Quickstart</a></li><li><a href=../python-api-intro/>Python API</a></li><li><a href=../python-feature-support/>Python Feature Support</a></li></ul></div><li><a href=https://iceberg.apache.org/ [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=API class="collapse in"><ul class=sub-menu><li><a href=../java-api-quickstart/>Java Quickstart</a></li><li><a id=active href=../api/>Java API</a></li><li><a href=../custom-catalog/>Java Custom Catalog</a></li><li><a href=../python-quickstart/>Python Quickstart</a></li><li><a href=../python-api-intro/>Python API</a></li><li><a href=../python-feature-support/>Python Feature Support</a></li></ul></div><li><a href=https://iceberg.apache.org/ [...]
 </span></span></code></pre></div><p>To configure a scan, call <code>filter</code> and <code>select</code> on the <code>TableScan</code> to get a new <code>TableScan</code> with those changes.</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-java data-lang=java><span style=display:flex><span>TableScan filteredScan <span style=color:#f92672>=</span> scan<span style=color:#f92672>.</span><span  [...]
 </span></span></code></pre></div><p>Calls to configuration methods create a new <code>TableScan</code> so that each <code>TableScan</code> is immutable and won&rsquo;t change unexpectedly if shared across threads.</p><p>When a scan is configured, <code>planFiles</code>, <code>planTasks</code>, and <code>schema</code> are used to return files, tasks, and the read projection.</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4; [...]
 </span></span><span style=display:flex><span>    <span style=color:#f92672>.</span><span style=color:#a6e22e>filter</span><span style=color:#f92672>(</span>Expressions<span style=color:#f92672>.</span><span style=color:#a6e22e>equal</span><span style=color:#f92672>(</span><span style=color:#e6db74>&#34;id&#34;</span><span style=color:#f92672>,</span> 5<span style=color:#f92672>))</span>
@@ -26,7 +26,7 @@
 </span></span></code></pre></div><p>When a scan is configured, call method <code>build</code> to execute scan. <code>build</code> return <code>CloseableIterable&lt;Record></code></p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-java data-lang=java><span style=display:flex><span>CloseableIterable<span style=color:#f92672>&lt;</span>Record<span style=color:#f92672>&gt;</span> result <span styl [...]
 </span></span><span style=display:flex><span>        <span style=color:#f92672>.</span><span style=color:#a6e22e>where</span><span style=color:#f92672>(</span>Expressions<span style=color:#f92672>.</span><span style=color:#a6e22e>lessThan</span><span style=color:#f92672>(</span><span style=color:#e6db74>&#34;id&#34;</span><span style=color:#f92672>,</span> 5<span style=color:#f92672>))</span>
 </span></span><span style=display:flex><span>        <span style=color:#f92672>.</span><span style=color:#a6e22e>build</span><span style=color:#f92672>();</span>
-</span></span></code></pre></div><p>where <code>Record</code> is Iceberg record for iceberg-data module <code>org.apache.iceberg.data.Record</code>.</p><h3 id=update-operations>Update operations</h3><p><code>Table</code> also exposes operations that update the table. These operations use a builder pattern, <a href=../../../javadoc/0.14.1/index.html?org/apache/iceberg/PendingUpdate.html><code>PendingUpdate</code></a>, that commits when <code>PendingUpdate#commit</code> is called.</p><p>Fo [...]
+</span></span></code></pre></div><p>where <code>Record</code> is Iceberg record for iceberg-data module <code>org.apache.iceberg.data.Record</code>.</p><h3 id=update-operations>Update operations</h3><p><code>Table</code> also exposes operations that update the table. These operations use a builder pattern, <a href=../../../javadoc/1.0.0/index.html?org/apache/iceberg/PendingUpdate.html><code>PendingUpdate</code></a>, that commits when <code>PendingUpdate#commit</code> is called.</p><p>For [...]
 </span></span><span style=display:flex><span>    <span style=color:#f92672>.</span><span style=color:#a6e22e>addColumn</span><span style=color:#f92672>(</span><span style=color:#e6db74>&#34;count&#34;</span><span style=color:#f92672>,</span> Types<span style=color:#f92672>.</span><span style=color:#a6e22e>LongType</span><span style=color:#f92672>.</span><span style=color:#a6e22e>get</span><span style=color:#f92672>())</span>
 </span></span><span style=display:flex><span>    <span style=color:#f92672>.</span><span style=color:#a6e22e>commit</span><span style=color:#f92672>();</span>
 </span></span></code></pre></div><p>Available operations to update a table are:</p><ul><li><code>updateSchema</code> &ndash; update the table schema</li><li><code>updateProperties</code> &ndash; update table properties</li><li><code>updateLocation</code> &ndash; update the table&rsquo;s base location</li><li><code>newAppend</code> &ndash; used to append data files</li><li><code>newFastAppend</code> &ndash; used to append data files, will not compact metadata</li><li><code>newOverwrite</c [...]
@@ -37,7 +37,7 @@
 </span></span><span style=display:flex><span>
 </span></span><span style=display:flex><span><span style=color:#75715e>// commit all the changes to the table
 </span></span></span><span style=display:flex><span><span style=color:#75715e></span>t<span style=color:#f92672>.</span><span style=color:#a6e22e>commitTransaction</span><span style=color:#f92672>();</span>
-</span></span></code></pre></div><h2 id=types>Types</h2><p>Iceberg data types are located in the <a href=../../../javadoc/0.14.1/index.html?org/apache/iceberg/types/package-summary.html><code>org.apache.iceberg.types</code> package</a>.</p><h3 id=primitives>Primitives</h3><p>Primitive type instances are available from static methods in each type class. Types without parameters use <code>get</code>, and types like <code>decimal</code> use factory methods:</p><div class=highlight><pre tabi [...]
+</span></span></code></pre></div><h2 id=types>Types</h2><p>Iceberg data types are located in the <a href=../../../javadoc/1.0.0/index.html?org/apache/iceberg/types/package-summary.html><code>org.apache.iceberg.types</code> package</a>.</p><h3 id=primitives>Primitives</h3><p>Primitive type instances are available from static methods in each type class. Types without parameters use <code>get</code>, and types like <code>decimal</code> use factory methods:</p><div class=highlight><pre tabin [...]
 </span></span></span><span style=display:flex><span><span style=color:#75715e></span>Types<span style=color:#f92672>.</span><span style=color:#a6e22e>DoubleType</span><span style=color:#f92672>.</span><span style=color:#a6e22e>get</span><span style=color:#f92672>()</span>     <span style=color:#75715e>// double
 </span></span></span><span style=display:flex><span><span style=color:#75715e></span>Types<span style=color:#f92672>.</span><span style=color:#a6e22e>DecimalType</span><span style=color:#f92672>.</span><span style=color:#a6e22e>of</span><span style=color:#f92672>(</span>9<span style=color:#f92672>,</span> 2<span style=color:#f92672>)</span> <span style=color:#75715e>// decimal(9, 2)
 </span></span></span></code></pre></div><h3 id=nested-types>Nested types</h3><p>Structs, maps, and lists are created using factory methods in type classes.</p><p>Like struct fields, map keys or values and list elements are tracked as nested fields. Nested fields track <a href=../evolution#correctness>field IDs</a> and nullability.</p><p>Struct fields are created using <code>NestedField.optional</code> or <code>NestedField.required</code>. Map value and list element nullability is set in  [...]
@@ -53,7 +53,7 @@
 </span></span><span style=display:flex><span>  <span style=color:#f92672>)</span>
 </span></span></code></pre></div><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-java data-lang=java><span style=display:flex><span><span style=color:#75715e>// array&lt;1 element: int&gt;
 </span></span></span><span style=display:flex><span><span style=color:#75715e></span>ListType list <span style=color:#f92672>=</span> ListType<span style=color:#f92672>.</span><span style=color:#a6e22e>ofRequired</span><span style=color:#f92672>(</span>1<span style=color:#f92672>,</span> IntegerType<span style=color:#f92672>.</span><span style=color:#a6e22e>get</span><span style=color:#f92672>());</span>
-</span></span></code></pre></div><h2 id=expressions>Expressions</h2><p>Iceberg&rsquo;s expressions are used to configure table scans. To create expressions, use the factory methods in <a href=../../../javadoc/0.14.1/index.html?org/apache/iceberg/expressions/Expressions.html><code>Expressions</code></a>.</p><p>Supported predicate expressions are:</p><ul><li><code>isNull</code></li><li><code>notNull</code></li><li><code>equal</code></li><li><code>notEqual</code></li><li><code>lessThan</cod [...]
+</span></span></code></pre></div><h2 id=expressions>Expressions</h2><p>Iceberg&rsquo;s expressions are used to configure table scans. To create expressions, use the factory methods in <a href=../../../javadoc/1.0.0/index.html?org/apache/iceberg/expressions/Expressions.html><code>Expressions</code></a>.</p><p>Supported predicate expressions are:</p><ul><li><code>isNull</code></li><li><code>notNull</code></li><li><code>equal</code></li><li><code>notEqual</code></li><li><code>lessThan</code [...]
 </span></span><span style=display:flex><span>    <span style=color:#f92672>.</span><span style=color:#a6e22e>filter</span><span style=color:#f92672>(</span>Expressions<span style=color:#f92672>.</span><span style=color:#a6e22e>greaterThanOrEqual</span><span style=color:#f92672>(</span><span style=color:#e6db74>&#34;x&#34;</span><span style=color:#f92672>,</span> 5<span style=color:#f92672>))</span>
 </span></span><span style=display:flex><span>    <span style=color:#f92672>.</span><span style=color:#a6e22e>filter</span><span style=color:#f92672>(</span>Expressions<span style=color:#f92672>.</span><span style=color:#a6e22e>lessThan</span><span style=color:#f92672>(</span><span style=color:#e6db74>&#34;x&#34;</span><span style=color:#f92672>,</span> 10<span style=color:#f92672>))</span>
 </span></span></code></pre></div><h2 id=modules>Modules</h2><p>Iceberg table support is organized in library modules:</p><ul><li><code>iceberg-common</code> contains utility classes used in other modules</li><li><code>iceberg-api</code> contains the public Iceberg API, including expressions, types, tables, and operations</li><li><code>iceberg-arrow</code> is an implementation of the Iceberg type system for reading and writing data stored in Iceberg tables using Apache Arrow as the in-mem [...]
diff --git a/docs/latest/aws/index.html b/docs/latest/aws/index.html
index c1e32833..354ab9cc 100644
--- a/docs/latest/aws/index.html
+++ b/docs/latest/aws/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collaps [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chev [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
@@ -25,7 +25,7 @@ To choose a different HTTP client library such as <a href=https://mvnrepository.
 see the section <a href=#aws-client-customization>client customization</a> for more details.</p><p>All the AWS module features can be loaded through custom catalog properties,
 you can go to the documentations of each engine to see how to load a custom catalog.
 Here are some examples.</p><h3 id=spark>Spark</h3><p>For example, to use AWS features with Spark 3.0 and AWS clients version 2.17.257, you can start the Spark SQL shell with:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sh data-lang=sh><span style=display:flex><span><span style=color:#75715e># add Iceberg dependency</span>
-</span></span><span style=display:flex><span>ICEBERG_VERSION<span style=color:#f92672>=</span>0.14.1
+</span></span><span style=display:flex><span>ICEBERG_VERSION<span style=color:#f92672>=</span>1.0.0
 </span></span><span style=display:flex><span>DEPENDENCIES<span style=color:#f92672>=</span><span style=color:#e6db74>&#34;org.apache.iceberg:iceberg-spark3-runtime:</span>$ICEBERG_VERSION<span style=color:#e6db74>&#34;</span>
 </span></span><span style=display:flex><span>
 </span></span><span style=display:flex><span><span style=color:#75715e># add AWS dependnecy</span>
@@ -46,7 +46,7 @@ Here are some examples.</p><h3 id=spark>Spark</h3><p>For example, to use AWS fea
 </span></span></span><span style=display:flex><span><span style=color:#ae81ff></span>    --conf spark.sql.catalog.my_catalog.catalog-impl<span style=color:#f92672>=</span>org.apache.iceberg.aws.glue.GlueCatalog <span style=color:#ae81ff>\
 </span></span></span><span style=display:flex><span><span style=color:#ae81ff></span>    --conf spark.sql.catalog.my_catalog.io-impl<span style=color:#f92672>=</span>org.apache.iceberg.aws.s3.S3FileIO
 </span></span></code></pre></div><p>As you can see, In the shell command, we use <code>--packages</code> to specify the additional AWS bundle and HTTP client dependencies with their version as <code>2.17.257</code>.</p><h3 id=flink>Flink</h3><p>To use AWS module with Flink, you can download the necessary dependencies and specify them when starting the Flink SQL client:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-s [...]
-</span></span><span style=display:flex><span>ICEBERG_VERSION<span style=color:#f92672>=</span>0.14.1
+</span></span><span style=display:flex><span>ICEBERG_VERSION<span style=color:#f92672>=</span>1.0.0
 </span></span><span style=display:flex><span>MAVEN_URL<span style=color:#f92672>=</span>https://repo1.maven.org/maven2
 </span></span><span style=display:flex><span>ICEBERG_MAVEN_URL<span style=color:#f92672>=</span>$MAVEN_URL/org/apache/iceberg
 </span></span><span style=display:flex><span>wget $ICEBERG_MAVEN_URL/iceberg-flink-runtime/$ICEBERG_VERSION/iceberg-flink-runtime-$ICEBERG_VERSION.jar
@@ -196,7 +196,15 @@ For example, to add S3 delete tags with Spark 3.0, you can start the Spark SQL s
     --conf spark.sql.catalog.my_catalog.s3.delete.tags.my_key3=my_val3 \
     --conf spark.sql.catalog.my_catalog.s3.delete-enabled=false
 </code></pre><p>For the above example, the objects in S3 will be saved with tags: <code>my_key3=my_val3</code> before deletion.
-Users can also use the catalog property <code>s3.delete.num-threads</code> to mention the number of threads to be used for adding delete tags to the S3 objects.</p><p>For more details on tag restrictions, please refer <a href=https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/allocation-tag-restrictions.html>User-Defined Tag Restrictions</a>.</p><h3 id=s3-access-points>S3 Access Points</h3><p><a href=https://docs.aws.amazon.com/AmazonS3/latest/userguide/using-access-points.html [...]
+Users can also use the catalog property <code>s3.delete.num-threads</code> to mention the number of threads to be used for adding delete tags to the S3 objects.</p><p>When the catalog property <code>s3.write.table-tag-enabled</code> and <code>s3.write.namespace-tag-enabled</code> is set to <code>true</code> then the objects in S3 will be saved with tags: <code>iceberg.table=&lt;table-name></code> and <code>iceberg.namespace=&lt;namespace-name></code>.
+Users can define access and data retention policy per namespace or table based on these tags.
+For example, to write table and namespace name as S3 tags with Spark 3.0, you can start the Spark SQL shell with:</p><pre tabindex=0><code>sh spark-sql --conf spark.sql.catalog.my_catalog=org.apache.iceberg.spark.SparkCatalog \
+    --conf spark.sql.catalog.my_catalog.warehouse=s3://iceberg-warehouse/s3-tagging \
+    --conf spark.sql.catalog.my_catalog.catalog-impl=org.apache.iceberg.aws.glue.GlueCatalog \
+    --conf spark.sql.catalog.my_catalog.io-impl=org.apache.iceberg.aws.s3.S3FileIO \
+    --conf spark.sql.catalog.my_catalog.s3.write.table-tag-enabled=true \
+    --conf spark.sql.catalog.my_catalog.s3.write.namespace-tag-enabled=true
+</code></pre><p>For more details on tag restrictions, please refer <a href=https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/allocation-tag-restrictions.html>User-Defined Tag Restrictions</a>.</p><h3 id=s3-access-points>S3 Access Points</h3><p><a href=https://docs.aws.amazon.com/AmazonS3/latest/userguide/using-access-points.html>Access Points</a> can be used to perform
 S3 operations by specifying a mapping of bucket to access points. This is useful for multi-region access, cross-region access,
 disaster recovery, etc.</p><p>For using cross-region access points, we need to additionally set <code>use-arn-region-enabled</code> catalog property to
 <code>true</code> to enable <code>S3FileIO</code> to make cross-region calls, it&rsquo;s not required for same / multi-region access points.</p><p>For example, to use S3 access-point with Spark 3.0, you can start the Spark SQL shell with:</p><pre tabindex=0><code>spark-sql --conf spark.sql.catalog.my_catalog=org.apache.iceberg.spark.SparkCatalog \
@@ -224,13 +232,19 @@ In this case, a <a href=https://docs.aws.amazon.com/IAM/latest/UserGuide/id_role
 Iceberg provides an AWS client factory <code>AssumeRoleAwsClientFactory</code> to support this common use case.
 This also serves as an example for users who would like to implement their own AWS client factory.</p><p>This client factory has the following configurable catalog properties:</p><table><thead><tr><th>Property</th><th>Default</th><th>Description</th></tr></thead><tbody><tr><td>client.assume-role.arn</td><td>null, requires user input</td><td>ARN of the role to assume, e.g. arn:aws:iam::123456789:role/myRoleToAssume</td></tr><tr><td>client.assume-role.region</td><td>null, requires user inp [...]
 The Glue, S3 and DynamoDB clients are then initialized with the assume-role credential and region to access resources.
-Here is an example to start Spark shell with this client factory:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-shell data-lang=shell><span style=display:flex><span>spark-sql --packages org.apache.iceberg:iceberg-spark3-runtime:0.14.1,software.amazon.awssdk:bundle:2.17.257 <span style=color:#ae81ff>\
+Here is an example to start Spark shell with this client factory:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-shell data-lang=shell><span style=display:flex><span>spark-sql --packages org.apache.iceberg:iceberg-spark3-runtime:1.0.0,software.amazon.awssdk:bundle:2.17.257 <span style=color:#ae81ff>\
 </span></span></span><span style=display:flex><span><span style=color:#ae81ff></span>    --conf spark.sql.catalog.my_catalog<span style=color:#f92672>=</span>org.apache.iceberg.spark.SparkCatalog <span style=color:#ae81ff>\
 </span></span></span><span style=display:flex><span><span style=color:#ae81ff></span>    --conf spark.sql.catalog.my_catalog.warehouse<span style=color:#f92672>=</span>s3://my-bucket/my/key/prefix <span style=color:#ae81ff>\ </span>   
 </span></span><span style=display:flex><span>    --conf spark.sql.catalog.my_catalog.catalog-impl<span style=color:#f92672>=</span>org.apache.iceberg.aws.glue.GlueCatalog <span style=color:#ae81ff>\
 </span></span></span><span style=display:flex><span><span style=color:#ae81ff></span>    --conf spark.sql.catalog.my_catalog.client.factory<span style=color:#f92672>=</span>org.apache.iceberg.aws.AssumeRoleAwsClientFactory <span style=color:#ae81ff>\
 </span></span></span><span style=display:flex><span><span style=color:#ae81ff></span>    --conf spark.sql.catalog.my_catalog.client.assume-role.arn<span style=color:#f92672>=</span>arn:aws:iam::123456789:role/myRoleToAssume <span style=color:#ae81ff>\
 </span></span></span><span style=display:flex><span><span style=color:#ae81ff></span>    --conf spark.sql.catalog.my_catalog.client.assume-role.region<span style=color:#f92672>=</span>ap-northeast-1
+</span></span></code></pre></div><h3 id=http-client-configurations>HTTP Client Configurations</h3><p>AWS clients support two types of HTTP Client, <a href=https://mvnrepository.com/artifact/software.amazon.awssdk/url-connection-client>URL Connection HTTP Client</a>
+and <a href=https://mvnrepository.com/artifact/software.amazon.awssdk/apache-client>Apache HTTP Client</a>.
+By default, AWS clients use <strong>URL Connection</strong> HTTP Client to communicate with the service.
+This HTTP client optimizes for minimum dependencies and startup latency but support less functionality than other implementations.
+In contrast, Apache HTTP Client supports more functionalities and more customized settings, such as expect-continue handshake and TCP KeepAlive, at cost of extra dependency and additional startup latency.</p><p>For more details of configuration, see sections <a href=#url-connection-http-client-configurations>URL Connection HTTP Client Configurations</a> and <a href=#apache-http-client-configurations>Apache HTTP Client Configurations</a>.</p><p>Configure the following property to set the  [...]
+</span></span></code></pre></div><h4 id=apache-http-client-configurations>Apache HTTP Client Configurations</h4><p>Apache HTTP Client has the following configurable properties:</p><table><thead><tr><th>Property</th><th>Default</th><th>Description</th></tr></thead><tbody><tr><td>http-client.apache.socket-timeout-ms</td><td>null</td><td>An optional <a href=https://sdk.amazonaws.com/java/api/latest/software/amazon/awssdk/http/apache/ApacheHttpClient.Builder.html#socketTimeout(java.time.Dura [...]
 </span></span></code></pre></div><h2 id=run-iceberg-on-aws>Run Iceberg on AWS</h2><h3 id=amazon-athena>Amazon Athena</h3><p><a href=https://aws.amazon.com/athena/>Amazon Athena</a> provides a serverless query engine that could be used to perform read, write, update and optimization tasks against Iceberg tables.
 More details could be found <a href=https://docs.aws.amazon.com/athena/latest/ug/querying-iceberg.html>here</a>.</p><h3 id=amazon-emr>Amazon EMR</h3><p><a href=https://aws.amazon.com/emr/>Amazon EMR</a> can provision clusters with <a href=https://docs.aws.amazon.com/emr/latest/ReleaseGuide/emr-spark.html>Spark</a> (EMR 6 for Spark 3, EMR 5 for Spark 2),
 <a href=https://docs.aws.amazon.com/emr/latest/ReleaseGuide/emr-hive.html>Hive</a>, <a href=https://docs.aws.amazon.com/emr/latest/ReleaseGuide/emr-flink.html>Flink</a>,
@@ -238,7 +252,7 @@ More details could be found <a href=https://docs.aws.amazon.com/athena/latest/ug
 Please refer to the <a href=https://docs.aws.amazon.com/emr/latest/ReleaseGuide/emr-iceberg-use-cluster.html>official documentation</a> on how to create a cluster with Iceberg installed.</p><p>For versions before 6.5.0, you can use a <a href=https://docs.aws.amazon.com/emr/latest/ManagementGuide/emr-plan-bootstrap.html>bootstrap action</a> similar to the following to pre-install all necessary dependencies:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#2728 [...]
 </span></span></span><span style=display:flex><span><span style=color:#75715e></span>
 </span></span><span style=display:flex><span>AWS_SDK_VERSION<span style=color:#f92672>=</span>2.17.257
-</span></span><span style=display:flex><span>ICEBERG_VERSION<span style=color:#f92672>=</span>0.14.1
+</span></span><span style=display:flex><span>ICEBERG_VERSION<span style=color:#f92672>=</span>1.0.0
 </span></span><span style=display:flex><span>MAVEN_URL<span style=color:#f92672>=</span>https://repo1.maven.org/maven2
 </span></span><span style=display:flex><span>ICEBERG_MAVEN_URL<span style=color:#f92672>=</span>$MAVEN_URL/org/apache/iceberg
 </span></span><span style=display:flex><span>AWS_MAVEN_URL<span style=color:#f92672>=</span>$MAVEN_URL/software/amazon/awssdk
@@ -271,7 +285,7 @@ Please refer to the <a href=https://docs.aws.amazon.com/emr/latest/ReleaseGuide/
 </span></span><span style=display:flex><span>install_dependencies $LIB_PATH $AWS_MAVEN_URL $AWS_SDK_VERSION <span style=color:#e6db74>&#34;</span><span style=color:#e6db74>${</span>AWS_PACKAGES[@]<span style=color:#e6db74>}</span><span style=color:#e6db74>&#34;</span>
 </span></span></code></pre></div><h3 id=aws-eks>AWS EKS</h3><p><a href=https://aws.amazon.com/eks/>AWS Elastic Kubernetes Service (EKS)</a> can be used to start any Spark, Flink, Hive, Presto or Trino clusters to work with Iceberg.
 Search the <a href=../../../blogs>Iceberg blogs</a> page for tutorials around running Iceberg with Docker and Kubernetes.</p><h3 id=amazon-kinesis>Amazon Kinesis</h3><p><a href=https://aws.amazon.com/about-aws/whats-new/2019/11/you-can-now-run-fully-managed-apache-flink-applications-with-apache-kafka/>Amazon Kinesis Data Analytics</a> provides a platform
-to run fully managed Apache Flink applications. You can include Iceberg in your application Jar and run it in the platform.</p></div><div id=toc class=markdown-body><div id=full><nav id=TableOfContents><ul><li><a href=#enabling-aws-integration>Enabling AWS Integration</a><ul><li><a href=#spark>Spark</a></li><li><a href=#flink>Flink</a></li><li><a href=#hive>Hive</a></li></ul></li><li><a href=#catalogs>Catalogs</a><ul><li><a href=#glue-catalog>Glue Catalog</a></li><li><a href=#dynamodb-ca [...]
+to run fully managed Apache Flink applications. You can include Iceberg in your application Jar and run it in the platform.</p></div><div id=toc class=markdown-body><div id=full><nav id=TableOfContents><ul><li><a href=#enabling-aws-integration>Enabling AWS Integration</a><ul><li><a href=#spark>Spark</a></li><li><a href=#flink>Flink</a></li><li><a href=#hive>Hive</a></li></ul></li><li><a href=#catalogs>Catalogs</a><ul><li><a href=#glue-catalog>Glue Catalog</a></li><li><a href=#dynamodb-ca [...]
 <script src=https://iceberg.apache.org/docs/latest//js/jquery.easing.min.js></script>
 <script type=text/javascript src=https://iceberg.apache.org/docs/latest//js/search.js></script>
 <script src=https://iceberg.apache.org/docs/latest//js/bootstrap.min.js></script>
diff --git a/docs/latest/configuration/index.html b/docs/latest/configuration/index.html
index 497b31a7..a30cc99b 100644
--- a/docs/latest/configuration/index.html
+++ b/docs/latest/configuration/index.html
@@ -3,22 +3,22 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class="collapse in"><ul class=sub-menu><li><a id=active href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chev [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Integrations class=collapse><ul class=sub-menu><li><a href=../aws/>AWS</a></li><li><a href=../dell/>Dell</a></li><li><a href=../jdbc/>JDBC</a></li><li><a href=../nessie/>Nessie</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href=#API><span>API</span>
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=API class=collapse><ul class=sub-menu><li><a href=../java-api-quickstart/>Java Quickstart</a></li><li><a href=../api/>Java API</a></li><li><a href=../custom-catalog/>Java Custom Catalog</a></li><li><a href=../python-quickstart/>Python Quickstart</a></li><li><a href=../python-api-intro/>Python API</a></li><li><a href=../python-feature-support/>Python Feature Support</a></li></ul></div><li><a href=https://iceberg.apache.org/docs/latest/../ [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=API class=collapse><ul class=sub-menu><li><a href=../java-api-quickstart/>Java Quickstart</a></li><li><a href=../api/>Java API</a></li><li><a href=../custom-catalog/>Java Custom Catalog</a></li><li><a href=../python-quickstart/>Python Quickstart</a></li><li><a href=../python-api-intro/>Python API</a></li><li><a href=../python-feature-support/>Python Feature Support</a></li></ul></div><li><a href=https://iceberg.apache.org/docs/latest/../ [...]
 The value of these properties are not persisted as a part of the table metadata.</p><table><thead><tr><th>Property</th><th>Default</th><th>Description</th></tr></thead><tbody><tr><td>format-version</td><td>1</td><td>Table&rsquo;s format version (can be 1 or 2) as defined in the <a href=../../../spec/#format-versioning>Spec</a>.</td></tr></tbody></table><h3 id=compatibility-flags>Compatibility flags</h3><table><thead><tr><th>Property</th><th>Default</th><th>Description</th></tr></thead><t [...]
 Any other custom catalog can access the properties by implementing <code>Catalog.initialize(catalogName, catalogProperties)</code>.
 The properties can be manually constructed or passed in from a compute engine like Spark or Flink.
 Spark uses its session properties as catalog properties, see more details in the <a href=../spark-configuration#catalog-configuration>Spark configuration</a> section.
-Flink passes in catalog properties through <code>CREATE CATALOG</code> statement, see more details in the <a href=../flink/#creating-catalogs-and-using-catalogs>Flink</a> section.</p><h3 id=lock-catalog-properties>Lock catalog properties</h3><p>Here are the catalog properties related to locking. They are used by some catalog implementations to control the locking behavior during commits.</p><table><thead><tr><th>Property</th><th>Default</th><th>Description</th></tr></thead><tbody><tr><td [...]
+Flink passes in catalog properties through <code>CREATE CATALOG</code> statement, see more details in the <a href=../flink/#creating-catalogs-and-using-catalogs>Flink</a> section.</p><h3 id=lock-catalog-properties>Lock catalog properties</h3><p>Here are the catalog properties related to locking. They are used by some catalog implementations to control the locking behavior during commits.</p><table><thead><tr><th>Property</th><th>Default</th><th>Description</th></tr></thead><tbody><tr><td [...]
 of the Hive Metastore (<code>hive.txn.timeout</code> or <code>metastore.txn.timeout</code> in the newer versions). Otherwise, the heartbeats on the lock (which happens during the lock checks) would end up expiring in the
 Hive Metastore before the lock is retried from Iceberg.</p></div><div id=toc class=markdown-body><div id=full><nav id=TableOfContents><ul><li><a href=#table-properties>Table properties</a><ul><li><a href=#read-properties>Read properties</a></li><li><a href=#write-properties>Write properties</a></li><li><a href=#table-behavior-properties>Table behavior properties</a></li><li><a href=#reserved-table-properties>Reserved table properties</a></li><li><a href=#compatibility-flags>Compatibility [...]
 <script src=https://iceberg.apache.org/docs/latest//js/jquery.easing.min.js></script>
diff --git a/docs/latest/custom-catalog/index.html b/docs/latest/custom-catalog/index.html
index bd11324e..8c756afe 100644
--- a/docs/latest/custom-catalog/index.html
+++ b/docs/latest/custom-catalog/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collaps [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chev [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
diff --git a/docs/latest/dell/index.html b/docs/latest/dell/index.html
index ec58fd25..2463f114 100644
--- a/docs/latest/dell/index.html
+++ b/docs/latest/dell/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collaps [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chev [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
diff --git a/docs/latest/docssearch.json b/docs/latest/docssearch.json
index 8a0cdb53..82ff3b4c 100644
--- a/docs/latest/docssearch.json
+++ b/docs/latest/docssearch.json
@@ -1 +1 @@
-[{"categories":null,"content":" Getting Started The latest version of Iceberg is 0.14.1.\nSpark is currently the most feature-rich compute engine for Iceberg operations. We recommend you to get started with Spark to understand Iceberg concepts and features with examples. You can also view documentations of using Iceberg with other compute engine under the Engines tab.\nUsing Iceberg in Spark 3 To use Iceberg in a Spark shell, use the --packages option:\nspark-shell --packages org.apache. [...]
\ No newline at end of file
+[{"categories":null,"content":" Getting Started The latest version of Iceberg is 1.0.0.\nSpark is currently the most feature-rich compute engine for Iceberg operations. We recommend you to get started with Spark to understand Iceberg concepts and features with examples. You can also view documentations of using Iceberg with other compute engine under the Engines tab.\nUsing Iceberg in Spark 3 To use Iceberg in a Spark shell, use the --packages option:\nspark-shell --packages org.apache.i [...]
\ No newline at end of file
diff --git a/docs/latest/evolution/index.html b/docs/latest/evolution/index.html
index 90a56202..b5b9fe3e 100644
--- a/docs/latest/evolution/index.html
+++ b/docs/latest/evolution/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class="collapse in"><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a id=active href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chev [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
diff --git a/docs/latest/flink-connector/index.html b/docs/latest/flink-connector/index.html
index 9b30f97d..b206a183 100644
--- a/docs/latest/flink-connector/index.html
+++ b/docs/latest/flink-connector/index.html
@@ -3,18 +3,18 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collaps [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class=chevr [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class=chevron-toggle data-toggle=collapse data-parent=full href=#Flink><spa [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class="collapse in"><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a id=active href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li> [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Integrations class=collapse><ul class=sub-menu><li><a href=../aws/>AWS</a></li><li><a href=../dell/>Dell</a></li><li><a href=../jdbc/>JDBC</a></li><li><a href=../nessie/>Nessie</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href=#API><span>API</span>
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=API class=collapse><ul class=sub-menu><li><a href=../java-api-quickstart/>Java Quickstart</a></li><li><a href=../api/>Java API</a></li><li><a href=../custom-catalog/>Java Custom Catalog</a></li><li><a href=../python-quickstart/>Python Quickstart</a></li><li><a href=../python-api-intro/>Python API</a></li><li><a href=../python-feature-support/>Python Feature Support</a></li></ul></div><li><a href=https://iceberg.apache.org/docs/latest/../ [...]
-which is just mapping to the underlying iceberg table instead of maintaining iceberg table directly in current Flink catalog.</p><p>To create the table in Flink SQL by using SQL syntax <code>CREATE TABLE test (..) WITH ('connector'='iceberg', ...)</code>, Flink iceberg connector provides the following table properties:</p><ul><li><code>connector</code>: Use the constant <code>iceberg</code>.</li><li><code>catalog-name</code>: User-specified catalog name. It&rsquo;s required because the c [...]
+which is just mapping to the underlying iceberg table instead of maintaining iceberg table directly in current Flink catalog.</p><p>To create the table in Flink SQL by using SQL syntax <code>CREATE TABLE test (..) WITH ('connector'='iceberg', ...)</code>, Flink iceberg connector provides the following table properties:</p><ul><li><code>connector</code>: Use the constant <code>iceberg</code>.</li><li><code>catalog-name</code>: User-specified catalog name. It&rsquo;s required because the c [...]
 </span></span><span style=display:flex><span>    id   BIGINT,
 </span></span><span style=display:flex><span>    <span style=color:#66d9ef>data</span> STRING
 </span></span><span style=display:flex><span>) <span style=color:#66d9ef>WITH</span> (
diff --git a/docs/latest/flink/index.html b/docs/latest/flink/index.html
index b90c8cdc..619c4167 100644
--- a/docs/latest/flink/index.html
+++ b/docs/latest/flink/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collaps [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class=chevr [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class=chevron-toggle data-toggle=collapse data-parent=full href=#Flink><spa [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class="collapse in"><ul class=sub-menu><li><a id=active href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li> [...]
 <i class="fa fa-chevron-right"></i>
@@ -99,7 +99,7 @@ In our example we&rsquo;re using <code>env.add_jars(..)</code> as shown below:</
 </span></span><span style=display:flex><span><span style=color:#f92672>from</span> pyflink.datastream <span style=color:#f92672>import</span> StreamExecutionEnvironment
 </span></span><span style=display:flex><span>
 </span></span><span style=display:flex><span>env <span style=color:#f92672>=</span> StreamExecutionEnvironment<span style=color:#f92672>.</span>get_execution_environment()
-</span></span><span style=display:flex><span>iceberg_flink_runtime_jar <span style=color:#f92672>=</span> os<span style=color:#f92672>.</span>path<span style=color:#f92672>.</span>join(os<span style=color:#f92672>.</span>getcwd(), <span style=color:#e6db74>&#34;iceberg-flink-runtime-0.14.1.jar&#34;</span>)
+</span></span><span style=display:flex><span>iceberg_flink_runtime_jar <span style=color:#f92672>=</span> os<span style=color:#f92672>.</span>path<span style=color:#f92672>.</span>join(os<span style=color:#f92672>.</span>getcwd(), <span style=color:#e6db74>&#34;iceberg-flink-runtime-1.0.0.jar&#34;</span>)
 </span></span><span style=display:flex><span>
 </span></span><span style=display:flex><span>env<span style=color:#f92672>.</span>add_jars(<span style=color:#e6db74>&#34;file://</span><span style=color:#e6db74>{}</span><span style=color:#e6db74>&#34;</span><span style=color:#f92672>.</span>format(iceberg_flink_runtime_jar))
 </span></span></code></pre></div><p>Once we reached this point, we can then create a <code>StreamTableEnvironment</code> and execute Flink SQL statements.
@@ -122,7 +122,7 @@ The below example shows how to create a custom catalog via the Python Table API:
 </span></span><span style=display:flex><span>  <span style=color:#e6db74>&#39;property-version&#39;</span><span style=color:#f92672>=</span><span style=color:#e6db74>&#39;1&#39;</span>,
 </span></span><span style=display:flex><span>  <span style=color:#e6db74>&#39;warehouse&#39;</span><span style=color:#f92672>=</span><span style=color:#e6db74>&#39;hdfs://nn:8020/warehouse/path&#39;</span>
 </span></span><span style=display:flex><span>);
-</span></span></code></pre></div><p>The following properties can be set if using the Hive catalog:</p><ul><li><code>uri</code>: The Hive metastore&rsquo;s thrift URI. (Required)</li><li><code>clients</code>: The Hive metastore client pool size, default value is 2. (Optional)</li><li><code>warehouse</code>: The Hive warehouse location, users should specify this path if neither set the <code>hive-conf-dir</code> to specify a location containing a <code>hive-site.xml</code> configuration fi [...]
+</span></span></code></pre></div><p>The following properties can be set if using the Hive catalog:</p><ul><li><code>uri</code>: The Hive metastore&rsquo;s thrift URI. (Required)</li><li><code>clients</code>: The Hive metastore client pool size, default value is 2. (Optional)</li><li><code>warehouse</code>: The Hive warehouse location, users should specify this path if neither set the <code>hive-conf-dir</code> to specify a location containing a <code>hive-site.xml</code> configuration fi [...]
 </span></span><span style=display:flex><span>  <span style=color:#e6db74>&#39;type&#39;</span><span style=color:#f92672>=</span><span style=color:#e6db74>&#39;iceberg&#39;</span>,
 </span></span><span style=display:flex><span>  <span style=color:#e6db74>&#39;catalog-type&#39;</span><span style=color:#f92672>=</span><span style=color:#e6db74>&#39;hadoop&#39;</span>,
 </span></span><span style=display:flex><span>  <span style=color:#e6db74>&#39;warehouse&#39;</span><span style=color:#f92672>=</span><span style=color:#e6db74>&#39;hdfs://nn:8020/warehouse/path&#39;</span>,
@@ -230,7 +230,7 @@ For an unpartitioned iceberg table, its data will be completely overwritten by <
 </span></span><span style=display:flex><span>
 </span></span><span style=display:flex><span><span style=color:#75715e>// Submit and execute this streaming read job.
 </span></span></span><span style=display:flex><span><span style=color:#75715e></span>env<span style=color:#f92672>.</span><span style=color:#a6e22e>execute</span><span style=color:#f92672>(</span><span style=color:#e6db74>&#34;Test Iceberg Streaming Read&#34;</span><span style=color:#f92672>);</span>
-</span></span></code></pre></div><p>There are other options that we could set by Java API, please see the <a href=../../../javadoc/0.14.1/org/apache/iceberg/flink/source/FlinkSource.html>FlinkSource#Builder</a>.</p><h2 id=reading-with-datastream-flip-27-source>Reading with DataStream (FLIP-27 source)</h2><p><a href=https://cwiki.apache.org/confluence/display/FLINK/FLIP-27%3A+Refactor+Source+Interface>FLIP-27 source interface</a>
+</span></span></code></pre></div><p>There are other options that we could set by Java API, please see the <a href=../../../javadoc/1.0.0/org/apache/iceberg/flink/source/FlinkSource.html>FlinkSource#Builder</a>.</p><h2 id=reading-with-datastream-flip-27-source>Reading with DataStream (FLIP-27 source)</h2><p><a href=https://cwiki.apache.org/confluence/display/FLINK/FLIP-27%3A+Refactor+Source+Interface>FLIP-27 source interface</a>
 was introduced in Flink 1.12. It aims to solve several shortcomings of the old <code>SourceFunction</code>
 streaming source interface. It also unifies the source interfaces for both batch and streaming executions.
 Most source connectors (like Kafka, file) in Flink repo have migrated to the FLIP-27 interface.
@@ -279,7 +279,7 @@ CDC read is not supported yet.</p><div class=highlight><pre tabindex=0 style=col
 </span></span><span style=display:flex><span><span style=color:#75715e>// Submit and execute this streaming read job.
 </span></span></span><span style=display:flex><span><span style=color:#75715e></span>env<span style=color:#f92672>.</span><span style=color:#a6e22e>execute</span><span style=color:#f92672>(</span><span style=color:#e6db74>&#34;Test Iceberg Streaming Read&#34;</span><span style=color:#f92672>);</span>
 </span></span></code></pre></div><p>There are other options that we could set by Java API, please see the
-<a href=../../../javadoc/0.14.1/org/apache/iceberg/flink/source/IcebergSource.html>IcebergSource#Builder</a>.</p><h2 id=writing-with-datastream>Writing with DataStream</h2><p>Iceberg support writing to iceberg table from different DataStream input.</p><h3 id=appending-data>Appending data.</h3><p>we have supported writing <code>DataStream&lt;RowData></code> and <code>DataStream&lt;Row></code> to the sink iceberg table natively.</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;b [...]
+<a href=../../../javadoc/1.0.0/org/apache/iceberg/flink/source/IcebergSource.html>IcebergSource#Builder</a>.</p><h2 id=writing-with-datastream>Writing with DataStream</h2><p>Iceberg support writing to iceberg table from different DataStream input.</p><h3 id=appending-data>Appending data.</h3><p>we have supported writing <code>DataStream&lt;RowData></code> and <code>DataStream&lt;Row></code> to the sink iceberg table natively.</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;ba [...]
 </span></span><span style=display:flex><span>
 </span></span><span style=display:flex><span>DataStream<span style=color:#f92672>&lt;</span>RowData<span style=color:#f92672>&gt;</span> input <span style=color:#f92672>=</span> <span style=color:#f92672>...</span> <span style=color:#f92672>;</span>
 </span></span><span style=display:flex><span>Configuration hadoopConf <span style=color:#f92672>=</span> <span style=color:#66d9ef>new</span> Configuration<span style=color:#f92672>();</span>
@@ -328,7 +328,7 @@ CDC read is not supported yet.</p><div class=highlight><pre tabindex=0 style=col
 </span></span><span style=display:flex><span>RewriteDataFilesActionResult result <span style=color:#f92672>=</span> Actions<span style=color:#f92672>.</span><span style=color:#a6e22e>forTable</span><span style=color:#f92672>(</span>table<span style=color:#f92672>)</span>
 </span></span><span style=display:flex><span>        <span style=color:#f92672>.</span><span style=color:#a6e22e>rewriteDataFiles</span><span style=color:#f92672>()</span>
 </span></span><span style=display:flex><span>        <span style=color:#f92672>.</span><span style=color:#a6e22e>execute</span><span style=color:#f92672>();</span>
-</span></span></code></pre></div><p>For more doc about options of the rewrite files action, please see <a href=../../../javadoc/0.14.1/org/apache/iceberg/flink/actions/RewriteDataFilesAction.html>RewriteDataFilesAction</a></p><h2 id=type-conversion>Type conversion</h2><p>Iceberg&rsquo;s integration for Flink automatically converts between Flink and Iceberg types. When writing to a table with types that are not supported by Flink, like UUID, Iceberg will accept and convert values from the [...]
+</span></span></code></pre></div><p>For more doc about options of the rewrite files action, please see <a href=../../../javadoc/1.0.0/org/apache/iceberg/flink/actions/RewriteDataFilesAction.html>RewriteDataFilesAction</a></p><h2 id=type-conversion>Type conversion</h2><p>Iceberg&rsquo;s integration for Flink automatically converts between Flink and Iceberg types. When writing to a table with types that are not supported by Flink, like UUID, Iceberg will accept and convert values from the  [...]
 <script src=https://iceberg.apache.org/docs/latest//js/jquery.easing.min.js></script>
 <script type=text/javascript src=https://iceberg.apache.org/docs/latest//js/search.js></script>
 <script src=https://iceberg.apache.org/docs/latest//js/bootstrap.min.js></script>
diff --git a/docs/latest/getting-started/index.html b/docs/latest/getting-started/index.html
index a9f1aa44..956c9a4b 100644
--- a/docs/latest/getting-started/index.html
+++ b/docs/latest/getting-started/index.html
@@ -3,20 +3,20 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class=chevron-toggle data-toggle=collapse data-paren [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class="collapse in"><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a id=active href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class="collapse in"><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a id=active href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-p [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Integrations class=collapse><ul class=sub-menu><li><a href=../aws/>AWS</a></li><li><a href=../dell/>Dell</a></li><li><a href=../jdbc/>JDBC</a></li><li><a href=../nessie/>Nessie</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href=#API><span>API</span>
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=API class=collapse><ul class=sub-menu><li><a href=../java-api-quickstart/>Java Quickstart</a></li><li><a href=../api/>Java API</a></li><li><a href=../custom-catalog/>Java Custom Catalog</a></li><li><a href=../python-quickstart/>Python Quickstart</a></li><li><a href=../python-api-intro/>Python API</a></li><li><a href=../python-feature-support/>Python Feature Support</a></li></ul></div><li><a href=https://iceberg.apache.org/docs/latest/../ [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=API class=collapse><ul class=sub-menu><li><a href=../java-api-quickstart/>Java Quickstart</a></li><li><a href=../api/>Java API</a></li><li><a href=../custom-catalog/>Java Custom Catalog</a></li><li><a href=../python-quickstart/>Python Quickstart</a></li><li><a href=../python-api-intro/>Python API</a></li><li><a href=../python-feature-support/>Python Feature Support</a></li></ul></div><li><a href=https://iceberg.apache.org/docs/latest/../ [...]
 We recommend you to get started with Spark to understand Iceberg concepts and features with examples.
-You can also view documentations of using Iceberg with other compute engine under the <strong>Engines</strong> tab.</p><h2 id=using-iceberg-in-spark-3>Using Iceberg in Spark 3</h2><p>To use Iceberg in a Spark shell, use the <code>--packages</code> option:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sh data-lang=sh><span style=display:flex><span>spark-shell --packages org.apache.iceberg: [...]
-</span></span></code></pre></div><div class=info>If you want to include Iceberg in your Spark installation, add the <a href=spark-runtime-jar><code>iceberg-spark-runtime-3.2_2.12</code> Jar</a> to Spark&rsquo;s <code>jars</code> folder.</div><h3 id=adding-catalogs>Adding catalogs</h3><p>Iceberg comes with <a href=../spark-configuration#catalogs>catalogs</a> that enable SQL commands to manage tables and load them by name. Catalogs are configured using properties under <code>spark.sql.cata [...]
+You can also view documentations of using Iceberg with other compute engine under the <strong>Engines</strong> tab.</p><h2 id=using-iceberg-in-spark-3>Using Iceberg in Spark 3</h2><p>To use Iceberg in a Spark shell, use the <code>--packages</code> option:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sh data-lang=sh><span style=display:flex><span>spark-shell --packages org.apache.iceberg: [...]
+</span></span></code></pre></div><div class=info>If you want to include Iceberg in your Spark installation, add the <a href=spark-runtime-jar><code>iceberg-spark-runtime-3.2_2.12</code> Jar</a> to Spark&rsquo;s <code>jars</code> folder.</div><h3 id=adding-catalogs>Adding catalogs</h3><p>Iceberg comes with <a href=../spark-configuration#catalogs>catalogs</a> that enable SQL commands to manage tables and load them by name. Catalogs are configured using properties under <code>spark.sql.cata [...]
 </span></span></span><span style=display:flex><span><span style=color:#ae81ff></span>    --conf spark.sql.extensions<span style=color:#f92672>=</span>org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions <span style=color:#ae81ff>\
 </span></span></span><span style=display:flex><span><span style=color:#ae81ff></span>    --conf spark.sql.catalog.spark_catalog<span style=color:#f92672>=</span>org.apache.iceberg.spark.SparkSessionCatalog <span style=color:#ae81ff>\
 </span></span></span><span style=display:flex><span><span style=color:#ae81ff></span>    --conf spark.sql.catalog.spark_catalog.type<span style=color:#f92672>=</span>hive <span style=color:#ae81ff>\
diff --git a/docs/latest/hive/index.html b/docs/latest/hive/index.html
index ef3117d0..05f887bc 100644
--- a/docs/latest/hive/index.html
+++ b/docs/latest/hive/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collaps [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chev [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a id=active href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li>< [...]
 <i class="fa fa-chevron-right"></i>
@@ -48,7 +48,7 @@ SET iceberg.catalog.another_hive.clients=10;
 SET iceberg.catalog.another_hive.warehouse=hdfs://example.com:8020/warehouse;
 </code></pre><p>Register a <code>HadoopCatalog</code> called <code>hadoop</code>:</p><pre tabindex=0><code>SET iceberg.catalog.hadoop.type=hadoop;
 SET iceberg.catalog.hadoop.warehouse=hdfs://example.com:8020/warehouse;
-</code></pre><p>Register an AWS <code>GlueCatalog</code> called <code>glue</code>:</p><pre tabindex=0><code>SET iceberg.catalog.glue.catalog-impl=org.apache.iceberg.aws.GlueCatalog;
+</code></pre><p>Register an AWS <code>GlueCatalog</code> called <code>glue</code>:</p><pre tabindex=0><code>SET iceberg.catalog.glue.catalog-impl=org.apache.iceberg.aws.glue.GlueCatalog;
 SET iceberg.catalog.glue.warehouse=s3://my-bucket/my/key/prefix;
 SET iceberg.catalog.glue.lock.table=myGlueLockTable;
 </code></pre><h2 id=ddl-commands>DDL Commands</h2><p>Not all the features below are supported with Hive 2.3.x and Hive 3.1.x. Please refer to the
@@ -68,7 +68,7 @@ The Iceberg table and the corresponding Hive table are created at the beginning
 The data is inserted / committed when the query finishes. So for a transient period the table already exists but contains no data.</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>CREATE</span> <span style=color:#66d9ef>TABLE</span> target PARTITIONED <span style=color:#66d9ef>BY</span> SPEC (<span style=color:#66d9ef [...]
 </span></span><span style=display:flex><span>    <span style=color:#66d9ef>SELECT</span> <span style=color:#f92672>*</span> <span style=color:#66d9ef>FROM</span> <span style=color:#66d9ef>source</span>;
 </span></span></code></pre></div><h3 id=create-external-table-overlaying-an-existing-iceberg-table>CREATE EXTERNAL TABLE overlaying an existing Iceberg table</h3><p>The <code>CREATE EXTERNAL TABLE</code> command is used to overlay a Hive table &ldquo;on top of&rdquo; an existing Iceberg table. Iceberg
-tables are created using either a <a href=../../../javadoc/0.14.1/index.html?org/apache/iceberg/catalog/Catalog.html><code>Catalog</code></a>, or an implementation of the <a href=../../../javadoc/0.14.1/index.html?org/apache/iceberg/Tables.html><code>Tables</code></a> interface, and Hive needs to be configured accordingly to
+tables are created using either a <a href=../../../javadoc/1.0.0/index.html?org/apache/iceberg/catalog/Catalog.html><code>Catalog</code></a>, or an implementation of the <a href=../../../javadoc/1.0.0/index.html?org/apache/iceberg/Tables.html><code>Tables</code></a> interface, and Hive needs to be configured accordingly to
 operate on these different types of table.</p><h4 id=hive-catalog-tables>Hive catalog tables</h4><p>As described before, tables created by the <code>HiveCatalog</code> with Hive engine feature enabled are directly visible by the
 Hive engine, so there is no need to create an overlay.</p><h4 id=custom-catalog-tables>Custom catalog tables</h4><p>For a table in a registered catalog, specify the catalog name in the statement using table property <code>iceberg.catalog</code>.
 For example, the SQL below creates an overlay for a table in a <code>hadoop</code> type catalog named <code>hadoop_cat</code>:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>SET</span>
diff --git a/docs/latest/index.html b/docs/latest/index.html
index 7aaea2fe..f6f4a340 100644
--- a/docs/latest/index.html
+++ b/docs/latest/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=./configuration/>Configuration</a></li><li><a href=./evolution/>Evolution</a></li><li><a href=./maintenance/>Maintenance</a></li><li><a href=./partitioning/>Partitioning</a></li><li><a href=./performance/>Performance</a></li><li><a href=./reliability/>Reliability</a></li><li><a href=./schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data- [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=./spark-ddl/>DDL</a></li><li><a href=./getting-started/>Getting Started</a></li><li><a href=./spark-procedures/>Procedures</a></li><li><a href=./spark-queries/>Queries</a></li><li><a href=./spark-configuration/>Spark Configuration</a></li><li><a href=./spark-structured-streaming/>Structured Streaming</a></li><li><a href=./spark-writes/>Writes</a></li></ul></div><li><a class="chevron-tog [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=./spark-ddl/>DDL</a></li><li><a href=./getting-started/>Getting Started</a></li><li><a href=./spark-procedures/>Procedures</a></li><li><a href=./spark-queries/>Queries</a></li><li><a href=./spark-structured-streaming/>Structured Streaming</a></li><li><a href=./spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href=#Flin [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=./flink/>Enabling Iceberg in Flink</a></li><li><a href=./flink-connector/>Flink Connector</a></li></ul></div><li><a href=./hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_bla [...]
 <i class="fa fa-chevron-right"></i>
diff --git a/docs/latest/index.xml b/docs/latest/index.xml
index ead16c21..44963acf 100644
--- a/docs/latest/index.xml
+++ b/docs/latest/index.xml
@@ -1,4 +1,4 @@
-<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Introduction on Apache Iceberg</title><link>https://iceberg.apache.org/docs/latest/</link><description>Recent content in Introduction on Apache Iceberg</description><generator>Hugo -- gohugo.io</generator><language>en-us</language><atom:link href="https://iceberg.apache.org/docs/latest/index.xml" rel="self" type="application/rss+xml"/><item><title>Getting Sta [...]
+<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Introduction on Apache Iceberg</title><link>https://iceberg.apache.org/docs/latest/</link><description>Recent content in Introduction on Apache Iceberg</description><generator>Hugo -- gohugo.io</generator><language>en-us</language><atom:link href="https://iceberg.apache.org/docs/latest/index.xml" rel="self" type="application/rss+xml"/><item><title>Getting Sta [...]
 Spark is currently the most feature-rich compute engine for Iceberg operations. We recommend you to get started with Spark to understand Iceberg concepts and features with examples. You can also view documentations of using Iceberg with other compute engine under the Engines tab.
 Using Iceberg in Spark 3 To use Iceberg in a Spark shell, use the --packages option:
 spark-shell --packages org.</description></item><item><title>Hive</title><link>https://iceberg.apache.org/docs/latest/hive/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/docs/latest/hive/</guid><description>Hive Iceberg supports reading and writing Iceberg tables through Hive by using a StorageHandler.
@@ -6,7 +6,9 @@ Feature support Iceberg compatibility with Hive 2.x and Hive 3.1.2/3 supports th
 Creating a table Dropping a table Reading a table Inserting into a table (INSERT INTO) DML operations work only with MapReduce execution engine. With Hive version 4.0.0-alpha-1 and above, the Iceberg integration when using HiveCatalog supports the following additional features:
 Creating an Iceberg identity-partitioned table Creating an Iceberg table with any partition spec, including the various transforms supported by Iceberg Creating a table from an existing table (CTAS table) Altering a table while keeping Iceberg and Hive schemas in sync Altering the partition schema (updating columns) Altering the partition schema by specifying partition transforms Truncating a table Migrating tables in Avro, Parquet, or ORC (Non-ACID) format to Iceberg Reading the schema  [...]
 Enabling AWS Integration The iceberg-aws module is bundled with Spark and Flink engine runtimes for all versions from 0.11.0 onwards. However, the AWS clients are not bundled so that you can use the same client version as your application. You will need to provide the AWS v2 SDK because that is what Iceberg depends on.</description></item><item><title>Configuration</title><link>https://iceberg.apache.org/docs/latest/configuration/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate>< [...]
-Read properties Property Default Description read.split.target-size 134217728 (128 MB) Target size when combining data input splits read.split.metadata-target-size 33554432 (32 MB) Target size when combining metadata input splits read.split.planning-lookback 10 Number of bins to consider when combining input splits read.split.open-file-cost 4194304 (4 MB) The estimated cost to open a file, used as a minimum weight when combining splits.</description></item><item><title>DDL</title><link>h [...]
+Read properties Property Default Description read.split.target-size 134217728 (128 MB) Target size when combining data input splits read.split.metadata-target-size 33554432 (32 MB) Target size when combining metadata input splits read.split.planning-lookback 10 Number of bins to consider when combining input splits read.split.open-file-cost 4194304 (4 MB) The estimated cost to open a file, used as a minimum weight when combining splits.</description></item><item><title>Configuration</tit [...]
+This creates an Iceberg catalog named hive_prod that loads tables from a Hive metastore:
+spark.sql.catalog.hive_prod = org.apache.iceberg.spark.SparkCatalog spark.sql.catalog.hive_prod.type = hive spark.sql.catalog.hive_prod.uri = thrift://metastore-host:port # omit uri to use the same URI as Spark: hive.metastore.uris in hive-site.xml Iceberg also supports a directory-based catalog in HDFS that can be configured using type=hadoop:</description></item><item><title>DDL</title><link>https://iceberg.apache.org/docs/latest/spark-ddl/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +000 [...]
 Iceberg uses Apache Spark&amp;rsquo;s DataSourceV2 API for data source and catalog implementations. Spark DSv2 is an evolving API with different levels of support in Spark versions. Spark 2.4 does not support SQL DDL.
 Spark 2.4 can&amp;rsquo;t create Iceberg tables with DDL, instead use Spark 3.x or the Iceberg API. CREATE TABLE Spark 3.0 can create tables in any Iceberg catalog with the clause USING iceberg:</description></item><item><title>Dell</title><link>https://iceberg.apache.org/docs/latest/dell/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/docs/latest/dell/</guid><description>Iceberg Dell Integration Dell ECS Integration Iceberg can be used with Dell [...]
 See Dell ECS for more information on Dell ECS.
@@ -44,9 +46,7 @@ git clone https://github.com/apache/iceberg.git cd iceberg/python pip install -e
 Iceberg uses Apache Spark&amp;rsquo;s DataSourceV2 API for data source and catalog implementations. Spark DSv2 is an evolving API with different levels of support in Spark versions:
 Feature support Spark 3.0 Spark 2.4 Notes SELECT ✔️ DataFrame reads ✔️ ✔️ Metadata table SELECT ✔️ History metadata table ✔️ ✔️ Snapshots metadata table ✔️ ✔️ Files metadata table ✔️ ✔️ Manifests metadata table ✔️ ✔️ Partitions metadata table ✔️ ✔️ All metadata tables ✔️ ✔️ Querying with SQL In Spark 3, tables use identifiers that include a catalog name.</description></item><item><title>Reliability</title><link>https://iceberg.apache.org/docs/latest/reliability/</link><pubDate>Mon, 01 Ja [...]
 Hive tables track data files using both a central metastore for partitions and a file system for individual files. This makes atomic changes to a table&amp;rsquo;s contents impossible, and eventually consistent stores like S3 may return incorrect results due to the use of listing files to reconstruct the state of a table. It also requires job planning to make many slow listing calls: O(n) with the number of partitions.</description></item><item><title>Schemas</title><link>https://iceberg [...]
-Type Description Notes boolean True or false int 32-bit signed integers Can promote to long long 64-bit signed integers float 32-bit IEEE 754 floating point Can promote to double double 64-bit IEEE 754 floating point decimal(P,S) Fixed-point decimal; precision P, scale S Scale is fixed and precision must be 38 or less date Calendar date without timezone or time time Time of day without date, timezone Stored as microseconds timestamp Timestamp without timezone Stored as microseconds times [...]
-This creates an Iceberg catalog named hive_prod that loads tables from a Hive metastore:
-spark.sql.catalog.hive_prod = org.apache.iceberg.spark.SparkCatalog spark.sql.catalog.hive_prod.type = hive spark.sql.catalog.hive_prod.uri = thrift://metastore-host:port # omit uri to use the same URI as Spark: hive.metastore.uris in hive-site.xml Iceberg also supports a directory-based catalog in HDFS that can be configured using type=hadoop:</description></item><item><title>Structured Streaming</title><link>https://iceberg.apache.org/docs/latest/spark-structured-streaming/</link><pubD [...]
+Type Description Notes boolean True or false int 32-bit signed integers Can promote to long long 64-bit signed integers float 32-bit IEEE 754 floating point Can promote to double double 64-bit IEEE 754 floating point decimal(P,S) Fixed-point decimal; precision P, scale S Scale is fixed and precision must be 38 or less date Calendar date without timezone or time time Time of day without date, timezone Stored as microseconds timestamp Timestamp without timezone Stored as microseconds times [...]
 As of Spark 3.0, DataFrame reads and writes are supported.
 Feature support Spark 3.0 Spark 2.4 Notes DataFrame write ✔ ✔ Streaming Reads Iceberg supports processing incremental data in spark structured streaming jobs which starts from a historical timestamp:
 val df = spark.</description></item><item><title>Writes</title><link>https://iceberg.apache.org/docs/latest/spark-writes/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://iceberg.apache.org/docs/latest/spark-writes/</guid><description>Spark Writes To use Iceberg in Spark, first configure Spark catalogs.
diff --git a/docs/latest/java-api-quickstart/index.html b/docs/latest/java-api-quickstart/index.html
index 0edcdfae..b361f060 100644
--- a/docs/latest/java-api-quickstart/index.html
+++ b/docs/latest/java-api-quickstart/index.html
@@ -3,17 +3,17 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collaps [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chev [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Integrations class=collapse><ul class=sub-menu><li><a href=../aws/>AWS</a></li><li><a href=../dell/>Dell</a></li><li><a href=../jdbc/>JDBC</a></li><li><a href=../nessie/>Nessie</a></li></ul></div><li><a class=chevron-toggle data-toggle=collapse data-parent=full href=#API><span>API</span>
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=API class="collapse in"><ul class=sub-menu><li><a id=active href=../java-api-quickstart/>Java Quickstart</a></li><li><a href=../api/>Java API</a></li><li><a href=../custom-catalog/>Java Custom Catalog</a></li><li><a href=../python-quickstart/>Python Quickstart</a></li><li><a href=../python-api-intro/>Python API</a></li><li><a href=../python-feature-support/>Python Feature Support</a></li></ul></div><li><a href=https://iceberg.apache.org/ [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=API class="collapse in"><ul class=sub-menu><li><a id=active href=../java-api-quickstart/>Java Quickstart</a></li><li><a href=../api/>Java API</a></li><li><a href=../custom-catalog/>Java Custom Catalog</a></li><li><a href=../python-quickstart/>Python Quickstart</a></li><li><a href=../python-api-intro/>Python API</a></li><li><a href=../python-feature-support/>Python Feature Support</a></li></ul></div><li><a href=https://iceberg.apache.org/ [...]
 You can initialize a Hive catalog with a name and some properties.
 (see: <a href=../configuration/#catalog-properties>Catalog properties</a>)</p><p><strong>Note:</strong> Currently, <code>setConf</code> is always required for hive catalogs, but this will change in the future.</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-java data-lang=java><span style=display:flex><span><span style=color:#f92672>import</span> org.apache.iceberg.hive.HiveCatalog<span sty [...]
 </span></span><span style=display:flex><span>
diff --git a/docs/latest/jdbc/index.html b/docs/latest/jdbc/index.html
index e6eec41f..6c6e1412 100644
--- a/docs/latest/jdbc/index.html
+++ b/docs/latest/jdbc/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collaps [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chev [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
@@ -16,7 +16,7 @@
 <i class="fa fa-chevron-down"></i></a></li><div id=API class=collapse><ul class=sub-menu><li><a href=../java-api-quickstart/>Java Quickstart</a></li><li><a href=../api/>Java API</a></li><li><a href=../custom-catalog/>Java Custom Catalog</a></li><li><a href=../python-quickstart/>Python Quickstart</a></li><li><a href=../python-api-intro/>Python API</a></li><li><a href=../python-feature-support/>Python Feature Support</a></li></ul></div><li><a href=https://iceberg.apache.org/docs/latest/../ [...]
 The database that JDBC connects to must support atomic transaction to allow the JDBC catalog implementation to
 properly support atomic Iceberg table commits and read serializable isolation.</p><h3 id=configurations>Configurations</h3><p>Because each database and database service provider might require different configurations,
-the JDBC catalog allows arbitrary configurations through:</p><table><thead><tr><th>Property</th><th>Default</th><th>Description</th></tr></thead><tbody><tr><td>uri</td><td></td><td>the JDBC connection string</td></tr><tr><td>jdbc.&lt;property_key></td><td></td><td>any key value pairs to configure the JDBC connection</td></tr></tbody></table><h3 id=examples>Examples</h3><h4 id=spark>Spark</h4><p>You can start a Spark session with a MySQL JDBC connection using the following configurations: [...]
+the JDBC catalog allows arbitrary configurations through:</p><table><thead><tr><th>Property</th><th>Default</th><th>Description</th></tr></thead><tbody><tr><td>uri</td><td></td><td>the JDBC connection string</td></tr><tr><td>jdbc.&lt;property_key></td><td></td><td>any key value pairs to configure the JDBC connection</td></tr></tbody></table><h3 id=examples>Examples</h3><h4 id=spark>Spark</h4><p>You can start a Spark session with a MySQL JDBC connection using the following configurations: [...]
 </span></span></span><span style=display:flex><span><span style=color:#ae81ff></span>    --conf spark.sql.catalog.my_catalog<span style=color:#f92672>=</span>org.apache.iceberg.spark.SparkCatalog <span style=color:#ae81ff>\
 </span></span></span><span style=display:flex><span><span style=color:#ae81ff></span>    --conf spark.sql.catalog.my_catalog.warehouse<span style=color:#f92672>=</span>s3://my-bucket/my/key/prefix <span style=color:#ae81ff>\
 </span></span></span><span style=display:flex><span><span style=color:#ae81ff></span>    --conf spark.sql.catalog.my_catalog.catalog-impl<span style=color:#f92672>=</span>org.apache.iceberg.jdbc.JdbcCatalog <span style=color:#ae81ff>\
diff --git a/docs/latest/maintenance/index.html b/docs/latest/maintenance/index.html
index 022bf0c1..01de283d 100644
--- a/docs/latest/maintenance/index.html
+++ b/docs/latest/maintenance/index.html
@@ -3,22 +3,22 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class="collapse in"><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a id=active href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chev [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Integrations class=collapse><ul class=sub-menu><li><a href=../aws/>AWS</a></li><li><a href=../dell/>Dell</a></li><li><a href=../jdbc/>JDBC</a></li><li><a href=../nessie/>Nessie</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href=#API><span>API</span>
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=API class=collapse><ul class=sub-menu><li><a href=../java-api-quickstart/>Java Quickstart</a></li><li><a href=../api/>Java API</a></li><li><a href=../custom-catalog/>Java Custom Catalog</a></li><li><a href=../python-quickstart/>Python Quickstart</a></li><li><a href=../python-api-intro/>Python API</a></li><li><a href=../python-feature-support/>Python Feature Support</a></li></ul></div><li><a href=https://iceberg.apache.org/docs/latest/../ [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=API class=collapse><ul class=sub-menu><li><a href=../java-api-quickstart/>Java Quickstart</a></li><li><a href=../api/>Java API</a></li><li><a href=../custom-catalog/>Java Custom Catalog</a></li><li><a href=../python-quickstart/>Python Quickstart</a></li><li><a href=../python-api-intro/>Python API</a></li><li><a href=../python-feature-support/>Python Feature Support</a></li></ul></div><li><a href=https://iceberg.apache.org/docs/latest/../ [...]
 </span></span><span style=display:flex><span><span style=color:#66d9ef>long</span> tsToExpire <span style=color:#f92672>=</span> System<span style=color:#f92672>.</span><span style=color:#a6e22e>currentTimeMillis</span><span style=color:#f92672>()</span> <span style=color:#f92672>-</span> <span style=color:#f92672>(</span>1000 <span style=color:#f92672>*</span> 60 <span style=color:#f92672>*</span> 60 <span style=color:#f92672>*</span> 24<span style=color:#f92672>);</span> <span style=co [...]
 </span></span></span><span style=display:flex><span><span style=color:#75715e></span>table<span style=color:#f92672>.</span><span style=color:#a6e22e>expireSnapshots</span><span style=color:#f92672>()</span>
 </span></span><span style=display:flex><span>     <span style=color:#f92672>.</span><span style=color:#a6e22e>expireOlderThan</span><span style=color:#f92672>(</span>tsToExpire<span style=color:#f92672>)</span>
 </span></span><span style=display:flex><span>     <span style=color:#f92672>.</span><span style=color:#a6e22e>commit</span><span style=color:#f92672>();</span>
-</span></span></code></pre></div><p>See the <a href=../../../javadoc/0.14.1/org/apache/iceberg/ExpireSnapshots.html><code>ExpireSnapshots</code> Javadoc</a> to see more configuration options.</p><p>There is also a Spark action that can run table expiration in parallel for large tables:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-java data-lang=java><span style=display:flex><span>Table t [...]
+</span></span></code></pre></div><p>See the <a href=../../../javadoc/1.0.0/org/apache/iceberg/ExpireSnapshots.html><code>ExpireSnapshots</code> Javadoc</a> to see more configuration options.</p><p>There is also a Spark action that can run table expiration in parallel for large tables:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-java data-lang=java><span style=display:flex><span>Table ta [...]
 </span></span><span style=display:flex><span>SparkActions
 </span></span><span style=display:flex><span>    <span style=color:#f92672>.</span><span style=color:#a6e22e>get</span><span style=color:#f92672>()</span>
 </span></span><span style=display:flex><span>    <span style=color:#f92672>.</span><span style=color:#a6e22e>expireSnapshots</span><span style=color:#f92672>(</span>table<span style=color:#f92672>)</span>
@@ -30,7 +30,7 @@ Regularly expiring snapshots deletes unused data files.</div><h3 id=remove-old-m
 </span></span><span style=display:flex><span>    <span style=color:#f92672>.</span><span style=color:#a6e22e>get</span><span style=color:#f92672>()</span>
 </span></span><span style=display:flex><span>    <span style=color:#f92672>.</span><span style=color:#a6e22e>deleteOrphanFiles</span><span style=color:#f92672>(</span>table<span style=color:#f92672>)</span>
 </span></span><span style=display:flex><span>    <span style=color:#f92672>.</span><span style=color:#a6e22e>execute</span><span style=color:#f92672>();</span>
-</span></span></code></pre></div><p>See the <a href=../../../javadoc/0.14.1/org/apache/iceberg/actions/DeleteOrphanFiles.html>DeleteOrphanFiles Javadoc</a> to see more configuration options.</p><p>This action may take a long time to finish if you have lots of files in data and metadata directories. It is recommended to execute this periodically, but you may not need to execute this often.</p><div class=info>It is dangerous to remove orphan files with a retention interval shorter than the [...]
+</span></span></code></pre></div><p>See the <a href=../../../javadoc/1.0.0/org/apache/iceberg/actions/DeleteOrphanFiles.html>DeleteOrphanFiles Javadoc</a> to see more configuration options.</p><p>This action may take a long time to finish if you have lots of files in data and metadata directories. It is recommended to execute this periodically, but you may not need to execute this often.</p><div class=info>It is dangerous to remove orphan files with a retention interval shorter than the  [...]
 might corrupt the table if in-progress files are considered orphaned and are deleted. The default interval is 3 days.</div><div class=info>Iceberg uses the string representations of paths when determining which files need to be removed. On some file systems,
 the path can change over time, but it still represents the same file. For example, if you change authorities for an HDFS cluster,
 none of the old path urls used during creation will match those that appear in a current listing. <em>This will lead to data loss when
@@ -42,13 +42,13 @@ FileSystem API to avoid unintentional deletion.</div><h2 id=optional-maintenance
 </span></span><span style=display:flex><span>    <span style=color:#f92672>.</span><span style=color:#a6e22e>filter</span><span style=color:#f92672>(</span>Expressions<span style=color:#f92672>.</span><span style=color:#a6e22e>equal</span><span style=color:#f92672>(</span><span style=color:#e6db74>&#34;date&#34;</span><span style=color:#f92672>,</span> <span style=color:#e6db74>&#34;2020-08-18&#34;</span><span style=color:#f92672>))</span>
 </span></span><span style=display:flex><span>    <span style=color:#f92672>.</span><span style=color:#a6e22e>option</span><span style=color:#f92672>(</span><span style=color:#e6db74>&#34;target-file-size-bytes&#34;</span><span style=color:#f92672>,</span> Long<span style=color:#f92672>.</span><span style=color:#a6e22e>toString</span><span style=color:#f92672>(</span>500 <span style=color:#f92672>*</span> 1024 <span style=color:#f92672>*</span> 1024<span style=color:#f92672>))</span> <spa [...]
 </span></span></span><span style=display:flex><span><span style=color:#75715e></span>    <span style=color:#f92672>.</span><span style=color:#a6e22e>execute</span><span style=color:#f92672>();</span>
-</span></span></code></pre></div><p>The <code>files</code> metadata table is useful for inspecting data file sizes and determining when to compact partitions.</p><p>See the <a href=../../../javadoc/0.14.1/org/apache/iceberg/actions/RewriteDataFiles.html><code>RewriteDataFiles</code> Javadoc</a> to see more configuration options.</p><h3 id=rewrite-manifests>Rewrite manifests</h3><p>Iceberg uses metadata in its manifest list and manifest files speed up query planning and to prune unnecessa [...]
+</span></span></code></pre></div><p>The <code>files</code> metadata table is useful for inspecting data file sizes and determining when to compact partitions.</p><p>See the <a href=../../../javadoc/1.0.0/org/apache/iceberg/actions/RewriteDataFiles.html><code>RewriteDataFiles</code> Javadoc</a> to see more configuration options.</p><h3 id=rewrite-manifests>Rewrite manifests</h3><p>Iceberg uses metadata in its manifest list and manifest files speed up query planning and to prune unnecessar [...]
 </span></span><span style=display:flex><span>SparkActions
 </span></span><span style=display:flex><span>    <span style=color:#f92672>.</span><span style=color:#a6e22e>get</span><span style=color:#f92672>()</span>
 </span></span><span style=display:flex><span>    <span style=color:#f92672>.</span><span style=color:#a6e22e>rewriteManifests</span><span style=color:#f92672>(</span>table<span style=color:#f92672>)</span>
 </span></span><span style=display:flex><span>    <span style=color:#f92672>.</span><span style=color:#a6e22e>rewriteIf</span><span style=color:#f92672>(</span>file <span style=color:#f92672>-&gt;</span> file<span style=color:#f92672>.</span><span style=color:#a6e22e>length</span><span style=color:#f92672>()</span> <span style=color:#f92672>&lt;</span> 10 <span style=color:#f92672>*</span> 1024 <span style=color:#f92672>*</span> 1024<span style=color:#f92672>)</span> <span style=color:#75 [...]
 </span></span></span><span style=display:flex><span><span style=color:#75715e></span>    <span style=color:#f92672>.</span><span style=color:#a6e22e>execute</span><span style=color:#f92672>();</span>
-</span></span></code></pre></div><p>See the <a href=../../../javadoc/0.14.1/org/apache/iceberg/actions/RewriteManifests.html><code>RewriteManifests</code> Javadoc</a> to see more configuration options.</p></div><div id=toc class=markdown-body><div id=full><nav id=TableOfContents><ul><li><a href=#recommended-maintenance>Recommended Maintenance</a><ul><li><a href=#expire-snapshots>Expire Snapshots</a></li><li><a href=#remove-old-metadata-files>Remove old metadata files</a></li><li><a href= [...]
+</span></span></code></pre></div><p>See the <a href=../../../javadoc/1.0.0/org/apache/iceberg/actions/RewriteManifests.html><code>RewriteManifests</code> Javadoc</a> to see more configuration options.</p></div><div id=toc class=markdown-body><div id=full><nav id=TableOfContents><ul><li><a href=#recommended-maintenance>Recommended Maintenance</a><ul><li><a href=#expire-snapshots>Expire Snapshots</a></li><li><a href=#remove-old-metadata-files>Remove old metadata files</a></li><li><a href=# [...]
 <script src=https://iceberg.apache.org/docs/latest//js/jquery.easing.min.js></script>
 <script type=text/javascript src=https://iceberg.apache.org/docs/latest//js/search.js></script>
 <script src=https://iceberg.apache.org/docs/latest//js/bootstrap.min.js></script>
diff --git a/docs/latest/nessie/index.html b/docs/latest/nessie/index.html
index f5796f9e..adc09cd2 100644
--- a/docs/latest/nessie/index.html
+++ b/docs/latest/nessie/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collaps [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chev [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
@@ -16,8 +16,8 @@
 <i class="fa fa-chevron-down"></i></a></li><div id=API class=collapse><ul class=sub-menu><li><a href=../java-api-quickstart/>Java Quickstart</a></li><li><a href=../api/>Java API</a></li><li><a href=../custom-catalog/>Java Custom Catalog</a></li><li><a href=../python-quickstart/>Python Quickstart</a></li><li><a href=../python-api-intro/>Python API</a></li><li><a href=../python-feature-support/>Python Feature Support</a></li></ul></div><li><a href=https://iceberg.apache.org/docs/latest/../ [...]
 This section describes how to use Iceberg with Nessie. Nessie provides several key features on top of Iceberg:</p><ul><li>multi-table transactions</li><li>git-like operations (eg branches, tags, commits)</li><li>hive-like metastore capabilities</li></ul><p>See <a href=https://projectnessie.org>Project Nessie</a> for more information on Nessie. Nessie requires a server to run, see
 <a href=https://projectnessie.org/try/>Getting Started</a> to start a Nessie server.</p><h2 id=enabling-nessie-catalog>Enabling Nessie Catalog</h2><p>The <code>iceberg-nessie</code> module is bundled with Spark and Flink runtimes for all versions from <code>0.11.0</code>. To get started
-with Nessie and Iceberg simply add the Iceberg runtime to your process. Eg: <code>spark-sql --packages org.apache.iceberg:iceberg-spark3-runtime:0.14.1</code>.</p><h2 id=spark-sql-extensions>Spark SQL Extensions</h2><p>From Spark 3.0, Nessie SQL extensions can be used to manage the Nessie repo as shown below.</p><pre tabindex=0><code>bin/spark-sql 
-  --packages &#34;org.apache.iceberg:iceberg-spark3-runtime:0.14.1,org.projectnessie:nessie-spark-extensions:0.20.0&#34;
+with Nessie and Iceberg simply add the Iceberg runtime to your process. Eg: <code>spark-sql --packages org.apache.iceberg:iceberg-spark3-runtime:1.0.0</code>.</p><h2 id=spark-sql-extensions>Spark SQL Extensions</h2><p>From Spark 3.0, Nessie SQL extensions can be used to manage the Nessie repo as shown below.</p><pre tabindex=0><code>bin/spark-sql 
+  --packages &#34;org.apache.iceberg:iceberg-spark3-runtime:1.0.0,org.projectnessie:nessie-spark-extensions:0.20.0&#34;
   --conf spark.sql.extensions=&#34;org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions,org.projectnessie.spark.extensions.NessieSparkSessionExtensions&#34;
   --conf &lt;other settings&gt;
 </code></pre><p>Please refer <a href=https://projectnessie.org/tools/sql/>Nessie SQL extension document</a> to learn more about it.</p><h2 id=nessie-catalog>Nessie Catalog</h2><p>One major feature introduced in release <code>0.11.0</code> is the ability to easily interact with a <a href=../custom-catalog>Custom
@@ -38,7 +38,7 @@ and <a href=../flink#custom-catalog>Flink Configuration</a> for instructions for
 </span></span><span style=display:flex><span><span style=color:#f92672>from</span> pyflink.table <span style=color:#f92672>import</span> StreamTableEnvironment
 </span></span><span style=display:flex><span>
 </span></span><span style=display:flex><span>env <span style=color:#f92672>=</span> StreamExecutionEnvironment<span style=color:#f92672>.</span>get_execution_environment()
-</span></span><span style=display:flex><span>iceberg_flink_runtime_jar <span style=color:#f92672>=</span> os<span style=color:#f92672>.</span>path<span style=color:#f92672>.</span>join(os<span style=color:#f92672>.</span>getcwd(), <span style=color:#e6db74>&#34;iceberg-flink-runtime-0.14.1.jar&#34;</span>)
+</span></span><span style=display:flex><span>iceberg_flink_runtime_jar <span style=color:#f92672>=</span> os<span style=color:#f92672>.</span>path<span style=color:#f92672>.</span>join(os<span style=color:#f92672>.</span>getcwd(), <span style=color:#e6db74>&#34;iceberg-flink-runtime-1.0.0.jar&#34;</span>)
 </span></span><span style=display:flex><span>env<span style=color:#f92672>.</span>add_jars(<span style=color:#e6db74>&#34;file://</span><span style=color:#e6db74>{}</span><span style=color:#e6db74>&#34;</span><span style=color:#f92672>.</span>format(iceberg_flink_runtime_jar))
 </span></span><span style=display:flex><span>table_env <span style=color:#f92672>=</span> StreamTableEnvironment<span style=color:#f92672>.</span>create(env)
 </span></span><span style=display:flex><span>
diff --git a/docs/latest/partitioning/index.html b/docs/latest/partitioning/index.html
index 80ac0174..ce576e11 100644
--- a/docs/latest/partitioning/index.html
+++ b/docs/latest/partitioning/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class="collapse in"><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a id=active href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chev [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
diff --git a/docs/latest/performance/index.html b/docs/latest/performance/index.html
index 676c3cbe..0d59f7b9 100644
--- a/docs/latest/performance/index.html
+++ b/docs/latest/performance/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class="collapse in"><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a id=active href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chev [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
diff --git a/docs/latest/python-api-intro/index.html b/docs/latest/python-api-intro/index.html
index 98c224b4..0dc42bae 100644
--- a/docs/latest/python-api-intro/index.html
+++ b/docs/latest/python-api-intro/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collaps [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chev [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
diff --git a/docs/latest/python-feature-support/index.html b/docs/latest/python-feature-support/index.html
index a3883b07..5476d103 100644
--- a/docs/latest/python-feature-support/index.html
+++ b/docs/latest/python-feature-support/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collaps [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chev [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
diff --git a/docs/latest/python-quickstart/index.html b/docs/latest/python-quickstart/index.html
index d46118af..577d5cbb 100644
--- a/docs/latest/python-quickstart/index.html
+++ b/docs/latest/python-quickstart/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collaps [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chev [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
diff --git a/docs/latest/reliability/index.html b/docs/latest/reliability/index.html
index 1ac2762b..890fc813 100644
--- a/docs/latest/reliability/index.html
+++ b/docs/latest/reliability/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class="collapse in"><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a id=active href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chev [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
diff --git a/docs/latest/schemas/index.html b/docs/latest/schemas/index.html
index 47d98a9c..806de005 100644
--- a/docs/latest/schemas/index.html
+++ b/docs/latest/schemas/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class="collapse in"><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a id=active href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chev [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
diff --git a/docs/latest/sitemap.xml b/docs/latest/sitemap.xml
index 1c90a27b..b378dab3 100644
--- a/docs/latest/sitemap.xml
+++ b/docs/latest/sitemap.xml
@@ -1 +1 @@
-<?xml version="1.0" encoding="utf-8" standalone="yes"?><urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml"><url><loc>https://iceberg.apache.org/docs/latest/getting-started/</loc></url><url><loc>https://iceberg.apache.org/docs/latest/hive/</loc></url><url><loc>https://iceberg.apache.org/docs/latest/aws/</loc></url><url><loc>https://iceberg.apache.org/docs/latest/categories/</loc></url><url><loc>https://iceberg.apache.org/docs/latest/confi [...]
\ No newline at end of file
+<?xml version="1.0" encoding="utf-8" standalone="yes"?><urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml"><url><loc>https://iceberg.apache.org/docs/latest/getting-started/</loc></url><url><loc>https://iceberg.apache.org/docs/latest/hive/</loc></url><url><loc>https://iceberg.apache.org/docs/latest/aws/</loc></url><url><loc>https://iceberg.apache.org/docs/latest/categories/</loc></url><url><loc>https://iceberg.apache.org/docs/latest/confi [...]
\ No newline at end of file
diff --git a/docs/latest/spark-configuration/index.html b/docs/latest/spark-configuration/index.html
index 13dbe0c1..d845f189 100644
--- a/docs/latest/spark-configuration/index.html
+++ b/docs/latest/spark-configuration/index.html
@@ -1,13 +1,13 @@
-<!doctype html><html><head><meta charset=utf-8><meta http-equiv=x-ua-compatible content="IE=edge"><meta name=viewport content="width=device-width,initial-scale=1"><meta name=description content><meta name=author content><title>Spark Configuration</title><link href=../css/bootstrap.css rel=stylesheet><link href=../css/markdown.css rel=stylesheet><link href=../css/katex.min.css rel=stylesheet><link href=../css/iceberg-theme.css rel=stylesheet><link href=../font-awesome-4.7.0/css/font-aweso [...]
+<!doctype html><html><head><meta charset=utf-8><meta http-equiv=x-ua-compatible content="IE=edge"><meta name=viewport content="width=device-width,initial-scale=1"><meta name=description content><meta name=author content><title>Configuration</title><link href=../css/bootstrap.css rel=stylesheet><link href=../css/markdown.css rel=stylesheet><link href=../css/katex.min.css rel=stylesheet><link href=../css/iceberg-theme.css rel=stylesheet><link href=../font-awesome-4.7.0/css/font-awesome.min [...]
 <span class=sr-only>Toggle navigation</span>
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class=chevron-toggle data-toggle=collapse data-paren [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collaps [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class="collapse in"><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a id=active href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class=collapse><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
diff --git a/docs/latest/spark-ddl/index.html b/docs/latest/spark-ddl/index.html
index ec3c5276..45d9f1e0 100644
--- a/docs/latest/spark-ddl/index.html
+++ b/docs/latest/spark-ddl/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class=chevron-toggle data-toggle=collapse data-paren [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class="collapse in"><ul class=sub-menu><li><a id=active href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class="collapse in"><ul class=sub-menu><li><a id=active href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-p [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
@@ -45,7 +45,9 @@
 </span></span><span style=display:flex><span><span style=color:#66d9ef>USING</span> iceberg
 </span></span><span style=display:flex><span><span style=color:#66d9ef>AS</span> <span style=color:#66d9ef>SELECT</span> ...
 </span></span></code></pre></div><p>The schema and partition spec will be replaced if changed. To avoid modifying the table&rsquo;s schema and partitioning, use <code>INSERT OVERWRITE</code> instead of <code>REPLACE TABLE</code>.
-The new table properties in the <code>REPLACE TABLE</code> command will be merged with any existing table properties. The existing table properties will be updated if changed else they are preserved.</p><h2 id=drop-table><code>DROP TABLE</code></h2><p>To delete a table, run:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#6 [...]
+The new table properties in the <code>REPLACE TABLE</code> command will be merged with any existing table properties. The existing table properties will be updated if changed else they are preserved.</p><h2 id=drop-table><code>DROP TABLE</code></h2><p>The drop table behavior changed in 0.14.</p><p>Prior to 0.14, running <code>DROP TABLE</code> would remove the table from the catalog and delete the table contents as well.</p><p>From 0.14 onwards, <code>DROP TABLE</code> would only remove  [...]
+In order to delete the table contents <code>DROP TABLE PURGE</code> should be used.</p><h3 id=drop-table-1><code>DROP TABLE</code></h3><p>To drop the table from the catalog, run:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>DROP</span> <span style=color:#66d9ef>TABLE</span> prod.db.sample
+</span></span></code></pre></div><h3 id=drop-table-purge><code>DROP TABLE PURGE</code></h3><p>To drop the table from the catalog and delete the table&rsquo;s contents, run:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>DROP</span> <span style=color:#66d9ef>TABLE</span> prod.db.sample PURGE
 </span></span></code></pre></div><h2 id=alter-table><code>ALTER TABLE</code></h2><p>Iceberg has full <code>ALTER TABLE</code> support in Spark 3, including:</p><ul><li>Renaming a table</li><li>Setting or removing table properties</li><li>Adding, deleting, and renaming columns</li><li>Adding, deleting, and renaming nested fields</li><li>Reordering top-level columns and nested struct fields</li><li>Widening the type of <code>int</code>, <code>float</code>, and <code>decimal</code> fields</ [...]
 </span></span></code></pre></div><h3 id=alter-table--set-tblproperties><code>ALTER TABLE ... SET TBLPROPERTIES</code></h3><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>ALTER</span> <span style=color:#66d9ef>TABLE</span> prod.db.sample <span style=color:#66d9ef>SET</span> TBLPROPERTIES (
 </span></span><span style=display:flex><span>    <span style=color:#e6db74>&#39;read.split.target-size&#39;</span><span style=color:#f92672>=</span><span style=color:#e6db74>&#39;268435456&#39;</span>
@@ -120,7 +122,7 @@ For example, if you partition by days and move to partitioning by hours, overwri
 </span></span></code></pre></div><p>To order within each task, not across tasks, use <code>LOCALLY ORDERED BY</code>:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>ALTER</span> <span style=color:#66d9ef>TABLE</span> prod.db.sample <span style=color:#66d9ef>WRITE</span> LOCALLY ORDERED <span style=color:#66d9ef>BY</ [...]
 </span></span></code></pre></div><h3 id=alter-table--write-distributed-by-partition><code>ALTER TABLE ... WRITE DISTRIBUTED BY PARTITION</code></h3><p><code>WRITE DISTRIBUTED BY PARTITION</code> will request that each partition is handled by one writer, the default implementation is hash distribution.</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex [...]
 </span></span></code></pre></div><p><code>DISTRIBUTED BY PARTITION</code> and <code>LOCALLY ORDERED BY</code> may be used together, to distribute by partition and locally order rows within each task.</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>ALTER</span> <span style=color:#66d9ef>TABLE</span> prod.db.sample <sp [...]
-</span></span></code></pre></div></div><div id=toc class=markdown-body><div id=full><nav id=TableOfContents><ul><li><a href=#create-table><code>CREATE TABLE</code></a><ul><li><a href=#partitioned-by><code>PARTITIONED BY</code></a></li></ul></li><li><a href=#create-table--as-select><code>CREATE TABLE ... AS SELECT</code></a></li><li><a href=#replace-table--as-select><code>REPLACE TABLE ... AS SELECT</code></a></li><li><a href=#drop-table><code>DROP TABLE</code></a></li><li><a href=#alter- [...]
+</span></span></code></pre></div></div><div id=toc class=markdown-body><div id=full><nav id=TableOfContents><ul><li><a href=#create-table><code>CREATE TABLE</code></a><ul><li><a href=#partitioned-by><code>PARTITIONED BY</code></a></li></ul></li><li><a href=#create-table--as-select><code>CREATE TABLE ... AS SELECT</code></a></li><li><a href=#replace-table--as-select><code>REPLACE TABLE ... AS SELECT</code></a></li><li><a href=#drop-table><code>DROP TABLE</code></a><ul><li><a href=#drop-ta [...]
 <script src=https://iceberg.apache.org/docs/latest//js/jquery.easing.min.js></script>
 <script type=text/javascript src=https://iceberg.apache.org/docs/latest//js/search.js></script>
 <script src=https://iceberg.apache.org/docs/latest//js/bootstrap.min.js></script>
diff --git a/docs/latest/spark-procedures/index.html b/docs/latest/spark-procedures/index.html
index 0528ba78..fa8399cf 100644
--- a/docs/latest/spark-procedures/index.html
+++ b/docs/latest/spark-procedures/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class=chevron-toggle data-toggle=collapse data-paren [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class="collapse in"><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a id=active href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class="collapse in"><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a id=active href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-p [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
@@ -23,11 +23,11 @@
 </span></span></code></pre></div><h2 id=metadata-management>Metadata management</h2><p>Many <a href=../maintenance>maintenance actions</a> can be performed using Iceberg stored procedures.</p><h3 id=expire_snapshots><code>expire_snapshots</code></h3><p>Each write/update/delete/upsert/compaction in Iceberg produces a new snapshot while keeping the old data and metadata
 around for snapshot isolation and time travel. The <code>expire_snapshots</code> procedure can be used to remove older snapshots
 and their files which are no longer needed.</p><p>This procedure will remove old snapshots and data files which are uniquely required by those old snapshots. This means
-the <code>expire_snapshots</code> procedure will never remove files which are still required by a non-expired snapshot.</p><h4 id=usage-5>Usage</h4><table><thead><tr><th>Argument Name</th><th>Required?</th><th>Type</th><th>Description</th></tr></thead><tbody><tr><td><code>table</code></td><td>✔️</td><td>string</td><td>Name of the table to update</td></tr><tr><td><code>older_than</code></td><td>️</td><td>timestamp</td><td>Timestamp before which snapshots will be removed (Default: 5 days a [...]
+the <code>expire_snapshots</code> procedure will never remove files which are still required by a non-expired snapshot.</p><h4 id=usage-5>Usage</h4><table><thead><tr><th>Argument Name</th><th>Required?</th><th>Type</th><th>Description</th></tr></thead><tbody><tr><td><code>table</code></td><td>✔️</td><td>string</td><td>Name of the table to update</td></tr><tr><td><code>older_than</code></td><td>️</td><td>timestamp</td><td>Timestamp before which snapshots will be removed (Default: 5 days a [...]
 </span></span></code></pre></div><h3 id=remove_orphan_files><code>remove_orphan_files</code></h3><p>Used to remove files which are not referenced in any metadata files of an Iceberg table and can thus be considered &ldquo;orphaned&rdquo;.</p><h4 id=usage-6>Usage</h4><table><thead><tr><th>Argument Name</th><th>Required?</th><th>Type</th><th>Description</th></tr></thead><tbody><tr><td><code>table</code></td><td>✔️</td><td>string</td><td>Name of the table to clean</td></tr><tr><td><code>old [...]
 </span></span></code></pre></div><p>Remove any files in the <code>tablelocation/data</code> folder which are not known to the table <code>db.sample</code>.</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>CALL</span> <span style=color:#66d9ef>catalog_name</span>.<span style=color:#66d9ef>system</span>.remove_orphan_fi [...]
-</span></span></code></pre></div><h3 id=rewrite_data_files><code>rewrite_data_files</code></h3><p>Iceberg tracks each data file in a table. More data files leads to more metadata stored in manifest files, and small data files causes an unnecessary amount of metadata and less efficient queries from file open costs.</p><p>Iceberg can compact data files in parallel using Spark with the <code>rewriteDataFiles</code> action. This will combine small files into larger files to reduce metadata o [...]
-and <a href=../../../javadoc/0.14.1/org/apache/iceberg/actions/SortStrategy.html#field.summary><code>SortStrategy</code> Javadoc</a>
+</span></span></code></pre></div><h3 id=rewrite_data_files><code>rewrite_data_files</code></h3><p>Iceberg tracks each data file in a table. More data files leads to more metadata stored in manifest files, and small data files causes an unnecessary amount of metadata and less efficient queries from file open costs.</p><p>Iceberg can compact data files in parallel using Spark with the <code>rewriteDataFiles</code> action. This will combine small files into larger files to reduce metadata o [...]
+and <a href=../../../javadoc/1.0.0/org/apache/iceberg/actions/SortStrategy.html#field.summary><code>SortStrategy</code> Javadoc</a>
 for list of all the supported options for this action.</p><h4 id=output-6>Output</h4><table><thead><tr><th>Output Name</th><th>Type</th><th>Description</th></tr></thead><tbody><tr><td><code>rewritten_data_files_count</code></td><td>int</td><td>Number of data which were re-written by this command</td></tr><tr><td><code>added_data_files_count</code></td><td>int</td><td>Number of new data files which were written by this command</td></tr></tbody></table><h4 id=examples-3>Examples</h4><p>Rew [...]
 and also split large files according to the default write size of the table.</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>CALL</span> <span style=color:#66d9ef>catalog_name</span>.<span style=color:#66d9ef>system</span>.rewrite_data_files(<span style=color:#e6db74>&#39;db.sample&#39;</span>)
 </span></span></code></pre></div><p>Rewrite the data files in table <code>db.sample</code> by sorting all the data on id and name
@@ -36,7 +36,7 @@ using the same defaults as bin-pack to determine which files to rewrite.</p><div
 Using the same defaults as bin-pack to determine which files to rewrite.</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>CALL</span> <span style=color:#66d9ef>catalog_name</span>.<span style=color:#66d9ef>system</span>.rewrite_data_files(<span style=color:#66d9ef>table</span> <span style=color:#f92672>=&gt;</span> <s [...]
 </span></span></code></pre></div><p>Rewrite the data files in table <code>db.sample</code> using bin-pack strategy in any partition where more than 2 or more files need to be rewritten.</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>CALL</span> <span style=color:#66d9ef>catalog_name</span>.<span style=color:#66d9ef> [...]
 </span></span></code></pre></div><p>Rewrite the data files in table <code>db.sample</code> and select the files that may contain data matching the filter (id = 3 and name = &ldquo;foo&rdquo;) to be rewritten.</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>CALL</span> <span style=color:#66d9ef>catalog_name</span>.<sp [...]
-</span></span></code></pre></div><h3 id=rewrite_manifests><code>rewrite_manifests</code></h3><p>Rewrite manifests for a table to optimize scan planning.</p><p>Data files in manifests are sorted by fields in the partition spec. This procedure runs in parallel using a Spark job.</p><p>See the <a href=../../../javadoc/0.14.1/org/apache/iceberg/actions/RewriteManifests.html><code>RewriteManifests</code> Javadoc</a>
+</span></span></code></pre></div><h3 id=rewrite_manifests><code>rewrite_manifests</code></h3><p>Rewrite manifests for a table to optimize scan planning.</p><p>Data files in manifests are sorted by fields in the partition spec. This procedure runs in parallel using a Spark job.</p><p>See the <a href=../../../javadoc/1.0.0/org/apache/iceberg/actions/RewriteManifests.html><code>RewriteManifests</code> Javadoc</a>
 to see more configuration options.</p><div class=info>This procedure invalidates all cached Spark plans that reference the affected table.</div><h4 id=usage-8>Usage</h4><table><thead><tr><th>Argument Name</th><th>Required?</th><th>Type</th><th>Description</th></tr></thead><tbody><tr><td><code>table</code></td><td>✔️</td><td>string</td><td>Name of the table to update</td></tr><tr><td><code>use_caching</code></td><td>️</td><td>boolean</td><td>Use Spark caching during operation (defaults to [...]
 </span></span></code></pre></div><p>Rewrite the manifests in table <code>db.sample</code> and disable the use of Spark caching. This could be done to avoid memory issues on executors.</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>CALL</span> <span style=color:#66d9ef>catalog_name</span>.<span style=color:#66d9ef>sy [...]
 </span></span></code></pre></div><h2 id=table-migration>Table migration</h2><p>The <code>snapshot</code> and <code>migrate</code> procedures help test and migrate existing Hive or Spark tables to Iceberg.</p><h3 id=snapshot><code>snapshot</code></h3><p>Create a light-weight temporary copy of a table for testing, without changing the source table.</p><p>The newly created table can be changed or written to without affecting the source table, but the snapshot uses the original table&rsquo;s [...]
diff --git a/docs/latest/spark-queries/index.html b/docs/latest/spark-queries/index.html
index 7318a075..3d97b175 100644
--- a/docs/latest/spark-queries/index.html
+++ b/docs/latest/spark-queries/index.html
@@ -3,32 +3,32 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class=chevron-toggle data-toggle=collapse data-paren [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class="collapse in"><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a id=active href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class="collapse in"><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a id=active href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-p [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Integrations class=collapse><ul class=sub-menu><li><a href=../aws/>AWS</a></li><li><a href=../dell/>Dell</a></li><li><a href=../jdbc/>JDBC</a></li><li><a href=../nessie/>Nessie</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-parent=full href=#API><span>API</span>
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=API class=collapse><ul class=sub-menu><li><a href=../java-api-quickstart/>Java Quickstart</a></li><li><a href=../api/>Java API</a></li><li><a href=../custom-catalog/>Java Custom Catalog</a></li><li><a href=../python-quickstart/>Python Quickstart</a></li><li><a href=../python-api-intro/>Python API</a></li><li><a href=../python-feature-support/>Python Feature Support</a></li></ul></div><li><a href=https://iceberg.apache.org/docs/latest/../ [...]
-</span></span></span></code></pre></div><p>Metadata tables, like <code>history</code> and <code>snapshots</code>, can use the Iceberg table name as a namespace.</p><p>For example, to read from the <code>files</code> metadata table for <code>prod.db.table</code>:</p><pre tabindex=0><code>SELECT * FROM prod.db.table.files
-</code></pre><table><thead><tr><th>content</th><th>file_path</th><th>file_format</th><th>spec_id</th><th>partition</th><th>record_count</th><th>file_size_in_bytes</th><th>column_sizes</th><th>value_counts</th><th>null_value_counts</th><th>nan_value_counts</th><th>lower_bounds</th><th>upper_bounds</th><th>key_metadata</th><th>split_offsets</th><th>equality_ids</th><th>sort_order_id</th></tr></thead><tbody><tr><td>0</td><td>s3:/&mldr;/table/data/00000-3-8d6d60e8-d427-4809-bcf0-f5d45a4aad96 [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=API class=collapse><ul class=sub-menu><li><a href=../java-api-quickstart/>Java Quickstart</a></li><li><a href=../api/>Java API</a></li><li><a href=../custom-catalog/>Java Custom Catalog</a></li><li><a href=../python-quickstart/>Python Quickstart</a></li><li><a href=../python-api-intro/>Python API</a></li><li><a href=../python-feature-support/>Python Feature Support</a></li></ul></div><li><a href=https://iceberg.apache.org/docs/latest/../ [...]
+</span></span></span></code></pre></div><p>Metadata tables, like <code>history</code> and <code>snapshots</code>, can use the Iceberg table name as a namespace.</p><p>For example, to read from the <code>files</code> metadata table for <code>prod.db.table</code>:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>SELECT< [...]
+</span></span></code></pre></div><table><thead><tr><th>content</th><th>file_path</th><th>file_format</th><th>spec_id</th><th>partition</th><th>record_count</th><th>file_size_in_bytes</th><th>column_sizes</th><th>value_counts</th><th>null_value_counts</th><th>nan_value_counts</th><th>lower_bounds</th><th>upper_bounds</th><th>key_metadata</th><th>split_offsets</th><th>equality_ids</th><th>sort_order_id</th></tr></thead><tbody><tr><td>0</td><td>s3:/&mldr;/table/data/00000-3-8d6d60e8-d427-48 [...]
 </span></span></code></pre></div><h3 id=catalogs-with-dataframereader>Catalogs with DataFrameReader</h3><p>Iceberg 0.11.0 adds multi-catalog support to <code>DataFrameReader</code> in both Spark 3.x and 2.4.</p><p>Paths and table names can be loaded with Spark&rsquo;s <code>DataFrameReader</code> interface. How tables are loaded depends on how
-the identifier is specified. When using <code>spark.read.format("iceberg").path(table)</code> or <code>spark.table(table)</code> the <code>table</code>
-variable can take a number of forms as listed below:</p><ul><li><code>file:/path/to/table</code>: loads a HadoopTable at given path</li><li><code>tablename</code>: loads <code>currentCatalog.currentNamespace.tablename</code></li><li><code>catalog.tablename</code>: loads <code>tablename</code> from the specified catalog.</li><li><code>namespace.tablename</code>: loads <code>namespace.tablename</code> from current catalog</li><li><code>catalog.namespace.tablename</code>: loads <code>namesp [...]
+the identifier is specified. When using <code>spark.read.format("iceberg").load(table)</code> or <code>spark.table(table)</code> the <code>table</code>
+variable can take a number of forms as listed below:</p><ul><li><code>file:///path/to/table</code>: loads a HadoopTable at given path</li><li><code>tablename</code>: loads <code>currentCatalog.currentNamespace.tablename</code></li><li><code>catalog.tablename</code>: loads <code>tablename</code> from the specified catalog.</li><li><code>namespace.tablename</code>: loads <code>namespace.tablename</code> from current catalog</li><li><code>catalog.namespace.tablename</code>: loads <code>name [...]
 </span></span></span><span style=display:flex><span><span style=color:#75715e></span><span style=color:#66d9ef>SELECT</span> <span style=color:#f92672>*</span> <span style=color:#66d9ef>FROM</span> prod.db.<span style=color:#66d9ef>table</span> <span style=color:#66d9ef>TIMESTAMP</span> <span style=color:#66d9ef>AS</span> <span style=color:#66d9ef>OF</span> <span style=color:#e6db74>&#39;1986-10-26 01:21:00&#39;</span>;
 </span></span><span style=display:flex><span>
 </span></span><span style=display:flex><span><span style=color:#75715e>-- time travel to snapshot with id 10963874102873L
 </span></span></span><span style=display:flex><span><span style=color:#75715e></span><span style=color:#66d9ef>SELECT</span> <span style=color:#f92672>*</span> <span style=color:#66d9ef>FROM</span> prod.db.<span style=color:#66d9ef>table</span> <span style=color:#66d9ef>VERSION</span> <span style=color:#66d9ef>AS</span> <span style=color:#66d9ef>OF</span> <span style=color:#ae81ff>10963874102873</span>;
-</span></span></code></pre></div><p>In addition, <code>FOR SYSTEM_TIME AS OF</code> and <code>FOR SYSTEM_VERSION AS OF</code> clauses are also supported:</p><pre tabindex=0><code>SELECT * FROM prod.db.table FOR SYSTEM_TIME AS OF &#39;1986-10-26 01:21:00&#39;;
-SELECT * FROM prod.db.table FOR SYSTEM_VERSION AS OF 10963874102873;
-</code></pre><p>Timestamps may also be supplied as a Unix timestamp, in seconds:</p><pre tabindex=0><code>-- timestamp in seconds
-SELECT * FROM prod.db.table TIMESTAMP AS OF 499162860;
-SELECT * FROM prod.db.table FOR SYSTEM_TIME AS OF 499162860;
-</code></pre><h4 id=dataframe>DataFrame</h4><p>To select a specific table snapshot or the snapshot at some time in the DataFrame API, Iceberg supports two Spark read options:</p><ul><li><code>snapshot-id</code> selects a specific table snapshot</li><li><code>as-of-timestamp</code> selects the current snapshot at a timestamp, in milliseconds</li></ul><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=langu [...]
+</span></span></code></pre></div><p>In addition, <code>FOR SYSTEM_TIME AS OF</code> and <code>FOR SYSTEM_VERSION AS OF</code> clauses are also supported:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>SELECT</span> <span style=color:#f92672>*</span> <span style=color:#66d9ef>FROM</span> prod.db.<span style=color:#66 [...]
+</span></span><span style=display:flex><span><span style=color:#66d9ef>SELECT</span> <span style=color:#f92672>*</span> <span style=color:#66d9ef>FROM</span> prod.db.<span style=color:#66d9ef>table</span> <span style=color:#66d9ef>FOR</span> SYSTEM_VERSION <span style=color:#66d9ef>AS</span> <span style=color:#66d9ef>OF</span> <span style=color:#ae81ff>10963874102873</span>;
+</span></span></code></pre></div><p>Timestamps may also be supplied as a Unix timestamp, in seconds:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#75715e>-- timestamp in seconds
+</span></span></span><span style=display:flex><span><span style=color:#75715e></span><span style=color:#66d9ef>SELECT</span> <span style=color:#f92672>*</span> <span style=color:#66d9ef>FROM</span> prod.db.<span style=color:#66d9ef>table</span> <span style=color:#66d9ef>TIMESTAMP</span> <span style=color:#66d9ef>AS</span> <span style=color:#66d9ef>OF</span> <span style=color:#ae81ff>499162860</span>;
+</span></span><span style=display:flex><span><span style=color:#66d9ef>SELECT</span> <span style=color:#f92672>*</span> <span style=color:#66d9ef>FROM</span> prod.db.<span style=color:#66d9ef>table</span> <span style=color:#66d9ef>FOR</span> SYSTEM_TIME <span style=color:#66d9ef>AS</span> <span style=color:#66d9ef>OF</span> <span style=color:#ae81ff>499162860</span>;
+</span></span></code></pre></div><h4 id=dataframe>DataFrame</h4><p>To select a specific table snapshot or the snapshot at some time in the DataFrame API, Iceberg supports two Spark read options:</p><ul><li><code>snapshot-id</code> selects a specific table snapshot</li><li><code>as-of-timestamp</code> selects the current snapshot at a timestamp, in milliseconds</li></ul><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size [...]
 </span></span></span><span style=display:flex><span><span style=color:#75715e></span>spark<span style=color:#f92672>.</span>read
 </span></span><span style=display:flex><span>    <span style=color:#f92672>.</span>option<span style=color:#f92672>(</span><span style=color:#e6db74>&#34;as-of-timestamp&#34;</span><span style=color:#f92672>,</span> <span style=color:#e6db74>&#34;499162860000&#34;</span><span style=color:#f92672>)</span>
 </span></span><span style=display:flex><span>    <span style=color:#f92672>.</span>format<span style=color:#f92672>(</span><span style=color:#e6db74>&#34;iceberg&#34;</span><span style=color:#f92672>)</span>
@@ -55,9 +55,9 @@ Incremental read is not supported by Spark&rsquo;s SQL syntax.</div><h3 id=spark
 </span></span><span style=display:flex><span>df<span style=color:#f92672>.</span>createOrReplaceTempView<span style=color:#f92672>(</span><span style=color:#e6db74>&#34;table&#34;</span><span style=color:#f92672>)</span>
 </span></span><span style=display:flex><span>
 </span></span><span style=display:flex><span>spark<span style=color:#f92672>.</span>sql<span style=color:#f92672>(</span><span style=color:#e6db74>&#34;&#34;&#34;select count(1) from table&#34;&#34;&#34;</span><span style=color:#f92672>).</span>show<span style=color:#f92672>()</span>
-</span></span></code></pre></div><h2 id=inspecting-tables>Inspecting tables</h2><p>To inspect a table&rsquo;s history, snapshots, and other metadata, Iceberg supports metadata tables.</p><p>Metadata tables are identified by adding the metadata table name after the original table name. For example, history for <code>db.table</code> is read using <code>db.table.history</code>.</p><div class=info><p>For Spark 2.4, use the <code>DataFrameReader</code> API to <a href=#inspecting-with-datafram [...]
-</span></span></code></pre></div><table><thead><tr><th>made_current_at</th><th>snapshot_id</th><th>parent_id</th><th>is_current_ancestor</th></tr></thead><tbody><tr><td>2019-02-08 03:29:51.215</td><td>5781947118336215154</td><td>NULL</td><td>true</td></tr><tr><td>2019-02-08 03:47:55.948</td><td>5179299526185056830</td><td>5781947118336215154</td><td>true</td></tr><tr><td>2019-02-09 16:24:30.13</td><td>296410040247533544</td><td>5179299526185056830</td><td>false</td></tr><tr><td>2019-02-0 [...]
-</span></span></code></pre></div><table><thead><tr><th>timestamp</th><th>file</th><th>latest_snapshot_id</th><th>latest_schema_id</th><th>latest_sequence_number</th></tr></thead><tbody><tr><td>2022-07-28 10:43:52.93</td><td>s3://&mldr;/table/metadata/00000-9441e604-b3c2-498a-a45a-6320e8ab9006.metadata.json</td><td>null</td><td>null</td><td>null</td></tr><tr><td>2022-07-28 10:43:57.487</td><td>s3://&mldr;/table/metadata/00001-f30823df-b745-4a0a-b293-7532e0c99986.metadata.json</td><td>1702 [...]
+</span></span></code></pre></div><h2 id=inspecting-tables>Inspecting tables</h2><p>To inspect a table&rsquo;s history, snapshots, and other metadata, Iceberg supports metadata tables.</p><p>Metadata tables are identified by adding the metadata table name after the original table name. For example, history for <code>db.table</code> is read using <code>db.table.history</code>.</p><div class=info><p>For Spark 2.4, use the <code>DataFrameReader</code> API to <a href=#inspecting-with-datafram [...]
+</span></span></code></pre></div><table><thead><tr><th>made_current_at</th><th>snapshot_id</th><th>parent_id</th><th>is_current_ancestor</th></tr></thead><tbody><tr><td>2019-02-08 03:29:51.215</td><td>5781947118336215154</td><td>NULL</td><td>true</td></tr><tr><td>2019-02-08 03:47:55.948</td><td>5179299526185056830</td><td>5781947118336215154</td><td>true</td></tr><tr><td>2019-02-09 16:24:30.13</td><td>296410040247533544</td><td>5179299526185056830</td><td>false</td></tr><tr><td>2019-02-0 [...]
+</span></span></code></pre></div><table><thead><tr><th>timestamp</th><th>file</th><th>latest_snapshot_id</th><th>latest_schema_id</th><th>latest_sequence_number</th></tr></thead><tbody><tr><td>2022-07-28 10:43:52.93</td><td>s3://&mldr;/table/metadata/00000-9441e604-b3c2-498a-a45a-6320e8ab9006.metadata.json</td><td>null</td><td>null</td><td>null</td></tr><tr><td>2022-07-28 10:43:57.487</td><td>s3://&mldr;/table/metadata/00001-f30823df-b745-4a0a-b293-7532e0c99986.metadata.json</td><td>1702 [...]
 </span></span></code></pre></div><table><thead><tr><th>committed_at</th><th>snapshot_id</th><th>parent_id</th><th>operation</th><th>manifest_list</th><th>summary</th></tr></thead><tbody><tr><td>2019-02-08 03:29:51.215</td><td>57897183625154</td><td>null</td><td>append</td><td>s3://&mldr;/table/metadata/snap-57897183625154-1.avro</td><td>{ added-records -> 2478404, total-records -> 2478404, added-data-files -> 438, total-data-files -> 438, spark.app.id -> application_1520379288616_155055  [...]
 </span></span><span style=display:flex><span>    h.made_current_at,
 </span></span><span style=display:flex><span>    s.<span style=color:#66d9ef>operation</span>,
@@ -68,20 +68,27 @@ Incremental read is not supported by Spark&rsquo;s SQL syntax.</div><h3 id=spark
 </span></span><span style=display:flex><span><span style=color:#66d9ef>join</span> prod.db.<span style=color:#66d9ef>table</span>.snapshots s
 </span></span><span style=display:flex><span>  <span style=color:#66d9ef>on</span> h.snapshot_id <span style=color:#f92672>=</span> s.snapshot_id
 </span></span><span style=display:flex><span><span style=color:#66d9ef>order</span> <span style=color:#66d9ef>by</span> made_current_at
-</span></span></code></pre></div><table><thead><tr><th>made_current_at</th><th>operation</th><th>snapshot_id</th><th>is_current_ancestor</th><th>summary[spark.app.id]</th></tr></thead><tbody><tr><td>2019-02-08 03:29:51.215</td><td>append</td><td>57897183625154</td><td>true</td><td>application_1520379288616_155055</td></tr><tr><td>2019-02-09 16:24:30.13</td><td>delete</td><td>29641004024753</td><td>false</td><td>application_1520379288616_151109</td></tr><tr><td>2019-02-09 16:32:47.336</td [...]
-</span></span></code></pre></div><table><thead><tr><th>content</th><th>file_path</th><th>file_format</th><th>spec_id</th><th>partition</th><th>record_count</th><th>file_size_in_bytes</th><th>column_sizes</th><th>value_counts</th><th>null_value_counts</th><th>nan_value_counts</th><th>lower_bounds</th><th>upper_bounds</th><th>key_metadata</th><th>split_offsets</th><th>equality_ids</th><th>sort_order_id</th></tr></thead><tbody><tr><td>0</td><td>s3:/&mldr;/table/data/00000-3-8d6d60e8-d427-48 [...]
+</span></span></code></pre></div><table><thead><tr><th>made_current_at</th><th>operation</th><th>snapshot_id</th><th>is_current_ancestor</th><th>summary[spark.app.id]</th></tr></thead><tbody><tr><td>2019-02-08 03:29:51.215</td><td>append</td><td>57897183625154</td><td>true</td><td>application_1520379288616_155055</td></tr><tr><td>2019-02-09 16:24:30.13</td><td>delete</td><td>29641004024753</td><td>false</td><td>application_1520379288616_151109</td></tr><tr><td>2019-02-09 16:32:47.336</td [...]
+</span></span></code></pre></div><table><thead><tr><th>content</th><th>file_path</th><th>file_format</th><th>spec_id</th><th>partition</th><th>record_count</th><th>file_size_in_bytes</th><th>column_sizes</th><th>value_counts</th><th>null_value_counts</th><th>nan_value_counts</th><th>lower_bounds</th><th>upper_bounds</th><th>key_metadata</th><th>split_offsets</th><th>equality_ids</th><th>sort_order_id</th></tr></thead><tbody><tr><td>0</td><td>s3:/&mldr;/table/data/00000-3-8d6d60e8-d427-48 [...]
 </span></span></code></pre></div><table><thead><tr><th>path</th><th>length</th><th>partition_spec_id</th><th>added_snapshot_id</th><th>added_data_files_count</th><th>existing_data_files_count</th><th>deleted_data_files_count</th><th>partition_summaries</th></tr></thead><tbody><tr><td>s3://&mldr;/table/metadata/45b5290b-ee61-4788-b324-b1e2735c0e10-m0.avro</td><td>4479</td><td>0</td><td>6668963634911763636</td><td>8</td><td>0</td><td>0</td><td>[[false,null,2019-05-13,2019-05-15]]</td></tr> [...]
-This usually occurs when reading from V1 table, where <code>contains_nan</code> is not populated.</li></ol><h3 id=partitions>Partitions</h3><p>To show a table&rsquo;s current partitions:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>SELECT</span> <span style=color:#f92672>*</span> <span style=color:#66d9ef>FROM</sp [...]
+This usually occurs when reading from V1 table, where <code>contains_nan</code> is not populated.</li></ol><h3 id=partitions>Partitions</h3><p>To show a table&rsquo;s current partitions:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>SELECT</span> <span style=color:#f92672>*</span> <span style=color:#66d9ef>FROM</sp [...]
 </span></span></code></pre></div><table><thead><tr><th>partition</th><th>record_count</th><th>file_count</th><th>spec_id</th></tr></thead><tbody><tr><td>{20211001, 11}</td><td>1</td><td>1</td><td>0</td></tr><tr><td>{20211002, 11}</td><td>1</td><td>1</td><td>0</td></tr><tr><td>{20211001, 10}</td><td>1</td><td>1</td><td>0</td></tr><tr><td>{20211002, 10}</td><td>1</td><td>1</td><td>0</td></tr></tbody></table><p>Note:
-For unpartitioned tables, the partitions table will contain only the record_count and file_count columns.</p><h3 id=all-metadata-tables>All Metadata Tables</h3><p>These tables are unions of the metadata tables specific to the current snapshot, and return metadata across all snapshots.</p><div class=danger>The &ldquo;all&rdquo; metadata tables may produce more than one row per data file or manifest file because metadata files may be part of more than one table snapshot.</div><h4 id=all-da [...]
-</span></span></code></pre></div><table><thead><tr><th>content</th><th>file_path</th><th>file_format</th><th>partition</th><th>record_count</th><th>file_size_in_bytes</th><th>column_sizes</th><th>value_counts</th><th>null_value_counts</th><th>nan_value_counts</th><th>lower_bounds</th><th>upper_bounds</th><th>key_metadata</th><th>split_offsets</th><th>equality_ids</th><th>sort_order_id</th></tr></thead><tbody><tr><td>0</td><td>s3://&mldr;/dt=20210102/00000-0-756e2512-49ae-45bb-aae3-c0ca47 [...]
+For unpartitioned tables, the partitions table will contain only the record_count and file_count columns.</p><h3 id=all-metadata-tables>All Metadata Tables</h3><p>These tables are unions of the metadata tables specific to the current snapshot, and return metadata across all snapshots.</p><div class=danger>The &ldquo;all&rdquo; metadata tables may produce more than one row per data file or manifest file because metadata files may be part of more than one table snapshot.</div><h4 id=all-da [...]
+</span></span></code></pre></div><table><thead><tr><th>content</th><th>file_path</th><th>file_format</th><th>partition</th><th>record_count</th><th>file_size_in_bytes</th><th>column_sizes</th><th>value_counts</th><th>null_value_counts</th><th>nan_value_counts</th><th>lower_bounds</th><th>upper_bounds</th><th>key_metadata</th><th>split_offsets</th><th>equality_ids</th><th>sort_order_id</th></tr></thead><tbody><tr><td>0</td><td>s3://&mldr;/dt=20210102/00000-0-756e2512-49ae-45bb-aae3-c0ca47 [...]
 </span></span></code></pre></div><table><thead><tr><th>path</th><th>length</th><th>partition_spec_id</th><th>added_snapshot_id</th><th>added_data_files_count</th><th>existing_data_files_count</th><th>deleted_data_files_count</th><th>partition_summaries</th></tr></thead><tbody><tr><td>s3://&mldr;/metadata/a85f78c5-3222-4b37-b7e4-faf944425d48-m0.avro</td><td>6376</td><td>0</td><td>6272782676904868561</td><td>2</td><td>0</td><td>0</td><td>[{false, false, 20210101, 20210101}]</td></tr></tbod [...]
-This usually occurs when reading from V1 table, where <code>contains_nan</code> is not populated.</li></ol><h3 id=references>References</h3><p>To show a table&rsquo;s known snapshot references:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>SELECT</span> <span style=color:#f92672>*</span> <span style=color:#66d9ef>F [...]
-</span></span></code></pre></div><table><thead><tr><th>name</th><th>type</th><th>snapshot_id</th><th>max_reference_age_in_ms</th><th>min_snapshots_to_keep</th><th>max_snapshot_age_in_ms</th></tr></thead><tbody><tr><td>main</td><td>BRANCH</td><td>4686954189838128572</td><td>10</td><td>20</td><td>30</td></tr><tr><td>testTag</td><td>TAG</td><td>4686954189838128572</td><td>10</td><td>null</td><td>null</td></tr></tbody></table><h2 id=inspecting-with-dataframes>Inspecting with DataFrames</h2>< [...]
-</span></span></span><span style=display:flex><span><span style=color:#75715e></span>spark<span style=color:#f92672>.</span>read<span style=color:#f92672>.</span>format<span style=color:#f92672>(</span><span style=color:#e6db74>&#34;iceberg&#34;</span><span style=color:#f92672>).</span>load<span style=color:#f92672>(</span><span style=color:#e6db74>&#34;db.table.files&#34;</span><span style=color:#f92672>).</span>show<span style=color:#f92672>(</span>truncate <span style=color:#66d9ef>=< [...]
+This usually occurs when reading from V1 table, where <code>contains_nan</code> is not populated.</li></ol><h3 id=references>References</h3><p>To show a table&rsquo;s known snapshot references:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#66d9ef>SELECT</span> <span style=color:#f92672>*</span> <span style=color:#66d9ef>F [...]
+</span></span></code></pre></div><table><thead><tr><th>name</th><th>type</th><th>snapshot_id</th><th>max_reference_age_in_ms</th><th>min_snapshots_to_keep</th><th>max_snapshot_age_in_ms</th></tr></thead><tbody><tr><td>main</td><td>BRANCH</td><td>4686954189838128572</td><td>10</td><td>20</td><td>30</td></tr><tr><td>testTag</td><td>TAG</td><td>4686954189838128572</td><td>10</td><td>null</td><td>null</td></tr></tbody></table><h3 id=inspecting-with-dataframes>Inspecting with DataFrames</h3>< [...]
+</span></span></span><span style=display:flex><span><span style=color:#75715e></span>spark<span style=color:#f92672>.</span>read<span style=color:#f92672>.</span>format<span style=color:#f92672>(</span><span style=color:#e6db74>&#34;iceberg&#34;</span><span style=color:#f92672>).</span>load<span style=color:#f92672>(</span><span style=color:#e6db74>&#34;db.table.files&#34;</span><span style=color:#f92672>)</span>
 </span></span><span style=display:flex><span><span style=color:#75715e>// Hadoop path table
-</span></span></span><span style=display:flex><span><span style=color:#75715e></span>spark<span style=color:#f92672>.</span>read<span style=color:#f92672>.</span>format<span style=color:#f92672>(</span><span style=color:#e6db74>&#34;iceberg&#34;</span><span style=color:#f92672>).</span>load<span style=color:#f92672>(</span><span style=color:#e6db74>&#34;hdfs://nn:8020/path/to/table#files&#34;</span><span style=color:#f92672>).</span>show<span style=color:#f92672>(</span>truncate <span st [...]
-</span></span></code></pre></div></div><div id=toc class=markdown-body><div id=full><nav id=TableOfContents><ul><li><a href=#querying-with-sql>Querying with SQL</a></li><li><a href=#querying-with-dataframes>Querying with DataFrames</a><ul><li><a href=#catalogs-with-dataframereader>Catalogs with DataFrameReader</a></li><li><a href=#time-travel>Time travel</a></li><li><a href=#incremental-read>Incremental read</a></li><li><a href=#spark-24>Spark 2.4</a></li></ul></li><li><a href=#inspectin [...]
+</span></span></span><span style=display:flex><span><span style=color:#75715e></span>spark<span style=color:#f92672>.</span>read<span style=color:#f92672>.</span>format<span style=color:#f92672>(</span><span style=color:#e6db74>&#34;iceberg&#34;</span><span style=color:#f92672>).</span>load<span style=color:#f92672>(</span><span style=color:#e6db74>&#34;hdfs://nn:8020/path/to/table#files&#34;</span><span style=color:#f92672>)</span>
+</span></span></code></pre></div><h3 id=time-travel-with-metadata-tables>Time Travel with Metadata Tables</h3><p>To inspect a tables&rsquo;s metadata with the time travel feature:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-sql data-lang=sql><span style=display:flex><span><span style=color:#75715e>-- get the table&#39;s file manifests at timestamp Sep 20, 2021 08:00:00
+</span></span></span><span style=display:flex><span><span style=color:#75715e></span><span style=color:#66d9ef>SELECT</span> <span style=color:#f92672>*</span> <span style=color:#66d9ef>FROM</span> prod.db.<span style=color:#66d9ef>table</span>.manifests <span style=color:#66d9ef>TIMESTAMP</span> <span style=color:#66d9ef>AS</span> <span style=color:#66d9ef>OF</span> <span style=color:#e6db74>&#39;2021-09-20 08:00:00&#39;</span>;
+</span></span><span style=display:flex><span>
+</span></span><span style=display:flex><span><span style=color:#75715e>-- get the table&#39;s partitions with snapshot id 10963874102873L
+</span></span></span><span style=display:flex><span><span style=color:#75715e></span><span style=color:#66d9ef>SELECT</span> <span style=color:#f92672>*</span> <span style=color:#66d9ef>FROM</span> prod.db.<span style=color:#66d9ef>table</span>.partitions <span style=color:#66d9ef>VERSION</span> <span style=color:#66d9ef>AS</span> <span style=color:#66d9ef>OF</span> <span style=color:#ae81ff>10963874102873</span>;
+</span></span></code></pre></div><p>Metadata tables can also be inspected with time travel using the DataFrameReader API:</p><div class=highlight><pre tabindex=0 style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code class=language-scala data-lang=scala><span style=display:flex><span><span style=color:#75715e>// load the table&#39;s file metadata at snapshot-id 10963874102873 as DataFrame
+</span></span></span><span style=display:flex><span><span style=color:#75715e></span>spark<span style=color:#f92672>.</span>read<span style=color:#f92672>.</span>format<span style=color:#f92672>(</span><span style=color:#e6db74>&#34;iceberg&#34;</span><span style=color:#f92672>).</span>option<span style=color:#f92672>(</span><span style=color:#e6db74>&#34;snapshot-id&#34;</span><span style=color:#f92672>,</span> <span style=color:#ae81ff>10963874102873L</span><span style=color:#f92672>). [...]
+</span></span></code></pre></div></div><div id=toc class=markdown-body><div id=full><nav id=TableOfContents><ul><li><a href=#querying-with-sql>Querying with SQL</a></li><li><a href=#querying-with-dataframes>Querying with DataFrames</a><ul><li><a href=#catalogs-with-dataframereader>Catalogs with DataFrameReader</a></li><li><a href=#time-travel>Time travel</a></li><li><a href=#incremental-read>Incremental read</a></li><li><a href=#spark-24>Spark 2.4</a></li></ul></li><li><a href=#inspectin [...]
 <script src=https://iceberg.apache.org/docs/latest//js/jquery.easing.min.js></script>
 <script type=text/javascript src=https://iceberg.apache.org/docs/latest//js/search.js></script>
 <script src=https://iceberg.apache.org/docs/latest//js/bootstrap.min.js></script>
diff --git a/docs/latest/spark-structured-streaming/index.html b/docs/latest/spark-structured-streaming/index.html
index 62f74e52..65d9b457 100644
--- a/docs/latest/spark-structured-streaming/index.html
+++ b/docs/latest/spark-structured-streaming/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class=chevron-toggle data-toggle=collapse data-paren [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class="collapse in"><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a id=active href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class="collapse in"><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a id=active href=../spark-structured-streaming/>Structured Streaming</a></li><li><a href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-p [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>
diff --git a/docs/latest/spark-writes/index.html b/docs/latest/spark-writes/index.html
index d807e01e..b2a273ca 100644
--- a/docs/latest/spark-writes/index.html
+++ b/docs/latest/spark-writes/index.html
@@ -3,11 +3,11 @@
 <span class=icon-bar></span>
 <span class=icon-bar></span>
 <span class=icon-bar></span></button>
-<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>0.14.1</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg [...]
+<a class="page-scroll navbar-brand" href=https://iceberg.apache.org/><img class=top-navbar-logo src=https://iceberg.apache.org/docs/latest//img/iceberg-logo-icon.png> Apache Iceberg</a></div><div><input type=search class=form-control id=search-input placeholder=Search... maxlength=64 data-hotkeys=s/></div><div class=versions-dropdown><span>1.0.0</span> <i class="fa fa-chevron-down"></i><div class=versions-dropdown-content><ul><li class=versions-dropdown-selection><a href=https://iceberg. [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Tables class=collapse><ul class=sub-menu><li><a href=../configuration/>Configuration</a></li><li><a href=../evolution/>Evolution</a></li><li><a href=../maintenance/>Maintenance</a></li><li><a href=../partitioning/>Partitioning</a></li><li><a href=../performance/>Performance</a></li><li><a href=../reliability/>Reliability</a></li><li><a href=../schemas/>Schemas</a></li></ul></div><li><a class=chevron-toggle data-toggle=collapse data-paren [...]
 <i class="fa fa-chevron-right"></i>
-<i class="fa fa-chevron-down"></i></a></li><div id=Spark class="collapse in"><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-configuration/>Spark Configuration</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a id=active href=../spark-writes/>Writes</a></li></ul></div><li [...]
+<i class="fa fa-chevron-down"></i></a></li><div id=Spark class="collapse in"><ul class=sub-menu><li><a href=../spark-ddl/>DDL</a></li><li><a href=../getting-started/>Getting Started</a></li><li><a href=../spark-procedures/>Procedures</a></li><li><a href=../spark-queries/>Queries</a></li><li><a href=../spark-structured-streaming/>Structured Streaming</a></li><li><a id=active href=../spark-writes/>Writes</a></li></ul></div><li><a class="chevron-toggle collapsed" data-toggle=collapse data-p [...]
 <i class="fa fa-chevron-right"></i>
 <i class="fa fa-chevron-down"></i></a></li><div id=Flink class=collapse><ul class=sub-menu><li><a href=../flink/>Enabling Iceberg in Flink</a></li><li><a href=../flink-connector/>Flink Connector</a></li></ul></div><li><a href=../hive/><span>Hive</span></a></li><li><a target=_blank href=https://trino.io/docs/current/connector/iceberg.html><span>Trino</span></a></li><li><a target=_blank href=https://prestodb.io/docs/current/connector/iceberg.html><span>Presto</span></a></li><li><a target=_ [...]
 <i class="fa fa-chevron-right"></i>