You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@avro.apache.org by mg...@apache.org on 2022/02/01 11:51:45 UTC
[avro] 30/30: AVRO-3339 Rust: Rename crate from avro-rs to apache-avro (#1488)
This is an automated email from the ASF dual-hosted git repository.
mgrigorov pushed a commit to branch branch-1.11
in repository https://gitbox.apache.org/repos/asf/avro.git
commit ee2953e0aabc8bf67872e5006e898c502c16e4e0
Author: Martin Grigorov <ma...@users.noreply.github.com>
AuthorDate: Mon Jan 31 16:06:05 2022 +0200
AVRO-3339 Rust: Rename crate from avro-rs to apache-avro (#1488)
* AVRO-3339 Rust: Rename crate from avro-rs to apache-avro
Signed-off-by: Martin Tzvetanov Grigorov <mg...@apache.org>
* AVRO-3339 Rust: Rename crate from avro-rs to apache-avro
Signed-off-by: Martin Tzvetanov Grigorov <mg...@apache.org>
(cherry picked from commit 7e5bdeec02d90ac63413fad09ce5e54adc087621)
---
lang/rust/Cargo.toml | 4 +-
lang/rust/README.md | 68 ++++++++++++------------
lang/rust/README.tpl | 4 +-
lang/rust/benches/serde.rs | 2 +-
lang/rust/benches/single.rs | 2 +-
lang/rust/build.sh | 4 +-
lang/rust/examples/benchmark.rs | 2 +-
lang/rust/examples/generate_interop_data.rs | 2 +-
lang/rust/examples/test_interop_data.rs | 2 +-
lang/rust/examples/to_value.rs | 2 +-
lang/rust/src/lib.rs | 80 ++++++++++++++---------------
lang/rust/src/rabin.rs | 4 +-
lang/rust/src/reader.rs | 2 +-
lang/rust/tests/io.rs | 2 +-
lang/rust/tests/schema.rs | 2 +-
15 files changed, 91 insertions(+), 91 deletions(-)
diff --git a/lang/rust/Cargo.toml b/lang/rust/Cargo.toml
index 0f230a4..dba7266 100644
--- a/lang/rust/Cargo.toml
+++ b/lang/rust/Cargo.toml
@@ -16,7 +16,7 @@
# under the License.
[package]
-name = "avro-rs"
+name = "apache-avro"
version = "0.14.0"
authors = ["Apache Avro team <de...@avro.apache.org>"]
description = "A library for working with Apache Avro in Rust"
@@ -26,7 +26,7 @@ repository = "https://github.com/apache/avro"
edition = "2018"
keywords = ["avro", "data", "serialization"]
categories = ["encoding"]
-documentation = "https://docs.rs/avro-rs"
+documentation = "https://docs.rs/apache-avro"
[features]
snappy = ["crc32fast", "snap"]
diff --git a/lang/rust/README.md b/lang/rust/README.md
index 0282d51..4c562f7 100644
--- a/lang/rust/README.md
+++ b/lang/rust/README.md
@@ -17,16 +17,16 @@
under the License.
-->
-# avro-rs
+# apache-avro
-[![Latest Version](https://img.shields.io/crates/v/avro-rs.svg)](https://crates.io/crates/avro-rs)
+[![Latest Version](https://img.shields.io/crates/v/apache-avro.svg)](https://crates.io/crates/apache-avro)
[![Rust Continuous Integration](https://github.com/apache/avro/actions/workflows/test-lang-rust-ci.yml/badge.svg)](https://github.com/apache/avro/actions/workflows/test-lang-rust-ci.yml)
-[![Latest Documentation](https://docs.rs/avro-rs/badge.svg)](https://docs.rs/avro-rs)
+[![Latest Documentation](https://docs.rs/apache-avro/badge.svg)](https://docs.rs/apache-avro)
[![Apache License 2.0](https://img.shields.io/badge/license-Apache%202-blue.svg](https://github.com/apache/avro/blob/master/LICENSE.txt)
A library for working with [Apache Avro](https://avro.apache.org/) in Rust.
-Please check our [documentation](https://docs.rs/avro-rs) for examples, tutorials and API reference.
+Please check our [documentation](https://docs.rs/apache-avro) for examples, tutorials and API reference.
**[Apache Avro](https://avro.apache.org/)** is a data serialization system which provides rich
data structures and a compact, fast, binary data format.
@@ -50,7 +50,7 @@ There are basically two ways of handling Avro data in Rust:
* **as generic Rust serde-compatible types** implementing/deriving `Serialize` and
`Deserialize`;
-**avro-rs** provides a way to read and write both these data representations easily and
+**apache-avro** provides a way to read and write both these data representations easily and
efficiently.
## Installing the library
@@ -60,13 +60,13 @@ Add to your `Cargo.toml`:
```toml
[dependencies]
-avro-rs = "x.y"
+apache-avro = "x.y"
```
Or in case you want to leverage the **Snappy** codec:
```toml
-[dependencies.avro-rs]
+[dependencies.apache-avro]
version = "x.y"
features = ["snappy"]
```
@@ -74,7 +74,7 @@ features = ["snappy"]
Or in case you want to leverage the **Zstandard** codec:
```toml
-[dependencies.avro-rs]
+[dependencies.apache-avro]
version = "x.y"
features = ["zstandard"]
```
@@ -82,7 +82,7 @@ features = ["zstandard"]
Or in case you want to leverage the **Bzip2** codec:
```toml
-[dependencies.avro-rs]
+[dependencies.apache-avro]
version = "x.y"
features = ["bzip"]
```
@@ -90,7 +90,7 @@ features = ["bzip"]
Or in case you want to leverage the **Xz** codec:
```toml
-[dependencies.avro-rs]
+[dependencies.apache-avro]
version = "x.y"
features = ["xz"]
```
@@ -110,7 +110,7 @@ handling. Avro schemas are used for both schema validation and resolution of Avr
Avro schemas are defined in **JSON** format and can just be parsed out of a raw string:
```rust
-use avro_rs::Schema;
+use apache_avro::Schema;
let raw_schema = r#"
{
@@ -134,7 +134,7 @@ Additionally, a list of of definitions (which may depend on each other) can be g
them will be parsed into the corresponding schemas.
```rust
-use avro_rs::Schema;
+use apache_avro::Schema;
let raw_schema_1 = r#"{
"name": "A",
@@ -187,8 +187,8 @@ Given that the schema we defined above is that of an Avro *Record*, we are going
associated type provided by the library to specify the data we want to serialize:
```rust
-use avro_rs::types::Record;
-use avro_rs::Writer;
+use apache_avro::types::Record;
+use apache_avro::Writer;
#
// a writer needs a schema and something to write to
let mut writer = Writer::new(&schema, Vec::new());
@@ -213,7 +213,7 @@ case we want to directly define an Avro value, the library offers that capabilit
`Value` interface.
```rust
-use avro_rs::types::Value;
+use apache_avro::types::Value;
let mut value = Value::String("foo".to_string());
```
@@ -224,7 +224,7 @@ Given that the schema we defined above is an Avro *Record*, we can directly use
deriving `Serialize` to model our data:
```rust
-use avro_rs::Writer;
+use apache_avro::Writer;
#[derive(Debug, Serialize)]
struct Test {
@@ -279,8 +279,8 @@ You must enable the `bzip` feature to use this codec.
To specify a codec to use to compress data, just specify it while creating a `Writer`:
```rust
-use avro_rs::Writer;
-use avro_rs::Codec;
+use apache_avro::Writer;
+use apache_avro::Codec;
#
let mut writer = Writer::with_codec(&schema, Vec::new(), Codec::Deflate);
```
@@ -292,7 +292,7 @@ read them. The library will do it automatically for us, as it already does for t
codec:
```rust
-use avro_rs::Reader;
+use apache_avro::Reader;
#
// reader creation can fail in case the input to read from is not Avro-compatible or malformed
let reader = Reader::new(&input[..]).unwrap();
@@ -301,8 +301,8 @@ let reader = Reader::new(&input[..]).unwrap();
In case, instead, we want to specify a different (but compatible) reader schema from the schema
the data has been written with, we can just do as the following:
```rust
-use avro_rs::Schema;
-use avro_rs::Reader;
+use apache_avro::Schema;
+use apache_avro::Reader;
#
let reader_raw_schema = r#"
@@ -341,7 +341,7 @@ interested.
We can just read directly instances of `Value` out of the `Reader` iterator:
```rust
-use avro_rs::Reader;
+use apache_avro::Reader;
#
let reader = Reader::new(&input[..]).unwrap();
@@ -358,8 +358,8 @@ Alternatively, we can use a Rust type implementing `Deserialize` and representin
read the data into:
```rust
-use avro_rs::Reader;
-use avro_rs::from_value;
+use apache_avro::Reader;
+use apache_avro::from_value;
#[derive(Debug, Deserialize)]
struct Test {
@@ -381,7 +381,7 @@ The following is an example of how to combine everything showed so far and it is
quick reference of the library interface:
```rust
-use avro_rs::{Codec, Reader, Schema, Writer, from_value, types::Record, Error};
+use apache_avro::{Codec, Reader, Schema, Writer, from_value, types::Record, Error};
use serde::{Deserialize, Serialize};
#[derive(Debug, Deserialize, Serialize)]
@@ -431,7 +431,7 @@ fn main() -> Result<(), Error> {
}
```
-`avro-rs` also supports the logical types listed in the [Avro specification](https://avro.apache.org/docs/current/spec.html#Logical+Types):
+`apache-avro` also supports the logical types listed in the [Avro specification](https://avro.apache.org/docs/current/spec.html#Logical+Types):
1. `Decimal` using the [`num_bigint`](https://docs.rs/num-bigint/0.2.6/num_bigint) crate
1. UUID using the [`uuid`](https://docs.rs/uuid/0.8.1/uuid) crate
@@ -444,7 +444,7 @@ Note that the on-disk representation is identical to the underlying primitive/co
#### Read and write logical types
```rust
-use avro_rs::{
+use apache_avro::{
types::Record, types::Value, Codec, Days, Decimal, Duration, Millis, Months, Reader, Schema,
Writer, Error,
};
@@ -557,8 +557,8 @@ This library supports calculating the following fingerprints:
An example of fingerprinting for the supported fingerprints:
```rust
-use avro_rs::rabin::Rabin;
-use avro_rs::{Schema, Error};
+use apache_avro::rabin::Rabin;
+use apache_avro::{Schema, Error};
use md5::Md5;
use sha2::Sha256;
@@ -590,7 +590,7 @@ If encoded data passed to a `Reader` has been ill-formed, it can happen that
the bytes meant to contain the length of data are bogus and could result
in extravagant memory allocation.
-To shield users from ill-formed data, `avro-rs` sets a limit (default: 512MB)
+To shield users from ill-formed data, `apache-avro` sets a limit (default: 512MB)
to any allocation it will perform when decoding data.
If you expect some of your data fields to be larger than this limit, be sure
@@ -602,7 +602,7 @@ will be 512MB throughout the lifetime of the program).
```rust
-use avro_rs::max_allocation_bytes;
+use apache_avro::max_allocation_bytes;
max_allocation_bytes(2 * 1024 * 1024 * 1024); // 2GB
@@ -615,7 +615,7 @@ max_allocation_bytes(2 * 1024 * 1024 * 1024); // 2GB
This library supports checking for schemas compatibility.
Note: It does not yet support named schemas (more on
-https://github.com/flavray/avro-rs/pull/76).
+https://github.com/flavray/apache-avro/pull/76).
Examples of checking for compatibility:
@@ -625,7 +625,7 @@ Explanation: an int array schema can be read by a long array schema- an int
(32bit signed integer) fits into a long (64bit signed integer)
```rust
-use avro_rs::{Schema, schema_compatibility::SchemaCompatibility};
+use apache_avro::{Schema, schema_compatibility::SchemaCompatibility};
let writers_schema = Schema::parse_str(r#"{"type": "array", "items":"int"}"#).unwrap();
let readers_schema = Schema::parse_str(r#"{"type": "array", "items":"long"}"#).unwrap();
@@ -638,7 +638,7 @@ Explanation: a long array schema cannot be read by an int array schema- a
long (64bit signed integer) does not fit into an int (32bit signed integer)
```rust
-use avro_rs::{Schema, schema_compatibility::SchemaCompatibility};
+use apache_avro::{Schema, schema_compatibility::SchemaCompatibility};
let writers_schema = Schema::parse_str(r#"{"type": "array", "items":"long"}"#).unwrap();
let readers_schema = Schema::parse_str(r#"{"type": "array", "items":"int"}"#).unwrap();
diff --git a/lang/rust/README.tpl b/lang/rust/README.tpl
index 88830d1..a1184ac 100644
--- a/lang/rust/README.tpl
+++ b/lang/rust/README.tpl
@@ -1,8 +1,8 @@
# {{crate}}
-[![Latest Version](https://img.shields.io/crates/v/avro-rs.svg)](https://crates.io/crates/avro-rs)
+[![Latest Version](https://img.shields.io/crates/v/apache-avro.svg)](https://crates.io/crates/apache-avro)
[![Rust Continuous Integration](https://github.com/apache/avro/actions/workflows/test-lang-rust-ci.yml/badge.svg)](https://github.com/apache/avro/actions/workflows/test-lang-rust-ci.yml)
-[![Latest Documentation](https://docs.rs/avro-rs/badge.svg)](https://docs.rs/avro-rs)
+[![Latest Documentation](https://docs.rs/apache-avro/badge.svg)](https://docs.rs/apache-avro)
[![Apache License 2.0](https://img.shields.io/badge/license-Apache%202-blue.svg](https://github.com/apache/avro/blob/master/LICENSE.txt)
{{readme}}
diff --git a/lang/rust/benches/serde.rs b/lang/rust/benches/serde.rs
index 2d74dea..e48fffb 100644
--- a/lang/rust/benches/serde.rs
+++ b/lang/rust/benches/serde.rs
@@ -15,7 +15,7 @@
// specific language governing permissions and limitations
// under the License.
-use avro_rs::{
+use apache_avro::{
schema::Schema,
types::{Record, Value},
Reader, Writer,
diff --git a/lang/rust/benches/single.rs b/lang/rust/benches/single.rs
index 314f20b..3556adb 100644
--- a/lang/rust/benches/single.rs
+++ b/lang/rust/benches/single.rs
@@ -15,7 +15,7 @@
// specific language governing permissions and limitations
// under the License.
-use avro_rs::{
+use apache_avro::{
schema::Schema,
to_avro_datum,
types::{Record, Value},
diff --git a/lang/rust/build.sh b/lang/rust/build.sh
index 2f0a824..7b78acd 100755
--- a/lang/rust/build.sh
+++ b/lang/rust/build.sh
@@ -53,11 +53,11 @@ do
cargo build --release --lib --all-features
cargo package
mkdir -p ../../dist/rust
- cp target/package/avro-rs-*.crate $dist_dir
+ cp target/package/apache-avro-*.crate $dist_dir
;;
interop-data-generate)
prepare_build
- export RUST_LOG=avro_rs=debug
+ export RUST_LOG=apache_avro=debug
export RUST_BACKTRACE=1
cargo run --all-features --example generate_interop_data
;;
diff --git a/lang/rust/examples/benchmark.rs b/lang/rust/examples/benchmark.rs
index 9ec23da..9728ead 100644
--- a/lang/rust/examples/benchmark.rs
+++ b/lang/rust/examples/benchmark.rs
@@ -15,7 +15,7 @@
// specific language governing permissions and limitations
// under the License.
-use avro_rs::{
+use apache_avro::{
schema::Schema,
types::{Record, Value},
Reader, Writer,
diff --git a/lang/rust/examples/generate_interop_data.rs b/lang/rust/examples/generate_interop_data.rs
index 211c9cb..514ee77 100644
--- a/lang/rust/examples/generate_interop_data.rs
+++ b/lang/rust/examples/generate_interop_data.rs
@@ -15,7 +15,7 @@
// specific language governing permissions and limitations
// under the License.
-use avro_rs::{
+use apache_avro::{
schema::Schema,
types::{Record, Value},
Codec, Writer,
diff --git a/lang/rust/examples/test_interop_data.rs b/lang/rust/examples/test_interop_data.rs
index f86c6c4..e04020e 100644
--- a/lang/rust/examples/test_interop_data.rs
+++ b/lang/rust/examples/test_interop_data.rs
@@ -15,7 +15,7 @@
// specific language governing permissions and limitations
// under the License.
-use avro_rs::Reader;
+use apache_avro::Reader;
use std::ffi::OsStr;
fn main() -> anyhow::Result<()> {
diff --git a/lang/rust/examples/to_value.rs b/lang/rust/examples/to_value.rs
index 622554b..69cbe38 100644
--- a/lang/rust/examples/to_value.rs
+++ b/lang/rust/examples/to_value.rs
@@ -23,7 +23,7 @@ struct Test {
fn main() -> anyhow::Result<()> {
let test = Test { a: 27, b: "foo" };
- let value = avro_rs::to_value(test)?;
+ let value = apache_avro::to_value(test)?;
println!("{:?}", value);
Ok(())
}
diff --git a/lang/rust/src/lib.rs b/lang/rust/src/lib.rs
index af6e3cf..3b140ce 100644
--- a/lang/rust/src/lib.rs
+++ b/lang/rust/src/lib.rs
@@ -17,7 +17,7 @@
//! A library for working with [Apache Avro](https://avro.apache.org/) in Rust.
//!
-//! Please check our [documentation](https://docs.rs/avro-rs) for examples, tutorials and API reference.
+//! Please check our [documentation](https://docs.rs/apache-avro) for examples, tutorials and API reference.
//!
//! **[Apache Avro](https://avro.apache.org/)** is a data serialization system which provides rich
//! data structures and a compact, fast, binary data format.
@@ -41,7 +41,7 @@
//! * **as generic Rust serde-compatible types** implementing/deriving `Serialize` and
//! `Deserialize`;
//!
-//! **avro-rs** provides a way to read and write both these data representations easily and
+//! **apache-avro** provides a way to read and write both these data representations easily and
//! efficiently.
//!
//! # Installing the library
@@ -51,13 +51,13 @@
//!
//! ```toml
//! [dependencies]
-//! avro-rs = "x.y"
+//! apache-avro = "x.y"
//! ```
//!
//! Or in case you want to leverage the **Snappy** codec:
//!
//! ```toml
-//! [dependencies.avro-rs]
+//! [dependencies.apache-avro]
//! version = "x.y"
//! features = ["snappy"]
//! ```
@@ -76,7 +76,7 @@
//! Avro schemas are defined in **JSON** format and can just be parsed out of a raw string:
//!
//! ```
-//! use avro_rs::Schema;
+//! use apache_avro::Schema;
//!
//! let raw_schema = r#"
//! {
@@ -100,7 +100,7 @@
//! them will be parsed into the corresponding schemas.
//!
//! ```
-//! use avro_rs::Schema;
+//! use apache_avro::Schema;
//!
//! let raw_schema_1 = r#"{
//! "name": "A",
@@ -153,9 +153,9 @@
//! associated type provided by the library to specify the data we want to serialize:
//!
//! ```
-//! # use avro_rs::Schema;
-//! use avro_rs::types::Record;
-//! use avro_rs::Writer;
+//! # use apache_avro::Schema;
+//! use apache_avro::types::Record;
+//! use apache_avro::Writer;
//! #
//! # let raw_schema = r#"
//! # {
@@ -191,7 +191,7 @@
//! `Value` interface.
//!
//! ```
-//! use avro_rs::types::Value;
+//! use apache_avro::types::Value;
//!
//! let mut value = Value::String("foo".to_string());
//! ```
@@ -202,9 +202,9 @@
//! deriving `Serialize` to model our data:
//!
//! ```
-//! # use avro_rs::Schema;
+//! # use apache_avro::Schema;
//! # use serde::Serialize;
-//! use avro_rs::Writer;
+//! use apache_avro::Writer;
//!
//! #[derive(Debug, Serialize)]
//! struct Test {
@@ -263,9 +263,9 @@
//!
//! To specify a codec to use to compress data, just specify it while creating a `Writer`:
//! ```
-//! # use avro_rs::Schema;
-//! use avro_rs::Writer;
-//! use avro_rs::Codec;
+//! # use apache_avro::Schema;
+//! use apache_avro::Writer;
+//! use apache_avro::Codec;
//! #
//! # let raw_schema = r#"
//! # {
@@ -288,10 +288,10 @@
//! codec:
//!
//! ```
-//! use avro_rs::Reader;
-//! # use avro_rs::Schema;
-//! # use avro_rs::types::Record;
-//! # use avro_rs::Writer;
+//! use apache_avro::Reader;
+//! # use apache_avro::Schema;
+//! # use apache_avro::types::Record;
+//! # use apache_avro::Writer;
//! #
//! # let raw_schema = r#"
//! # {
@@ -317,10 +317,10 @@
//! In case, instead, we want to specify a different (but compatible) reader schema from the schema
//! the data has been written with, we can just do as the following:
//! ```
-//! use avro_rs::Schema;
-//! use avro_rs::Reader;
-//! # use avro_rs::types::Record;
-//! # use avro_rs::Writer;
+//! use apache_avro::Schema;
+//! use apache_avro::Reader;
+//! # use apache_avro::types::Record;
+//! # use apache_avro::Writer;
//! #
//! # let writer_raw_schema = r#"
//! # {
@@ -376,10 +376,10 @@
//! We can just read directly instances of `Value` out of the `Reader` iterator:
//!
//! ```
-//! # use avro_rs::Schema;
-//! # use avro_rs::types::Record;
-//! # use avro_rs::Writer;
-//! use avro_rs::Reader;
+//! # use apache_avro::Schema;
+//! # use apache_avro::types::Record;
+//! # use apache_avro::Writer;
+//! use apache_avro::Reader;
//! #
//! # let raw_schema = r#"
//! # {
@@ -414,11 +414,11 @@
//! read the data into:
//!
//! ```
-//! # use avro_rs::Schema;
-//! # use avro_rs::Writer;
+//! # use apache_avro::Schema;
+//! # use apache_avro::Writer;
//! # use serde::{Deserialize, Serialize};
-//! use avro_rs::Reader;
-//! use avro_rs::from_value;
+//! use apache_avro::Reader;
+//! use apache_avro::from_value;
//!
//! # #[derive(Serialize)]
//! #[derive(Debug, Deserialize)]
@@ -459,7 +459,7 @@
//! quick reference of the library interface:
//!
//! ```
-//! use avro_rs::{Codec, Reader, Schema, Writer, from_value, types::Record, Error};
+//! use apache_avro::{Codec, Reader, Schema, Writer, from_value, types::Record, Error};
//! use serde::{Deserialize, Serialize};
//!
//! #[derive(Debug, Deserialize, Serialize)]
@@ -509,7 +509,7 @@
//! }
//! ```
//!
-//! `avro-rs` also supports the logical types listed in the [Avro specification](https://avro.apache.org/docs/current/spec.html#Logical+Types):
+//! `apache-avro` also supports the logical types listed in the [Avro specification](https://avro.apache.org/docs/current/spec.html#Logical+Types):
//!
//! 1. `Decimal` using the [`num_bigint`](https://docs.rs/num-bigint/0.2.6/num_bigint) crate
//! 1. UUID using the [`uuid`](https://docs.rs/uuid/0.8.1/uuid) crate
@@ -522,7 +522,7 @@
//! ### Read and write logical types
//!
//! ```rust
-//! use avro_rs::{
+//! use apache_avro::{
//! types::Record, types::Value, Codec, Days, Decimal, Duration, Millis, Months, Reader, Schema,
//! Writer, Error,
//! };
@@ -635,8 +635,8 @@
//! An example of fingerprinting for the supported fingerprints:
//!
//! ```rust
-//! use avro_rs::rabin::Rabin;
-//! use avro_rs::{Schema, Error};
+//! use apache_avro::rabin::Rabin;
+//! use apache_avro::{Schema, Error};
//! use md5::Md5;
//! use sha2::Sha256;
//!
@@ -668,7 +668,7 @@
//! the bytes meant to contain the length of data are bogus and could result
//! in extravagant memory allocation.
//!
-//! To shield users from ill-formed data, `avro-rs` sets a limit (default: 512MB)
+//! To shield users from ill-formed data, `apache-avro` sets a limit (default: 512MB)
//! to any allocation it will perform when decoding data.
//!
//! If you expect some of your data fields to be larger than this limit, be sure
@@ -680,7 +680,7 @@
//!
//!
//! ```rust
-//! use avro_rs::max_allocation_bytes;
+//! use apache_avro::max_allocation_bytes;
//!
//! max_allocation_bytes(2 * 1024 * 1024 * 1024); // 2GB
//!
@@ -703,7 +703,7 @@
//! (32bit signed integer) fits into a long (64bit signed integer)
//!
//! ```rust
-//! use avro_rs::{Schema, schema_compatibility::SchemaCompatibility};
+//! use apache_avro::{Schema, schema_compatibility::SchemaCompatibility};
//!
//! let writers_schema = Schema::parse_str(r#"{"type": "array", "items":"int"}"#).unwrap();
//! let readers_schema = Schema::parse_str(r#"{"type": "array", "items":"long"}"#).unwrap();
@@ -716,7 +716,7 @@
//! long (64bit signed integer) does not fit into an int (32bit signed integer)
//!
//! ```rust
-//! use avro_rs::{Schema, schema_compatibility::SchemaCompatibility};
+//! use apache_avro::{Schema, schema_compatibility::SchemaCompatibility};
//!
//! let writers_schema = Schema::parse_str(r#"{"type": "array", "items":"long"}"#).unwrap();
//! let readers_schema = Schema::parse_str(r#"{"type": "array", "items":"int"}"#).unwrap();
diff --git a/lang/rust/src/rabin.rs b/lang/rust/src/rabin.rs
index d7cb773..e1ede43 100644
--- a/lang/rust/src/rabin.rs
+++ b/lang/rust/src/rabin.rs
@@ -45,7 +45,7 @@ lazy_static! {
/// This is what is used for avro [single object encoding](https://avro.apache.org/docs/current/spec.html#single_object_encoding)
///
/// ```rust
-/// use avro_rs::rabin::Rabin;
+/// use apache_avro::rabin::Rabin;
/// use digest::Digest;
/// use hex_literal::hex;
///
@@ -64,7 +64,7 @@ lazy_static! {
/// To convert the digest to the commonly used 64-bit integer value, you can use the byteorder crate:
///
/// ```rust
-/// # use avro_rs::rabin::Rabin;
+/// # use apache_avro::rabin::Rabin;
/// # use digest::Digest;
/// # use hex_literal::hex;
///
diff --git a/lang/rust/src/reader.rs b/lang/rust/src/reader.rs
index 9634a27..d46b3bd 100644
--- a/lang/rust/src/reader.rs
+++ b/lang/rust/src/reader.rs
@@ -191,7 +191,7 @@ impl<R: Read> Block<R> {
/// To be used as an iterator:
///
/// ```no_run
-/// # use avro_rs::Reader;
+/// # use apache_avro::Reader;
/// # use std::io::Cursor;
/// # let input = Cursor::new(Vec::<u8>::new());
/// for value in Reader::new(input).unwrap() {
diff --git a/lang/rust/tests/io.rs b/lang/rust/tests/io.rs
index 4714493..18b3c70 100644
--- a/lang/rust/tests/io.rs
+++ b/lang/rust/tests/io.rs
@@ -16,7 +16,7 @@
// under the License.
//! Port of https://github.com/apache/avro/blob/release-1.9.1/lang/py/test/test_io.py
-use avro_rs::{from_avro_datum, to_avro_datum, types::Value, Error, Schema};
+use apache_avro::{from_avro_datum, to_avro_datum, types::Value, Error, Schema};
use lazy_static::lazy_static;
use std::io::Cursor;
diff --git a/lang/rust/tests/schema.rs b/lang/rust/tests/schema.rs
index 77b6569..d7ff3e4 100644
--- a/lang/rust/tests/schema.rs
+++ b/lang/rust/tests/schema.rs
@@ -15,7 +15,7 @@
// specific language governing permissions and limitations
// under the License.
-use avro_rs::{
+use apache_avro::{
schema::{Name, RecordField},
types::{Record, Value},
Codec, Error, Reader, Schema, Writer,