You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@arrow.apache.org by ko...@apache.org on 2022/11/15 08:50:56 UTC

[arrow] branch maint-10.0.x created (now 8f47806ce5)

This is an automated email from the ASF dual-hosted git repository.

kou pushed a change to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git


      at 8f47806ce5 ARROW-18315: [CI][deb][RPM] Pin createrepo_c on Travis CI arm64-graviton (#14625)

This branch includes the following new commits:

     new 22ad639d31 ARROW-18078: [Docs][R] Fix broken link in R documentation (#14437)
     new 5f66d0e484 ARROW-18092: [CI][Conan] Update versions of gRPC related dependencies (#14453)
     new f1343f1b78 ARROW-18093: [CI][Conda][Windows] Disable ORC (#14454)
     new 8d76062d45 MINOR: [R] [Docs] Fix link in ?cast documentation (#14455)
     new d05dd5482d MINOR: [Release] Fix loop in universal2 wheel verification (#14479)
     new 0f4ba5360d ARROW-18132: [R] Add deprecation cycle for pull() change (#14475)
     new 9608485fa9 ARROW-18131: [R] Correctly handle .data pronoun in group_by() (#14484)
     new 59809769f5 ARROW-18054: [Python][CI] Enable Cython tests on windows wheels (#13552)
     new 9d66933ecb ARROW-18190: [CI][Packaging] Fix macOS mojave wheels test step to actually test wheels (#14540)
     new c67738f555 ARROW-18186: [C++][MinGW] Make buildable with clang (#14536)
     new 60bbf14285 ARROW-17487: [Python][Packaging][CI] Add support for Python 3.11 (#14499)
     new c3353fad50 ARROW-18274: [Go] StructBuilder premature release fields (#14604)
     new 89f1b91a86 ARROW-18294: [Java] Fix Flight SQL JDBC PreparedStatement#executeUpdate (#14616)
     new 8e8f6bf5ad ARROW-18260: [C++][CMake] Add support for x64 for CMAKE_SYSTEM_PROCESSOR (#14598)
     new b7b92fa88e ARROW-18080: [C++] Remove gcc <= 4.9 workarounds (#14441)
     new 82cf616704 ARROW-17635: [Python][CI] Sync conda recipe with the arrow-cpp feedstock (#14102)
     new f63804f976 ARROW-18162: [C++] Add Arm SVE compiler options (#14515)
     new 73da8d3b73 ARROW-18255: [C++] Don't fail cmake check for armv6 (#14611)
     new 26b0ec4db3 ARROW-18296: [Java] Honor Driver#connect API contract in JDBC driver (#14617)
     new da638b2b05 ARROW-18285: [R] Fix for failing test after lubridate 1.9 release (#14615)
     new 09467cfe1b ARROW-18299: [CI][GLib][macOS] Fix dependency install failures (#14618)
     new e79870d468 ARROW-18310: [C++] Use atomic backpressure counter (#14622)
     new 7e505a90dc ARROW-18251: [CI][Python] Fix AMD64 macOS 12 Python 3 job (#14614)
     new fb95aee543 ARROW-18302: [Python][Packaging] Update minimum vcpkg required version (#14634)
     new 74c6a3589b ARROW-18325: [Docs][Python] Add Python 3.11 to python/install.rst (#14630)
     new f15f771e27 ARROW-18327: [CI][Release] verify-rc-source-*-macos-amd64 are failed (#14640)
     new 8f47806ce5 ARROW-18315: [CI][deb][RPM] Pin createrepo_c on Travis CI arm64-graviton (#14625)

The 27 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.



[arrow] 09/27: ARROW-18190: [CI][Packaging] Fix macOS mojave wheels test step to actually test wheels (#14540)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit 9d66933ecbd4e7758635e1ad1c47e3366c5afd69
Author: Raúl Cumplido <ra...@gmail.com>
AuthorDate: Mon Oct 31 11:12:34 2022 +0100

    ARROW-18190: [CI][Packaging] Fix macOS mojave wheels test step to actually test wheels (#14540)
    
    Authored-by: Raúl Cumplido <ra...@gmail.com>
    Signed-off-by: Antoine Pitrou <an...@python.org>
---
 dev/tasks/python-wheels/github.osx.amd64.yml | 22 ++++++++--------------
 1 file changed, 8 insertions(+), 14 deletions(-)

diff --git a/dev/tasks/python-wheels/github.osx.amd64.yml b/dev/tasks/python-wheels/github.osx.amd64.yml
index 8044fc39fb..526412f842 100644
--- a/dev/tasks/python-wheels/github.osx.amd64.yml
+++ b/dev/tasks/python-wheels/github.osx.amd64.yml
@@ -108,23 +108,17 @@ jobs:
           name: wheel
           path: arrow/python/repaired_wheels/*.whl
 
-      - name: Test Wheel
+      - name: Test Wheel on AMD64
         shell: bash
         env:
-          TEST_DEFAULT: "0"
-          TEST_PYARROW_VERSION: "{{ arrow.no_rc_version }}"
-          TEST_WHEEL: "1"
+          PYTEST_ADDOPTS: "-k 'not test_cancellation'"
         run: |
-          case "${PYTHON_VERSION}" in
-            3.7)
-              python_version=${PYTHON_VERSION}m
-              ;;
-            *)
-              python_version=${PYTHON_VERSION}
-              ;;
-          esac
-          export TEST_PYTHON_VERSIONS=${python_version}
-          arrow/dev/release/verify-release-candidate.sh
+          $PYTHON -m venv test-amd64-env
+          source test-amd64-env/bin/activate
+          pip install --upgrade pip wheel
+          arch -x86_64 pip install -r arrow/python/requirements-wheel-test.txt
+          PYTHON=python arch -x86_64 arrow/ci/scripts/install_gcs_testbench.sh default
+          arch -x86_64 arrow/ci/scripts/python_wheel_unix_test.sh $(pwd)/arrow
 
       {{ macros.github_upload_releases("arrow/python/repaired_wheels/*.whl")|indent }}
       {{ macros.github_upload_gemfury("arrow/python/repaired_wheels/*.whl")|indent }}


[arrow] 12/27: ARROW-18274: [Go] StructBuilder premature release fields (#14604)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit c3353fad507595e4eb94a7b9f1dc6ed44cbc217c
Author: Matt Topol <zo...@gmail.com>
AuthorDate: Tue Nov 8 09:40:11 2022 -0500

    ARROW-18274: [Go] StructBuilder premature release fields (#14604)
    
    Authored-by: Matt Topol <zo...@gmail.com>
    Signed-off-by: Matt Topol <zo...@gmail.com>
---
 go/arrow/array/struct.go     | 11 ++++---
 go/arrow/array/union_test.go | 75 ++++++++++++++++++++++++++++++++++++++++++++
 2 files changed, 82 insertions(+), 4 deletions(-)

diff --git a/go/arrow/array/struct.go b/go/arrow/array/struct.go
index 213febfa41..4901d26d1d 100644
--- a/go/arrow/array/struct.go
+++ b/go/arrow/array/struct.go
@@ -90,7 +90,10 @@ func (a *Struct) String() string {
 		if i > 0 {
 			o.WriteString(" ")
 		}
-		if !bytes.Equal(structBitmap, v.NullBitmapBytes()) {
+		if arrow.IsUnion(v.DataType().ID()) {
+			fmt.Fprintf(o, "%v", v)
+			continue
+		} else if !bytes.Equal(structBitmap, v.NullBitmapBytes()) {
 			masked := a.newStructFieldWithParentValidityMask(i)
 			fmt.Fprintf(o, "%v", masked)
 			masked.Release()
@@ -234,10 +237,10 @@ func (b *StructBuilder) Release() {
 			b.nullBitmap.Release()
 			b.nullBitmap = nil
 		}
-	}
 
-	for _, f := range b.fields {
-		f.Release()
+		for _, f := range b.fields {
+			f.Release()
+		}
 	}
 }
 
diff --git a/go/arrow/array/union_test.go b/go/arrow/array/union_test.go
index ca6122c0ae..036031892d 100644
--- a/go/arrow/array/union_test.go
+++ b/go/arrow/array/union_test.go
@@ -17,6 +17,7 @@
 package array_test
 
 import (
+	"fmt"
 	"strings"
 	"testing"
 
@@ -946,7 +947,81 @@ func (s *UnionBuilderSuite) TestSparseUnionStructWithUnion() {
 	s.Truef(arrow.TypeEqual(expectedType, bldr.Type()), "expected: %s, got: %s", expectedType, bldr.Type())
 }
 
+func ExampleSparseUnionBuilder() {
+	dt1 := arrow.SparseUnionOf([]arrow.Field{
+		{Name: "c", Type: &arrow.DictionaryType{IndexType: arrow.PrimitiveTypes.Uint16, ValueType: arrow.BinaryTypes.String}},
+	}, []arrow.UnionTypeCode{0})
+	dt2 := arrow.StructOf(arrow.Field{Name: "a", Type: dt1})
+
+	pool := memory.DefaultAllocator
+	bldr := array.NewStructBuilder(pool, dt2)
+	defer bldr.Release()
+
+	bldrDt1 := bldr.FieldBuilder(0).(*array.SparseUnionBuilder)
+	binDictBldr := bldrDt1.Child(0).(*array.BinaryDictionaryBuilder)
+
+	bldr.Append(true)
+	bldrDt1.Append(0)
+	binDictBldr.AppendString("foo")
+
+	bldr.Append(true)
+	bldrDt1.Append(0)
+	binDictBldr.AppendString("bar")
+
+	out := bldr.NewArray().(*array.Struct)
+	defer out.Release()
+
+	fmt.Println(out)
+
+	// Output:
+	// {[{c=foo} {c=bar}]}
+}
+
 func TestUnions(t *testing.T) {
 	suite.Run(t, new(UnionFactorySuite))
 	suite.Run(t, new(UnionBuilderSuite))
 }
+
+func TestNestedUnionStructDict(t *testing.T) {
+	// ARROW-18274
+	dt1 := arrow.SparseUnionOf([]arrow.Field{
+		{Name: "c", Type: &arrow.DictionaryType{
+			IndexType: arrow.PrimitiveTypes.Uint16,
+			ValueType: arrow.BinaryTypes.String,
+			Ordered:   false,
+		}},
+	}, []arrow.UnionTypeCode{0})
+	dt2 := arrow.StructOf(
+		arrow.Field{Name: "b", Type: dt1},
+	)
+	dt3 := arrow.SparseUnionOf([]arrow.Field{
+		{Name: "a", Type: dt2},
+	}, []arrow.UnionTypeCode{0})
+	pool := memory.NewGoAllocator()
+
+	builder := array.NewSparseUnionBuilder(pool, dt3)
+	defer builder.Release()
+	arr := builder.NewArray()
+	defer arr.Release()
+	assert.Equal(t, 0, arr.Len())
+}
+
+func TestNestedUnionDictUnion(t *testing.T) {
+	dt1 := arrow.SparseUnionOf([]arrow.Field{
+		{Name: "c", Type: &arrow.DictionaryType{
+			IndexType: arrow.PrimitiveTypes.Uint16,
+			ValueType: arrow.BinaryTypes.String,
+			Ordered:   false,
+		}},
+	}, []arrow.UnionTypeCode{0})
+	dt2 := arrow.SparseUnionOf([]arrow.Field{
+		{Name: "a", Type: dt1},
+	}, []arrow.UnionTypeCode{0})
+	pool := memory.NewGoAllocator()
+
+	builder := array.NewSparseUnionBuilder(pool, dt2)
+	defer builder.Release()
+	arr := builder.NewArray()
+	defer arr.Release()
+	assert.Equal(t, 0, arr.Len())
+}


[arrow] 10/27: ARROW-18186: [C++][MinGW] Make buildable with clang (#14536)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit c67738f555f5db6435016a7901ad2c566c46c7d2
Author: Sutou Kouhei <ko...@clear-code.com>
AuthorDate: Wed Nov 2 05:33:10 2022 +0900

    ARROW-18186: [C++][MinGW] Make buildable with clang (#14536)
    
    Error1 (can't use `[[gnu::dllexport]]` with template):
    
        cpp/src/arrow/util/int_util.cc:463:1: error: an attribute list cannot appear here
        INSTANTIATE_ALL()
        ^~~~~~~~~~~~~~~~~
        cpp/src/arrow/util/int_util.cc:454:3: note: expanded from macro 'INSTANTIATE_ALL'
          INSTANTIATE_ALL_DEST(uint8_t)  \
          ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~
        cpp/src/arrow/util/int_util.cc:444:3: note: expanded from macro 'INSTANTIATE_ALL_DEST'
          INSTANTIATE(uint8_t, DEST)       \
          ^~~~~~~~~~~~~~~~~~~~~~~~~~
        cpp/src/arrow/util/int_util.cc:440:12: note: expanded from macro 'INSTANTIATE'
          template ARROW_TEMPLATE_EXPORT void TransposeInts( \
                   ^~~~~~~~~~~~~~~~~~~~~
        cpp/src/arrow/util/visibility.h:47:31: note: expanded from macro 'ARROW_TEMPLATE_EXPORT'
        #define ARROW_TEMPLATE_EXPORT ARROW_DLLEXPORT
                                      ^~~~~~~~~~~~~~~
        cpp/src/arrow/util/visibility.h:32:25: note: expanded from macro 'ARROW_DLLEXPORT'
        #define ARROW_DLLEXPORT [[gnu::dllexport]]
                                ^~~~~~~~~~~~~~~~~~
    
    Error2 (unused variable):
    
        cpp/src/arrow/util/io_util.cc:1079:7: warning: variable 'oflag' set but not used [-Wunused-but-set-variable]
          int oflag = _O_CREAT | _O_BINARY | _O_NOINHERIT;
              ^
    
    Error3 (missing field initializers):
    
        cpp/src/arrow/util/io_util.cc:1545:29: warning: missing field 'InternalHigh' initializer [-Wmissing-field-initializers]
          OVERLAPPED overlapped = {0};
                                    ^
    
    Authored-by: Sutou Kouhei <ko...@clear-code.com>
    Signed-off-by: Sutou Kouhei <ko...@clear-code.com>
---
 .github/workflows/cpp.yml                   | 27 +++++++++++++++------------
 cpp/cmake_modules/ThirdpartyToolchain.cmake |  6 +++---
 cpp/src/arrow/util/io_util.cc               | 13 ++-----------
 cpp/src/arrow/util/visibility.h             |  2 +-
 4 files changed, 21 insertions(+), 27 deletions(-)

diff --git a/.github/workflows/cpp.yml b/.github/workflows/cpp.yml
index 3d6b89f2f5..2278f075a8 100644
--- a/.github/workflows/cpp.yml
+++ b/.github/workflows/cpp.yml
@@ -293,18 +293,21 @@ jobs:
           ci/scripts/cpp_test.sh $(pwd) $(pwd)/build
 
   windows-mingw:
-    name: AMD64 Windows MinGW ${{ matrix.mingw-n-bits }} C++
+    name: AMD64 Windows MinGW ${{ matrix.msystem_upper }} C++
     runs-on: windows-2019
     if: ${{ !contains(github.event.pull_request.title, 'WIP') }}
-    # Build may take 1h+ without cache and installing Google Cloud
-    # Storage Testbench may take 20m+ without cache.
+    # Build may take 1h+ without cache.
     timeout-minutes: 120
     strategy:
       fail-fast: false
       matrix:
-        mingw-n-bits:
-          - 32
-          - 64
+        include:
+          - msystem_lower: mingw32
+            msystem_upper: MINGW32
+          - msystem_lower: mingw64
+            msystem_upper: MINGW64
+          - msystem_lower: clang64
+            msystem_upper: CLANG64
     env:
       ARROW_BUILD_SHARED: ON
       ARROW_BUILD_STATIC: OFF
@@ -316,10 +319,9 @@ jobs:
       ARROW_GANDIVA: ON
       ARROW_GCS: ON
       ARROW_HDFS: OFF
-      ARROW_HOME: /mingw${{ matrix.mingw-n-bits }}
+      ARROW_HOME: /${{ matrix.msystem_lower}}
       ARROW_JEMALLOC: OFF
       ARROW_PARQUET: ON
-      ARROW_PYTHON: ON
       ARROW_S3: ON
       ARROW_USE_GLOG: OFF
       ARROW_VERBOSE_THIRDPARTY_BUILD: OFF
@@ -334,11 +336,12 @@ jobs:
       # -DBoost_NO_BOOST_CMAKE=ON
       BOOST_ROOT: ""
       CMAKE_ARGS: >-
-        -DARROW_PACKAGE_PREFIX=/mingw${{ matrix.mingw-n-bits }}
+        -DARROW_PACKAGE_PREFIX=/${{ matrix.msystem_lower}}
         -DBoost_NO_BOOST_CMAKE=ON
       # We can't use unity build because we don't have enough memory on
       # GitHub Actions.
       # CMAKE_UNITY_BUILD: ON
+      GTest_SOURCE: BUNDLED
     steps:
       - name: Disable Crash Dialogs
         run: |
@@ -355,7 +358,7 @@ jobs:
           submodules: recursive
       - uses: msys2/setup-msys2@v2
         with:
-          msystem: MINGW${{ matrix.mingw-n-bits }}
+          msystem: ${{ matrix.msystem_upper }}
           update: true
       - name: Setup MSYS2
         shell: msys2 {0}
@@ -364,8 +367,8 @@ jobs:
         uses: actions/cache@v3
         with:
           path: ccache
-          key: cpp-ccache-mingw${{ matrix.mingw-n-bits }}-${{ hashFiles('cpp/**') }}
-          restore-keys: cpp-ccache-mingw${{ matrix.mingw-n-bits }}-
+          key: cpp-ccache-${{ matrix.msystem_lower}}-${{ hashFiles('cpp/**') }}
+          restore-keys: cpp-ccache-${{ matrix.msystem_lower}}-
       - name: Build
         shell: msys2 {0}
         run: |
diff --git a/cpp/cmake_modules/ThirdpartyToolchain.cmake b/cpp/cmake_modules/ThirdpartyToolchain.cmake
index b7cd31f3d7..b1c3201894 100644
--- a/cpp/cmake_modules/ThirdpartyToolchain.cmake
+++ b/cpp/cmake_modules/ThirdpartyToolchain.cmake
@@ -1947,7 +1947,7 @@ macro(build_gtest)
                               -Wno-unused-value -Wno-ignored-attributes)
   endif()
 
-  if(MSVC)
+  if(WIN32)
     set(GTEST_CMAKE_CXX_FLAGS "${GTEST_CMAKE_CXX_FLAGS} -DGTEST_CREATE_SHARED_LIBRARY=1")
   endif()
 
@@ -1956,7 +1956,7 @@ macro(build_gtest)
 
   set(_GTEST_LIBRARY_DIR "${GTEST_PREFIX}/lib")
 
-  if(MSVC)
+  if(WIN32)
     set(_GTEST_IMPORTED_TYPE IMPORTED_IMPLIB)
     set(_GTEST_LIBRARY_SUFFIX
         "${CMAKE_GTEST_DEBUG_EXTENSION}${CMAKE_IMPORT_LIBRARY_SUFFIX}")
@@ -1989,7 +1989,7 @@ macro(build_gtest)
       -DCMAKE_MACOSX_RPATH=OFF)
   set(GMOCK_INCLUDE_DIR "${GTEST_PREFIX}/include")
 
-  if(MSVC AND NOT ARROW_USE_STATIC_CRT)
+  if(WIN32 AND NOT ARROW_USE_STATIC_CRT)
     set(GTEST_CMAKE_ARGS ${GTEST_CMAKE_ARGS} -Dgtest_force_shared_crt=ON)
   endif()
 
diff --git a/cpp/src/arrow/util/io_util.cc b/cpp/src/arrow/util/io_util.cc
index a62040f3a7..571b49c1d7 100644
--- a/cpp/src/arrow/util/io_util.cc
+++ b/cpp/src/arrow/util/io_util.cc
@@ -1076,24 +1076,15 @@ Result<FileDescriptor> FileOpenWritable(const PlatformFilename& file_name,
   FileDescriptor fd;
 
 #if defined(_WIN32)
-  int oflag = _O_CREAT | _O_BINARY | _O_NOINHERIT;
   DWORD desired_access = GENERIC_WRITE;
   DWORD share_mode = FILE_SHARE_READ | FILE_SHARE_WRITE;
   DWORD creation_disposition = OPEN_ALWAYS;
 
-  if (append) {
-    oflag |= _O_APPEND;
-  }
-
   if (truncate) {
-    oflag |= _O_TRUNC;
     creation_disposition = CREATE_ALWAYS;
   }
 
-  if (write_only) {
-    oflag |= _O_WRONLY;
-  } else {
-    oflag |= _O_RDWR;
+  if (!write_only) {
     desired_access |= GENERIC_READ;
   }
 
@@ -1542,7 +1533,7 @@ static inline int64_t pread_compat(int fd, void* buf, int64_t nbytes, int64_t po
 #if defined(_WIN32)
   HANDLE handle = reinterpret_cast<HANDLE>(_get_osfhandle(fd));
   DWORD dwBytesRead = 0;
-  OVERLAPPED overlapped = {0};
+  OVERLAPPED overlapped = {};
   overlapped.Offset = static_cast<uint32_t>(pos);
   overlapped.OffsetHigh = static_cast<uint32_t>(pos >> 32);
 
diff --git a/cpp/src/arrow/util/visibility.h b/cpp/src/arrow/util/visibility.h
index 6ab5290ef3..b0fd790295 100644
--- a/cpp/src/arrow/util/visibility.h
+++ b/cpp/src/arrow/util/visibility.h
@@ -26,7 +26,7 @@
 #pragma GCC diagnostic ignored "-Wattributes"
 #endif
 
-#if defined(__cplusplus) && (defined(__GNUC__) || defined(__clang__))
+#if defined(__cplusplus) && defined(__GNUC__) && !defined(__clang__)
 // Use C++ attribute syntax where possible to avoid GCC parser bug
 // (https://stackoverflow.com/questions/57993818/gcc-how-to-combine-attribute-dllexport-and-nodiscard-in-a-struct-de)
 #define ARROW_DLLEXPORT [[gnu::dllexport]]


[arrow] 20/27: ARROW-18285: [R] Fix for failing test after lubridate 1.9 release (#14615)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit da638b2b0554b8bbc248e2291b729c6c75cd7f24
Author: Dewey Dunnington <de...@fishandwhistle.net>
AuthorDate: Thu Nov 10 13:26:29 2022 -0400

    ARROW-18285: [R] Fix for failing test after lubridate 1.9 release (#14615)
    
    This fixes test failures resulting from an updated lubridate package, whose update removed some functionality that we supported (apparently `lubridate::yq(2021.1)` was a thing). This works in our source because we cast to `string()` before doing any further processing.
    
    Authored-by: Dewey Dunnington <de...@fishandwhistle.net>
    Signed-off-by: Dewey Dunnington <de...@fishandwhistle.net>
---
 r/tests/testthat/test-dplyr-funcs-datetime.R | 10 +++++-----
 1 file changed, 5 insertions(+), 5 deletions(-)

diff --git a/r/tests/testthat/test-dplyr-funcs-datetime.R b/r/tests/testthat/test-dplyr-funcs-datetime.R
index 3ddc9ec3be..005922dee4 100644
--- a/r/tests/testthat/test-dplyr-funcs-datetime.R
+++ b/r/tests/testthat/test-dplyr-funcs-datetime.R
@@ -2163,12 +2163,12 @@ test_that("ym, my & yq parsers", {
     my_string = c("05-2022", "02/2022", "03.22", "12//1979", "09.88", NA),
     Ym_string = c("2022-05", "2022/02", "2022.03", "1979//12", "1988.09", NA),
     mY_string = c("05-2022", "02/2022", "03.2022", "12//1979", "09.1988", NA),
-    yq_string = c("2007.3", "1970.2", "2020.1", "2009.4", "1975.1", NA),
-    yq_numeric = c(2007.3, 1970.2, 2020.1, 2009.4, 1975.1, NA),
+    yq_string = c("2007.3", "1971.2", "2021.1", "2009.4", "1975.1", NA),
+    yq_numeric = c(2007.3, 1971.2, 2021.1, 2009.4, 1975.1, NA),
     yq_space = c("2007 3", "1970 2", "2020 1", "2009 4", "1975 1", NA),
-    qy_string = c("3.2007", "2.1970", "1.2020", "4.2009", "1.1975", NA),
-    qy_numeric = c(3.2007, 2.1970, 1.2020, 4.2009, 1.1975, NA),
-    qy_space = c("3 2007", "2 1970", "1 2020", "4 2009", "1 1975", NA)
+    qy_string = c("3.2007", "2.1971", "1.2020", "4.2009", "1.1975", NA),
+    qy_numeric = c(3.2007, 2.1971, 1.2021, 4.2009, 1.1975, NA),
+    qy_space = c("3 2007", "2 1971", "1 2021", "4 2009", "1 1975", NA)
   )
 
   # these functions' internals use some string processing which requires the


[arrow] 13/27: ARROW-18294: [Java] Fix Flight SQL JDBC PreparedStatement#executeUpdate (#14616)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit 89f1b91a86583c3bcd7122a058fcb81815157599
Author: David Li <li...@gmail.com>
AuthorDate: Wed Nov 9 16:07:19 2022 -0500

    ARROW-18294: [Java] Fix Flight SQL JDBC PreparedStatement#executeUpdate (#14616)
    
    We need to implement a separate code path for executing a prepared statement that returns an  update count.
    
    Authored-by: David Li <li...@gmail.com>
    Signed-off-by: David Li <li...@gmail.com>
---
 .../arrow/driver/jdbc/ArrowFlightConnection.java   |  2 +-
 .../arrow/driver/jdbc/ArrowFlightMetaImpl.java     | 78 +++++++++++++++++++---
 .../jdbc/ArrowFlightPreparedStatementTest.java     | 15 ++++-
 3 files changed, 83 insertions(+), 12 deletions(-)

diff --git a/java/flight/flight-sql-jdbc-driver/src/main/java/org/apache/arrow/driver/jdbc/ArrowFlightConnection.java b/java/flight/flight-sql-jdbc-driver/src/main/java/org/apache/arrow/driver/jdbc/ArrowFlightConnection.java
index d2b6e89e3f..79bc04d27f 100644
--- a/java/flight/flight-sql-jdbc-driver/src/main/java/org/apache/arrow/driver/jdbc/ArrowFlightConnection.java
+++ b/java/flight/flight-sql-jdbc-driver/src/main/java/org/apache/arrow/driver/jdbc/ArrowFlightConnection.java
@@ -139,7 +139,7 @@ public final class ArrowFlightConnection extends AvaticaConnection {
    *
    * @return the handler.
    */
-  ArrowFlightSqlClientHandler getClientHandler() throws SQLException {
+  ArrowFlightSqlClientHandler getClientHandler() {
     return clientHandler;
   }
 
diff --git a/java/flight/flight-sql-jdbc-driver/src/main/java/org/apache/arrow/driver/jdbc/ArrowFlightMetaImpl.java b/java/flight/flight-sql-jdbc-driver/src/main/java/org/apache/arrow/driver/jdbc/ArrowFlightMetaImpl.java
index cc7addc3a7..f825e7d13c 100644
--- a/java/flight/flight-sql-jdbc-driver/src/main/java/org/apache/arrow/driver/jdbc/ArrowFlightMetaImpl.java
+++ b/java/flight/flight-sql-jdbc-driver/src/main/java/org/apache/arrow/driver/jdbc/ArrowFlightMetaImpl.java
@@ -42,7 +42,7 @@ import org.apache.calcite.avatica.remote.TypedValue;
  * Metadata handler for Arrow Flight.
  */
 public class ArrowFlightMetaImpl extends MetaImpl {
-  private final Map<StatementHandle, PreparedStatement> statementHandlePreparedStatementMap;
+  private final Map<StatementHandleKey, PreparedStatement> statementHandlePreparedStatementMap;
 
   /**
    * Constructs a {@link MetaImpl} object specific for Arrow Flight.
@@ -67,7 +67,8 @@ public class ArrowFlightMetaImpl extends MetaImpl {
 
   @Override
   public void closeStatement(final StatementHandle statementHandle) {
-    PreparedStatement preparedStatement = statementHandlePreparedStatementMap.remove(statementHandle);
+    PreparedStatement preparedStatement =
+        statementHandlePreparedStatementMap.remove(new StatementHandleKey(statementHandle));
     // Testing if the prepared statement was created because the statement can be not created until this moment
     if (preparedStatement != null) {
       preparedStatement.close();
@@ -82,12 +83,25 @@ public class ArrowFlightMetaImpl extends MetaImpl {
   @Override
   public ExecuteResult execute(final StatementHandle statementHandle,
                                final List<TypedValue> typedValues, final long maxRowCount) {
-    // TODO Why is maxRowCount ignored?
-    Preconditions.checkNotNull(statementHandle.signature, "Signature not found.");
-    return new ExecuteResult(
-        Collections.singletonList(MetaResultSet.create(
-            statementHandle.connectionId, statementHandle.id,
-            true, statementHandle.signature, null)));
+    Preconditions.checkArgument(connection.id.equals(statementHandle.connectionId),
+        "Connection IDs are not consistent");
+    if (statementHandle.signature == null) {
+      // Update query
+      final StatementHandleKey key = new StatementHandleKey(statementHandle);
+      PreparedStatement preparedStatement = statementHandlePreparedStatementMap.get(key);
+      if (preparedStatement == null) {
+        throw new IllegalStateException("Prepared statement not found: " + statementHandle);
+      }
+      long updatedCount = preparedStatement.executeUpdate();
+      return new ExecuteResult(Collections.singletonList(MetaResultSet.count(statementHandle.connectionId,
+          statementHandle.id, updatedCount)));
+    } else {
+      // TODO Why is maxRowCount ignored?
+      return new ExecuteResult(
+          Collections.singletonList(MetaResultSet.create(
+              statementHandle.connectionId, statementHandle.id,
+              true, statementHandle.signature, null)));
+    }
   }
 
   @Override
@@ -121,6 +135,9 @@ public class ArrowFlightMetaImpl extends MetaImpl {
                                  final String query, final long maxRowCount) {
     final StatementHandle handle = super.createStatement(connectionHandle);
     handle.signature = newSignature(query);
+    final PreparedStatement preparedStatement =
+        ((ArrowFlightConnection) connection).getClientHandler().prepare(query);
+    statementHandlePreparedStatementMap.put(new StatementHandleKey(handle), preparedStatement);
     return handle;
   }
 
@@ -143,7 +160,7 @@ public class ArrowFlightMetaImpl extends MetaImpl {
       final PreparedStatement preparedStatement =
           ((ArrowFlightConnection) connection).getClientHandler().prepare(query);
       final StatementType statementType = preparedStatement.getType();
-      statementHandlePreparedStatementMap.put(handle, preparedStatement);
+      statementHandlePreparedStatementMap.put(new StatementHandleKey(handle), preparedStatement);
       final Signature signature = newSignature(query);
       final long updateCount =
           statementType.equals(StatementType.UPDATE) ? preparedStatement.executeUpdate() : -1;
@@ -195,6 +212,47 @@ public class ArrowFlightMetaImpl extends MetaImpl {
   }
 
   PreparedStatement getPreparedStatement(StatementHandle statementHandle) {
-    return statementHandlePreparedStatementMap.get(statementHandle);
+    return statementHandlePreparedStatementMap.get(new StatementHandleKey(statementHandle));
+  }
+
+  // Helper used to look up prepared statement instances later. Avatica doesn't give us the signature in
+  // an UPDATE code path so we can't directly use StatementHandle as a map key.
+  private static final class StatementHandleKey {
+    public final String connectionId;
+    public final int id;
+
+    StatementHandleKey(String connectionId, int id) {
+      this.connectionId = connectionId;
+      this.id = id;
+    }
+
+    StatementHandleKey(StatementHandle statementHandle) {
+      this.connectionId = statementHandle.connectionId;
+      this.id = statementHandle.id;
+    }
+
+    @Override
+    public boolean equals(Object o) {
+      if (this == o) {
+        return true;
+      }
+      if (o == null || getClass() != o.getClass()) {
+        return false;
+      }
+
+      StatementHandleKey that = (StatementHandleKey) o;
+
+      if (id != that.id) {
+        return false;
+      }
+      return connectionId.equals(that.connectionId);
+    }
+
+    @Override
+    public int hashCode() {
+      int result = connectionId.hashCode();
+      result = 31 * result + id;
+      return result;
+    }
   }
 }
diff --git a/java/flight/flight-sql-jdbc-driver/src/test/java/org/apache/arrow/driver/jdbc/ArrowFlightPreparedStatementTest.java b/java/flight/flight-sql-jdbc-driver/src/test/java/org/apache/arrow/driver/jdbc/ArrowFlightPreparedStatementTest.java
index 51c491be28..8af529296f 100644
--- a/java/flight/flight-sql-jdbc-driver/src/test/java/org/apache/arrow/driver/jdbc/ArrowFlightPreparedStatementTest.java
+++ b/java/flight/flight-sql-jdbc-driver/src/test/java/org/apache/arrow/driver/jdbc/ArrowFlightPreparedStatementTest.java
@@ -18,6 +18,7 @@
 package org.apache.arrow.driver.jdbc;
 
 import static org.hamcrest.CoreMatchers.equalTo;
+import static org.junit.jupiter.api.Assertions.assertEquals;
 
 import java.sql.Connection;
 import java.sql.PreparedStatement;
@@ -25,6 +26,7 @@ import java.sql.ResultSet;
 import java.sql.SQLException;
 
 import org.apache.arrow.driver.jdbc.utils.CoreMockedSqlProducers;
+import org.apache.arrow.driver.jdbc.utils.MockFlightSqlProducer;
 import org.junit.AfterClass;
 import org.junit.BeforeClass;
 import org.junit.ClassRule;
@@ -34,9 +36,10 @@ import org.junit.rules.ErrorCollector;
 
 public class ArrowFlightPreparedStatementTest {
 
+  public static final MockFlightSqlProducer PRODUCER = CoreMockedSqlProducers.getLegacyProducer();
   @ClassRule
   public static final FlightServerTestRule FLIGHT_SERVER_TEST_RULE = FlightServerTestRule
-      .createStandardTestRule(CoreMockedSqlProducers.getLegacyProducer());
+      .createStandardTestRule(PRODUCER);
 
   private static Connection connection;
 
@@ -75,4 +78,14 @@ public class ArrowFlightPreparedStatementTest {
       collector.checkThat(6, equalTo(psmt.getMetaData().getColumnCount()));
     }
   }
+
+  @Test
+  public void testUpdateQuery() throws SQLException {
+    String query = "Fake update";
+    PRODUCER.addUpdateQuery(query, /*updatedRows*/42);
+    try (final PreparedStatement stmt = connection.prepareStatement(query)) {
+      int updated = stmt.executeUpdate();
+      assertEquals(42, updated);
+    }
+  }
 }


[arrow] 06/27: ARROW-18132: [R] Add deprecation cycle for pull() change (#14475)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit 0f4ba5360da4eca99830e2b11c2ee9c549039b52
Author: Neal Richardson <ne...@gmail.com>
AuthorDate: Sat Oct 22 08:49:18 2022 -0400

    ARROW-18132: [R] Add deprecation cycle for pull() change (#14475)
    
    Authored-by: Neal Richardson <ne...@gmail.com>
    Signed-off-by: Neal Richardson <ne...@gmail.com>
---
 r/NEWS.md                           | 13 ++++++-----
 r/R/arrow-package.R                 |  8 ++++++-
 r/R/dplyr-collect.R                 | 45 ++++++++++++++++++++++++++++++++-----
 r/R/dplyr-funcs-doc.R               |  2 +-
 r/R/dplyr-group-by.R                |  5 ++++-
 r/man/acero.Rd                      |  2 +-
 r/man/cast.Rd                       |  2 +-
 r/tests/testthat/helper-arrow.R     |  4 ++++
 r/tests/testthat/test-dplyr-query.R | 17 +++++++++++---
 9 files changed, 80 insertions(+), 18 deletions(-)

diff --git a/r/NEWS.md b/r/NEWS.md
index 11d3df1e88..01ce44ce1f 100644
--- a/r/NEWS.md
+++ b/r/NEWS.md
@@ -45,11 +45,14 @@ A few new features and bugfixes were implemented for joins:
   join keys (when `keep = FALSE`), avoiding the issue where the join keys would
   be all `NA` for rows in the right hand side without any matches on the left.
 
-A few breaking changes that improve the consistency of the API:
-
-* Calling `dplyr::pull()` will return a `?ChunkedArray` instead of an R vector.
-* Calling `dplyr::compute()` on a query that is grouped
-  returns a `?Table`, instead of a query object.
+Some changes to improve the consistency of the API:
+
+* In a future release, calling `dplyr::pull()` will return a `?ChunkedArray`
+  instead of an R vector by default. The current default behavior is deprecated.
+  To update to the new behavior now, specify `pull(as_vector = FALSE)` or set
+  `options(arrow.pull_as_vector = FALSE)` globally.
+* Calling `dplyr::compute()` on a query that is grouped returns a `?Table`
+  instead of a query object.
 
 Finally, long-running queries can now be cancelled and will abort their
 computation immediately.
diff --git a/r/R/arrow-package.R b/r/R/arrow-package.R
index 1ab4e41a7a..aca593551f 100644
--- a/r/R/arrow-package.R
+++ b/r/R/arrow-package.R
@@ -54,7 +54,13 @@ supported_dplyr_methods <- list(
   transmute = NULL,
   arrange = NULL,
   rename = NULL,
-  pull = "returns an Arrow [ChunkedArray], not an R vector",
+  pull = c(
+    "the `name` argument is not supported;",
+    "returns an R vector by default but this behavior is deprecated and will",
+    "return an Arrow [ChunkedArray] in a future release. Provide",
+    "`as_vector = TRUE/FALSE` to control this behavior, or set",
+    "`options(arrow.pull_as_vector)` globally."
+  ),
   relocate = NULL,
   compute = NULL,
   collapse = NULL,
diff --git a/r/R/dplyr-collect.R b/r/R/dplyr-collect.R
index 4f8ffc7c1a..8bf22728d6 100644
--- a/r/R/dplyr-collect.R
+++ b/r/R/dplyr-collect.R
@@ -46,16 +46,51 @@ compute.arrow_dplyr_query <- function(x, ...) dplyr::collect(x, as_data_frame =
 compute.ArrowTabular <- function(x, ...) x
 compute.Dataset <- compute.RecordBatchReader <- compute.arrow_dplyr_query
 
-pull.arrow_dplyr_query <- function(.data, var = -1) {
+pull.Dataset <- function(.data,
+                         var = -1,
+                         ...,
+                         as_vector = getOption("arrow.pull_as_vector")) {
   .data <- as_adq(.data)
   var <- vars_pull(names(.data), !!enquo(var))
   .data$selected_columns <- set_names(.data$selected_columns[var], var)
-  dplyr::compute(.data)[[1]]
+  out <- dplyr::compute(.data)[[1]]
+  handle_pull_as_vector(out, as_vector)
+}
+pull.RecordBatchReader <- pull.arrow_dplyr_query <- pull.Dataset
+
+pull.ArrowTabular <- function(x,
+                              var = -1,
+                              ...,
+                              as_vector = getOption("arrow.pull_as_vector")) {
+  out <- x[[vars_pull(names(x), !!enquo(var))]]
+  handle_pull_as_vector(out, as_vector)
 }
-pull.Dataset <- pull.RecordBatchReader <- pull.arrow_dplyr_query
 
-pull.ArrowTabular <- function(x, var = -1) {
-  x[[vars_pull(names(x), !!enquo(var))]]
+handle_pull_as_vector <- function(out, as_vector) {
+  if (is.null(as_vector)) {
+    warn(
+      c(
+        paste(
+          "Default behavior of `pull()` on Arrow data is changing. Current",
+          "behavior of returning an R vector is deprecated, and in a future",
+          "release, it will return an Arrow `ChunkedArray`. To control this:"
+        ),
+        i = paste(
+          "Specify `as_vector = TRUE` (the current default) or",
+          "`FALSE` (what it will change to) in `pull()`"
+        ),
+        i = "Or, set `options(arrow.pull_as_vector)` globally"
+      ),
+      .frequency = "regularly",
+      .frequency_id = "arrow.pull_as_vector",
+      class = "lifecycle_warning_deprecated"
+    )
+    as_vector <- TRUE
+  }
+  if (as_vector) {
+    out <- as.vector(out)
+  }
+  out
 }
 
 restore_dplyr_features <- function(df, query) {
diff --git a/r/R/dplyr-funcs-doc.R b/r/R/dplyr-funcs-doc.R
index eb0f582201..b8337e3069 100644
--- a/r/R/dplyr-funcs-doc.R
+++ b/r/R/dplyr-funcs-doc.R
@@ -54,7 +54,7 @@
 #' * [`inner_join()`][dplyr::inner_join()]: the `copy` and `na_matches` arguments are ignored
 #' * [`left_join()`][dplyr::left_join()]: the `copy` and `na_matches` arguments are ignored
 #' * [`mutate()`][dplyr::mutate()]: window functions (e.g. things that require aggregation within groups) not currently supported
-#' * [`pull()`][dplyr::pull()]: returns an Arrow [ChunkedArray], not an R vector
+#' * [`pull()`][dplyr::pull()]: the `name` argument is not supported; returns an R vector by default but this behavior is deprecated and will return an Arrow [ChunkedArray] in a future release. Provide `as_vector = TRUE/FALSE` to control this behavior, or set `options(arrow.pull_as_vector)` globally.
 #' * [`relocate()`][dplyr::relocate()]
 #' * [`rename()`][dplyr::rename()]
 #' * [`rename_with()`][dplyr::rename_with()]
diff --git a/r/R/dplyr-group-by.R b/r/R/dplyr-group-by.R
index 57cf417c9a..85825b9bf2 100644
--- a/r/R/dplyr-group-by.R
+++ b/r/R/dplyr-group-by.R
@@ -25,7 +25,10 @@ group_by.arrow_dplyr_query <- function(.data,
                                        .drop = dplyr::group_by_drop_default(.data)) {
   if (!missing(add)) {
     .Deprecated(
-      msg = paste("The `add` argument of `group_by()` is deprecated. Please use the `.add` argument instead.")
+      msg = paste(
+        "The `add` argument of `group_by()` is deprecated.",
+        "Please use the `.add` argument instead."
+      )
     )
     .add <- add
   }
diff --git a/r/man/acero.Rd b/r/man/acero.Rd
index d340c2cbd8..84adf081de 100644
--- a/r/man/acero.Rd
+++ b/r/man/acero.Rd
@@ -38,7 +38,7 @@ Table into an R \code{data.frame}.
 \item \code{\link[dplyr:mutate-joins]{inner_join()}}: the \code{copy} and \code{na_matches} arguments are ignored
 \item \code{\link[dplyr:mutate-joins]{left_join()}}: the \code{copy} and \code{na_matches} arguments are ignored
 \item \code{\link[dplyr:mutate]{mutate()}}: window functions (e.g. things that require aggregation within groups) not currently supported
-\item \code{\link[dplyr:pull]{pull()}}: returns an Arrow \link{ChunkedArray}, not an R vector
+\item \code{\link[dplyr:pull]{pull()}}: the \code{name} argument is not supported; returns an R vector by default but this behavior is deprecated and will return an Arrow \link{ChunkedArray} in a future release. Provide \code{as_vector = TRUE/FALSE} to control this behavior, or set \code{options(arrow.pull_as_vector)} globally.
 \item \code{\link[dplyr:relocate]{relocate()}}
 \item \code{\link[dplyr:rename]{rename()}}
 \item \code{\link[dplyr:rename]{rename_with()}}
diff --git a/r/man/cast.Rd b/r/man/cast.Rd
index 6d87958376..81e729c704 100644
--- a/r/man/cast.Rd
+++ b/r/man/cast.Rd
@@ -34,7 +34,7 @@ mtcars \%>\%
 \seealso{
 \code{\link{data-type}} for a list of \link{DataType} to be used with \code{to}.
 
-\href{https://arrow.apache.org/docs/cpp/api/compute.html?highlight=castoptions#arrow\%3A\%3Acompute\%3A\%3ACastOptions}{Arrow C++ CastOptions documentation}
+\href{https://arrow.apache.org/docs/cpp/api/compute.html?highlight=castoptions#arrow\%3A\%3Acompute\%3A\%3ACastOptions}{Arrow C++ CastOptions documentation} # nolint
 for the list of supported CastOptions.
 }
 \keyword{internal}
diff --git a/r/tests/testthat/helper-arrow.R b/r/tests/testthat/helper-arrow.R
index d705a8029c..6812a3eec0 100644
--- a/r/tests/testthat/helper-arrow.R
+++ b/r/tests/testthat/helper-arrow.R
@@ -29,6 +29,10 @@ Sys.setlocale("LC_COLLATE", "C")
 # (R CMD check does this, but in case you're running outside of check)
 Sys.setenv(LANGUAGE = "en")
 
+# Set this option so that the deprecation warning isn't shown
+# (except when we test for it)
+options(arrow.pull_as_vector = FALSE)
+
 with_language <- function(lang, expr) {
   old <- Sys.getenv("LANGUAGE")
   # Check what this message is before changing languages; this will
diff --git a/r/tests/testthat/test-dplyr-query.R b/r/tests/testthat/test-dplyr-query.R
index db9a3bb30d..ef9a9bcdc1 100644
--- a/r/tests/testthat/test-dplyr-query.R
+++ b/r/tests/testthat/test-dplyr-query.R
@@ -91,6 +91,17 @@ test_that("pull", {
   )
 })
 
+test_that("pull() shows a deprecation warning if the option isn't set", {
+  expect_warning(
+    vec <- tbl %>%
+      arrow_table() %>%
+      pull(as_vector = NULL),
+    "Current behavior of returning an R vector is deprecated"
+  )
+  # And the default is the old behavior, an R vector
+  expect_identical(vec, pull(tbl))
+})
+
 test_that("collect(as_data_frame=FALSE)", {
   batch <- record_batch(tbl)
 
@@ -583,9 +594,9 @@ test_that("needs_projection unit tests", {
 
 test_that("compute() on a grouped query returns a Table with groups in metadata", {
   tab1 <- tbl %>%
-      arrow_table() %>%
-      group_by(int) %>%
-      compute()
+    arrow_table() %>%
+    group_by(int) %>%
+    compute()
   expect_r6_class(tab1, "Table")
   expect_equal(
     as.data.frame(tab1),


[arrow] 26/27: ARROW-18327: [CI][Release] verify-rc-source-*-macos-amd64 are failed (#14640)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit f15f771e277dc168fa9230aa662d1625463e12b8
Author: Sutou Kouhei <ko...@clear-code.com>
AuthorDate: Tue Nov 15 12:08:07 2022 +0900

    ARROW-18327: [CI][Release] verify-rc-source-*-macos-amd64 are failed (#14640)
    
    https://github.com/ursacomputing/crossbow/actions/runs/3461688619/jobs/5779609972#step:3:127
    
        ==> Pouring python@ 3.11--3.11.0.big_sur.bottle.tar.gz
        Error: The `brew link` step did not complete successfully
        The formula built, but is not symlinked into /usr/local
        Could not symlink bin/2to3-3.11
        Target /usr/local/bin/2to3-3.11
        already exists. You may want to remove it:
          rm '/usr/local/bin/2to3-3.11'
    
        To force the link and overwrite all conflicting files:
          brew link --overwrite python@ 3.11
    
        To list all files that would be deleted:
          brew link --overwrite --dry-run python@ 3.11
    
        Possible conflicting files are:
        /usr/local/bin/2to3-3.11 -> /Library/Frameworks/Python.framework/Versions/3.11/bin/2to3-3.11
        /usr/local/bin/idle3.11 -> /Library/Frameworks/Python.framework/Versions/3.11/bin/idle3.11
        /usr/local/bin/pydoc3.11 -> /Library/Frameworks/Python.framework/Versions/3.11/bin/pydoc3.11
        /usr/local/bin/python3.11 -> /Library/Frameworks/Python.framework/Versions/3.11/bin/python3.11
        /usr/local/bin/python3.11-config -> /Library/Frameworks/Python.framework/Versions/3.11/bin/python3.11-config
    
    Authored-by: Sutou Kouhei <ko...@clear-code.com>
    Signed-off-by: Sutou Kouhei <ko...@clear-code.com>
---
 dev/tasks/verify-rc/github.macos.amd64.yml | 4 ++++
 1 file changed, 4 insertions(+)

diff --git a/dev/tasks/verify-rc/github.macos.amd64.yml b/dev/tasks/verify-rc/github.macos.amd64.yml
index cdd6170289..7205b33371 100644
--- a/dev/tasks/verify-rc/github.macos.amd64.yml
+++ b/dev/tasks/verify-rc/github.macos.amd64.yml
@@ -43,6 +43,10 @@ jobs:
       - name: Install System Dependencies
         shell: bash
         run: |
+          rm -f /usr/local/bin/2to3*
+          rm -f /usr/local/bin/idle*
+          rm -f /usr/local/bin/pydoc3*
+          rm -f /usr/local/bin/python3*
           brew update
           brew install --overwrite git
           brew bundle --file=arrow/cpp/Brewfile


[arrow] 18/27: ARROW-18255: [C++] Don't fail cmake check for armv6 (#14611)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit 73da8d3b73a0a8caf6640ae360bc2e72bd0286a0
Author: Yibo Cai <yi...@arm.com>
AuthorDate: Thu Nov 10 08:06:56 2022 +0800

    ARROW-18255: [C++] Don't fail cmake check for armv6 (#14611)
    
    Authored-by: Yibo Cai <yi...@arm.com>
    Signed-off-by: Sutou Kouhei <ko...@clear-code.com>
---
 cpp/cmake_modules/SetupCxxFlags.cmake | 10 +++++-----
 1 file changed, 5 insertions(+), 5 deletions(-)

diff --git a/cpp/cmake_modules/SetupCxxFlags.cmake b/cpp/cmake_modules/SetupCxxFlags.cmake
index 88d2f0fdae..4fbbdfcdbe 100644
--- a/cpp/cmake_modules/SetupCxxFlags.cmake
+++ b/cpp/cmake_modules/SetupCxxFlags.cmake
@@ -27,9 +27,9 @@ if(NOT DEFINED ARROW_CPU_FLAG)
   if(CMAKE_SYSTEM_PROCESSOR MATCHES "AMD64|X86|x86|i[3456]86|x64")
     set(ARROW_CPU_FLAG "x86")
   elseif(CMAKE_SYSTEM_PROCESSOR MATCHES "aarch64|ARM64|arm64")
-    set(ARROW_CPU_FLAG "armv8")
-  elseif(CMAKE_SYSTEM_PROCESSOR MATCHES "armv7")
-    set(ARROW_CPU_FLAG "armv7")
+    set(ARROW_CPU_FLAG "aarch64")
+  elseif(CMAKE_SYSTEM_PROCESSOR MATCHES "^arm$|armv[4-7]")
+    set(ARROW_CPU_FLAG "aarch32")
   elseif(CMAKE_SYSTEM_PROCESSOR MATCHES "powerpc|ppc")
     set(ARROW_CPU_FLAG "ppc")
   elseif(CMAKE_SYSTEM_PROCESSOR MATCHES "s390x")
@@ -108,7 +108,7 @@ elseif(ARROW_CPU_FLAG STREQUAL "ppc")
   if(ARROW_SIMD_LEVEL STREQUAL "DEFAULT")
     set(ARROW_SIMD_LEVEL "NONE")
   endif()
-elseif(ARROW_CPU_FLAG STREQUAL "armv8")
+elseif(ARROW_CPU_FLAG STREQUAL "aarch64")
   # Arm64 compiler flags, gcc/clang only
   set(ARROW_ARMV8_MARCH "armv8-a")
   check_cxx_compiler_flag("-march=${ARROW_ARMV8_MARCH}+sve" CXX_SUPPORTS_SVE)
@@ -484,7 +484,7 @@ if(ARROW_CPU_FLAG STREQUAL "ppc")
   endif()
 endif()
 
-if(ARROW_CPU_FLAG STREQUAL "armv8")
+if(ARROW_CPU_FLAG STREQUAL "aarch64")
   if(ARROW_SIMD_LEVEL MATCHES "NEON|SVE[0-9]*")
     set(ARROW_HAVE_NEON ON)
     add_definitions(-DARROW_HAVE_NEON)


[arrow] 11/27: ARROW-17487: [Python][Packaging][CI] Add support for Python 3.11 (#14499)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit 60bbf14285a45b38f81ecd79769dbc6cf64d453e
Author: Raúl Cumplido <ra...@gmail.com>
AuthorDate: Fri Nov 4 21:32:09 2022 +0100

    ARROW-17487: [Python][Packaging][CI] Add support for Python 3.11 (#14499)
    
    This PR adds jobs to build pyarrow wheels for Python 3.11.
    
    Authored-by: Raúl Cumplido <ra...@gmail.com>
    Signed-off-by: Sutou Kouhei <ko...@clear-code.com>
---
 .../python-wheel-windows-test-vs2017.dockerfile    |  5 +-
 ci/docker/python-wheel-windows-vs2017.dockerfile   |  5 +-
 ci/scripts/install_gcs_testbench.sh                | 11 +----
 ci/scripts/install_python.sh                       |  7 +--
 ci/scripts/python_wheel_macos_build.sh             |  4 +-
 dev/release/verify-release-candidate.sh            | 26 ++---------
 dev/tasks/python-wheels/github.osx.arm64.yml       | 54 ----------------------
 dev/tasks/tasks.yml                                | 17 ++-----
 docker-compose.yml                                 |  2 +-
 python/pyarrow/_fs.pyx                             | 11 ++++-
 python/requirements-wheel-test.txt                 | 15 ++++--
 python/setup.py                                    |  1 +
 12 files changed, 40 insertions(+), 118 deletions(-)

diff --git a/ci/docker/python-wheel-windows-test-vs2017.dockerfile b/ci/docker/python-wheel-windows-test-vs2017.dockerfile
index 6013efcd46..a4c836ef4f 100644
--- a/ci/docker/python-wheel-windows-test-vs2017.dockerfile
+++ b/ci/docker/python-wheel-windows-test-vs2017.dockerfile
@@ -37,7 +37,8 @@ RUN wmic product where "name like 'python%%'" call uninstall /nointeractive && \
 ARG python=3.8
 RUN (if "%python%"=="3.7" setx PYTHON_VERSION "3.7.9" && setx PATH "%PATH%;C:\Python37;C:\Python37\Scripts") & \
     (if "%python%"=="3.8" setx PYTHON_VERSION "3.8.10" && setx PATH "%PATH%;C:\Python38;C:\Python38\Scripts") & \
-    (if "%python%"=="3.9" setx PYTHON_VERSION "3.9.7" && setx PATH "%PATH%;C:\Python39;C:\Python39\Scripts") & \
-    (if "%python%"=="3.10" setx PYTHON_VERSION "3.10.2" && setx PATH "%PATH%;C:\Python310;C:\Python310\Scripts")
+    (if "%python%"=="3.9" setx PYTHON_VERSION "3.9.13" && setx PATH "%PATH%;C:\Python39;C:\Python39\Scripts") & \
+    (if "%python%"=="3.10" setx PYTHON_VERSION "3.10.8" && setx PATH "%PATH%;C:\Python310;C:\Python310\Scripts") & \
+    (if "%python%"=="3.11" setx PYTHON_VERSION "3.11.0" && setx PATH "%PATH%;C:\Python311;C:\Python311\Scripts")
 RUN choco install -r -y --no-progress python --version=%PYTHON_VERSION%
 RUN python -m pip install -U pip setuptools
diff --git a/ci/docker/python-wheel-windows-vs2017.dockerfile b/ci/docker/python-wheel-windows-vs2017.dockerfile
index 247f13a15c..f82a47a057 100644
--- a/ci/docker/python-wheel-windows-vs2017.dockerfile
+++ b/ci/docker/python-wheel-windows-vs2017.dockerfile
@@ -80,8 +80,9 @@ RUN wmic product where "name like 'python%%'" call uninstall /nointeractive && \
 ARG python=3.8
 RUN (if "%python%"=="3.7" setx PYTHON_VERSION "3.7.9" && setx PATH "%PATH%;C:\Python37;C:\Python37\Scripts") & \
     (if "%python%"=="3.8" setx PYTHON_VERSION "3.8.10" && setx PATH "%PATH%;C:\Python38;C:\Python38\Scripts") & \
-    (if "%python%"=="3.9" setx PYTHON_VERSION "3.9.7" && setx PATH "%PATH%;C:\Python39;C:\Python39\Scripts") & \
-    (if "%python%"=="3.10" setx PYTHON_VERSION "3.10.2" && setx PATH "%PATH%;C:\Python310;C:\Python310\Scripts")
+    (if "%python%"=="3.9" setx PYTHON_VERSION "3.9.13" && setx PATH "%PATH%;C:\Python39;C:\Python39\Scripts") & \
+    (if "%python%"=="3.10" setx PYTHON_VERSION "3.10.8" && setx PATH "%PATH%;C:\Python310;C:\Python310\Scripts") & \
+    (if "%python%"=="3.11" setx PYTHON_VERSION "3.11.0" && setx PATH "%PATH%;C:\Python311;C:\Python311\Scripts")
 RUN choco install -r -y --no-progress python --version=%PYTHON_VERSION%
 RUN python -m pip install -U pip setuptools
 
diff --git a/ci/scripts/install_gcs_testbench.sh b/ci/scripts/install_gcs_testbench.sh
index 9a788fdfd4..0109ea607f 100755
--- a/ci/scripts/install_gcs_testbench.sh
+++ b/ci/scripts/install_gcs_testbench.sh
@@ -34,18 +34,9 @@ case "$(uname -m)" in
     ;;
 esac
 
-case "$(uname -s)-$(uname -m)" in
-  Darwin-arm64)
-    # Workaround for https://github.com/grpc/grpc/issues/28387 .
-    # Build grpcio instead of using wheel.
-    # storage-testbench 0.27.0 pins grpcio to 1.46.1.
-    ${PYTHON:-python3} -m pip install --no-binary :all: "grpcio==1.46.1"
-    ;;
-esac
-
 version=$1
 if [[ "${version}" -eq "default" ]]; then
-  version="v0.27.0"
+  version="v0.32.0"
 fi
 
 ${PYTHON:-python3} -m pip install \
diff --git a/ci/scripts/install_python.sh b/ci/scripts/install_python.sh
index 7a18cd8324..d64318751c 100755
--- a/ci/scripts/install_python.sh
+++ b/ci/scripts/install_python.sh
@@ -27,8 +27,9 @@ platforms=([windows]=Windows
 declare -A versions
 versions=([3.7]=3.7.9
           [3.8]=3.8.10
-          [3.9]=3.9.9
-          [3.10]=3.10.1)
+          [3.9]=3.9.13
+          [3.10]=3.10.8
+          [3.11]=3.11.0)
 
 if [ "$#" -ne 2 ]; then
   echo "Usage: $0 <platform> <version>"
@@ -45,7 +46,7 @@ full_version=${versions[$2]}
 if [ $platform = "MacOSX" ]; then
     echo "Downloading Python installer..."
 
-    if [ "$(uname -m)" = "arm64" ] || [ "$version" = "3.10" ]; then
+    if [ "$(uname -m)" = "arm64" ] || [ "$version" = "3.10" ] || [ "$version" = "3.11" ]; then
         fname="python-${full_version}-macos11.pkg"
     else
         fname="python-${full_version}-macosx10.9.pkg"
diff --git a/ci/scripts/python_wheel_macos_build.sh b/ci/scripts/python_wheel_macos_build.sh
index fd24c58d63..7c7ef7745c 100755
--- a/ci/scripts/python_wheel_macos_build.sh
+++ b/ci/scripts/python_wheel_macos_build.sh
@@ -41,8 +41,6 @@ if [ $arch = "arm64" ]; then
   export CMAKE_OSX_ARCHITECTURES="arm64"
 elif [ $arch = "x86_64" ]; then
   export CMAKE_OSX_ARCHITECTURES="x86_64"
-elif [ $arch = "universal2" ]; then
-  export CMAKE_OSX_ARCHITECTURES="x86_64;arm64"
 else
   echo "Unexpected architecture: $arch"
   exit 1
@@ -58,7 +56,7 @@ pip install \
   --target $PIP_SITE_PACKAGES \
   --platform $PIP_TARGET_PLATFORM \
   -r ${source_dir}/python/requirements-wheel-build.txt
-pip install "delocate>=0.9"
+pip install "delocate>=0.10.3"
 
 echo "=== (${PYTHON_VERSION}) Building Arrow C++ libraries ==="
 : ${ARROW_DATASET:=ON}
diff --git a/dev/release/verify-release-candidate.sh b/dev/release/verify-release-candidate.sh
index f874500c8f..bcbde89893 100755
--- a/dev/release/verify-release-candidate.sh
+++ b/dev/release/verify-release-candidate.sh
@@ -1014,7 +1014,7 @@ test_linux_wheels() {
     local arch="x86_64"
   fi
 
-  local python_versions="${TEST_PYTHON_VERSIONS:-3.7m 3.8 3.9 3.10}"
+  local python_versions="${TEST_PYTHON_VERSIONS:-3.7m 3.8 3.9 3.10 3.11}"
   local platform_tags="manylinux_2_17_${arch}.manylinux2014_${arch}"
 
   for python in ${python_versions}; do
@@ -1036,11 +1036,11 @@ test_macos_wheels() {
 
   # apple silicon processor
   if [ "$(uname -m)" = "arm64" ]; then
-    local python_versions="3.8 3.9 3.10"
+    local python_versions="3.8 3.9 3.10 3.11"
     local platform_tags="macosx_11_0_arm64"
     local check_flight=OFF
   else
-    local python_versions="3.7m 3.8 3.9 3.10"
+    local python_versions="3.7m 3.8 3.9 3.10 3.11"
     local platform_tags="macosx_10_14_x86_64"
   fi
 
@@ -1062,26 +1062,6 @@ test_macos_wheels() {
         ${ARROW_DIR}/ci/scripts/python_wheel_unix_test.sh ${ARROW_SOURCE_DIR}
     done
   done
-
-  # verify arm64 and universal2 wheels using an universal2 python binary
-  # the interpreter should be installed from python.org:
-  #   https://www.python.org/ftp/python/3.9.6/python-3.9.6-macosx10.9.pkg
-  if [ "$(uname -m)" = "arm64" ]; then
-    for pyver in 3.9 3.10; do
-      local python="/Library/Frameworks/Python.framework/Versions/${pyver}/bin/python${pyver}"
-
-      # create and activate a virtualenv for testing as arm64
-      for arch in "arm64" "x86_64"; do
-        show_header "Testing Python ${pyver} universal2 wheel on ${arch}"
-        VENV_ENV=wheel-${pyver}-universal2-${arch} PYTHON=${python} maybe_setup_virtualenv || continue
-        # install pyarrow's universal2 wheel
-        pip install pyarrow-${VERSION}-cp${pyver/.}-cp${pyver/.}-macosx_11_0_universal2.whl
-        # check the imports and execute the unittests
-        INSTALL_PYARROW=OFF ARROW_FLIGHT=${check_flight} \
-          arch -${arch} ${ARROW_DIR}/ci/scripts/python_wheel_unix_test.sh ${ARROW_SOURCE_DIR}
-      done
-    done
-  fi
 }
 
 test_wheels() {
diff --git a/dev/tasks/python-wheels/github.osx.arm64.yml b/dev/tasks/python-wheels/github.osx.arm64.yml
index a555b28713..2c796d162d 100644
--- a/dev/tasks/python-wheels/github.osx.arm64.yml
+++ b/dev/tasks/python-wheels/github.osx.arm64.yml
@@ -87,46 +87,6 @@ jobs:
           pip install --upgrade pip wheel
           arrow/ci/scripts/python_wheel_macos_build.sh arm64 $(pwd)/arrow $(pwd)/build
 
-      {% if arch == "universal2" %}
-      - name: Install AMD64 Packages
-        env:
-          VCPKG_DEFAULT_TRIPLET: amd64-osx-static-release
-        run: |
-          vcpkg install \
-            --clean-after-build \
-            --x-install-root=${VCPKG_ROOT}/installed \
-            --x-manifest-root=arrow/ci/vcpkg \
-            --x-feature=flight \
-            --x-feature=gcs \
-            --x-feature=json \
-            --x-feature=parquet \
-            --x-feature=s3
-
-      - name: Build AMD64 Wheel
-        env:
-          ARROW_SIMD_LEVEL: "NONE"
-          VCPKG_DEFAULT_TRIPLET: amd64-osx-static-release
-        run: |
-          $PYTHON -m venv build-amd64-env
-          source build-amd64-env/bin/activate
-          pip install --upgrade pip wheel
-          arch -x86_64 arrow/ci/scripts/python_wheel_macos_build.sh x86_64 $(pwd)/arrow $(pwd)/build
-
-      - name: Fuse AMD64 and ARM64 wheels
-        run: |
-          source build-amd64-env/bin/activate
-          pip install delocate
-
-          amd64_wheel=$(ls arrow/python/repaired_wheels/pyarrow*x86_64.whl)
-          arm64_wheel=$(ls arrow/python/repaired_wheels/pyarrow*arm64.whl)
-          echo "Fusing ${amd64_wheel} and ${arm64_wheel} into an universal2 wheel..."
-          delocate-fuse $amd64_wheel $arm64_wheel -w .
-
-          fused_wheel=$(ls *x86_64.whl)
-          rm arrow/python/repaired_wheels/*.whl
-          mv $fused_wheel arrow/python/repaired_wheels/${fused_wheel/x86_64/universal2}
-      {% endif %}
-
       - uses: actions/upload-artifact@v3
         with:
           name: wheel
@@ -148,20 +108,6 @@ jobs:
           PYTHON=python  arrow/ci/scripts/install_gcs_testbench.sh default
            arrow/ci/scripts/python_wheel_unix_test.sh $(pwd)/arrow
 
-      {% if arch == "universal2" %}
-      - name: Test Wheel on AMD64
-        shell: bash
-        env:
-          PYTEST_ADDOPTS: "-k 'not test_cancellation'"
-        run: |
-          $PYTHON -m venv test-amd64-env
-          source test-amd64-env/bin/activate
-          pip install --upgrade pip wheel
-          arch -x86_64 pip install -r arrow/python/requirements-wheel-test.txt
-          PYTHON=python arch -x86_64 arrow/ci/scripts/install_gcs_testbench.sh default
-          arch -x86_64 arrow/ci/scripts/python_wheel_unix_test.sh $(pwd)/arrow
-      {% endif %}
-
       - name: Upload artifacts
         shell: bash
         run: |
diff --git a/dev/tasks/tasks.yml b/dev/tasks/tasks.yml
index bdf53ff1da..57878227b2 100644
--- a/dev/tasks/tasks.yml
+++ b/dev/tasks/tasks.yml
@@ -458,7 +458,8 @@ tasks:
 {% for python_version, python_tag, abi_tag in [("3.7", "cp37", "cp37m"),
                                                ("3.8", "cp38", "cp38"),
                                                ("3.9", "cp39", "cp39"),
-                                               ("3.10", "cp310", "cp310")] %}
+                                               ("3.10", "cp310", "cp310"),
+                                               ("3.11", "cp311", "cp311")] %}
 
 {############################## Wheel Linux ##################################}
 
@@ -519,7 +520,7 @@ tasks:
     artifacts:
       - pyarrow-{no_rc_version}-cp38-cp38-macosx_11_0_arm64.whl
 
-{% for python_version, python_tag in [("3.9", "cp39"), ("3.10", "cp310")] %}
+{% for python_version, python_tag in [("3.9", "cp39"), ("3.10", "cp310"), ("3.11", "cp311")] %}
   wheel-macos-big-sur-{{ python_tag }}-arm64:
     ci: github
     template: python-wheels/github.osx.arm64.yml
@@ -529,16 +530,6 @@ tasks:
       macos_deployment_target: "11.0"
     artifacts:
       - pyarrow-{no_rc_version}-{{ python_tag }}-{{ python_tag }}-macosx_11_0_arm64.whl
-
-  wheel-macos-big-sur-{{ python_tag }}-universal2:
-    ci: github
-    template: python-wheels/github.osx.arm64.yml
-    params:
-      arch: universal2
-      python_version: "{{ python_version }}"
-      macos_deployment_target: "10.14"
-    artifacts:
-      - pyarrow-{no_rc_version}-{{ python_tag }}-{{ python_tag }}-macosx_10_14_universal2.whl
 {% endfor %}
 
 {############################ Python sdist ####################################}
@@ -1224,7 +1215,7 @@ tasks:
         UBUNTU: 20.04
       image: ubuntu-cpp-thread-sanitizer
 
-{% for python_version in ["3.7", "3.8", "3.9", "3.10"] %}
+{% for python_version in ["3.7", "3.8", "3.9", "3.10", "3.11"] %}
   test-conda-python-{{ python_version }}:
     ci: github
     template: docker-tests/github.linux.yml
diff --git a/docker-compose.yml b/docker-compose.yml
index 57cd03d11d..313a1623ba 100644
--- a/docker-compose.yml
+++ b/docker-compose.yml
@@ -918,7 +918,7 @@ services:
       args:
         arch: ${ARCH}
         arch_short: ${ARCH_SHORT}
-        base: quay.io/pypa/manylinux2014_${ARCH_ALIAS}:2021-10-11-14ac00e
+        base: quay.io/pypa/manylinux2014_${ARCH_ALIAS}:2022-10-25-fbea779
         vcpkg: ${VCPKG}
         python: ${PYTHON}
         manylinux: 2014
diff --git a/python/pyarrow/_fs.pyx b/python/pyarrow/_fs.pyx
index e7b028a07d..557c081493 100644
--- a/python/pyarrow/_fs.pyx
+++ b/python/pyarrow/_fs.pyx
@@ -78,6 +78,12 @@ cdef CFileType _unwrap_file_type(FileType ty) except *:
     assert 0
 
 
+def _file_type_to_string(ty):
+    # Python 3.11 changed str(IntEnum) to return the string representation
+    # of the integer value: https://github.com/python/cpython/issues/94763
+    return f"{ty.__class__.__name__}.{ty._name_}"
+
+
 cdef class FileInfo(_Weakrefable):
     """
     FileSystem entry info.
@@ -185,9 +191,10 @@ cdef class FileInfo(_Weakrefable):
             except ValueError:
                 return ''
 
-        s = '<FileInfo for {!r}: type={}'.format(self.path, str(self.type))
+        s = (f'<FileInfo for {self.path!r}: '
+             f'type={_file_type_to_string(self.type)}')
         if self.is_file:
-            s += ', size={}'.format(self.size)
+            s += f', size={self.size}'
         s += '>'
         return s
 
diff --git a/python/requirements-wheel-test.txt b/python/requirements-wheel-test.txt
index 1644b2f8bc..dd07f0358d 100644
--- a/python/requirements-wheel-test.txt
+++ b/python/requirements-wheel-test.txt
@@ -8,14 +8,19 @@ pytz
 tzdata; sys_platform == 'win32'
 
 numpy==1.19.5; platform_system == "Linux"   and platform_machine == "aarch64" and python_version <  "3.7"
-numpy==1.21.3; platform_system == "Linux"   and platform_machine == "aarch64" and python_version >= "3.7"
+numpy==1.21.3; platform_system == "Linux"   and platform_machine == "aarch64" and python_version >= "3.7" and python_version < "3.11"
+numpy==1.23.4; platform_system == "Linux"   and platform_machine == "aarch64" and python_version >= "3.11"
 numpy==1.19.5; platform_system == "Linux"   and platform_machine != "aarch64" and python_version <  "3.9"
-numpy==1.21.3; platform_system == "Linux"   and platform_machine != "aarch64" and python_version >= "3.9"
-numpy==1.21.3; platform_system == "Darwin"  and platform_machine == "arm64"
+numpy==1.21.3; platform_system == "Linux"   and platform_machine != "aarch64" and python_version >= "3.9" and python_version < "3.11"
+numpy==1.23.4; platform_system == "Linux"   and platform_machine != "aarch64" and python_version >= "3.11"
+numpy==1.21.3; platform_system == "Darwin"  and platform_machine == "arm64"   and python_version <  "3.11"
+numpy==1.23.4; platform_system == "Darwin"  and platform_machine == "arm64"   and python_version >= "3.11"
 numpy==1.19.5; platform_system == "Darwin"  and platform_machine != "arm64"   and python_version <  "3.9"
-numpy==1.21.3; platform_system == "Darwin"  and platform_machine != "arm64"   and python_version >= "3.9"
+numpy==1.21.3; platform_system == "Darwin"  and platform_machine != "arm64"   and python_version >= "3.9" and python_version < "3.11"
+numpy==1.23.4; platform_system == "Darwin"  and platform_machine != "arm64"   and python_version >= "3.11"
 numpy==1.19.5; platform_system == "Windows"                                   and python_version <  "3.9"
-numpy==1.21.3; platform_system == "Windows"                                   and python_version >= "3.9"
+numpy==1.21.3; platform_system == "Windows"                                   and python_version >= "3.9" and python_version < "3.11"
+numpy==1.23.4; platform_system == "Windows"                                   and python_version >= "3.11"
 
 pandas<1.1.0;  platform_system == "Linux"   and platform_machine != "aarch64" and python_version <  "3.8"
 pandas;        platform_system == "Linux"   and platform_machine != "aarch64" and python_version >= "3.8"
diff --git a/python/setup.py b/python/setup.py
index 73bf19225a..6ca87c7940 100755
--- a/python/setup.py
+++ b/python/setup.py
@@ -768,6 +768,7 @@ setup(
         'Programming Language :: Python :: 3.8',
         'Programming Language :: Python :: 3.9',
         'Programming Language :: Python :: 3.10',
+        'Programming Language :: Python :: 3.11',
     ],
     license='Apache License, Version 2.0',
     maintainer='Apache Arrow Developers',


[arrow] 19/27: ARROW-18296: [Java] Honor Driver#connect API contract in JDBC driver (#14617)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit 26b0ec4db3c6f6b94e76ed4515e407e8955f1889
Author: David Li <li...@gmail.com>
AuthorDate: Wed Nov 9 19:59:17 2022 -0500

    ARROW-18296: [Java] Honor Driver#connect API contract in JDBC driver (#14617)
    
    Authored-by: David Li <li...@gmail.com>
    Signed-off-by: David Li <li...@gmail.com>
---
 .../arrow/driver/jdbc/ArrowFlightJdbcDriver.java   | 16 +++++----
 .../driver/jdbc/ArrowFlightJdbcDriverTest.java     | 41 +++++++++++-----------
 2 files changed, 30 insertions(+), 27 deletions(-)

diff --git a/java/flight/flight-sql-jdbc-driver/src/main/java/org/apache/arrow/driver/jdbc/ArrowFlightJdbcDriver.java b/java/flight/flight-sql-jdbc-driver/src/main/java/org/apache/arrow/driver/jdbc/ArrowFlightJdbcDriver.java
index a72fbd3a4d..aa1b460fc1 100644
--- a/java/flight/flight-sql-jdbc-driver/src/main/java/org/apache/arrow/driver/jdbc/ArrowFlightJdbcDriver.java
+++ b/java/flight/flight-sql-jdbc-driver/src/main/java/org/apache/arrow/driver/jdbc/ArrowFlightJdbcDriver.java
@@ -29,6 +29,7 @@ import java.nio.charset.StandardCharsets;
 import java.sql.SQLException;
 import java.util.Map;
 import java.util.Objects;
+import java.util.Optional;
 import java.util.Properties;
 
 import org.apache.arrow.driver.jdbc.utils.ArrowFlightConnectionConfigImpl.ArrowFlightConnectionProperty;
@@ -72,7 +73,11 @@ public class ArrowFlightJdbcDriver extends UnregisteredDriver {
     properties.putAll(info);
 
     if (url != null) {
-      final Map<Object, Object> propertiesFromUrl = getUrlsArgs(url);
+      final Optional<Map<Object, Object>> maybeProperties = getUrlsArgs(url);
+      if (!maybeProperties.isPresent()) {
+        return null;
+      }
+      final Map<Object, Object> propertiesFromUrl = maybeProperties.get();
       properties.putAll(propertiesFromUrl);
     }
 
@@ -199,11 +204,11 @@ public class ArrowFlightJdbcDriver extends UnregisteredDriver {
    * </table>
    *
    * @param url The url to parse.
-   * @return the parsed arguments.
+   * @return the parsed arguments, or an empty optional if the driver does not handle this URL.
    * @throws SQLException If an error occurs while trying to parse the URL.
    */
   @VisibleForTesting // ArrowFlightJdbcDriverTest
-  Map<Object, Object> getUrlsArgs(String url)
+  Optional<Map<Object, Object>> getUrlsArgs(String url)
       throws SQLException {
 
     /*
@@ -240,8 +245,7 @@ public class ArrowFlightJdbcDriver extends UnregisteredDriver {
 
     if (!Objects.equals(uri.getScheme(), "arrow-flight") &&
         !Objects.equals(uri.getScheme(), "arrow-flight-sql")) {
-      throw new SQLException("URL Scheme must be 'arrow-flight'. Expected format: " +
-          CONNECTION_STRING_EXPECTED);
+      return Optional.empty();
     }
 
     if (uri.getHost() == null) {
@@ -258,7 +262,7 @@ public class ArrowFlightJdbcDriver extends UnregisteredDriver {
       resultMap.putAll(keyValuePairs);
     }
 
-    return resultMap;
+    return Optional.of(resultMap);
   }
 
   static Properties lowerCasePropertyKeys(final Properties properties) {
diff --git a/java/flight/flight-sql-jdbc-driver/src/test/java/org/apache/arrow/driver/jdbc/ArrowFlightJdbcDriverTest.java b/java/flight/flight-sql-jdbc-driver/src/test/java/org/apache/arrow/driver/jdbc/ArrowFlightJdbcDriverTest.java
index f1958b4fc8..9b8fa96d23 100644
--- a/java/flight/flight-sql-jdbc-driver/src/test/java/org/apache/arrow/driver/jdbc/ArrowFlightJdbcDriverTest.java
+++ b/java/flight/flight-sql-jdbc-driver/src/test/java/org/apache/arrow/driver/jdbc/ArrowFlightJdbcDriverTest.java
@@ -17,9 +17,11 @@
 
 package org.apache.arrow.driver.jdbc;
 
-import static org.junit.Assert.assertEquals;
-import static org.junit.Assert.assertTrue;
+import static org.junit.jupiter.api.Assertions.assertEquals;
+import static org.junit.jupiter.api.Assertions.assertFalse;
+import static org.junit.jupiter.api.Assertions.assertNull;
 import static org.junit.jupiter.api.Assertions.assertThrows;
+import static org.junit.jupiter.api.Assertions.assertTrue;
 
 import java.sql.Connection;
 import java.sql.Driver;
@@ -91,17 +93,15 @@ public class ArrowFlightJdbcDriverTest {
   }
 
   /**
-   * Tests whether the {@link ArrowFlightJdbcDriver} fails when provided with an
+   * Tests whether the {@link ArrowFlightJdbcDriver} returns null when provided with an
    * unsupported URL prefix.
-   *
-   * @throws SQLException If the test passes.
    */
-  @Test(expected = SQLException.class)
+  @Test
   public void testShouldDeclineUrlWithUnsupportedPrefix() throws Exception {
     final Driver driver = new ArrowFlightJdbcDriver();
 
-    driver.connect("jdbc:mysql://localhost:32010", dataSource.getProperties("flight", "flight123"))
-        .close();
+    assertNull(driver.connect("jdbc:mysql://localhost:32010",
+        dataSource.getProperties("flight", "flight123")));
   }
 
   /**
@@ -263,7 +263,8 @@ public class ArrowFlightJdbcDriverTest {
     final ArrowFlightJdbcDriver driver = new ArrowFlightJdbcDriver();
 
     final Map<Object, Object> parsedArgs = driver.getUrlsArgs(
-        "jdbc:arrow-flight-sql://localhost:2222/?key1=value1&key2=value2&a=b");
+        "jdbc:arrow-flight-sql://localhost:2222/?key1=value1&key2=value2&a=b")
+        .orElseThrow(() -> new RuntimeException("URL was rejected"));
 
     // Check size == the amount of args provided (scheme not included)
     assertEquals(5, parsedArgs.size());
@@ -284,7 +285,8 @@ public class ArrowFlightJdbcDriverTest {
   public void testDriverUrlParsingMechanismShouldReturnTheDesiredArgsFromUrlWithSemicolon() throws Exception {
     final ArrowFlightJdbcDriver driver = new ArrowFlightJdbcDriver();
     final Map<Object, Object> parsedArgs = driver.getUrlsArgs(
-        "jdbc:arrow-flight-sql://localhost:2222/;key1=value1;key2=value2;a=b");
+        "jdbc:arrow-flight-sql://localhost:2222/;key1=value1;key2=value2;a=b")
+        .orElseThrow(() -> new RuntimeException("URL was rejected"));
 
     // Check size == the amount of args provided (scheme not included)
     assertEquals(5, parsedArgs.size());
@@ -305,7 +307,8 @@ public class ArrowFlightJdbcDriverTest {
   public void testDriverUrlParsingMechanismShouldReturnTheDesiredArgsFromUrlWithOneSemicolon() throws Exception {
     final ArrowFlightJdbcDriver driver = new ArrowFlightJdbcDriver();
     final Map<Object, Object> parsedArgs = driver.getUrlsArgs(
-        "jdbc:arrow-flight-sql://localhost:2222/;key1=value1");
+        "jdbc:arrow-flight-sql://localhost:2222/;key1=value1")
+        .orElseThrow(() -> new RuntimeException("URL was rejected"));
 
     // Check size == the amount of args provided (scheme not included)
     assertEquals(3, parsedArgs.size());
@@ -320,16 +323,10 @@ public class ArrowFlightJdbcDriverTest {
     assertEquals(parsedArgs.get("key1"), "value1");
   }
 
-  /**
-   * Tests whether an exception is thrown upon attempting to connect to a
-   * malformed URI.
-   *
-   */
   @Test
-  public void testDriverUrlParsingMechanismShouldThrowExceptionUponProvidedWithMalformedUrl() {
+  public void testDriverUrlParsingMechanismShouldReturnEmptyOptionalForUnknownScheme() throws SQLException {
     final ArrowFlightJdbcDriver driver = new ArrowFlightJdbcDriver();
-    assertThrows(SQLException.class, () -> driver.getUrlsArgs(
-        "jdbc:malformed-url-flight://localhost:2222"));
+    assertFalse(driver.getUrlsArgs("jdbc:malformed-url-flight://localhost:2222").isPresent());
   }
 
   /**
@@ -341,7 +338,8 @@ public class ArrowFlightJdbcDriverTest {
   @Test
   public void testDriverUrlParsingMechanismShouldWorkWithIPAddress() throws Exception {
     final ArrowFlightJdbcDriver driver = new ArrowFlightJdbcDriver();
-    final Map<Object, Object> parsedArgs = driver.getUrlsArgs("jdbc:arrow-flight-sql://0.0.0.0:2222");
+    final Map<Object, Object> parsedArgs = driver.getUrlsArgs("jdbc:arrow-flight-sql://0.0.0.0:2222")
+        .orElseThrow(() -> new RuntimeException("URL was rejected"));
 
     // Check size == the amount of args provided (scheme not included)
     assertEquals(2, parsedArgs.size());
@@ -364,7 +362,8 @@ public class ArrowFlightJdbcDriverTest {
       throws Exception {
     final ArrowFlightJdbcDriver driver = new ArrowFlightJdbcDriver();
     final Map<Object, Object> parsedArgs = driver.getUrlsArgs(
-        "jdbc:arrow-flight-sql://0.0.0.0:2222?test1=test1value&test2%26continue=test2value&test3=test3value");
+        "jdbc:arrow-flight-sql://0.0.0.0:2222?test1=test1value&test2%26continue=test2value&test3=test3value")
+        .orElseThrow(() -> new RuntimeException("URL was rejected"));
 
     // Check size == the amount of args provided (scheme not included)
     assertEquals(5, parsedArgs.size());


[arrow] 25/27: ARROW-18325: [Docs][Python] Add Python 3.11 to python/install.rst (#14630)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit 74c6a3589bfea36856bcfdf44c89d07f514f66d8
Author: Alenka Frim <Al...@users.noreply.github.com>
AuthorDate: Tue Nov 15 01:02:05 2022 +0100

    ARROW-18325: [Docs][Python] Add Python 3.11 to python/install.rst (#14630)
    
    Add Python 3.11 to the Python Compatibility section in the PyArrow install documentation.
    
    Authored-by: Alenka Frim <fr...@gmail.com>
    Signed-off-by: Sutou Kouhei <ko...@clear-code.com>
---
 docs/source/python/install.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/source/python/install.rst b/docs/source/python/install.rst
index f884a9cc94..ec71388152 100644
--- a/docs/source/python/install.rst
+++ b/docs/source/python/install.rst
@@ -28,7 +28,7 @@ using a 64-bit system.
 Python Compatibility
 --------------------
 
-PyArrow is currently compatible with Python 3.7, 3.8, 3.9 and 3.10.
+PyArrow is currently compatible with Python 3.7, 3.8, 3.9, 3.10 and 3.11.
 
 Using Conda
 -----------


[arrow] 16/27: ARROW-17635: [Python][CI] Sync conda recipe with the arrow-cpp feedstock (#14102)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit 82cf616704093376e030314a4374fecdb2252ee2
Author: h-vetinari <h....@gmx.com>
AuthorDate: Mon Oct 31 20:05:08 2022 +1100

    ARROW-17635: [Python][CI] Sync conda recipe with the arrow-cpp feedstock (#14102)
    
    Corresponds to status of feedstock as of https://github.com/conda-forge/arrow-cpp-feedstock/pull/848, minus obvious & intentional divergences in the setup here (with the exception of unpinning xsimd, which was [pinned](https://github.com/apache/arrow/blob/apache-arrow-9.0.0/cpp/cmake_modules/ThirdpartyToolchain.cmake#L2245) as of 9.0.0, but isn't anymore).
    
    Lead-authored-by: H. Vetinari <h....@gmx.com>
    Co-authored-by: Sutou Kouhei <ko...@clear-code.com>
    Signed-off-by: Sutou Kouhei <ko...@clear-code.com>
---
 ...r_version10numpy1.20python3.7.____cpython.yaml} |  26 ++---
 ...r_version10numpy1.20python3.8.____cpython.yaml} |  26 ++---
 ...r_version10numpy1.20python3.9.____cpython.yaml} |  26 ++---
 ..._version10numpy1.21python3.10.____cpython.yaml} |  24 +++--
 ...er_version7numpy1.20python3.7.____cpython.yaml} |  22 ++--
 ...er_version7numpy1.20python3.8.____cpython.yaml} |  22 ++--
 ...er_version7numpy1.20python3.9.____cpython.yaml} |  22 ++--
 ...er_version7numpy1.21python3.10.____cpython.yaml |  20 ++--
 ...versionNonenumpy1.20python3.7.____cpython.yaml} |  28 ++---
 ...versionNonenumpy1.20python3.8.____cpython.yaml} |  28 ++---
 ...versionNonenumpy1.20python3.9.____cpython.yaml} |  28 ++---
 ...ersionNonenumpy1.21python3.10.____cpython.yaml} |  26 +++--
 ...ux_ppc64le_numpy1.20python3.7.____cpython.yaml} |  26 ++---
 ...ux_ppc64le_numpy1.20python3.8.____cpython.yaml} |  26 ++---
 ...ux_ppc64le_numpy1.20python3.9.____cpython.yaml} |  26 ++---
 ...ux_ppc64le_numpy1.21python3.10.____cpython.yaml |  24 +++--
 ... => osx_64_numpy1.20python3.7.____cpython.yaml} |  26 ++---
 ... => osx_64_numpy1.20python3.8.____cpython.yaml} |  26 ++---
 ... => osx_64_numpy1.20python3.9.____cpython.yaml} |  26 ++---
 .../osx_64_numpy1.21python3.10.____cpython.yaml    |  24 +++--
 ... osx_arm64_numpy1.20python3.8.____cpython.yaml} |  26 ++---
 ... osx_arm64_numpy1.20python3.9.____cpython.yaml} |  26 ++---
 .../osx_arm64_numpy1.21python3.10.____cpython.yaml |  24 +++--
 ...version10.2numpy1.20python3.7.____cpython.yaml} |  20 ++--
 ...version10.2numpy1.20python3.8.____cpython.yaml} |  20 ++--
 ...version10.2numpy1.20python3.9.____cpython.yaml} |  20 ++--
 ...version10.2numpy1.21python3.10.____cpython.yaml |  18 ++--
 ...versionNonenumpy1.20python3.7.____cpython.yaml} |  20 ++--
 ...versionNonenumpy1.20python3.8.____cpython.yaml} |  20 ++--
 ...versionNonenumpy1.20python3.9.____cpython.yaml} |  20 ++--
 ...versionNonenumpy1.21python3.10.____cpython.yaml |  18 ++--
 dev/tasks/conda-recipes/arrow-cpp/bld-arrow.bat    |   8 +-
 dev/tasks/conda-recipes/arrow-cpp/bld-pyarrow.bat  |   6 +-
 dev/tasks/conda-recipes/arrow-cpp/build-arrow.sh   |  56 +++++-----
 dev/tasks/conda-recipes/arrow-cpp/build-pyarrow.sh |  24 ++---
 dev/tasks/conda-recipes/arrow-cpp/meta.yaml        | 105 ++++++++++---------
 .../conda-recipes/arrow-cpp/test_read_parquet.py   |   5 +
 dev/tasks/tasks.yml                                | 113 ++++++++-------------
 38 files changed, 551 insertions(+), 500 deletions(-)

diff --git a/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version9cuda_compiler_versionNonecxx_compiler_version9numpy1.18python3.7.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version10cuda_compiler_versionNonecxx_compiler_version10numpy1.20python3.7.____cpython.yaml
similarity index 85%
rename from dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version9cuda_compiler_versionNonecxx_compiler_version9numpy1.18python3.7.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version10cuda_compiler_versionNonecxx_compiler_version10numpy1.20python3.7.____cpython.yaml
index 3f6c8209e8..103484667f 100644
--- a/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version9cuda_compiler_versionNonecxx_compiler_version9numpy1.18python3.7.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version10cuda_compiler_versionNonecxx_compiler_version10numpy1.20python3.7.____cpython.yaml
@@ -1,9 +1,11 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - gcc
 c_compiler_version:
-- '9'
+- '10'
 cdt_name:
 - cos6
 channel_sources:
@@ -17,32 +19,32 @@ cuda_compiler_version:
 cxx_compiler:
 - gxx
 cxx_compiler_version:
-- '9'
+- '10'
 docker_image:
 - quay.io/condaforge/linux-anvil-cos7-x86_64
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
-- '1.18'
+- '1.20'
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -51,13 +53,13 @@ pin_run_as_build:
 python:
 - 3.7.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - linux-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version9cuda_compiler_versionNonecxx_compiler_version9numpy1.18python3.8.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version10cuda_compiler_versionNonecxx_compiler_version10numpy1.20python3.8.____cpython.yaml
similarity index 85%
rename from dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version9cuda_compiler_versionNonecxx_compiler_version9numpy1.18python3.8.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version10cuda_compiler_versionNonecxx_compiler_version10numpy1.20python3.8.____cpython.yaml
index 85326a4b78..892afa7ea6 100644
--- a/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version9cuda_compiler_versionNonecxx_compiler_version9numpy1.18python3.8.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version10cuda_compiler_versionNonecxx_compiler_version10numpy1.20python3.8.____cpython.yaml
@@ -1,9 +1,11 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - gcc
 c_compiler_version:
-- '9'
+- '10'
 cdt_name:
 - cos6
 channel_sources:
@@ -17,32 +19,32 @@ cuda_compiler_version:
 cxx_compiler:
 - gxx
 cxx_compiler_version:
-- '9'
+- '10'
 docker_image:
 - quay.io/condaforge/linux-anvil-cos7-x86_64
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
-- '1.18'
+- '1.20'
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -51,13 +53,13 @@ pin_run_as_build:
 python:
 - 3.8.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - linux-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version9cuda_compiler_versionNonecxx_compiler_version9numpy1.19python3.9.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version10cuda_compiler_versionNonecxx_compiler_version10numpy1.20python3.9.____cpython.yaml
similarity index 85%
rename from dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version9cuda_compiler_versionNonecxx_compiler_version9numpy1.19python3.9.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version10cuda_compiler_versionNonecxx_compiler_version10numpy1.20python3.9.____cpython.yaml
index 20034ba4e8..3652c6ed97 100644
--- a/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version9cuda_compiler_versionNonecxx_compiler_version9numpy1.19python3.9.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version10cuda_compiler_versionNonecxx_compiler_version10numpy1.20python3.9.____cpython.yaml
@@ -1,9 +1,11 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - gcc
 c_compiler_version:
-- '9'
+- '10'
 cdt_name:
 - cos6
 channel_sources:
@@ -17,32 +19,32 @@ cuda_compiler_version:
 cxx_compiler:
 - gxx
 cxx_compiler_version:
-- '9'
+- '10'
 docker_image:
 - quay.io/condaforge/linux-anvil-cos7-x86_64
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
-- '1.19'
+- '1.20'
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -51,13 +53,13 @@ pin_run_as_build:
 python:
 - 3.9.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - linux-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version9cuda_compiler_versionNonecxx_compiler_version9numpy1.21python3.10.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version10cuda_compiler_versionNonecxx_compiler_version10numpy1.21python3.10.____cpython.yaml
similarity index 86%
rename from dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version9cuda_compiler_versionNonecxx_compiler_version9numpy1.21python3.10.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version10cuda_compiler_versionNonecxx_compiler_version10numpy1.21python3.10.____cpython.yaml
index d441464402..ea90f70614 100644
--- a/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version9cuda_compiler_versionNonecxx_compiler_version9numpy1.21python3.10.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version10cuda_compiler_versionNonecxx_compiler_version10numpy1.21python3.10.____cpython.yaml
@@ -1,9 +1,11 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - gcc
 c_compiler_version:
-- '9'
+- '10'
 cdt_name:
 - cos6
 channel_sources:
@@ -17,19 +19,21 @@ cuda_compiler_version:
 cxx_compiler:
 - gxx
 cxx_compiler_version:
-- '9'
+- '10'
 docker_image:
 - quay.io/condaforge/linux-anvil-cos7-x86_64
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
@@ -37,12 +41,10 @@ numpy:
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -51,13 +53,13 @@ pin_run_as_build:
 python:
 - 3.10.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - linux-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.18python3.7.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.20python3.7.____cpython.yaml
similarity index 87%
rename from dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.18python3.7.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.20python3.7.____cpython.yaml
index 9c6fe5c377..6a3f127048 100644
--- a/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.18python3.7.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.20python3.7.____cpython.yaml
@@ -1,5 +1,7 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - gcc
 c_compiler_version:
@@ -23,26 +25,26 @@ docker_image:
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
-- '1.18'
+- '1.20'
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -51,13 +53,13 @@ pin_run_as_build:
 python:
 - 3.7.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - linux-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.18python3.8.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.20python3.8.____cpython.yaml
similarity index 87%
rename from dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.18python3.8.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.20python3.8.____cpython.yaml
index 4a1f196024..6ee37c66d1 100644
--- a/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.18python3.8.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.20python3.8.____cpython.yaml
@@ -1,5 +1,7 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - gcc
 c_compiler_version:
@@ -23,26 +25,26 @@ docker_image:
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
-- '1.18'
+- '1.20'
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -51,13 +53,13 @@ pin_run_as_build:
 python:
 - 3.8.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - linux-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.19python3.9.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.20python3.9.____cpython.yaml
similarity index 87%
rename from dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.19python3.9.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.20python3.9.____cpython.yaml
index cd9760a903..53307c2603 100644
--- a/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.19python3.9.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.20python3.9.____cpython.yaml
@@ -1,5 +1,7 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - gcc
 c_compiler_version:
@@ -23,26 +25,26 @@ docker_image:
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
-- '1.19'
+- '1.20'
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -51,13 +53,13 @@ pin_run_as_build:
 python:
 - 3.9.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - linux-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.21python3.10.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.21python3.10.____cpython.yaml
index 581e907ed5..6e1f0354f1 100644
--- a/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.21python3.10.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/linux_64_c_compiler_version7cuda_compiler_version10.2cxx_compiler_version7numpy1.21python3.10.____cpython.yaml
@@ -1,5 +1,7 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - gcc
 c_compiler_version:
@@ -23,13 +25,15 @@ docker_image:
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
@@ -37,12 +41,10 @@ numpy:
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -51,13 +53,13 @@ pin_run_as_build:
 python:
 - 3.10.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - linux-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/linux_aarch64_numpy1.18python3.7.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/linux_aarch64_cuda_compiler_versionNonenumpy1.20python3.7.____cpython.yaml
similarity index 84%
rename from dev/tasks/conda-recipes/.ci_support/linux_aarch64_numpy1.18python3.7.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/linux_aarch64_cuda_compiler_versionNonenumpy1.20python3.7.____cpython.yaml
index 9bf341a78c..fcd7d6c04d 100644
--- a/dev/tasks/conda-recipes/.ci_support/linux_aarch64_numpy1.18python3.7.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/linux_aarch64_cuda_compiler_versionNonenumpy1.20python3.7.____cpython.yaml
@@ -2,10 +2,12 @@ BUILD:
 - aarch64-conda_cos7-linux-gnu
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - gcc
 c_compiler_version:
-- '9'
+- '10'
 cdt_arch:
 - aarch64
 cdt_name:
@@ -14,37 +16,39 @@ channel_sources:
 - conda-forge
 channel_targets:
 - conda-forge main
+cuda_compiler:
+- nvcc
 cuda_compiler_version:
 - None
 cxx_compiler:
 - gxx
 cxx_compiler_version:
-- '9'
+- '10'
 docker_image:
 - quay.io/condaforge/linux-anvil-cos7-x86_64
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
-- '1.18'
+- '1.20'
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -53,13 +57,13 @@ pin_run_as_build:
 python:
 - 3.7.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - linux-aarch64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/linux_aarch64_numpy1.18python3.8.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/linux_aarch64_cuda_compiler_versionNonenumpy1.20python3.8.____cpython.yaml
similarity index 84%
rename from dev/tasks/conda-recipes/.ci_support/linux_aarch64_numpy1.18python3.8.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/linux_aarch64_cuda_compiler_versionNonenumpy1.20python3.8.____cpython.yaml
index 802123a913..32d570e52a 100644
--- a/dev/tasks/conda-recipes/.ci_support/linux_aarch64_numpy1.18python3.8.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/linux_aarch64_cuda_compiler_versionNonenumpy1.20python3.8.____cpython.yaml
@@ -2,10 +2,12 @@ BUILD:
 - aarch64-conda_cos7-linux-gnu
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - gcc
 c_compiler_version:
-- '9'
+- '10'
 cdt_arch:
 - aarch64
 cdt_name:
@@ -14,37 +16,39 @@ channel_sources:
 - conda-forge
 channel_targets:
 - conda-forge main
+cuda_compiler:
+- nvcc
 cuda_compiler_version:
 - None
 cxx_compiler:
 - gxx
 cxx_compiler_version:
-- '9'
+- '10'
 docker_image:
 - quay.io/condaforge/linux-anvil-cos7-x86_64
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
-- '1.18'
+- '1.20'
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -53,13 +57,13 @@ pin_run_as_build:
 python:
 - 3.8.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - linux-aarch64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/linux_aarch64_numpy1.19python3.9.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/linux_aarch64_cuda_compiler_versionNonenumpy1.20python3.9.____cpython.yaml
similarity index 84%
rename from dev/tasks/conda-recipes/.ci_support/linux_aarch64_numpy1.19python3.9.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/linux_aarch64_cuda_compiler_versionNonenumpy1.20python3.9.____cpython.yaml
index ab619cc87e..2532c0fd7c 100644
--- a/dev/tasks/conda-recipes/.ci_support/linux_aarch64_numpy1.19python3.9.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/linux_aarch64_cuda_compiler_versionNonenumpy1.20python3.9.____cpython.yaml
@@ -2,10 +2,12 @@ BUILD:
 - aarch64-conda_cos7-linux-gnu
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - gcc
 c_compiler_version:
-- '9'
+- '10'
 cdt_arch:
 - aarch64
 cdt_name:
@@ -14,37 +16,39 @@ channel_sources:
 - conda-forge
 channel_targets:
 - conda-forge main
+cuda_compiler:
+- nvcc
 cuda_compiler_version:
 - None
 cxx_compiler:
 - gxx
 cxx_compiler_version:
-- '9'
+- '10'
 docker_image:
 - quay.io/condaforge/linux-anvil-cos7-x86_64
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
-- '1.19'
+- '1.20'
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -53,13 +57,13 @@ pin_run_as_build:
 python:
 - 3.9.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - linux-aarch64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/linux_aarch64_numpy1.21python3.10.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/linux_aarch64_cuda_compiler_versionNonenumpy1.21python3.10.____cpython.yaml
similarity index 85%
rename from dev/tasks/conda-recipes/.ci_support/linux_aarch64_numpy1.21python3.10.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/linux_aarch64_cuda_compiler_versionNonenumpy1.21python3.10.____cpython.yaml
index 19c5246d08..a2b090f0bc 100644
--- a/dev/tasks/conda-recipes/.ci_support/linux_aarch64_numpy1.21python3.10.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/linux_aarch64_cuda_compiler_versionNonenumpy1.21python3.10.____cpython.yaml
@@ -2,10 +2,12 @@ BUILD:
 - aarch64-conda_cos7-linux-gnu
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - gcc
 c_compiler_version:
-- '9'
+- '10'
 cdt_arch:
 - aarch64
 cdt_name:
@@ -14,24 +16,28 @@ channel_sources:
 - conda-forge
 channel_targets:
 - conda-forge main
+cuda_compiler:
+- nvcc
 cuda_compiler_version:
 - None
 cxx_compiler:
 - gxx
 cxx_compiler_version:
-- '9'
+- '10'
 docker_image:
 - quay.io/condaforge/linux-anvil-cos7-x86_64
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
@@ -39,12 +45,10 @@ numpy:
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -53,13 +57,13 @@ pin_run_as_build:
 python:
 - 3.10.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - linux-aarch64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.18python3.7.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.20python3.7.____cpython.yaml
similarity index 85%
rename from dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.18python3.7.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.20python3.7.____cpython.yaml
index 2bce24c681..9ece142ae8 100644
--- a/dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.18python3.7.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.20python3.7.____cpython.yaml
@@ -1,9 +1,11 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - gcc
 c_compiler_version:
-- '7'
+- '10'
 cdt_name:
 - cos7
 channel_sources:
@@ -15,32 +17,32 @@ cuda_compiler_version:
 cxx_compiler:
 - gxx
 cxx_compiler_version:
-- '7'
+- '10'
 docker_image:
 - quay.io/condaforge/linux-anvil-cos7-x86_64
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
-- '1.18'
+- '1.20'
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -49,13 +51,13 @@ pin_run_as_build:
 python:
 - 3.7.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - linux-ppc64le
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.18python3.8.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.20python3.8.____cpython.yaml
similarity index 85%
rename from dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.18python3.8.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.20python3.8.____cpython.yaml
index 6942d589cd..df80774cd8 100644
--- a/dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.18python3.8.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.20python3.8.____cpython.yaml
@@ -1,9 +1,11 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - gcc
 c_compiler_version:
-- '7'
+- '10'
 cdt_name:
 - cos7
 channel_sources:
@@ -15,32 +17,32 @@ cuda_compiler_version:
 cxx_compiler:
 - gxx
 cxx_compiler_version:
-- '7'
+- '10'
 docker_image:
 - quay.io/condaforge/linux-anvil-cos7-x86_64
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
-- '1.18'
+- '1.20'
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -49,13 +51,13 @@ pin_run_as_build:
 python:
 - 3.8.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - linux-ppc64le
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.19python3.9.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.20python3.9.____cpython.yaml
similarity index 85%
rename from dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.19python3.9.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.20python3.9.____cpython.yaml
index dd69227c26..f697be7cb3 100644
--- a/dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.19python3.9.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.20python3.9.____cpython.yaml
@@ -1,9 +1,11 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - gcc
 c_compiler_version:
-- '7'
+- '10'
 cdt_name:
 - cos7
 channel_sources:
@@ -15,32 +17,32 @@ cuda_compiler_version:
 cxx_compiler:
 - gxx
 cxx_compiler_version:
-- '7'
+- '10'
 docker_image:
 - quay.io/condaforge/linux-anvil-cos7-x86_64
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
-- '1.19'
+- '1.20'
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -49,13 +51,13 @@ pin_run_as_build:
 python:
 - 3.9.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - linux-ppc64le
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.21python3.10.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.21python3.10.____cpython.yaml
index 7c76681489..a1fd913c03 100644
--- a/dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.21python3.10.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/linux_ppc64le_numpy1.21python3.10.____cpython.yaml
@@ -1,9 +1,11 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - gcc
 c_compiler_version:
-- '7'
+- '10'
 cdt_name:
 - cos7
 channel_sources:
@@ -15,19 +17,21 @@ cuda_compiler_version:
 cxx_compiler:
 - gxx
 cxx_compiler_version:
-- '7'
+- '10'
 docker_image:
 - quay.io/condaforge/linux-anvil-cos7-x86_64
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
@@ -35,12 +39,10 @@ numpy:
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -49,13 +51,13 @@ pin_run_as_build:
 python:
 - 3.10.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - linux-ppc64le
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.18python3.7.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.20python3.7.____cpython.yaml
similarity index 84%
rename from dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.18python3.7.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.20python3.7.____cpython.yaml
index 952ef45a8e..f9aa6765bb 100644
--- a/dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.18python3.7.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.20python3.7.____cpython.yaml
@@ -2,10 +2,12 @@ MACOSX_DEPLOYMENT_TARGET:
 - '10.9'
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - clang
 c_compiler_version:
-- '11'
+- '14'
 channel_sources:
 - conda-forge
 channel_targets:
@@ -15,32 +17,32 @@ cuda_compiler_version:
 cxx_compiler:
 - clangxx
 cxx_compiler_version:
-- '11'
+- '14'
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 macos_machine:
 - x86_64-apple-darwin13.4.0
 numpy:
-- '1.18'
+- '1.20'
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -49,13 +51,13 @@ pin_run_as_build:
 python:
 - 3.7.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - osx-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.18python3.8.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.20python3.8.____cpython.yaml
similarity index 84%
rename from dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.18python3.8.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.20python3.8.____cpython.yaml
index 22e92ad5d7..b2461b7e6c 100644
--- a/dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.18python3.8.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.20python3.8.____cpython.yaml
@@ -2,10 +2,12 @@ MACOSX_DEPLOYMENT_TARGET:
 - '10.9'
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - clang
 c_compiler_version:
-- '11'
+- '14'
 channel_sources:
 - conda-forge
 channel_targets:
@@ -15,32 +17,32 @@ cuda_compiler_version:
 cxx_compiler:
 - clangxx
 cxx_compiler_version:
-- '11'
+- '14'
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 macos_machine:
 - x86_64-apple-darwin13.4.0
 numpy:
-- '1.18'
+- '1.20'
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -49,13 +51,13 @@ pin_run_as_build:
 python:
 - 3.8.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - osx-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.19python3.9.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.20python3.9.____cpython.yaml
similarity index 84%
rename from dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.19python3.9.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.20python3.9.____cpython.yaml
index 6d01a39891..317fba95fd 100644
--- a/dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.19python3.9.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.20python3.9.____cpython.yaml
@@ -2,10 +2,12 @@ MACOSX_DEPLOYMENT_TARGET:
 - '10.9'
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - clang
 c_compiler_version:
-- '11'
+- '14'
 channel_sources:
 - conda-forge
 channel_targets:
@@ -15,32 +17,32 @@ cuda_compiler_version:
 cxx_compiler:
 - clangxx
 cxx_compiler_version:
-- '11'
+- '14'
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 macos_machine:
 - x86_64-apple-darwin13.4.0
 numpy:
-- '1.19'
+- '1.20'
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -49,13 +51,13 @@ pin_run_as_build:
 python:
 - 3.9.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - osx-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.21python3.10.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.21python3.10.____cpython.yaml
index bbe311e4cd..3d6052d0f9 100644
--- a/dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.21python3.10.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/osx_64_numpy1.21python3.10.____cpython.yaml
@@ -2,10 +2,12 @@ MACOSX_DEPLOYMENT_TARGET:
 - '10.9'
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - clang
 c_compiler_version:
-- '11'
+- '14'
 channel_sources:
 - conda-forge
 channel_targets:
@@ -15,17 +17,19 @@ cuda_compiler_version:
 cxx_compiler:
 - clangxx
 cxx_compiler_version:
-- '11'
+- '14'
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 macos_machine:
@@ -35,12 +39,10 @@ numpy:
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -49,13 +51,13 @@ pin_run_as_build:
 python:
 - 3.10.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - osx-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/osx_arm64_numpy1.19python3.8.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/osx_arm64_numpy1.20python3.8.____cpython.yaml
similarity index 84%
rename from dev/tasks/conda-recipes/.ci_support/osx_arm64_numpy1.19python3.8.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/osx_arm64_numpy1.20python3.8.____cpython.yaml
index c951785200..7dd2d05dbd 100644
--- a/dev/tasks/conda-recipes/.ci_support/osx_arm64_numpy1.19python3.8.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/osx_arm64_numpy1.20python3.8.____cpython.yaml
@@ -2,10 +2,12 @@ MACOSX_DEPLOYMENT_TARGET:
 - '11.0'
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - clang
 c_compiler_version:
-- '11'
+- '14'
 channel_sources:
 - conda-forge
 channel_targets:
@@ -15,32 +17,32 @@ cuda_compiler_version:
 cxx_compiler:
 - clangxx
 cxx_compiler_version:
-- '11'
+- '14'
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 macos_machine:
 - arm64-apple-darwin20.0.0
 numpy:
-- '1.19'
+- '1.20'
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -49,13 +51,13 @@ pin_run_as_build:
 python:
 - 3.8.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - osx-arm64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/osx_arm64_numpy1.19python3.9.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/osx_arm64_numpy1.20python3.9.____cpython.yaml
similarity index 84%
rename from dev/tasks/conda-recipes/.ci_support/osx_arm64_numpy1.19python3.9.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/osx_arm64_numpy1.20python3.9.____cpython.yaml
index 143947084d..4adda2707a 100644
--- a/dev/tasks/conda-recipes/.ci_support/osx_arm64_numpy1.19python3.9.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/osx_arm64_numpy1.20python3.9.____cpython.yaml
@@ -2,10 +2,12 @@ MACOSX_DEPLOYMENT_TARGET:
 - '11.0'
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - clang
 c_compiler_version:
-- '11'
+- '14'
 channel_sources:
 - conda-forge
 channel_targets:
@@ -15,32 +17,32 @@ cuda_compiler_version:
 cxx_compiler:
 - clangxx
 cxx_compiler_version:
-- '11'
+- '14'
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 macos_machine:
 - arm64-apple-darwin20.0.0
 numpy:
-- '1.19'
+- '1.20'
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -49,13 +51,13 @@ pin_run_as_build:
 python:
 - 3.9.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - osx-arm64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/osx_arm64_numpy1.21python3.10.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/osx_arm64_numpy1.21python3.10.____cpython.yaml
index 807e840090..0e4ea0d725 100644
--- a/dev/tasks/conda-recipes/.ci_support/osx_arm64_numpy1.21python3.10.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/osx_arm64_numpy1.21python3.10.____cpython.yaml
@@ -2,10 +2,12 @@ MACOSX_DEPLOYMENT_TARGET:
 - '11.0'
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - clang
 c_compiler_version:
-- '11'
+- '14'
 channel_sources:
 - conda-forge
 channel_targets:
@@ -15,17 +17,19 @@ cuda_compiler_version:
 cxx_compiler:
 - clangxx
 cxx_compiler_version:
-- '11'
+- '14'
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 macos_machine:
@@ -35,12 +39,10 @@ numpy:
 openssl:
 - 1.1.1
 orc:
-- 1.7.3
+- 1.7.6
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -49,13 +51,13 @@ pin_run_as_build:
 python:
 - 3.10.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - osx-arm64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - c_compiler_version
   - cxx_compiler_version
diff --git a/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.18python3.7.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.20python3.7.____cpython.yaml
similarity index 83%
rename from dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.18python3.7.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.20python3.7.____cpython.yaml
index 2592e75f0c..b3a8a8c278 100644
--- a/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.18python3.7.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.20python3.7.____cpython.yaml
@@ -1,5 +1,7 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - vs2019
 channel_sources:
@@ -15,24 +17,24 @@ cxx_compiler:
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
-- '1.18'
+- '1.20'
 openssl:
 - 1.1.1
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -41,13 +43,13 @@ pin_run_as_build:
 python:
 - 3.7.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - win-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - python
   - numpy
diff --git a/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.18python3.8.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.20python3.8.____cpython.yaml
similarity index 83%
rename from dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.18python3.8.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.20python3.8.____cpython.yaml
index 3aa9cca60a..d0ad4c12d2 100644
--- a/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.18python3.8.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.20python3.8.____cpython.yaml
@@ -1,5 +1,7 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - vs2019
 channel_sources:
@@ -15,24 +17,24 @@ cxx_compiler:
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
-- '1.18'
+- '1.20'
 openssl:
 - 1.1.1
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -41,13 +43,13 @@ pin_run_as_build:
 python:
 - 3.8.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - win-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - python
   - numpy
diff --git a/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.19python3.9.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.20python3.9.____cpython.yaml
similarity index 83%
rename from dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.19python3.9.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.20python3.9.____cpython.yaml
index b01a67017f..ad93f73671 100644
--- a/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.19python3.9.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.20python3.9.____cpython.yaml
@@ -1,5 +1,7 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - vs2019
 channel_sources:
@@ -15,24 +17,24 @@ cxx_compiler:
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
-- '1.19'
+- '1.20'
 openssl:
 - 1.1.1
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -41,13 +43,13 @@ pin_run_as_build:
 python:
 - 3.9.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - win-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - python
   - numpy
diff --git a/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.21python3.10.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.21python3.10.____cpython.yaml
index fb868730af..f8b50812c5 100644
--- a/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.21python3.10.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_version10.2numpy1.21python3.10.____cpython.yaml
@@ -1,5 +1,7 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - vs2019
 channel_sources:
@@ -15,13 +17,15 @@ cxx_compiler:
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
@@ -31,8 +35,6 @@ openssl:
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -41,13 +43,13 @@ pin_run_as_build:
 python:
 - 3.10.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - win-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - python
   - numpy
diff --git a/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.18python3.7.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.20python3.7.____cpython.yaml
similarity index 83%
rename from dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.18python3.7.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.20python3.7.____cpython.yaml
index b9fb499fe5..4fcc0f797f 100644
--- a/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.18python3.7.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.20python3.7.____cpython.yaml
@@ -1,5 +1,7 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - vs2019
 channel_sources:
@@ -15,24 +17,24 @@ cxx_compiler:
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
-- '1.18'
+- '1.20'
 openssl:
 - 1.1.1
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -41,13 +43,13 @@ pin_run_as_build:
 python:
 - 3.7.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - win-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - python
   - numpy
diff --git a/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.18python3.8.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.20python3.8.____cpython.yaml
similarity index 83%
rename from dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.18python3.8.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.20python3.8.____cpython.yaml
index 806d2363de..97e3a9332e 100644
--- a/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.18python3.8.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.20python3.8.____cpython.yaml
@@ -1,5 +1,7 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - vs2019
 channel_sources:
@@ -15,24 +17,24 @@ cxx_compiler:
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
-- '1.18'
+- '1.20'
 openssl:
 - 1.1.1
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -41,13 +43,13 @@ pin_run_as_build:
 python:
 - 3.8.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - win-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - python
   - numpy
diff --git a/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.19python3.9.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.20python3.9.____cpython.yaml
similarity index 83%
rename from dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.19python3.9.____cpython.yaml
rename to dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.20python3.9.____cpython.yaml
index 990696de9f..9d771ea6d1 100644
--- a/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.19python3.9.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.20python3.9.____cpython.yaml
@@ -1,5 +1,7 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - vs2019
 channel_sources:
@@ -15,24 +17,24 @@ cxx_compiler:
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
-- '1.19'
+- '1.20'
 openssl:
 - 1.1.1
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -41,13 +43,13 @@ pin_run_as_build:
 python:
 - 3.9.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - win-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - python
   - numpy
diff --git a/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.21python3.10.____cpython.yaml b/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.21python3.10.____cpython.yaml
index cc9069c0be..a5de05e824 100644
--- a/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.21python3.10.____cpython.yaml
+++ b/dev/tasks/conda-recipes/.ci_support/win_64_cuda_compiler_versionNonenumpy1.21python3.10.____cpython.yaml
@@ -1,5 +1,7 @@
 bzip2:
 - '1'
+c_ares:
+- '1'
 c_compiler:
 - vs2019
 channel_sources:
@@ -15,13 +17,15 @@ cxx_compiler:
 gflags:
 - '2.2'
 glog:
-- '0.5'
+- '0.6'
 google_cloud_cpp:
-- '1.35'
+- '2.2'
 grpc_cpp:
-- '1.43'
+- '1.47'
+libabseil:
+- '20220623.0'
 libprotobuf:
-- '3.19'
+- '3.21'
 lz4_c:
 - 1.9.3
 numpy:
@@ -31,8 +35,6 @@ openssl:
 pin_run_as_build:
   bzip2:
     max_pin: x
-  lz4-c:
-    max_pin: x.x.x
   python:
     min_pin: x.x
     max_pin: x.x
@@ -41,13 +43,13 @@ pin_run_as_build:
 python:
 - 3.10.* *_cpython
 re2:
-- 2022.02.01
+- 2022.06.01
 snappy:
 - '1'
 target_platform:
 - win-64
 thrift_cpp:
-- 0.15.0
+- 0.16.0
 zip_keys:
 - - python
   - numpy
diff --git a/dev/tasks/conda-recipes/arrow-cpp/bld-arrow.bat b/dev/tasks/conda-recipes/arrow-cpp/bld-arrow.bat
index 21e2ae714e..c8eacfb434 100644
--- a/dev/tasks/conda-recipes/arrow-cpp/bld-arrow.bat
+++ b/dev/tasks/conda-recipes/arrow-cpp/bld-arrow.bat
@@ -25,9 +25,9 @@ cmake -G "Ninja" ^
       -DARROW_DEPENDENCY_SOURCE=SYSTEM ^
       -DARROW_FILESYSTEM:BOOL=ON ^
       -DARROW_FLIGHT:BOOL=ON ^
-      -DARROW_FLIGHT_REQUIRE_TLSCREDENTIALSOPTIONS:BOOL=OFF ^
+      -DARROW_FLIGHT_REQUIRE_TLSCREDENTIALSOPTIONS:BOOL=ON ^
       -DARROW_GANDIVA:BOOL=ON ^
-      -DARROW_GCS:BOOL=OFF ^
+      -DARROW_GCS:BOOL=ON ^
       -DARROW_HDFS:BOOL=ON ^
       -DARROW_JSON:BOOL=ON ^
       -DARROW_MIMALLOC:BOOL=ON ^
@@ -35,7 +35,8 @@ cmake -G "Ninja" ^
       -DARROW_PACKAGE_PREFIX="%LIBRARY_PREFIX%" ^
       -DARROW_PARQUET:BOOL=ON ^
       -DARROW_S3:BOOL=ON ^
-      -DARROW_SIMD_LEVEL=NONE ^
+      -DARROW_SIMD_LEVEL:STRING=NONE ^
+      -DARROW_SUBSTRAIT:BOOL=ON ^
       -DARROW_WITH_BROTLI:BOOL=ON ^
       -DARROW_WITH_BZ2:BOOL=ON ^
       -DARROW_WITH_LZ4:BOOL=ON ^
@@ -45,6 +46,7 @@ cmake -G "Ninja" ^
       -DBUILD_SHARED_LIBS=ON ^
       -DBoost_NO_BOOST_CMAKE=ON ^
       -DCMAKE_BUILD_TYPE=release ^
+      -DCMAKE_CXX_STANDARD=17 ^
       -DCMAKE_INSTALL_PREFIX="%LIBRARY_PREFIX%" ^
       -DCMAKE_UNITY_BUILD=ON ^
       -DLLVM_TOOLS_BINARY_DIR="%LIBRARY_BIN%" ^
diff --git a/dev/tasks/conda-recipes/arrow-cpp/bld-pyarrow.bat b/dev/tasks/conda-recipes/arrow-cpp/bld-pyarrow.bat
index 8897c4b7d4..bd20c79efe 100644
--- a/dev/tasks/conda-recipes/arrow-cpp/bld-pyarrow.bat
+++ b/dev/tasks/conda-recipes/arrow-cpp/bld-pyarrow.bat
@@ -17,14 +17,14 @@ pushd "%SRC_DIR%"\python
 SET ARROW_HOME=%LIBRARY_PREFIX%
 SET SETUPTOOLS_SCM_PRETEND_VERSION=%PKG_VERSION%
 SET PYARROW_BUILD_TYPE=release
-SET PYARROW_WITH_GCS=0
-SET PYARROW_WITH_S3=1
-SET PYARROW_WITH_HDFS=1
 SET PYARROW_WITH_DATASET=1
 SET PYARROW_WITH_FLIGHT=1
 SET PYARROW_WITH_GANDIVA=1
+SET PYARROW_WITH_GCS=1
+SET PYARROW_WITH_HDFS=1
 SET PYARROW_WITH_PARQUET=1
 SET PYARROW_WITH_PARQUET_ENCRYPTION=1
+SET PYARROW_WITH_S3=1
 SET PYARROW_CMAKE_GENERATOR=Ninja
 
 :: Enable CUDA support
diff --git a/dev/tasks/conda-recipes/arrow-cpp/build-arrow.sh b/dev/tasks/conda-recipes/arrow-cpp/build-arrow.sh
index 9a45b39161..3a26bd2ede 100755
--- a/dev/tasks/conda-recipes/arrow-cpp/build-arrow.sh
+++ b/dev/tasks/conda-recipes/arrow-cpp/build-arrow.sh
@@ -1,4 +1,4 @@
-#!/usr/bin/env bash
+#!/bin/bash
 
 set -e
 set -x
@@ -7,14 +7,15 @@ mkdir cpp/build
 pushd cpp/build
 
 EXTRA_CMAKE_ARGS=""
-ARROW_GCS="OFF"
 
+# Include g++'s system headers
 if [ "$(uname)" == "Linux" ]; then
-  # Include g++'s system headers
   SYSTEM_INCLUDES=$(echo | ${CXX} -E -Wp,-v -xc++ - 2>&1 | grep '^ ' | awk '{print "-isystem;" substr($1, 1)}' | tr '\n' ';')
-  EXTRA_CMAKE_ARGS=" -DARROW_GANDIVA_PC_CXX_FLAGS=${SYSTEM_INCLUDES}"
-  # GCS doesn't produce any abseil-induced linker error on Linux, enable it
-  ARROW_GCS="ON"
+  ARROW_GANDIVA_PC_CXX_FLAGS="${SYSTEM_INCLUDES}"
+else
+  # See https://conda-forge.org/docs/maintainer/knowledge_base.html#newer-c-features-with-old-sdk
+  CXXFLAGS="${CXXFLAGS} -D_LIBCPP_DISABLE_AVAILABILITY"
+  ARROW_GANDIVA_PC_CXX_FLAGS="-D_LIBCPP_DISABLE_AVAILABILITY"
 fi
 
 # Enable CUDA support
@@ -38,23 +39,23 @@ else
 fi
 
 if [[ "${target_platform}" == "osx-arm64" ]]; then
-    # We need llvm 11+ support in Arrow for this
-    # Tell jemalloc to support 16K page size on apple arm64 silicon
-    EXTRA_CMAKE_ARGS=" ${EXTRA_CMAKE_ARGS} -DARROW_GANDIVA=OFF -DARROW_JEMALLOC_LG_PAGE=14"
+    EXTRA_CMAKE_ARGS="${EXTRA_CMAKE_ARGS} -DCLANG_EXECUTABLE=${BUILD_PREFIX}/bin/clang -DLLVM_LINK_EXECUTABLE=${BUILD_PREFIX}/bin/llvm-link"
     sed -ie "s;protoc-gen-grpc.*$;protoc-gen-grpc=${BUILD_PREFIX}/bin/grpc_cpp_plugin\";g" ../src/arrow/flight/CMakeLists.txt
-elif [[ "${target_platform}" == "linux-aarch64" ]]; then
-    # Tell jemalloc to support both 4k and 64k page arm64 systems
-    # See https://github.com/apache/arrow/pull/10940
-    EXTRA_CMAKE_ARGS=" ${EXTRA_CMAKE_ARGS} -DARROW_GANDIVA=ON -DARROW_JEMALLOC_LG_PAGE=16"
-else
-    EXTRA_CMAKE_ARGS=" ${EXTRA_CMAKE_ARGS} -DARROW_GANDIVA=ON"
+    sed -ie 's;"--with-jemalloc-prefix\=je_arrow_";"--with-jemalloc-prefix\=je_arrow_" "--with-lg-page\=14";g' ../cmake_modules/ThirdpartyToolchain.cmake
+fi
+
+# disable -fno-plt, which causes problems with GCC on PPC
+if [[ "$target_platform" == "linux-ppc64le" ]]; then
+  CFLAGS="$(echo $CFLAGS | sed 's/-fno-plt //g')"
+  CXXFLAGS="$(echo $CXXFLAGS | sed 's/-fno-plt //g')"
 fi
 
-# See https://conda-forge.org/docs/maintainer/knowledge_base.html#newer-c-features-with-old-sdk
-CXXFLAGS="${CXXFLAGS} -D_LIBCPP_DISABLE_AVAILABILITY"
-ARROW_GANDIVA_PC_CXX_FLAGS="-D_LIBCPP_DISABLE_AVAILABILITY"
+# Limit number of threads used to avoid hardware oversubscription
+if [[ "${target_platform}" == "linux-aarch64" ]] || [[ "${target_platform}" == "linux-ppc64le" ]]; then
+     export CMAKE_BUILD_PARALLEL_LEVEL=3
+fi
 
-cmake \
+cmake -GNinja \
     -DARROW_BOOST_USE_SHARED=ON \
     -DARROW_BUILD_BENCHMARKS=OFF \
     -DARROW_BUILD_STATIC=OFF \
@@ -68,8 +69,10 @@ cmake \
     -DARROW_FILESYSTEM=ON \
     -DARROW_FLIGHT=ON \
     -DARROW_FLIGHT_REQUIRE_TLSCREDENTIALSOPTIONS=ON \
+    -DARROW_FLIGHT_SQL=ON \
+    -DARROW_GANDIVA=ON \
     -DARROW_GANDIVA_PC_CXX_FLAGS="${ARROW_GANDIVA_PC_CXX_FLAGS}" \
-    -DARROW_GCS=${ARROW_GCS} \
+    -DARROW_GCS=ON \
     -DARROW_HDFS=ON \
     -DARROW_JEMALLOC=ON \
     -DARROW_JSON=ON \
@@ -80,6 +83,7 @@ cmake \
     -DARROW_PLASMA=ON \
     -DARROW_S3=ON \
     -DARROW_SIMD_LEVEL=NONE \
+    -DARROW_SUBSTRAIT=ON \
     -DARROW_USE_LD_GOLD=ON \
     -DARROW_WITH_BROTLI=ON \
     -DARROW_WITH_BZ2=ON \
@@ -96,19 +100,9 @@ cmake \
     -DPARQUET_REQUIRE_ENCRYPTION=ON \
     -DProtobuf_PROTOC_EXECUTABLE=$BUILD_PREFIX/bin/protoc \
     -DPython3_EXECUTABLE=${PYTHON} \
-    -GNinja \
     ${EXTRA_CMAKE_ARGS} \
     ..
 
-# Commented out until jemalloc and mimalloc are fixed upstream
-if [[ "${target_platform}" == "osx-arm64" ]]; then
-     ninja jemalloc_ep-prefix/src/jemalloc_ep-stamp/jemalloc_ep-patch mimalloc_ep-prefix/src/mimalloc_ep-stamp/mimalloc_ep-patch
-     cp $BUILD_PREFIX/share/gnuconfig/config.* jemalloc_ep-prefix/src/jemalloc_ep/build-aux/
-     sed -ie 's/list(APPEND mi_cflags -march=native)//g' mimalloc_ep-prefix/src/mimalloc_ep/CMakeLists.txt
-     # Use the correct register for thread-local storage
-     sed -ie 's/tpidr_el0/tpidrro_el0/g' mimalloc_ep-prefix/src/mimalloc_ep/include/mimalloc-internal.h
-fi
-
-ninja install
+cmake --build . --target install --config Release
 
 popd
diff --git a/dev/tasks/conda-recipes/arrow-cpp/build-pyarrow.sh b/dev/tasks/conda-recipes/arrow-cpp/build-pyarrow.sh
index fc28bb0623..60144d06b9 100644
--- a/dev/tasks/conda-recipes/arrow-cpp/build-pyarrow.sh
+++ b/dev/tasks/conda-recipes/arrow-cpp/build-pyarrow.sh
@@ -11,13 +11,8 @@ export PYARROW_BUILD_TYPE=release
 export PYARROW_BUNDLE_ARROW_CPP_HEADERS=0
 export PYARROW_WITH_DATASET=1
 export PYARROW_WITH_FLIGHT=1
-if [[ "${target_platform}" == "osx-arm64" ]]; then
-    # We need llvm 11+ support in Arrow for this
-    export PYARROW_WITH_GANDIVA=0
-else
-    export PYARROW_WITH_GANDIVA=1
-fi
-export PYARROW_WITH_GCS=0
+export PYARROW_WITH_GANDIVA=1
+export PYARROW_WITH_GCS=1
 export PYARROW_WITH_HDFS=1
 export PYARROW_WITH_ORC=1
 export PYARROW_WITH_PARQUET=1
@@ -28,10 +23,6 @@ export PYARROW_CMAKE_GENERATOR=Ninja
 export PYARROW_CMAKE_OPTIONS="-DARROW_SIMD_LEVEL=NONE"
 BUILD_EXT_FLAGS=""
 
-if [ "$(uname)" == "Linux" ]; then
-    export PYARROW_WITH_GCS=1
-fi
-
 # Enable CUDA support
 if [[ ! -z "${cuda_compiler_version+x}" && "${cuda_compiler_version}" != "None" ]]; then
     export PYARROW_WITH_CUDA=1
@@ -44,8 +35,15 @@ if [[ "${target_platform}" == "linux-aarch64" ]]; then
     export PYARROW_CMAKE_OPTIONS="-DARROW_ARMV8_ARCH=armv8-a ${PYARROW_CMAKE_OPTIONS}"
 fi
 
-# See https://conda-forge.org/docs/maintainer/knowledge_base.html#newer-c-features-with-old-sdk
-export CXXFLAGS="${CXXFLAGS} -D_LIBCPP_DISABLE_AVAILABILITY"
+if [[ "${target_platform}" == osx-* ]]; then
+    # See https://conda-forge.org/docs/maintainer/knowledge_base.html#newer-c-features-with-old-sdk
+    CXXFLAGS="${CXXFLAGS} -D_LIBCPP_DISABLE_AVAILABILITY"
+fi
+
+# Limit number of threads used to avoid hardware oversubscription
+if [[ "${target_platform}" == "linux-aarch64" ]] || [[ "${target_platform}" == "linux-ppc64le" ]]; then
+     export CMAKE_BUILD_PARALLEL_LEVEL=4
+fi
 
 cd python
 
diff --git a/dev/tasks/conda-recipes/arrow-cpp/meta.yaml b/dev/tasks/conda-recipes/arrow-cpp/meta.yaml
index e9fc89380b..9663e32cd2 100644
--- a/dev/tasks/conda-recipes/arrow-cpp/meta.yaml
+++ b/dev/tasks/conda-recipes/arrow-cpp/meta.yaml
@@ -4,6 +4,7 @@
 {% set build_ext_version = ARROW_VERSION %}
 {% set build_ext = "cuda" if cuda_enabled else "cpu" %}
 {% set proc_build_number = "0" %}
+{% set llvm_version = "14" %}
 
 package:
   name: arrow-cpp-ext
@@ -14,10 +15,15 @@ source:
 
 build:
   number: 0
-  # for cuda on win/linux, building with 9.2 is enough to be compatible with all later versions,
-  # since arrow is only using libcuda, and not libcudart.
-  skip: true  # [(win or linux) and cuda_compiler_version not in ("None", "10.2")]
+  # for cuda support, building with one version is enough to be compatible with
+  # all later versions, since arrow is only using libcuda, and not libcudart.
+  skip: true  # [(win or linux64) and cuda_compiler_version not in ("None", "10.2")]
+  skip: true  # [linux and (aarch64 or ppc64le) and cuda_compiler_version not in ("None", "11.2")]
   skip: true  # [osx and cuda_compiler_version != "None"]
+  # CUDA builds on ppc64le currently run out of time on Travis CI.
+  # It may be possible to move these to cross-compilation, but this will take additional work.
+  # Hence this is skipped for now until this can be addressed.
+  skip: true  # [linux and ppc64le and cuda_compiler_version != "None"]
   run_exports:
     - {{ pin_subpackage("arrow-cpp", max_pin="x.x.x") }}
 
@@ -26,7 +32,7 @@ outputs:
     version: {{ build_ext_version }}
     build:
       number: {{ proc_build_number }}
-      string: "{{ build_ext }}"
+      string: {{ build_ext }}
     test:
       commands:
         - exit 0
@@ -35,7 +41,7 @@ outputs:
       license: Apache-2.0
       license_file:
         - LICENSE.txt
-      summary: 'A meta-package to select Arrow build variant'
+      summary: A meta-package to select Arrow build variant
 
   - name: arrow-cpp
     script: build-arrow.sh  # [not win]
@@ -47,18 +53,18 @@ outputs:
         - {{ pin_subpackage("arrow-cpp", max_pin="x.x.x") }}
       ignore_run_exports:
         - cudatoolkit
-      track_features:
-        {{ "- arrow-cuda" if cuda_enabled else "" }}
+      track_features: {{ "[arrow-cuda]" if cuda_enabled else "" }}
     requirements:
       build:
         - python                                 # [build_platform != target_platform]
         - cross-python_{{ target_platform }}     # [build_platform != target_platform]
-        - cython                                 # [build_platform != target_platform]
         - numpy                                  # [build_platform != target_platform]
+        - clangdev {{ llvm_version }}            # [osx and arm64]
+        - llvmdev {{ llvm_version }}             # [osx and arm64]
         - gnuconfig                              # [osx and arm64]
         - libprotobuf
         - grpc-cpp
-        # aws-sdk-cpp 1.8.* doesn't work with newer CMake
+        # aws-sdk-cpp 1.8.* doesn't work with a newer CMake
         - cmake <3.22
         - autoconf  # [unix]
         - ninja
@@ -69,25 +75,26 @@ outputs:
       host:
         # https://issues.apache.org/jira/browse/ARROW-15141
         - aws-sdk-cpp 1.8.186
+        # abseil is only here to help conda pick the right constraints for pyarrow, see
+        # https://github.com/conda-forge/arrow-cpp-feedstock/pull/815#issuecomment-1216713245
+        - libabseil
         - boost-cpp >=1.70
         - brotli
         - bzip2
         - c-ares
         - gflags
         - glog
-        # On macOS and Windows, GCS support fails linking due to ABI
-        # issues with abseil-cpp
-        # (see https://github.com/conda-forge/abseil-cpp-feedstock/issues/45)
-        - google-cloud-cpp  # [linux]
+        - google-cloud-cpp
         - grpc-cpp
         - libprotobuf
-        - clangdev >=11 # [not (osx and arm64)]
-        - llvmdev >=11  # [not (osx and arm64)]
+        - clangdev {{ llvm_version }}
+        - llvmdev {{ llvm_version }}
         - libutf8proc
         - lz4-c
         - numpy
-        - orc  # [unix]
+        # gandiva depends on openssl
         - openssl
+        - orc          # [unix]
         - python
         - rapidjson
         - re2
@@ -115,27 +122,29 @@ outputs:
         # headers
         - test -f $PREFIX/include/arrow/api.h              # [unix]
         - test -f $PREFIX/include/arrow/flight/types.h     # [unix]
+        - test -f $PREFIX/include/arrow/flight/sql/api.h   # [unix]
         - test -f $PREFIX/include/plasma/client.h          # [unix]
-        - test -f $PREFIX/include/gandiva/engine.h         # [unix and not (osx and arm64)]
+        - test -f $PREFIX/include/gandiva/engine.h         # [unix]
         - test -f $PREFIX/include/parquet/api/reader.h     # [unix]
         - if not exist %LIBRARY_INC%\\arrow\\api.h exit 1            # [win]
         - if not exist %LIBRARY_INC%\\gandiva\\engine.h exit 1       # [win]
         - if not exist %LIBRARY_INC%\\parquet\\api\\reader.h exit 1  # [win]
 
         # shared
-        - test -f $PREFIX/lib/libarrow.so            # [linux]
-        - test -f $PREFIX/lib/libarrow_dataset.so    # [linux]
-        - test -f $PREFIX/lib/libarrow_flight.so     # [linux]
-        - test -f $PREFIX/lib/libparquet.so          # [linux]
-        - test -f $PREFIX/lib/libgandiva.so          # [linux]
-        - test -f $PREFIX/lib/libplasma.so           # [linux]
+        - test -f $PREFIX/lib/libarrow.so             # [linux]
+        - test -f $PREFIX/lib/libarrow_dataset.so     # [linux]
+        - test -f $PREFIX/lib/libarrow_flight.so      # [linux]
+        - test -f $PREFIX/lib/libarrow_flight_sql.so  # [linux]
+        - test -f $PREFIX/lib/libparquet.so           # [linux]
+        - test -f $PREFIX/lib/libgandiva.so           # [linux]
+        - test -f $PREFIX/lib/libplasma.so            # [linux]
         - test -f $PREFIX/lib/libarrow_cuda${SHLIB_EXT}               # [(cuda_compiler_version != "None") and unix]
         - test ! -f $PREFIX/lib/libarrow_cuda${SHLIB_EXT}             # [(cuda_compiler_version == "None") and unix]
         - if not exist %PREFIX%\\Library\\bin\\arrow_cuda.dll exit 1  # [(cuda_compiler_version != "None") and win]
         - if exist %PREFIX%\\Library\\bin\\arrow_cuda.dll exit 1      # [(cuda_compiler_version == "None") and win]
         - test -f $PREFIX/lib/libarrow.dylib          # [osx]
         - test -f $PREFIX/lib/libarrow_dataset.dylib  # [osx]
-        - test -f $PREFIX/lib/libgandiva.dylib        # [osx and not arm64]
+        - test -f $PREFIX/lib/libgandiva.dylib        # [osx]
         - test -f $PREFIX/lib/libparquet.dylib        # [osx]
         - test -f $PREFIX/lib/libplasma.dylib         # [osx]
         - if not exist %PREFIX%\\Library\\bin\\arrow.dll exit 1          # [win]
@@ -145,17 +154,19 @@ outputs:
         - if not exist %PREFIX%\\Library\\bin\\gandiva.dll exit 1        # [win]
 
         # absence of static libraries
-        - test ! -f $PREFIX/lib/libarrow.a          # [unix]
-        - test ! -f $PREFIX/lib/libarrow_dataset.a  # [unix]
-        - test ! -f $PREFIX/lib/libarrow_flight.a   # [unix]
-        - test ! -f $PREFIX/lib/libplasma.a         # [unix]
-        - test ! -f $PREFIX/lib/libparquet.a        # [unix]
-        - test ! -f $PREFIX/lib/libgandiva.a        # [unix]
-        - if exist %PREFIX%\\Library\\lib\\arrow_static.lib exit 1          # [win]
-        - if exist %PREFIX%\\Library\\lib\\arrow_dataset_static.lib exit 1  # [win]
-        - if exist %PREFIX%\\Library\\lib\\arrow_flight_static.lib exit 1   # [win]
-        - if exist %PREFIX%\\Library\\lib\\parquet_static.lib exit 1        # [win]
-        - if exist %PREFIX%\\Library\\lib\\gandiva_static.lib exit 1        # [win]
+        - test ! -f $PREFIX/lib/libarrow.a             # [unix]
+        - test ! -f $PREFIX/lib/libarrow_dataset.a     # [unix]
+        - test ! -f $PREFIX/lib/libarrow_flight.a      # [unix]
+        - test ! -f $PREFIX/lib/libarrow_flight_sql.a  # [unix]
+        - test ! -f $PREFIX/lib/libplasma.a            # [unix]
+        - test ! -f $PREFIX/lib/libparquet.a           # [unix]
+        - test ! -f $PREFIX/lib/libgandiva.a           # [unix]
+        - if exist %PREFIX%\\Library\\lib\\arrow_static.lib exit 1             # [win]
+        - if exist %PREFIX%\\Library\\lib\\arrow_dataset_static.lib exit 1     # [win]
+        - if exist %PREFIX%\\Library\\lib\\arrow_flight_static.lib exit 1      # [win]
+        - if exist %PREFIX%\\Library\\lib\\arrow_flight_sql_static.lib exit 1  # [win]
+        - if exist %PREFIX%\\Library\\lib\\parquet_static.lib exit 1           # [win]
+        - if exist %PREFIX%\\Library\\lib\\gandiva_static.lib exit 1           # [win]
 
   - name: pyarrow
     script: build-pyarrow.sh  # [not win]
@@ -167,8 +178,7 @@ outputs:
         - cudatoolkit
       ignore_run_exports_from:
         - openssl
-      track_features:
-        {{ "- arrow-cuda" if cuda_enabled else "" }}
+      track_features: {{ "[arrow-cuda]" if cuda_enabled else "" }}
     requirements:
       build:
         - python                                 # [build_platform != target_platform]
@@ -184,9 +194,9 @@ outputs:
         - {{ compiler("cuda") }}  # [cuda_compiler_version != "None"]
       host:
         - {{ pin_subpackage('arrow-cpp', exact=True) }}
-        - clangdev >=11 # [not (osx and arm64)]
+        - clangdev {{ llvm_version }}
+        - llvmdev {{ llvm_version }}
         - cython
-        - llvmdev >=11  # [not (osx and arm64)]
         - numpy
         - openssl
         - python
@@ -211,16 +221,18 @@ outputs:
       summary: Python libraries for Apache Arrow
 
     test:
+      files:
+        - test_read_parquet.py
       imports:
         - pyarrow
         - pyarrow.dataset
         - pyarrow.flight
-        - pyarrow.gandiva  # [not (osx and arm64)]
+        - pyarrow.gandiva
         - pyarrow.orc      # [unix]
         - pyarrow.parquet
         - pyarrow.plasma   # [unix]
         - pyarrow.fs
-        - pyarrow._s3fs    # [linux]
+        - pyarrow._s3fs
         - pyarrow._hdfs
         # We can only test importing cuda package but cannot run when a
         # CUDA device is not available, for instance, when building from CI.
@@ -233,6 +245,7 @@ outputs:
         - if exist %SP_DIR%/pyarrow/tests/test_array.py exit 1                    # [win]
         # Need to remove dot from PY_VER; %MYVAR:x=y% replaces "x" in %MYVAR% with "y"
         - if not exist %SP_DIR%/pyarrow/_cuda.cp%PY_VER:.=%-win_amd64.pyd exit 1  # [win and cuda_compiler_version != "None"]
+        - python test_read_parquet.py
 
   - name: pyarrow-tests
     script: build-pyarrow.sh  # [not win]
@@ -244,8 +257,7 @@ outputs:
         - cudatoolkit
       ignore_run_exports_from:
         - openssl
-      track_features:
-        {{ "- arrow-cuda" if cuda_enabled else "" }}
+      track_features: {{ "[arrow-cuda]" if cuda_enabled else "" }}
     requirements:
       build:
         - python                                 # [build_platform != target_platform]
@@ -262,12 +274,12 @@ outputs:
       host:
         - {{ pin_subpackage('arrow-cpp', exact=True) }}
         - {{ pin_subpackage('pyarrow', exact=True) }}
-        - clangdev >=11 # [not (osx and arm64)]
+        - clangdev {{ llvm_version }}
+        - llvmdev {{ llvm_version }}
         - cython
-        - llvmdev >=11  # [not (osx and arm64)]
         - numpy
-        - python
         - openssl
+        - python
         - setuptools
         - setuptools_scm
         - six
@@ -313,3 +325,4 @@ extra:
     - pearu
     - nealrichardson
     - jakirkham
+    - h-vetinari
diff --git a/dev/tasks/conda-recipes/arrow-cpp/test_read_parquet.py b/dev/tasks/conda-recipes/arrow-cpp/test_read_parquet.py
new file mode 100644
index 0000000000..5f76a4e22c
--- /dev/null
+++ b/dev/tasks/conda-recipes/arrow-cpp/test_read_parquet.py
@@ -0,0 +1,5 @@
+import pyarrow as pa
+import pyarrow.parquet as pq
+
+table = pa.Table.from_pydict({"a": [1, 2]})
+pq.write_table(table, "test.parquet")
diff --git a/dev/tasks/tasks.yml b/dev/tasks/tasks.yml
index 57878227b2..94902a0be0 100644
--- a/dev/tasks/tasks.yml
+++ b/dev/tasks/tasks.yml
@@ -250,7 +250,7 @@ tasks:
     ci: azure
     template: conda-recipes/azure.linux.yml
     params:
-      config: linux_64_c_compiler_version9cuda_compiler_versionNonecxx_compiler_version9numpy1.18python3.7.____cpython
+      config: linux_64_c_compiler_version10cuda_compiler_versionNonecxx_compiler_version10numpy1.20python3.7.____cpython
       r_config: linux_64_r_base4.0
     artifacts:
       - arrow-cpp-{no_rc_version}-py37(h[a-z0-9]+)_0_cpu.tar.bz2
@@ -260,7 +260,7 @@ tasks:
     ci: azure
     template: conda-recipes/azure.linux.yml
     params:
-      config: linux_64_c_compiler_version9cuda_compiler_versionNonecxx_compiler_version9numpy1.18python3.7.____cpython
+      config: linux_64_c_compiler_version10cuda_compiler_versionNonecxx_compiler_version10numpy1.20python3.7.____cpython
       r_config: linux_64_r_base4.1
     artifacts:
       - arrow-cpp-{no_rc_version}-py37(h[a-z0-9]+)_0_cpu.tar.bz2
@@ -270,7 +270,7 @@ tasks:
     ci: azure
     template: conda-recipes/azure.linux.yml
     params:
-      config: linux_64_c_compiler_version9cuda_compiler_versionNonecxx_compiler_version9numpy1.18python3.8.____cpython
+      config: linux_64_c_compiler_version10cuda_compiler_versionNonecxx_compiler_version10numpy1.20python3.8.____cpython
     artifacts:
       - arrow-cpp-{no_rc_version}-py38(h[a-z0-9]+)_0_cpu.tar.bz2
       - pyarrow-{no_rc_version}-py38(h[a-z0-9]+)_0_cpu.tar.bz2
@@ -279,7 +279,7 @@ tasks:
     ci: azure
     template: conda-recipes/azure.linux.yml
     params:
-      config: linux_64_c_compiler_version9cuda_compiler_versionNonecxx_compiler_version9numpy1.19python3.9.____cpython
+      config: linux_64_c_compiler_version10cuda_compiler_versionNonecxx_compiler_version10numpy1.20python3.9.____cpython
     artifacts:
       - arrow-cpp-{no_rc_version}-py39(h[a-z0-9]+)_0_cpu.tar.bz2
       - pyarrow-{no_rc_version}-py39(h[a-z0-9]+)_0_cpu.tar.bz2
@@ -288,14 +288,14 @@ tasks:
     ci: azure
     template: conda-recipes/azure.linux.yml
     params:
-      config: linux_64_c_compiler_version9cuda_compiler_versionNonecxx_compiler_version9numpy1.21python3.10.____cpython
+      config: linux_64_c_compiler_version10cuda_compiler_versionNonecxx_compiler_version10numpy1.21python3.10.____cpython
     artifacts:
       - arrow-cpp-{no_rc_version}-py310(h[a-z0-9]+)_0_cpu.tar.bz2
       - pyarrow-{no_rc_version}-py310(h[a-z0-9]+)_0_cpu.tar.bz2
 
-{% for python_version, numpy_version in [("3.7", "1.18"),
-                                         ("3.8", "1.18"),
-                                         ("3.9", "1.19"),
+{% for python_version, numpy_version in [("3.7", "1.20"),
+                                         ("3.8", "1.20"),
+                                         ("3.9", "1.20"),
                                          ("3.10", "1.21")] %}
   {% set pyver = python_version | replace(".", "") %}
 
@@ -312,7 +312,7 @@ tasks:
     ci: azure
     template: conda-recipes/azure.linux.yml
     params:
-      config: linux_aarch64_numpy{{ numpy_version }}python{{ python_version }}.____cpython
+      config: linux_aarch64_cuda_compiler_versionNonenumpy{{ numpy_version }}python{{ python_version }}.____cpython
     artifacts:
       - arrow-cpp-{no_rc_version}-py{{ pyver }}(h[a-z0-9]+)_0_cpu.tar.bz2
       - pyarrow-{no_rc_version}-py{{ pyver }}(h[a-z0-9]+)_0_cpu.tar.bz2
@@ -334,7 +334,7 @@ tasks:
     ci: azure
     template: conda-recipes/azure.osx.yml
     params:
-      config:  osx_64_numpy1.18python3.7.____cpython
+      config:  osx_64_numpy1.20python3.7.____cpython
       r_config: osx_64_r_base4.0
     artifacts:
       - arrow-cpp-{no_rc_version}-py37(h[a-z0-9]+)_0_cpu.tar.bz2
@@ -344,65 +344,37 @@ tasks:
     ci: azure
     template: conda-recipes/azure.osx.yml
     params:
-      config: osx_64_numpy1.18python3.7.____cpython
+      config: osx_64_numpy1.20python3.7.____cpython
       r_config: osx_64_r_base4.1
     artifacts:
       - arrow-cpp-{no_rc_version}-py37(h[a-z0-9]+)_0_cpu.tar.bz2
       - pyarrow-{no_rc_version}-py37(h[a-z0-9]+)_0_cpu.tar.bz2
 
-  conda-osx-clang-py38:
-    ci: azure
-    template: conda-recipes/azure.osx.yml
-    params:
-      config: osx_64_numpy1.18python3.8.____cpython
-    artifacts:
-      - arrow-cpp-{no_rc_version}-py38(h[a-z0-9]+)_0_cpu.tar.bz2
-      - pyarrow-{no_rc_version}-py38(h[a-z0-9]+)_0_cpu.tar.bz2
-
-  conda-osx-clang-py39:
-    ci: azure
-    template: conda-recipes/azure.osx.yml
-    params:
-      config: osx_64_numpy1.19python3.9.____cpython
-    artifacts:
-      - arrow-cpp-{no_rc_version}-py39(h[a-z0-9]+)_0_cpu.tar.bz2
-      - pyarrow-{no_rc_version}-py39(h[a-z0-9]+)_0_cpu.tar.bz2
-
-  conda-osx-clang-py310:
-    ci: azure
-    template: conda-recipes/azure.osx.yml
-    params:
-      config: osx_64_numpy1.21python3.10.____cpython
-    artifacts:
-      - arrow-cpp-{no_rc_version}-py310(h[a-z0-9]+)_0_cpu.tar.bz2
-      - pyarrow-{no_rc_version}-py310(h[a-z0-9]+)_0_cpu.tar.bz2
+{% for python_version, numpy_version in [("3.7", "1.20"),
+                                         ("3.8", "1.20"),
+                                         ("3.9", "1.20"),
+                                         ("3.10", "1.21")] %}
+  {% set pyver = python_version | replace(".", "") %}
 
-  conda-osx-arm64-clang-py38:
+  conda-osx-clang-py{{ pyver }}:
     ci: azure
     template: conda-recipes/azure.osx.yml
     params:
-      config: osx_arm64_numpy1.19python3.8.____cpython
+      config: osx_64_numpy{{ numpy_version }}python{{ python_version }}.____cpython
     artifacts:
-      - arrow-cpp-{no_rc_version}-py38(h[a-z0-9]+)_0_cpu.tar.bz2
-      - pyarrow-{no_rc_version}-py38(h[a-z0-9]+)_0_cpu.tar.bz2
+      - arrow-cpp-{no_rc_version}-py{{ pyver }}(h[a-z0-9]+)_0_cpu.tar.bz2
+      - pyarrow-{no_rc_version}-py{{ pyver }}(h[a-z0-9]+)_0_cpu.tar.bz2
 
-  conda-osx-arm64-clang-py39:
+  conda-osx-arm64-clang-py{{ pyver }}:
     ci: azure
     template: conda-recipes/azure.osx.yml
     params:
-      config: osx_arm64_numpy1.19python3.9.____cpython
+      config: osx_arm64_numpy{{ numpy_version }}python{{ python_version }}.____cpython
     artifacts:
-      - arrow-cpp-{no_rc_version}-py39(h[a-z0-9]+)_0_cpu.tar.bz2
-      - pyarrow-{no_rc_version}-py39(h[a-z0-9]+)_0_cpu.tar.bz2
+      - arrow-cpp-{no_rc_version}-py{{ pyver }}(h[a-z0-9]+)_0_cpu.tar.bz2
+      - pyarrow-{no_rc_version}-py{{ pyver }}(h[a-z0-9]+)_0_cpu.tar.bz2
 
-  conda-osx-arm64-clang-py310:
-    ci: azure
-    template: conda-recipes/azure.osx.yml
-    params:
-      config: osx_arm64_numpy1.21python3.10.____cpython
-    artifacts:
-      - arrow-cpp-{no_rc_version}-py310(h[a-z0-9]+)_0_cpu.tar.bz2
-      - pyarrow-{no_rc_version}-py310(h[a-z0-9]+)_0_cpu.tar.bz2
+{% endfor %}
 
   ############################## Conda Windows ################################
 
@@ -410,7 +382,7 @@ tasks:
     ci: azure
     template: conda-recipes/azure.win.yml
     params:
-      config: win_64_cuda_compiler_versionNonenumpy1.18python3.7.____cpython
+      config: win_64_cuda_compiler_versionNonenumpy1.20python3.7.____cpython
       r_config: win_64_r_base4.0
     artifacts:
       - arrow-cpp-{no_rc_version}-py37(h[a-z0-9]+)_0_cpu.tar.bz2
@@ -420,40 +392,37 @@ tasks:
     ci: azure
     template: conda-recipes/azure.win.yml
     params:
-      config: win_64_cuda_compiler_versionNonenumpy1.18python3.7.____cpython
+      config: win_64_cuda_compiler_versionNonenumpy1.20python3.7.____cpython
       r_config: win_64_r_base4.0
     artifacts:
       - arrow-cpp-{no_rc_version}-py37(h[a-z0-9]+)_0_cpu.tar.bz2
       - pyarrow-{no_rc_version}-py37(h[a-z0-9]+)_0_cpu.tar.bz2
 
-  conda-win-vs2019-py38:
-    ci: azure
-    template: conda-recipes/azure.win.yml
-    params:
-      config: win_64_cuda_compiler_versionNonenumpy1.18python3.8.____cpython
-    artifacts:
-      - arrow-cpp-{no_rc_version}-py38(h[a-z0-9]+)_0_cpu.tar.bz2
-      - pyarrow-{no_rc_version}-py38(h[a-z0-9]+)_0_cpu.tar.bz2
+{% for python_version, numpy_version in [("3.7", "1.20"),
+                                         ("3.8", "1.20"),
+                                         ("3.9", "1.20"),
+                                         ("3.10", "1.21")] %}
+  {% set pyver = python_version | replace(".", "") %}
 
-  conda-win-vs2019-py39:
+  conda-win-vs2019-py{{ pyver }}-cpu:
     ci: azure
     template: conda-recipes/azure.win.yml
     params:
-      config: win_64_cuda_compiler_versionNonenumpy1.19python3.9.____cpython
+      config: win_64_cuda_compiler_versionNonenumpy{{ numpy_version }}python{{ python_version }}.____cpython
     artifacts:
-      - arrow-cpp-{no_rc_version}-py39(h[a-z0-9]+)_0_cpu.tar.bz2
-      - pyarrow-{no_rc_version}-py39(h[a-z0-9]+)_0_cpu.tar.bz2
+      - arrow-cpp-{no_rc_version}-py{{ pyver }}(h[a-z0-9]+)_0_cpu.tar.bz2
+      - pyarrow-{no_rc_version}-py{{ pyver }}(h[a-z0-9]+)_0_cpu.tar.bz2
 
-  conda-win-vs2019-py310:
+  conda-win-vs2019-py{{ pyver }}-cuda:
     ci: azure
     template: conda-recipes/azure.win.yml
     params:
-      config: win_64_cuda_compiler_versionNonenumpy1.21python3.10.____cpython
+      config: win_64_cuda_compiler_version10.2numpy{{ numpy_version }}python{{ python_version }}.____cpython
     artifacts:
-      - arrow-cpp-{no_rc_version}-py310(h[a-z0-9]+)_0_cpu.tar.bz2
-      - pyarrow-{no_rc_version}-py310(h[a-z0-9]+)_0_cpu.tar.bz2
+      - arrow-cpp-{no_rc_version}-py{{ pyver }}(h[a-z0-9]+)_0_cpu.tar.bz2
+      - pyarrow-{no_rc_version}-py{{ pyver }}(h[a-z0-9]+)_0_cpu.tar.bz2
 
-# TODO: Windows CUDA
+{% endfor %}
 
 {% for python_version, python_tag, abi_tag in [("3.7", "cp37", "cp37m"),
                                                ("3.8", "cp38", "cp38"),


[arrow] 21/27: ARROW-18299: [CI][GLib][macOS] Fix dependency install failures (#14618)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit 09467cfe1b1c55dcbf0b7ad06c770335ffd16c51
Author: Sutou Kouhei <ko...@clear-code.com>
AuthorDate: Fri Nov 11 05:39:17 2022 +0900

    ARROW-18299: [CI][GLib][macOS] Fix dependency install failures (#14618)
    
    Authored-by: Sutou Kouhei <ko...@clear-code.com>
    Signed-off-by: Sutou Kouhei <ko...@clear-code.com>
---
 .github/workflows/cpp.yml  | 4 +---
 .github/workflows/ruby.yml | 7 +++++--
 c_glib/Brewfile            | 1 -
 cpp/Brewfile               | 1 -
 4 files changed, 6 insertions(+), 7 deletions(-)

diff --git a/.github/workflows/cpp.yml b/.github/workflows/cpp.yml
index 2278f075a8..6d30df4091 100644
--- a/.github/workflows/cpp.yml
+++ b/.github/workflows/cpp.yml
@@ -124,7 +124,7 @@ jobs:
           docker-compose run --rm minimal
 
   macos:
-    name: AMD64 macOS 11 C++
+    name: AMD64 macOS 12 C++
     runs-on: macos-latest
     if: ${{ !contains(github.event.pull_request.title, 'WIP') }}
     timeout-minutes: 75
@@ -158,9 +158,7 @@ jobs:
           submodules: recursive
       - name: Install Dependencies
         run: |
-          rm -f /usr/local/bin/2to3
           brew update --preinstall
-          brew install --overwrite git
           brew bundle --file=cpp/Brewfile
       - name: Install MinIO
         run: |
diff --git a/.github/workflows/ruby.yml b/.github/workflows/ruby.yml
index 2204570034..a0021cdcf9 100644
--- a/.github/workflows/ruby.yml
+++ b/.github/workflows/ruby.yml
@@ -109,7 +109,7 @@ jobs:
         run: archery docker push ubuntu-ruby
 
   macos:
-    name: AMD64 macOS 11 GLib & Ruby
+    name: AMD64 macOS 12 GLib & Ruby
     runs-on: macos-latest
     if: ${{ !contains(github.event.pull_request.title, 'WIP') }}
     timeout-minutes: 60
@@ -142,7 +142,10 @@ jobs:
       - name: Install Homebrew Dependencies
         shell: bash
         run: |
-          rm -f /usr/local/bin/2to3
+          rm -f /usr/local/bin/2to3*
+          rm -f /usr/local/bin/idle*
+          rm -f /usr/local/bin/pydoc3*
+          rm -f /usr/local/bin/python3*
           brew update --preinstall
           brew install --overwrite git
           brew bundle --file=cpp/Brewfile
diff --git a/c_glib/Brewfile b/c_glib/Brewfile
index bfda23b374..5ab5020369 100644
--- a/c_glib/Brewfile
+++ b/c_glib/Brewfile
@@ -15,7 +15,6 @@
 # specific language governing permissions and limitations
 # under the License.
 
-brew "autoconf-archive"
 brew "gobject-introspection"
 brew "gtk-doc"
 brew "libtool"
diff --git a/cpp/Brewfile b/cpp/Brewfile
index 35941a9289..26abb2efb2 100644
--- a/cpp/Brewfile
+++ b/cpp/Brewfile
@@ -15,7 +15,6 @@
 # specific language governing permissions and limitations
 # under the License.
 
-brew "automake"
 brew "aws-sdk-cpp"
 brew "bash"
 brew "boost"


[arrow] 04/27: MINOR: [R] [Docs] Fix link in ?cast documentation (#14455)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit 8d76062d451c6bc751c8a71cf6f34689ccb2240f
Author: Bryce Mecum <pe...@gmail.com>
AuthorDate: Thu Oct 20 06:02:24 2022 -0800

    MINOR: [R] [Docs] Fix link in ?cast documentation (#14455)
    
    Lead-authored-by: Bryce Mecum <pe...@gmail.com>
    Co-authored-by: Nic Crane <th...@gmail.com>
    Signed-off-by: Nic Crane <th...@gmail.com>
---
 r/R/dplyr-funcs-type.R |  9 +++++----
 r/R/type.R             |  5 ++++-
 r/man/DataType.Rd      | 10 ++++++++--
 r/man/cast.Rd          | 10 ++++++----
 4 files changed, 23 insertions(+), 11 deletions(-)

diff --git a/r/R/dplyr-funcs-type.R b/r/R/dplyr-funcs-type.R
index 296133daee..46720007dd 100644
--- a/r/R/dplyr-funcs-type.R
+++ b/r/R/dplyr-funcs-type.R
@@ -32,9 +32,9 @@ register_bindings_type <- function() {
 #' @param to [DataType] to cast to; for [Table] and [RecordBatch],
 #' it should be a [Schema].
 #' @param safe logical: only allow the type conversion if no data is lost
-#' (truncation, overflow, etc.). Default is `TRUE`
+#' (truncation, overflow, etc.). Default is `TRUE`.
 #' @param ... specific `CastOptions` to set
-#' @return an `Expression`
+#' @return An [Expression]
 #'
 #' @examples
 #' \dontrun{
@@ -43,8 +43,9 @@ register_bindings_type <- function() {
 #'   mutate(cyl = cast(cyl, string()))
 #' }
 #' @keywords internal
-#' @seealso https://arrow.apache.org/docs/cpp/api/compute.html for the list of
-#' supported CastOptions.
+#' @seealso [`data-type`] for a list of [DataType] to be used with `to`.
+#' @seealso [Arrow C++ CastOptions documentation](https://arrow.apache.org/docs/cpp/api/compute.html?highlight=castoptions#arrow%3A%3Acompute%3A%3ACastOptions) # nolint
+#' for the list of supported CastOptions.
 cast <- function(x, to, safe = TRUE, ...) {
   x$cast(to, safe = safe, ...)
 }
diff --git a/r/R/type.R b/r/R/type.R
index 5089789f6c..cda606e3fa 100644
--- a/r/R/type.R
+++ b/r/R/type.R
@@ -24,10 +24,13 @@
 #'
 #' @section Methods:
 #'
-#' TODO
+#' - `$ToString()`: String representation of the DataType
+#' - `$Equals(other)`: Is the DataType equal to `other`
+#' - `$fields()`: The children fields associated with this type
 #'
 #' @rdname DataType
 #' @name DataType
+#' @seealso [`data-type`]
 DataType <- R6Class("DataType",
   inherit = ArrowObject,
   public = list(
diff --git a/r/man/DataType.Rd b/r/man/DataType.Rd
index 8c96141bed..7c0bb4ec97 100644
--- a/r/man/DataType.Rd
+++ b/r/man/DataType.Rd
@@ -9,7 +9,13 @@ class arrow::DataType
 }
 \section{Methods}{
 
-
-TODO
+\itemize{
+\item \verb{$ToString()}: String representation of the DataType
+\item \verb{$Equals(other)}: Is the DataType equal to \code{other}
+\item \verb{$fields()}: The children fields associated with this type
+}
 }
 
+\seealso{
+\code{\link{data-type}}
+}
diff --git a/r/man/cast.Rd b/r/man/cast.Rd
index 88134f2e02..6d87958376 100644
--- a/r/man/cast.Rd
+++ b/r/man/cast.Rd
@@ -13,12 +13,12 @@ cast(x, to, safe = TRUE, ...)
 it should be a \link{Schema}.}
 
 \item{safe}{logical: only allow the type conversion if no data is lost
-(truncation, overflow, etc.). Default is \code{TRUE}}
+(truncation, overflow, etc.). Default is \code{TRUE}.}
 
 \item{...}{specific \code{CastOptions} to set}
 }
 \value{
-an \code{Expression}
+An \link{Expression}
 }
 \description{
 This is a wrapper around the \verb{$cast()} method that many Arrow objects have.
@@ -32,7 +32,9 @@ mtcars \%>\%
 }
 }
 \seealso{
-https://arrow.apache.org/docs/cpp/api/compute.html for the list of
-supported CastOptions.
+\code{\link{data-type}} for a list of \link{DataType} to be used with \code{to}.
+
+\href{https://arrow.apache.org/docs/cpp/api/compute.html?highlight=castoptions#arrow\%3A\%3Acompute\%3A\%3ACastOptions}{Arrow C++ CastOptions documentation}
+for the list of supported CastOptions.
 }
 \keyword{internal}


[arrow] 27/27: ARROW-18315: [CI][deb][RPM] Pin createrepo_c on Travis CI arm64-graviton (#14625)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit 8f47806ce5054a9ba2d1da2f5e1b7def7c6bb1bb
Author: Sutou Kouhei <ko...@clear-code.com>
AuthorDate: Tue Nov 15 12:08:45 2022 +0900

    ARROW-18315: [CI][deb][RPM] Pin createrepo_c on Travis CI arm64-graviton (#14625)
    
    Because createrepo_c master can't be built on Travis CI arm64-gravition.
    
    Authored-by: Sutou Kouhei <ko...@clear-code.com>
    Signed-off-by: Sutou Kouhei <ko...@clear-code.com>
---
 dev/tasks/linux-packages/travis.linux.arm64.yml | 2 +-
 dev/tasks/macros.jinja                          | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/dev/tasks/linux-packages/travis.linux.arm64.yml b/dev/tasks/linux-packages/travis.linux.arm64.yml
index f3ec4f1de2..d95ab48241 100644
--- a/dev/tasks/linux-packages/travis.linux.arm64.yml
+++ b/dev/tasks/linux-packages/travis.linux.arm64.yml
@@ -79,7 +79,7 @@ before_script:
 
   # Build createrepo_c from source.
   # We can remove them when we can install createrepo_c package
-  - git clone --depth 1 https://github.com/rpm-software-management/createrepo_c.git
+  - git clone --depth 1 --branch 0.20.1 https://github.com/rpm-software-management/createrepo_c.git
   - pushd createrepo_c
   - |
       /usr/bin/cmake \
diff --git a/dev/tasks/macros.jinja b/dev/tasks/macros.jinja
index bd3358e073..a880c17587 100644
--- a/dev/tasks/macros.jinja
+++ b/dev/tasks/macros.jinja
@@ -187,7 +187,7 @@ on:
 {% endmacro %}
 
 {%- macro travis_upload_releases(pattern) -%}
-  - sudo -H pip3 install pygit2==1.0 cryptography==36
+  - sudo -H pip3 install pygit2==1.8.0 cryptography==36
   - sudo -H pip3 install -e arrow/dev/archery[crossbow]
   - |
     archery crossbow \


[arrow] 02/27: ARROW-18092: [CI][Conan] Update versions of gRPC related dependencies (#14453)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit 5f66d0e484511ae359f2e4dfbdaf0e007e2f52ff
Author: Sutou Kouhei <ko...@clear-code.com>
AuthorDate: Wed Oct 19 10:02:25 2022 +0900

    ARROW-18092: [CI][Conan] Update versions of gRPC related dependencies (#14453)
    
    Error message:
    
    https://github.com/ursacomputing/crossbow/actions/runs/3271941831/jobs/5382341820#step:5:566
    
        WARN: Remotes registry file missing, creating default one in /root/.conan/remotes.json
        WARN: grpc/1.48.0: requirement re2/20220601 overridden by arrow/10.0.0 to re2/20220201
        WARN: grpc/1.48.0: requirement protobuf/3.21.4 overridden by arrow/10.0.0 to protobuf/3.21.1
        WARN: googleapis/cci.20220711: requirement protobuf/3.21.4 overridden by grpc/1.48.0 to protobuf/3.21.1
        WARN: grpc-proto/cci.20220627: requirement protobuf/3.21.4 overridden by grpc/1.48.0 to protobuf/3.21.1
        ERROR: Missing binary: grpc/1.48.0:ddc600b3316e16c4e38f2c1ca1214d7241b4dd80
        grpc/1.48.0: WARN: Can't find a 'grpc/1.48.0' package for the
        specified settings, options and dependencies:
        - Settings: arch=x86_64,
                    build_type=Release,
                    compiler=gcc,
                    compiler.libcxx=libstdc++,
                    compiler.version=10,
                    os=Linux
        - Options: codegen=True,
                   cpp_plugin=True,
                   csharp_ext=False,
                   csharp_plugin=True,
                   fPIC=True,
                   node_plugin=True,
                   objective_c_plugin=True,
                   php_plugin=True,
                   python_plugin=True,
                   ruby_plugin=True,
                   secure=False,
                   shared=False,
                   abseil:fPIC=True,
                   abseil:shared=False,
                   c-ares:fPIC=True,
                   c-ares:shared=False,
                   c-ares:tools=True,
                   googleapis:fPIC=True,
                   googleapis:shared=False,
                   grpc-proto:fPIC=True,
                   grpc-proto:shared=False,
                   openssl:386=False,
                   openssl:enable_weak_ssl_ciphers=False,
                   openssl:fPIC=True,
                   openssl:no_aria=False,
                   openssl:no_asm=False,
                   openssl:no_async=False,
                   openssl:no_bf=False,
                   openssl:no_blake2=False,
                   openssl:no_camellia=False,
                   openssl:no_cast=False,
                   openssl:no_chacha=False,
                   openssl:no_cms=False,
                   openssl:no_comp=False,
                   openssl:no_ct=False,
                   openssl:no_deprecated=False,
                   openssl:no_des=False,
                   openssl:no_dgram=False,
                   openssl:no_dh=False,
                   openssl:no_dsa=False,
                   openssl:no_dso=False,
                   openssl:no_ec=False,
                   openssl:no_ecdh=False,
                   openssl:no_ecdsa=False,
                   openssl:no_engine=False,
                   openssl:no_filenames=False,
                   openssl:no_gost=False,
                   openssl:no_hmac=False,
                   openssl:no_idea=False,
                   openssl:no_md4=False,
                   openssl:no_md5=False,
                   openssl:no_mdc2=False,
                   openssl:no_ocsp=False,
                   openssl:no_pinshared=False,
                   openssl:no_rc2=False,
                   openssl:no_rfc3779=False,
                   openssl:no_rmd160=False,
                   openssl:no_rsa=False,
                   openssl:no_seed=False,
                   openssl:no_sha=False,
                   openssl:no_sm2=False,
                   openssl:no_sm3=False,
                   openssl:no_sm4=False,
                   openssl:no_sock=False,
                   openssl:no_srp=False,
                   openssl:no_srtp=False,
                   openssl:no_sse2=False,
                   openssl:no_ssl=False,
                   openssl:no_ssl3=False,
                   openssl:no_stdio=False,
                   openssl:no_tests=False,
                   openssl:no_threads=False,
                   openssl:no_tls1=False,
                   openssl:no_ts=False,
                   openssl:no_whirlpool=False,
                   openssl:openssldir=None,
                   openssl:shared=False,
                   protobuf:debug_suffix=True,
                   protobuf:fPIC=True,
                   protobuf:lite=False,
                   protobuf:shared=False,
                   protobuf:with_rtti=True,
                   protobuf:with_zlib=True,
                   re2:fPIC=True,
                   re2:shared=False,
                   zlib:fPIC=True,
                   zlib:shared=False
        - Dependencies: abseil/20220623.0,
                        c-ares/1.18.1,
                        openssl/1.1.1q,
                        re2/20220201,
                        zlib/1.2.12,
                        protobuf/3.21.1,
                        googleapis/cci.20220711,
                        grpc-proto/cci.20220627
        - Requirements: abseil/20220623.Y.Z,
                        c-ares/1.Y.Z,
                        googleapis/cci.20220711,
                        grpc-proto/cci.20220627,
                        openssl/1.Y.Z,
                        protobuf/3.21.1:37dd8aae630726607d9d4108fefd2f59c8f7e9db,
                        re2/20220201.Y.Z,
                        zlib/1.Y.Z
        - Package ID: ddc600b3316e16c4e38f2c1ca1214d7241b4dd80
    
        ERROR: Missing prebuilt package for 'grpc/1.48.0'
    
    Authored-by: Sutou Kouhei <ko...@clear-code.com>
    Signed-off-by: Sutou Kouhei <ko...@clear-code.com>
---
 ci/conan/all/conanfile.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/ci/conan/all/conanfile.py b/ci/conan/all/conanfile.py
index 26abcc028b..726f83239b 100644
--- a/ci/conan/all/conanfile.py
+++ b/ci/conan/all/conanfile.py
@@ -302,7 +302,7 @@ class ArrowConan(ConanFile):
         if self._with_thrift():
             self.requires("thrift/0.16.0")
         if self._with_protobuf():
-            self.requires("protobuf/3.21.1")
+            self.requires("protobuf/3.21.4")
         if self._with_jemalloc():
             self.requires("jemalloc/5.2.1")
         if self._with_boost():
@@ -346,7 +346,7 @@ class ArrowConan(ConanFile):
         if self.options.with_zstd:
             self.requires("zstd/1.5.2")
         if self._with_re2():
-            self.requires("re2/20220201")
+            self.requires("re2/20220601")
         if self._with_utf8proc():
             self.requires("utf8proc/2.7.0")
         if self.options.with_backtrace:


[arrow] 07/27: ARROW-18131: [R] Correctly handle .data pronoun in group_by() (#14484)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit 9608485fa912c4cd9ea0cd0b22c2eeb886fc99b0
Author: Nic Crane <th...@gmail.com>
AuthorDate: Mon Oct 24 14:03:53 2022 +0100

    ARROW-18131: [R] Correctly handle .data pronoun in group_by() (#14484)
    
    Authored-by: Nic Crane <th...@gmail.com>
    Signed-off-by: Nic Crane <th...@gmail.com>
---
 r/R/dplyr-group-by.R                   | 17 +++++++++--------
 r/tests/testthat/test-dplyr-group-by.R |  9 +++++++++
 2 files changed, 18 insertions(+), 8 deletions(-)

diff --git a/r/R/dplyr-group-by.R b/r/R/dplyr-group-by.R
index 85825b9bf2..a7b1ab9dbc 100644
--- a/r/R/dplyr-group-by.R
+++ b/r/R/dplyr-group-by.R
@@ -37,19 +37,20 @@ group_by.arrow_dplyr_query <- function(.data,
   expression_list <- expand_across(.data, quos(...))
   new_groups <- ensure_named_exprs(expression_list)
 
+  # set up group names and check which are new
+  gbp <- dplyr::group_by_prepare(.data, !!!expression_list, .add = .add)
+  existing_groups <- dplyr::group_vars(gbp$data)
+  new_group_names <- setdiff(gbp$group_names, existing_groups)
+
+  names(new_groups) <- new_group_names
+
   if (length(new_groups)) {
     # Add them to the data
     .data <- dplyr::mutate(.data, !!!new_groups)
   }
 
-  if (.add) {
-    gv <- union(dplyr::group_vars(.data), names(new_groups))
-  } else {
-    gv <- names(new_groups)
-  }
-
-  .data$group_by_vars <- gv %||% character()
-  .data$drop_empty_groups <- ifelse(length(gv), .drop, dplyr::group_by_drop_default(.data))
+  .data$group_by_vars <- gbp$group_names
+  .data$drop_empty_groups <- ifelse(length(gbp$group_names), .drop, dplyr::group_by_drop_default(.data))
   .data
 }
 group_by.Dataset <- group_by.ArrowTabular <- group_by.RecordBatchReader <- group_by.arrow_dplyr_query
diff --git a/r/tests/testthat/test-dplyr-group-by.R b/r/tests/testthat/test-dplyr-group-by.R
index e4e4d41d49..9f2869dd10 100644
--- a/r/tests/testthat/test-dplyr-group-by.R
+++ b/r/tests/testthat/test-dplyr-group-by.R
@@ -305,3 +305,12 @@ test_that("Can use across() within group_by()", {
     tbl
   )
 })
+
+test_that("ARROW-18131 - correctly handles .data pronoun in group_by()", {
+  compare_dplyr_binding(
+    .input %>%
+      group_by(.data$lgl) %>%
+      collect(),
+    tbl
+  )
+})


[arrow] 17/27: ARROW-18162: [C++] Add Arm SVE compiler options (#14515)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit f63804f976bdfc0a9e12fbe9155f71a37103fec3
Author: Yibo Cai <yi...@arm.com>
AuthorDate: Tue Nov 1 16:34:08 2022 +0800

    ARROW-18162: [C++] Add Arm SVE compiler options (#14515)
    
    As xsimd only supports fixed-size SVE, we have to specify vector size
    explicitly on command line. And the binary can only run on hardware
    with matched vector size. Otherwise, the code behaviour is undefined.
    E.g., `cmake -DARROW_SIMD_LEVEL=SVE256 ..`
    According macro `ARROW_HAVE_SVE256` and cmake variable are defined.
    
    We can also leverage compiler auto vectorization to generate size
    agnostic SVE code without specifying the vector size.
    E.g., `cmake -DARROW_SIMD_LEVEL=SVE ..`
    
    This PR also removes some unused Arm64 arch options.
    
    Authored-by: Yibo Cai <yi...@arm.com>
    Signed-off-by: Yibo Cai <yi...@arm.com>
---
 cpp/cmake_modules/DefineOptions.cmake              | 15 +++------
 cpp/cmake_modules/SetupCxxFlags.cmake              | 39 +++++++++++-----------
 cpp/src/arrow/io/memory_benchmark.cc               |  4 +--
 cpp/src/arrow/util/simd.h                          |  4 ---
 dev/tasks/conda-recipes/arrow-cpp/build-pyarrow.sh |  5 ---
 python/CMakeLists.txt                              |  5 ---
 6 files changed, 26 insertions(+), 46 deletions(-)

diff --git a/cpp/cmake_modules/DefineOptions.cmake b/cpp/cmake_modules/DefineOptions.cmake
index 0a0f24b47f..040a6f5829 100644
--- a/cpp/cmake_modules/DefineOptions.cmake
+++ b/cpp/cmake_modules/DefineOptions.cmake
@@ -185,6 +185,10 @@ takes precedence over ccache if a storage backend is configured" ON)
                        "AVX2"
                        "AVX512"
                        "NEON"
+                       "SVE" # size agnostic SVE
+                       "SVE128" # fixed size SVE
+                       "SVE256" # "
+                       "SVE512" # "
                        "DEFAULT")
 
   define_option_string(ARROW_RUNTIME_SIMD_LEVEL
@@ -196,17 +200,6 @@ takes precedence over ccache if a storage backend is configured" ON)
                        "AVX512"
                        "MAX")
 
-  # Arm64 architectures and extensions can lead to exploding combinations.
-  # So set it directly through cmake command line.
-  #
-  # If you change this, you need to change the definition in
-  # python/CMakeLists.txt too.
-  define_option_string(ARROW_ARMV8_ARCH
-                       "Arm64 arch and extensions"
-                       "armv8-a" # Default
-                       "armv8-a"
-                       "armv8-a+crc+crypto")
-
   define_option(ARROW_ALTIVEC "Build with Altivec if compiler has support" ON)
 
   define_option(ARROW_RPATH_ORIGIN "Build Arrow libraries with RATH set to \$ORIGIN" OFF)
diff --git a/cpp/cmake_modules/SetupCxxFlags.cmake b/cpp/cmake_modules/SetupCxxFlags.cmake
index 0bd5c3090e..88d2f0fdae 100644
--- a/cpp/cmake_modules/SetupCxxFlags.cmake
+++ b/cpp/cmake_modules/SetupCxxFlags.cmake
@@ -110,8 +110,8 @@ elseif(ARROW_CPU_FLAG STREQUAL "ppc")
   endif()
 elseif(ARROW_CPU_FLAG STREQUAL "armv8")
   # Arm64 compiler flags, gcc/clang only
-  set(ARROW_ARMV8_ARCH_FLAG "-march=${ARROW_ARMV8_ARCH}")
-  check_cxx_compiler_flag(${ARROW_ARMV8_ARCH_FLAG} CXX_SUPPORTS_ARMV8_ARCH)
+  set(ARROW_ARMV8_MARCH "armv8-a")
+  check_cxx_compiler_flag("-march=${ARROW_ARMV8_MARCH}+sve" CXX_SUPPORTS_SVE)
   if(ARROW_SIMD_LEVEL STREQUAL "DEFAULT")
     set(ARROW_SIMD_LEVEL "NEON")
   endif()
@@ -485,26 +485,27 @@ if(ARROW_CPU_FLAG STREQUAL "ppc")
 endif()
 
 if(ARROW_CPU_FLAG STREQUAL "armv8")
-  if(ARROW_SIMD_LEVEL STREQUAL "NEON")
+  if(ARROW_SIMD_LEVEL MATCHES "NEON|SVE[0-9]*")
     set(ARROW_HAVE_NEON ON)
-
-    if(NOT CXX_SUPPORTS_ARMV8_ARCH)
-      message(FATAL_ERROR "Unsupported arch flag: ${ARROW_ARMV8_ARCH_FLAG}.")
-    endif()
-    if(ARROW_ARMV8_ARCH_FLAG MATCHES "native")
-      message(FATAL_ERROR "native arch not allowed, please specify arch explicitly.")
-    endif()
-    set(CXX_COMMON_FLAGS "${CXX_COMMON_FLAGS} ${ARROW_ARMV8_ARCH_FLAG}")
-
     add_definitions(-DARROW_HAVE_NEON)
-
-    if(ARROW_ARMV8_ARCH_FLAG MATCHES "\\+crypto")
-      add_definitions(-DARROW_HAVE_ARMV8_CRYPTO)
-    endif()
-    # armv8.1+ implies crc support
-    if(ARROW_ARMV8_ARCH_FLAG MATCHES "armv8\\.[1-9]|\\+crc")
-      add_definitions(-DARROW_HAVE_ARMV8_CRC)
+    if(ARROW_SIMD_LEVEL MATCHES "SVE[0-9]*")
+      if(NOT CXX_SUPPORTS_SVE)
+        message(FATAL_ERROR "SVE required but compiler doesn't support it.")
+      endif()
+      # -march=armv8-a+sve
+      set(ARROW_ARMV8_MARCH "${ARROW_ARMV8_MARCH}+sve")
+      string(REGEX MATCH "[0-9]+" SVE_VECTOR_BITS ${ARROW_SIMD_LEVEL})
+      if(SVE_VECTOR_BITS)
+        set(ARROW_HAVE_SVE${SVE_VECTOR_BITS} ON)
+        add_definitions(-DARROW_HAVE_SVE${SVE_VECTOR_BITS})
+        # -msve-vector-bits=256
+        set(CXX_COMMON_FLAGS "${CXX_COMMON_FLAGS} -msve-vector-bits=${SVE_VECTOR_BITS}")
+      else()
+        set(ARROW_HAVE_SVE_SIZELESS ON)
+        add_definitions(-DARROW_HAVE_SVE_SIZELSS)
+      endif()
     endif()
+    set(CXX_COMMON_FLAGS "${CXX_COMMON_FLAGS} -march=${ARROW_ARMV8_MARCH}")
   elseif(NOT ARROW_SIMD_LEVEL STREQUAL "NONE")
     message(WARNING "ARROW_SIMD_LEVEL=${ARROW_SIMD_LEVEL} not supported by Arm.")
   endif()
diff --git a/cpp/src/arrow/io/memory_benchmark.cc b/cpp/src/arrow/io/memory_benchmark.cc
index 1b584d17e0..3084b5e79a 100644
--- a/cpp/src/arrow/io/memory_benchmark.cc
+++ b/cpp/src/arrow/io/memory_benchmark.cc
@@ -154,7 +154,7 @@ static void StreamReadWrite(void* src, void* dst, size_t size) {
 
 #endif  // ARROW_HAVE_SSE4_2
 
-#ifdef ARROW_HAVE_ARMV8_CRYPTO
+#ifdef ARROW_HAVE_NEON
 
 using VectorType = uint8x16_t;
 using VectorTypeDual = uint8x16x2_t;
@@ -237,7 +237,7 @@ static void StreamReadWrite(void* src, void* dst, size_t size) {
   }
 }
 
-#endif  // ARROW_HAVE_ARMV8_CRYPTO
+#endif  // ARROW_HAVE_NEON
 
 static void PlatformMemcpy(void* src, void* dst, size_t size) { memcpy(src, dst, size); }
 
diff --git a/cpp/src/arrow/util/simd.h b/cpp/src/arrow/util/simd.h
index 046c74cdcc..ee9105d5f4 100644
--- a/cpp/src/arrow/util/simd.h
+++ b/cpp/src/arrow/util/simd.h
@@ -41,8 +41,4 @@
 #include <arm_neon.h>
 #endif
 
-#ifdef ARROW_HAVE_ARMV8_CRC
-#include <arm_acle.h>
-#endif
-
 #endif
diff --git a/dev/tasks/conda-recipes/arrow-cpp/build-pyarrow.sh b/dev/tasks/conda-recipes/arrow-cpp/build-pyarrow.sh
index 60144d06b9..76fdbce0c8 100644
--- a/dev/tasks/conda-recipes/arrow-cpp/build-pyarrow.sh
+++ b/dev/tasks/conda-recipes/arrow-cpp/build-pyarrow.sh
@@ -30,11 +30,6 @@ else
     export PYARROW_WITH_CUDA=0
 fi
 
-# Resolve: Make Error at cmake_modules/SetupCxxFlags.cmake:338 (message): Unsupported arch flag: -march=.
-if [[ "${target_platform}" == "linux-aarch64" ]]; then
-    export PYARROW_CMAKE_OPTIONS="-DARROW_ARMV8_ARCH=armv8-a ${PYARROW_CMAKE_OPTIONS}"
-fi
-
 if [[ "${target_platform}" == osx-* ]]; then
     # See https://conda-forge.org/docs/maintainer/knowledge_base.html#newer-c-features-with-old-sdk
     CXXFLAGS="${CXXFLAGS} -D_LIBCPP_DISABLE_AVAILABILITY"
diff --git a/python/CMakeLists.txt b/python/CMakeLists.txt
index 279ac076ac..bad5e926ab 100644
--- a/python/CMakeLists.txt
+++ b/python/CMakeLists.txt
@@ -126,11 +126,6 @@ if(NOT DEFINED ARROW_RUNTIME_SIMD_LEVEL)
       "MAX"
       CACHE STRING "Max runtime SIMD optimization level")
 endif()
-if(NOT DEFINED ARROW_ARMV8_ARCH)
-  set(ARROW_ARMV8_ARCH
-      "armv8-a"
-      CACHE STRING "Arm64 arch and extensions: armv8-a, armv8-a or armv8-a+crc+crypto")
-endif()
 include(SetupCxxFlags)
 
 # Add common flags


[arrow] 08/27: ARROW-18054: [Python][CI] Enable Cython tests on windows wheels (#13552)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit 59809769f59d115847cc3fa5025dc9110d6cbb6b
Author: Raúl Cumplido <ra...@gmail.com>
AuthorDate: Tue Oct 25 09:39:25 2022 +0200

    ARROW-18054: [Python][CI] Enable Cython tests on windows wheels (#13552)
    
    Authored-by: Raúl Cumplido <ra...@gmail.com>
    Signed-off-by: Antoine Pitrou <an...@python.org>
---
 .../python-wheel-windows-test-vs2017.dockerfile    | 43 ++++++++++++++++++++++
 ci/scripts/python_wheel_windows_test.bat           |  2 +-
 docker-compose.yml                                 |  9 ++++-
 python/pyarrow/tests/test_cython.py                | 26 ++++++++++---
 4 files changed, 72 insertions(+), 8 deletions(-)

diff --git a/ci/docker/python-wheel-windows-test-vs2017.dockerfile b/ci/docker/python-wheel-windows-test-vs2017.dockerfile
new file mode 100644
index 0000000000..6013efcd46
--- /dev/null
+++ b/ci/docker/python-wheel-windows-test-vs2017.dockerfile
@@ -0,0 +1,43 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+# NOTE: You must update PYTHON_WHEEL_WINDOWS_IMAGE_REVISION in .env
+# when you update this file.
+
+# based on mcr.microsoft.com/windows/servercore:ltsc2019
+# contains choco and vs2017 preinstalled
+FROM abrarov/msvc-2017:2.11.0
+
+# Add unix tools to path
+RUN setx path "%path%;C:\Program Files\Git\usr\bin"
+
+# Remove previous installations of python from the base image
+# NOTE: a more recent base image (tried with 2.12.1) comes with python 3.9.7
+# and the msi installers are failing to remove pip and tcl/tk "products" making
+# the subsequent choco python installation step failing for installing python
+# version 3.9.* due to existing python version
+RUN wmic product where "name like 'python%%'" call uninstall /nointeractive && \
+    rm -rf Python*
+
+# Define the full version number otherwise choco falls back to patch number 0 (3.7 => 3.7.0)
+ARG python=3.8
+RUN (if "%python%"=="3.7" setx PYTHON_VERSION "3.7.9" && setx PATH "%PATH%;C:\Python37;C:\Python37\Scripts") & \
+    (if "%python%"=="3.8" setx PYTHON_VERSION "3.8.10" && setx PATH "%PATH%;C:\Python38;C:\Python38\Scripts") & \
+    (if "%python%"=="3.9" setx PYTHON_VERSION "3.9.7" && setx PATH "%PATH%;C:\Python39;C:\Python39\Scripts") & \
+    (if "%python%"=="3.10" setx PYTHON_VERSION "3.10.2" && setx PATH "%PATH%;C:\Python310;C:\Python310\Scripts")
+RUN choco install -r -y --no-progress python --version=%PYTHON_VERSION%
+RUN python -m pip install -U pip setuptools
diff --git a/ci/scripts/python_wheel_windows_test.bat b/ci/scripts/python_wheel_windows_test.bat
index 2b7aad3abe..2abf8ca50f 100755
--- a/ci/scripts/python_wheel_windows_test.bat
+++ b/ci/scripts/python_wheel_windows_test.bat
@@ -17,7 +17,7 @@
 
 @echo on
 
-set PYARROW_TEST_CYTHON=OFF
+set PYARROW_TEST_CYTHON=ON
 set PYARROW_TEST_DATASET=ON
 set PYARROW_TEST_FLIGHT=ON
 set PYARROW_TEST_GANDIVA=OFF
diff --git a/docker-compose.yml b/docker-compose.yml
index 1c3813757f..57cd03d11d 100644
--- a/docker-compose.yml
+++ b/docker-compose.yml
@@ -981,10 +981,15 @@ services:
         target: "C:/arrow"
     command: arrow\\ci\\scripts\\python_wheel_windows_build.bat
 
-  # doesn't exit properly on fail
   python-wheel-windows-test:
-    image: python:${PYTHON}-windowsservercore-1809
+    image: ${REPO}:python-${PYTHON}-wheel-windows-test-vs2017-${PYTHON_WHEEL_WINDOWS_IMAGE_REVISION}
+    build:
+      args:
+        python: ${PYTHON}
+      context: .
+      dockerfile: ci/docker/python-wheel-windows-test-vs2017.dockerfile
     volumes:
+      - "${DOCKER_VOLUME_PREFIX}python-wheel-windows-clcache:C:/clcache"
       - type: bind
         source: .
         target: "C:/arrow"
diff --git a/python/pyarrow/tests/test_cython.py b/python/pyarrow/tests/test_cython.py
index 7152f3f1d4..42d8302afa 100644
--- a/python/pyarrow/tests/test_cython.py
+++ b/python/pyarrow/tests/test_cython.py
@@ -123,21 +123,37 @@ def test_cython_api(tmpdir):
         # pyarrow imported first.
         code = """if 1:
             import sys
+            import os
+
+            try:
+                # Add dll directory was added on python 3.8
+                # and is required in order to find extra DLLs
+                # only for win32
+                for dir in {library_dirs}:
+                    os.add_dll_directory(dir)
+            except AttributeError:
+                pass
 
             mod = __import__({mod_name!r})
             arr = mod.make_null_array(5)
             assert mod.get_array_length(arr) == 5
             assert arr.null_count == 5
-        """.format(mod_name='pyarrow_cython_example')
+        """.format(mod_name='pyarrow_cython_example',
+                   library_dirs=pa.get_library_dirs())
 
+        var = None
         if sys.platform == 'win32':
-            delim, var = ';', 'PATH'
+            if not hasattr(os, 'add_dll_directory'):
+                # Python 3.8 onwards don't check extension module DLLs on path
+                # we have to use os.add_dll_directory instead.
+                delim, var = ';', 'PATH'
         else:
             delim, var = ':', 'LD_LIBRARY_PATH'
 
-        subprocess_env[var] = delim.join(
-            pa.get_library_dirs() + [subprocess_env.get(var, '')]
-        )
+        if var:
+            subprocess_env[var] = delim.join(
+                pa.get_library_dirs() + [subprocess_env.get(var, '')]
+            )
         subprocess.check_call([sys.executable, '-c', code],
                               stdout=subprocess.PIPE,
                               env=subprocess_env)


[arrow] 22/27: ARROW-18310: [C++] Use atomic backpressure counter (#14622)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit e79870d46817fc6d52aad690570e2e457b8d9b92
Author: rtpsw <rt...@hotmail.com>
AuthorDate: Mon Nov 14 18:40:05 2022 +0200

    ARROW-18310: [C++] Use atomic backpressure counter (#14622)
    
    See https://issues.apache.org/jira/browse/ARROW-18310
    
    Authored-by: Yaron Gvili <rt...@hotmail.com>
    Signed-off-by: Weston Pace <we...@gmail.com>
---
 cpp/src/arrow/compute/exec/sink_node.cc   | 3 ++-
 cpp/src/arrow/compute/exec/source_node.cc | 3 ++-
 cpp/src/arrow/dataset/file_base.cc        | 3 ++-
 3 files changed, 6 insertions(+), 3 deletions(-)

diff --git a/cpp/src/arrow/compute/exec/sink_node.cc b/cpp/src/arrow/compute/exec/sink_node.cc
index 96a34bff43..1f518ef75d 100644
--- a/cpp/src/arrow/compute/exec/sink_node.cc
+++ b/cpp/src/arrow/compute/exec/sink_node.cc
@@ -16,6 +16,7 @@
 // specific language governing permissions and limitations
 // under the License.
 
+#include <atomic>
 #include <mutex>
 #include <optional>
 
@@ -386,7 +387,7 @@ class ConsumingSinkNode : public ExecNode, public BackpressureControl {
   AtomicCounter input_counter_;
   std::shared_ptr<SinkNodeConsumer> consumer_;
   std::vector<std::string> names_;
-  int32_t backpressure_counter_ = 0;
+  std::atomic<int32_t> backpressure_counter_ = 0;
 };
 static Result<ExecNode*> MakeTableConsumingSinkNode(
     compute::ExecPlan* plan, std::vector<compute::ExecNode*> inputs,
diff --git a/cpp/src/arrow/compute/exec/source_node.cc b/cpp/src/arrow/compute/exec/source_node.cc
index 1d51a5c1d2..e0534b1b39 100644
--- a/cpp/src/arrow/compute/exec/source_node.cc
+++ b/cpp/src/arrow/compute/exec/source_node.cc
@@ -15,6 +15,7 @@
 // specific language governing permissions and limitations
 // under the License.
 
+#include <atomic>
 #include <mutex>
 #include <optional>
 
@@ -216,7 +217,7 @@ struct SourceNode : ExecNode {
 
  private:
   std::mutex mutex_;
-  int32_t backpressure_counter_{0};
+  std::atomic<int32_t> backpressure_counter_{0};
   Future<> backpressure_future_ = Future<>::MakeFinished();
   bool stop_requested_{false};
   bool started_ = false;
diff --git a/cpp/src/arrow/dataset/file_base.cc b/cpp/src/arrow/dataset/file_base.cc
index bd19c99a52..10b9e82d5c 100644
--- a/cpp/src/arrow/dataset/file_base.cc
+++ b/cpp/src/arrow/dataset/file_base.cc
@@ -20,6 +20,7 @@
 #include <arrow/compute/exec/exec_plan.h>
 
 #include <algorithm>
+#include <atomic>
 #include <memory>
 #include <unordered_map>
 #include <variant>
@@ -582,7 +583,7 @@ class TeeNode : public compute::MapNode {
   // only returns an unfinished future when it needs backpressure.  Using a serial
   // scheduler here ensures we pause while we wait for backpressure to clear
   std::shared_ptr<util::AsyncTaskScheduler> serial_scheduler_;
-  int32_t backpressure_counter_ = 0;
+  std::atomic<int32_t> backpressure_counter_ = 0;
 };
 
 }  // namespace


[arrow] 01/27: ARROW-18078: [Docs][R] Fix broken link in R documentation (#14437)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit 22ad639d31fdb24cb2a93211da39886169fdda92
Author: Benson Muite <bk...@users.noreply.github.com>
AuthorDate: Tue Oct 18 00:36:58 2022 +0300

    ARROW-18078: [Docs][R] Fix broken link in R documentation (#14437)
    
    Authored-by: Benson Muite <be...@emailplus.org>
    Signed-off-by: Sutou Kouhei <ko...@clear-code.com>
---
 r/vignettes/developers/install_details.Rmd | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/r/vignettes/developers/install_details.Rmd b/r/vignettes/developers/install_details.Rmd
index 688c8632fa..6e32450d05 100644
--- a/r/vignettes/developers/install_details.Rmd
+++ b/r/vignettes/developers/install_details.Rmd
@@ -118,7 +118,7 @@ If you are authorized to install system packages and you're installing a CRAN re
 you may want to use the official Apache Arrow release packages corresponding to 
 the R package version via software distribution tools such as `apt` or `yum` 
 (though there are some drawbacks: see the 
-["Troubleshooting" section in the main installation docs]("../install.html)).
+["Troubleshooting" section in the main installation docs](../install.html#troubleshooting)).
 See the [Arrow project installation page](https://arrow.apache.org/install/)
 to find pre-compiled binary packages for some common Linux distributions,
 including Debian, Ubuntu, and CentOS.


[arrow] 24/27: ARROW-18302: [Python][Packaging] Update minimum vcpkg required version (#14634)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit fb95aee54339ba7dc8a7426a32625af5898e9631
Author: Raúl Cumplido <ra...@gmail.com>
AuthorDate: Mon Nov 14 21:21:25 2022 +0100

    ARROW-18302: [Python][Packaging] Update minimum vcpkg required version (#14634)
    
    Authored-by: Raúl Cumplido <ra...@gmail.com>
    Signed-off-by: Sutou Kouhei <ko...@clear-code.com>
---
 .env                 |   3 +-
 ci/vcpkg/ports.patch | 113 ++++++++++++---------------------------------------
 2 files changed, 29 insertions(+), 87 deletions(-)

diff --git a/.env b/.env
index b18eb2c3f0..d93eab06ff 100644
--- a/.env
+++ b/.env
@@ -96,7 +96,8 @@ DEVTOOLSET_VERSION=
 # Please also update the crossbow configuration in order to keep the github
 # actions cache up to date for the macOS wheels:
 #   https://github.com/ursacomputing/crossbow/blob/master/.github/workflows/cache_vcpkg.yml
-VCPKG="38bb87c"
+# vcpkg minimum version "09adfdc8cdad76345b7cc7f3305899e1cbd66297" due to CVE-2022-3786
+VCPKG="2871ddd918cecb9cb642bcb9c56897f397283192"
 
 # This must be updated when we update
 # ci/docker/python-wheel-windows-vs2017.dockerfile.
diff --git a/ci/vcpkg/ports.patch b/ci/vcpkg/ports.patch
index b2eed47466..c873bfbb06 100644
--- a/ci/vcpkg/ports.patch
+++ b/ci/vcpkg/ports.patch
@@ -1,100 +1,28 @@
-diff --git a/ports/abseil/fix-universal2.patch b/ports/abseil/fix-universal2.patch
-new file mode 100644
-index 0000000000..c729e7ae48
---- /dev/null
-+++ b/ports/abseil/fix-universal2.patch
-@@ -0,0 +1,55 @@
-+diff --git a/absl/copts/AbseilConfigureCopts.cmake b/absl/copts/AbseilConfigureCopts.cmake
-+index 942ce90a4..15d6c895f 100644
-+--- a/absl/copts/AbseilConfigureCopts.cmake
-++++ b/absl/copts/AbseilConfigureCopts.cmake
-+@@ -12,7 +12,49 @@ else()
-+   set(ABSL_BUILD_DLL FALSE)
-+ endif()
-+
-+-if(CMAKE_SYSTEM_PROCESSOR MATCHES "x86_64|amd64|AMD64")
-++if(APPLE AND CMAKE_CXX_COMPILER_ID MATCHES [[Clang]])
-++  # Some CMake targets (not known at the moment of processing) could be set to
-++  # compile for multiple architectures as specified by the OSX_ARCHITECTURES
-++  # property, which is target-specific.  We should neither inspect nor rely on
-++  # any CMake property or variable to detect an architecture, in particular:
-++  #
-++  #   - CMAKE_OSX_ARCHITECTURES
-++  #     is just an initial value for OSX_ARCHITECTURES; set too early.
-++  #
-++  #   - OSX_ARCHITECTURES
-++  #     is a per-target property; targets could be defined later, and their
-++  #     properties could be modified any time later.
-++  #
-++  #   - CMAKE_SYSTEM_PROCESSOR
-++  #     does not reflect multiple architectures at all.
-++  #
-++  # When compiling for multiple architectures, a build system can invoke a
-++  # compiler either
-++  #
-++  #   - once: a single command line for multiple architectures (Ninja build)
-++  #   - twice: two command lines per each architecture (Xcode build system)
-++  #
-++  # If case of Xcode, it would be possible to set an Xcode-specific attributes
-++  # like XCODE_ATTRIBUTE_OTHER_CPLUSPLUSFLAGS[arch=arm64] or similar.
-++  #
-++  # In both cases, the viable strategy is to pass all arguments at once, allowing
-++  # the compiler to dispatch arch-specific arguments to a designated backend.
-++  set(ABSL_RANDOM_RANDEN_COPTS "")
-++  foreach(_arch IN ITEMS "x86_64" "arm64")
-++    string(TOUPPER "${_arch}" _arch_uppercase)
-++    string(REPLACE "X86_64" "X64" _arch_uppercase ${_arch_uppercase})
-++    foreach(_flag IN LISTS ABSL_RANDOM_HWAES_${_arch_uppercase}_FLAGS)
-++      list(APPEND ABSL_RANDOM_RANDEN_COPTS "-Xarch_${_arch}" "${_flag}")
-++    endforeach()
-++  endforeach()
-++  # If a compiler happens to deal with an argument for a currently unused
-++  # architecture, it will warn about an unused command line argument.
-++  option(ABSL_RANDOM_RANDEN_COPTS_WARNING OFF
-++         "Warn if one of ABSL_RANDOM_RANDEN_COPTS is unused")
-++  if(ABSL_RANDOM_RANDEN_COPTS AND NOT ABSL_RANDOM_RANDEN_COPTS_WARNING)
-++    list(APPEND ABSL_RANDOM_RANDEN_COPTS "-Wno-unused-command-line-argument")
-++  endif()
-++elseif(CMAKE_SYSTEM_PROCESSOR MATCHES "x86_64|amd64|AMD64")
-+   if (MSVC)
-+     set(ABSL_RANDOM_RANDEN_COPTS "${ABSL_RANDOM_HWAES_MSVC_X64_FLAGS}")
-+   else()
-diff --git a/ports/abseil/portfile.cmake b/ports/abseil/portfile.cmake
-index 1289eed36a..b010a69f13 100644
---- a/ports/abseil/portfile.cmake
-+++ b/ports/abseil/portfile.cmake
-@@ -15,6 +15,7 @@ vcpkg_from_github(
-         # detection can cause ABI issues depending on which compiler options
-         # are enabled for consuming user code
- 	    fix-cxx-standard.patch
-+        fix-universal2.patch
- )
- 
- vcpkg_check_features(OUT_FEATURE_OPTIONS FEATURE_OPTIONS
 diff --git a/ports/curl/portfile.cmake b/ports/curl/portfile.cmake
-index f81d0c491d..e5ea9cef57 100644
+index 5a14562..924b1b7 100644
 --- a/ports/curl/portfile.cmake
 +++ b/ports/curl/portfile.cmake
-@@ -88,6 +88,10 @@ vcpkg_cmake_configure(
-         -DCMAKE_DISABLE_FIND_PACKAGE_Perl=ON
-         -DENABLE_DEBUG=ON
+@@ -87,8 +87,11 @@ vcpkg_cmake_configure(
+         -DENABLE_MANUAL=OFF
          -DCURL_CA_FALLBACK=ON
+         -DCURL_USE_LIBPSL=OFF
 +        -DCURL_CA_PATH=none
 +        -DCURL_CA_BUNDLE=none
-+    OPTIONS_DEBUG
-+        ${EXTRA_ARGS_DEBUG}
-     OPTIONS_RELEASE
-         ${OPTIONS_RELEASE}
      OPTIONS_DEBUG
+         -DENABLE_DEBUG=ON
++        ${EXTRA_ARGS_DEBUG}
+ )
+ vcpkg_cmake_install()
+ vcpkg_copy_pdbs()
 diff --git a/ports/snappy/portfile.cmake b/ports/snappy/portfile.cmake
-index 45b8c706db..b409d8a7be 100644
+index df95a08..d740ce7 100644
 --- a/ports/snappy/portfile.cmake
 +++ b/ports/snappy/portfile.cmake
-@@ -4,6 +4,7 @@ vcpkg_from_github(
-     REF 1.1.9
-     SHA512 f1f8a90f5f7f23310423574b1d8c9acb84c66ea620f3999d1060395205e5760883476837aba02f0aa913af60819e34c625d8308c18a5d7a9c4e190f35968b024
+@@ -9,6 +9,7 @@ vcpkg_from_github(
      HEAD_REF master
-+    PATCHES "snappy-disable-bmi.patch"
+     PATCHES
+         fix_clang-cl_build.patch
++        "snappy-disable-bmi.patch"
  )
  
  vcpkg_cmake_configure(
@@ -123,3 +51,16 @@ index 0000000000..a57ce0c22f
 + }
 + 
 + static inline bool LeftShiftOverflows(uint8_t value, uint32_t shift) {
+diff --git a/scripts/cmake/vcpkg_find_acquire_program.cmake b/scripts/cmake/vcpkg_find_acquire_program.cmake
+index 4611af6..d11936f 100644
+--- a/scripts/cmake/vcpkg_find_acquire_program.cmake
++++ b/scripts/cmake/vcpkg_find_acquire_program.cmake
+@@ -239,7 +239,7 @@ function(vcpkg_find_acquire_program program)
+             set(paths_to_search "${DOWNLOADS}/tools/python/${tool_subdirectory}")
+             vcpkg_list(SET post_install_command "${CMAKE_COMMAND}" -E rm python310._pth)
+         else()
+-            set(program_name python3)
++            set(program_name python)
+             set(brew_package_name "python")
+             set(apt_package_name "python3")
+         endif()


[arrow] 15/27: ARROW-18080: [C++] Remove gcc <= 4.9 workarounds (#14441)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit b7b92fa88ea54123b4fd804835556c7763b3ea40
Author: Neal Richardson <ne...@gmail.com>
AuthorDate: Tue Oct 18 20:54:46 2022 -0400

    ARROW-18080: [C++] Remove gcc <= 4.9 workarounds (#14441)
    
    This is most of them I could find. There's another one in `python/pyarrow/src/arrow/python/datetime.cc` to remove but it's more involved.
    
    Authored-by: Neal Richardson <ne...@gmail.com>
    Signed-off-by: Sutou Kouhei <ko...@clear-code.com>
---
 cpp/cmake_modules/SetupCxxFlags.cmake       | 39 +++++++++--------------------
 cpp/cmake_modules/ThirdpartyToolchain.cmake | 19 +++-----------
 cpp/src/arrow/dataset/file_parquet.cc       |  9 -------
 cpp/thirdparty/versions.txt                 |  3 ---
 4 files changed, 15 insertions(+), 55 deletions(-)

diff --git a/cpp/cmake_modules/SetupCxxFlags.cmake b/cpp/cmake_modules/SetupCxxFlags.cmake
index 51ef979a02..0bd5c3090e 100644
--- a/cpp/cmake_modules/SetupCxxFlags.cmake
+++ b/cpp/cmake_modules/SetupCxxFlags.cmake
@@ -400,22 +400,13 @@ elseif(CMAKE_CXX_COMPILER_ID STREQUAL "GNU")
     set(CXX_ONLY_FLAGS "${CXX_ONLY_FLAGS} -Wno-noexcept-type")
   endif()
 
-  if(CMAKE_CXX_COMPILER_VERSION VERSION_GREATER "5.2")
-    # Disabling semantic interposition allows faster calling conventions
-    # when calling global functions internally, and can also help inlining.
-    # See https://stackoverflow.com/questions/35745543/new-option-in-gcc-5-3-fno-semantic-interposition
-    set(CXX_COMMON_FLAGS "${CXX_COMMON_FLAGS} -fno-semantic-interposition")
-  endif()
-
-  if(CMAKE_CXX_COMPILER_VERSION VERSION_GREATER "4.9")
-    # Add colors when paired with ninja
-    set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fdiagnostics-color=always")
-  endif()
+  # Disabling semantic interposition allows faster calling conventions
+  # when calling global functions internally, and can also help inlining.
+  # See https://stackoverflow.com/questions/35745543/new-option-in-gcc-5-3-fno-semantic-interposition
+  set(CXX_COMMON_FLAGS "${CXX_COMMON_FLAGS} -fno-semantic-interposition")
 
-  if(CMAKE_CXX_COMPILER_VERSION VERSION_LESS "6.0")
-    # Work around https://gcc.gnu.org/bugzilla/show_bug.cgi?id=43407
-    set(CXX_COMMON_FLAGS "${CXX_COMMON_FLAGS} -Wno-attributes")
-  endif()
+  # Add colors when paired with ninja
+  set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fdiagnostics-color=always")
 
   if(CMAKE_UNITY_BUILD)
     # Work around issue similar to https://bugs.webkit.org/show_bug.cgi?id=176869
@@ -507,18 +498,12 @@ if(ARROW_CPU_FLAG STREQUAL "armv8")
 
     add_definitions(-DARROW_HAVE_NEON)
 
-    if(CMAKE_CXX_COMPILER_ID STREQUAL "GNU" AND CMAKE_CXX_COMPILER_VERSION VERSION_LESS
-                                                "5.4")
-      message(WARNING "Disable Armv8 CRC and Crypto as compiler doesn't support them well."
-      )
-    else()
-      if(ARROW_ARMV8_ARCH_FLAG MATCHES "\\+crypto")
-        add_definitions(-DARROW_HAVE_ARMV8_CRYPTO)
-      endif()
-      # armv8.1+ implies crc support
-      if(ARROW_ARMV8_ARCH_FLAG MATCHES "armv8\\.[1-9]|\\+crc")
-        add_definitions(-DARROW_HAVE_ARMV8_CRC)
-      endif()
+    if(ARROW_ARMV8_ARCH_FLAG MATCHES "\\+crypto")
+      add_definitions(-DARROW_HAVE_ARMV8_CRYPTO)
+    endif()
+    # armv8.1+ implies crc support
+    if(ARROW_ARMV8_ARCH_FLAG MATCHES "armv8\\.[1-9]|\\+crc")
+      add_definitions(-DARROW_HAVE_ARMV8_CRC)
     endif()
   elseif(NOT ARROW_SIMD_LEVEL STREQUAL "NONE")
     message(WARNING "ARROW_SIMD_LEVEL=${ARROW_SIMD_LEVEL} not supported by Arm.")
diff --git a/cpp/cmake_modules/ThirdpartyToolchain.cmake b/cpp/cmake_modules/ThirdpartyToolchain.cmake
index b1c3201894..b44521e0c1 100644
--- a/cpp/cmake_modules/ThirdpartyToolchain.cmake
+++ b/cpp/cmake_modules/ThirdpartyToolchain.cmake
@@ -652,18 +652,9 @@ endif()
 if(DEFINED ENV{ARROW_SNAPPY_URL})
   set(SNAPPY_SOURCE_URL "$ENV{ARROW_SNAPPY_URL}")
 else()
-  if(CMAKE_CXX_COMPILER_ID STREQUAL "GNU" AND CMAKE_CXX_COMPILER_VERSION VERSION_LESS
-                                              "4.9")
-    # There is a bug in GCC < 4.9 with Snappy 1.1.9, so revert to 1.1.8 "SNAPPY_OLD" for those (ARROW-14661)
-    set_urls(SNAPPY_SOURCE_URL
-             "https://github.com/google/snappy/archive/${ARROW_SNAPPY_OLD_BUILD_VERSION}.tar.gz"
-             "${THIRDPARTY_MIRROR_URL}/snappy-${ARROW_SNAPPY_OLD_BUILD_VERSION}.tar.gz")
-    set(ARROW_SNAPPY_BUILD_SHA256_CHECKSUM ${ARROW_SNAPPY_OLD_BUILD_SHA256_CHECKSUM})
-  else()
-    set_urls(SNAPPY_SOURCE_URL
-             "https://github.com/google/snappy/archive/${ARROW_SNAPPY_BUILD_VERSION}.tar.gz"
-             "${THIRDPARTY_MIRROR_URL}/snappy-${ARROW_SNAPPY_BUILD_VERSION}.tar.gz")
-  endif()
+  set_urls(SNAPPY_SOURCE_URL
+           "https://github.com/google/snappy/archive/${ARROW_SNAPPY_BUILD_VERSION}.tar.gz"
+           "${THIRDPARTY_MIRROR_URL}/snappy-${ARROW_SNAPPY_BUILD_VERSION}.tar.gz")
 endif()
 
 if(DEFINED ENV{ARROW_SUBSTRAIT_URL})
@@ -4584,10 +4575,6 @@ endif()
 
 macro(build_awssdk)
   message(STATUS "Building AWS C++ SDK from source")
-  if(CMAKE_CXX_COMPILER_ID STREQUAL "GNU" AND CMAKE_CXX_COMPILER_VERSION VERSION_LESS
-                                              "4.9")
-    message(FATAL_ERROR "AWS C++ SDK requires gcc >= 4.9")
-  endif()
   set(AWSSDK_PREFIX "${CMAKE_CURRENT_BINARY_DIR}/awssdk_ep-install")
   set(AWSSDK_INCLUDE_DIR "${AWSSDK_PREFIX}/include")
   set(AWSSDK_LIB_DIR "lib")
diff --git a/cpp/src/arrow/dataset/file_parquet.cc b/cpp/src/arrow/dataset/file_parquet.cc
index f07254e111..da9748de96 100644
--- a/cpp/src/arrow/dataset/file_parquet.cc
+++ b/cpp/src/arrow/dataset/file_parquet.cc
@@ -742,16 +742,7 @@ Result<std::optional<int64_t>> ParquetFileFragment::TryCountRows(
     compute::Expression predicate) {
   DCHECK_NE(metadata_, nullptr);
   if (ExpressionHasFieldRefs(predicate)) {
-#if defined(__GNUC__) && (__GNUC__ < 5)
-    // ARROW-12694: with GCC 4.9 (RTools 35) we sometimes segfault here if we move(result)
-    auto result = TestRowGroups(std::move(predicate));
-    if (!result.ok()) {
-      return result.status();
-    }
-    auto expressions = result.ValueUnsafe();
-#else
     ARROW_ASSIGN_OR_RAISE(auto expressions, TestRowGroups(std::move(predicate)));
-#endif
     int64_t rows = 0;
     for (size_t i = 0; i < row_groups_->size(); i++) {
       // If the row group is entirely excluded, exclude it from the row count
diff --git a/cpp/thirdparty/versions.txt b/cpp/thirdparty/versions.txt
index 767cb2c860..0cc496e938 100644
--- a/cpp/thirdparty/versions.txt
+++ b/cpp/thirdparty/versions.txt
@@ -81,9 +81,6 @@ ARROW_RE2_BUILD_SHA256_CHECKSUM=f89c61410a072e5cbcf8c27e3a778da7d6fd2f2b5b1445cd
 # 1.1.9 is patched to implement https://github.com/google/snappy/pull/148 if this is bumped, remove the patch
 ARROW_SNAPPY_BUILD_VERSION=1.1.9
 ARROW_SNAPPY_BUILD_SHA256_CHECKSUM=75c1fbb3d618dd3a0483bff0e26d0a92b495bbe5059c8b4f1c962b478b6e06e7
-# There is a bug in GCC < 4.9 with Snappy 1.1.9, so revert to 1.1.8 for those (ARROW-14661)
-ARROW_SNAPPY_OLD_BUILD_VERSION=1.1.8
-ARROW_SNAPPY_OLD_BUILD_SHA256_CHECKSUM=16b677f07832a612b0836178db7f374e414f94657c138e6993cbfc5dcc58651f
 ARROW_SUBSTRAIT_BUILD_VERSION=v0.6.0
 ARROW_SUBSTRAIT_BUILD_SHA256_CHECKSUM=7b8583b9684477e9027f417bbfb4febb8acfeb01923dcaa7cf0fd3f921d69c88
 ARROW_THRIFT_BUILD_VERSION=0.16.0


[arrow] 03/27: ARROW-18093: [CI][Conda][Windows] Disable ORC (#14454)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit f1343f1b78a6461c0d39a9ce16718cd5d8c835ee
Author: Sutou Kouhei <ko...@clear-code.com>
AuthorDate: Fri Oct 21 14:35:32 2022 +0900

    ARROW-18093: [CI][Conda][Windows] Disable ORC (#14454)
    
    https://anaconda.org/conda-forge/orc doesn't provide binaries for Windows.
    
    Error message:
    
    https://dev.azure.com/ursacomputing/crossbow/_build/results?buildId=37759&view=logs&j=4c86bc1b-1091-5192-4404-c74dfaad23e7&t=41795ef0-6501-5db4-3ad4-33c0cf085626&l=497
    
        CMake Error at cmake_modules/FindORC.cmake:56 (message):
          ORC library was required in toolchain and unable to locate
        Call Stack (most recent call first):
          cmake_modules/ThirdpartyToolchain.cmake:280 (find_package)
          cmake_modules/ThirdpartyToolchain.cmake:4362 (resolve_dependency)
          CMakeLists.txt:496 (include)
    
    Authored-by: Sutou Kouhei <ko...@clear-code.com>
    Signed-off-by: Sutou Kouhei <ko...@clear-code.com>
---
 dev/tasks/conda-recipes/arrow-cpp/bld-arrow.bat | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/dev/tasks/conda-recipes/arrow-cpp/bld-arrow.bat b/dev/tasks/conda-recipes/arrow-cpp/bld-arrow.bat
index 02de305eaa..21e2ae714e 100644
--- a/dev/tasks/conda-recipes/arrow-cpp/bld-arrow.bat
+++ b/dev/tasks/conda-recipes/arrow-cpp/bld-arrow.bat
@@ -31,7 +31,7 @@ cmake -G "Ninja" ^
       -DARROW_HDFS:BOOL=ON ^
       -DARROW_JSON:BOOL=ON ^
       -DARROW_MIMALLOC:BOOL=ON ^
-      -DARROW_ORC:BOOL=ON ^
+      -DARROW_ORC:BOOL=OFF ^
       -DARROW_PACKAGE_PREFIX="%LIBRARY_PREFIX%" ^
       -DARROW_PARQUET:BOOL=ON ^
       -DARROW_S3:BOOL=ON ^


[arrow] 23/27: ARROW-18251: [CI][Python] Fix AMD64 macOS 12 Python 3 job (#14614)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit 7e505a90dc0c65ac5c5a10819c98f17177e81e98
Author: Raúl Cumplido <ra...@gmail.com>
AuthorDate: Mon Nov 14 21:15:54 2022 +0100

    ARROW-18251: [CI][Python] Fix AMD64 macOS 12 Python 3 job (#14614)
    
    Authored-by: Raúl Cumplido <ra...@gmail.com>
    Signed-off-by: Sutou Kouhei <ko...@clear-code.com>
---
 .github/workflows/python.yml | 15 +++++++++++----
 1 file changed, 11 insertions(+), 4 deletions(-)

diff --git a/.github/workflows/python.yml b/.github/workflows/python.yml
index 884c2fc736..3bc4a75b24 100644
--- a/.github/workflows/python.yml
+++ b/.github/workflows/python.yml
@@ -120,7 +120,7 @@ jobs:
         run: archery docker push ${{ matrix.image }}
 
   macos:
-    name: AMD64 macOS 11 Python 3
+    name: AMD64 macOS 12 Python 3
     runs-on: macos-latest
     if: ${{ !contains(github.event.pull_request.title, 'WIP') }}
     timeout-minutes: 60
@@ -146,14 +146,21 @@ jobs:
       ARROW_WITH_SNAPPY: ON
       ARROW_WITH_BROTLI: ON
       ARROW_BUILD_TESTS: OFF
-      CMAKE_ARGS: "-DPython3_EXECUTABLE=/usr/local/bin/python3"
+      PYARROW_BUNDLE_ARROW_CPP: ON
       PYARROW_TEST_LARGE_MEMORY: ON
+      # Current oldest supported version according to https://endoflife.date/macos
+      MACOSX_DEPLOYMENT_TARGET: 10.15
+      PYTEST_ADDOPTS: "-k 'not test_cython_api'"
     steps:
       - name: Checkout Arrow
         uses: actions/checkout@v3
         with:
           fetch-depth: 0
           submodules: recursive
+      - name: Setup Python
+        uses: actions/setup-python@v4
+        with:
+          python-version: '3.11'
       - name: Install Dependencies
         shell: bash
         run: |
@@ -162,13 +169,13 @@ jobs:
           brew install --overwrite git
           brew bundle --file=cpp/Brewfile
           brew install coreutils
-          python3 -mpip install \
+          python -m pip install \
             -r python/requirements-build.txt \
             -r python/requirements-test.txt
       - name: Build
         shell: bash
         run: |
-          export PYTHON=python3
+          python -m pip install wheel
           ci/scripts/cpp_build.sh $(pwd) $(pwd)/build
           ci/scripts/python_build.sh $(pwd) $(pwd)/build
       - name: Test


[arrow] 14/27: ARROW-18260: [C++][CMake] Add support for x64 for CMAKE_SYSTEM_PROCESSOR (#14598)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit 8e8f6bf5adbf50091a6cdb76e7ae95c9b3c701d3
Author: Sutou Kouhei <ko...@clear-code.com>
AuthorDate: Tue Nov 8 13:58:37 2022 +0900

    ARROW-18260: [C++][CMake] Add support for x64 for CMAKE_SYSTEM_PROCESSOR (#14598)
    
    vcpkg uses x64:
    
    https://vcpkg.readthedocs.io/en/latest/users/triplets/
    
    > Valid options are x86, x64, arm, arm64 and wasm32.
    
    Authored-by: Sutou Kouhei <ko...@clear-code.com>
    Signed-off-by: Sutou Kouhei <ko...@clear-code.com>
---
 cpp/cmake_modules/SetupCxxFlags.cmake | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/cpp/cmake_modules/SetupCxxFlags.cmake b/cpp/cmake_modules/SetupCxxFlags.cmake
index cef4eb0b16..51ef979a02 100644
--- a/cpp/cmake_modules/SetupCxxFlags.cmake
+++ b/cpp/cmake_modules/SetupCxxFlags.cmake
@@ -24,7 +24,7 @@ include(CheckCXXSourceCompiles)
 message(STATUS "System processor: ${CMAKE_SYSTEM_PROCESSOR}")
 
 if(NOT DEFINED ARROW_CPU_FLAG)
-  if(CMAKE_SYSTEM_PROCESSOR MATCHES "AMD64|X86|x86|i[3456]86")
+  if(CMAKE_SYSTEM_PROCESSOR MATCHES "AMD64|X86|x86|i[3456]86|x64")
     set(ARROW_CPU_FLAG "x86")
   elseif(CMAKE_SYSTEM_PROCESSOR MATCHES "aarch64|ARM64|arm64")
     set(ARROW_CPU_FLAG "armv8")


[arrow] 05/27: MINOR: [Release] Fix loop in universal2 wheel verification (#14479)

Posted by ko...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch maint-10.0.x
in repository https://gitbox.apache.org/repos/asf/arrow.git

commit d05dd5482dcb25a011f3fe01622c6b1ff4033f19
Author: David Li <li...@gmail.com>
AuthorDate: Sat Oct 22 16:57:27 2022 -0400

    MINOR: [Release] Fix loop in universal2 wheel verification (#14479)
    
    Authored-by: David Li <li...@gmail.com>
    Signed-off-by: Sutou Kouhei <ko...@clear-code.com>
---
 dev/release/verify-release-candidate.sh | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/dev/release/verify-release-candidate.sh b/dev/release/verify-release-candidate.sh
index a9681af528..f874500c8f 100755
--- a/dev/release/verify-release-candidate.sh
+++ b/dev/release/verify-release-candidate.sh
@@ -1067,11 +1067,12 @@ test_macos_wheels() {
   # the interpreter should be installed from python.org:
   #   https://www.python.org/ftp/python/3.9.6/python-3.9.6-macosx10.9.pkg
   if [ "$(uname -m)" = "arm64" ]; then
-    for pyver in "3.9 3.10"; do
+    for pyver in 3.9 3.10; do
       local python="/Library/Frameworks/Python.framework/Versions/${pyver}/bin/python${pyver}"
 
       # create and activate a virtualenv for testing as arm64
       for arch in "arm64" "x86_64"; do
+        show_header "Testing Python ${pyver} universal2 wheel on ${arch}"
         VENV_ENV=wheel-${pyver}-universal2-${arch} PYTHON=${python} maybe_setup_virtualenv || continue
         # install pyarrow's universal2 wheel
         pip install pyarrow-${VERSION}-cp${pyver/.}-cp${pyver/.}-macosx_11_0_universal2.whl