You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@arrow.apache.org by "Balaji Veeramani (Jira)" <ji...@apache.org> on 2022/11/09 03:40:00 UTC

[jira] [Created] (ARROW-18290) `pyarrow.fs.copy_files` doesn't work if filenames contain special characters

Balaji Veeramani created ARROW-18290:
----------------------------------------

             Summary: `pyarrow.fs.copy_files` doesn't work if filenames contain special characters
                 Key: ARROW-18290
                 URL: https://issues.apache.org/jira/browse/ARROW-18290
             Project: Apache Arrow
          Issue Type: Bug
          Components: Python
    Affects Versions: 6.0.1
            Reporter: Balaji Veeramani


I can't upload a file called `spam=ham` to a filesystem that emulates an S3 API. I can workaround the issue by renaming the file `spam-ham`. 

 

To reproduce, run a filesystem that emulates an S3 API:

```

docker run -p 9444:9000 scireum/s3-ninja:latest

```

 

Authenticate with the filesystem:

 

```

export AWS_ACCESS_KEY_ID="AKIAIOSFODNN7EXAMPLE"

export AWS_SECRET_ACCESS_KEY="wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"

```

 

Then run this Python script:

 

```
import os
import tempfile

import pyarrow.fs


source = tempfile.mkdtemp()
file_path = os.path.join(source, "spam")
open(file_path, "w").close()

filesystem, path = pyarrow.fs.FileSystem.from_uri(
"s3://bucket?scheme=http&endpoint_override=localhost:9444"
)
pyarrow.fs.copy_files(source, path, destination_filesystem=filesystem)
```



--
This message was sent by Atlassian Jira
(v8.20.10#820010)