You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by "chavdaparas (via GitHub)" <gi...@apache.org> on 2023/02/08 14:59:13 UTC
[GitHub] [beam] chavdaparas commented on issue #18390: Google Cloud Storage TextIO read fails with gz-files having Content-Encoding: gzip header
chavdaparas commented on issue #18390:
URL: https://github.com/apache/beam/issues/18390#issuecomment-1422729486
- you can upload the object to GCS with the `Content-Type` set to indicate compression and NO `Content-Encoding` at all, according to best practices.
`Content-encoding: application/gzip`
`Content-type:`
in this case the only thing immediately known about the object is that it is gzip-compressed, with no information regarding the underlying object type. Moreover, the object is not eligible for decompressive transcoding.
reference : https://cloud.google.com/storage/docs/transcoding
beam's `ReadFromText` with `compression_type=CompressionTypes.GZIP` works fine with above option
`p | "Read GCS File" >> beam.io.ReadFromText(file_pattern=file_path,compression_type=CompressionTypes.GZIP,
skip_header_lines=int(skip_header))`
Ways to compress the file
1. Implicitly by specifying `gsutil cp -Z <filename> <bucket>`
2. Explicitly by compressing the file first like `gzip <filename>` and load it to GCS
For more details around which combination works please see the table below :
<img width="550" alt="Screenshot 2023-02-08 at 8 26 22 PM" src="https://user-images.githubusercontent.com/27141543/217565746-58d45245-a890-4b43-805a-50a225de77cc.png">
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: github-unsubscribe@beam.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org