You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by dongjinleekr <gi...@git.apache.org> on 2017/03/09 15:58:31 UTC

[GitHub] spark pull request #17225: [CORE] Support ZStandard Compression

GitHub user dongjinleekr opened a pull request:

    https://github.com/apache/spark/pull/17225

    [CORE] Support ZStandard Compression

    ## What changes were proposed in this pull request?
    
    Hadoop will support ZStandard Compression from 2.9.0. This update enables saving a file in HDFS using ZStandard Codec, by implementing ZStandardCodec.
    
    ## How was this patch tested?
    
    3 additional unit tests.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/dongjinleekr/spark feature/z-compression

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/17225.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #17225
    
----
commit 01850446779b398bae281391de205cbda28d3efc
Author: Lee Dongjin <do...@apache.org>
Date:   2017-03-09T15:43:56Z

    Implement ZStandardCompressionCodec.

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #17225: [CORE] Support ZStandard Compression

Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on the issue:

    https://github.com/apache/spark/pull/17225
  
    Spark supports customized codec out of it through configuration, is it necessary to put it into Spark?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #17225: [CORE] Support ZStandard Compression

Posted by dongjinleekr <gi...@git.apache.org>.
Github user dongjinleekr closed the pull request at:

    https://github.com/apache/spark/pull/17225


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #17225: [CORE] Support ZStandard Compression

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/17225
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #17225: [CORE] Support ZStandard Compression

Posted by dongjinleekr <gi...@git.apache.org>.
Github user dongjinleekr commented on a diff in the pull request:

    https://github.com/apache/spark/pull/17225#discussion_r105342714
  
    --- Diff: core/src/main/scala/org/apache/spark/io/CompressionCodec.scala ---
    @@ -49,13 +50,14 @@ private[spark] object CompressionCodec {
     
       private[spark] def supportsConcatenationOfSerializedStreams(codec: CompressionCodec): Boolean = {
         (codec.isInstanceOf[SnappyCompressionCodec] || codec.isInstanceOf[LZFCompressionCodec]
    -      || codec.isInstanceOf[LZ4CompressionCodec])
    +      || codec.isInstanceOf[LZ4CompressionCodec] || codec.isInstanceOf[ZStandardCompressionCodec])
       }
     
       private val shortCompressionCodecNames = Map(
         "lz4" -> classOf[LZ4CompressionCodec].getName,
         "lzf" -> classOf[LZFCompressionCodec].getName,
    -    "snappy" -> classOf[SnappyCompressionCodec].getName)
    +    "snappy" -> classOf[SnappyCompressionCodec].getName,
    +    "zstd" -> classOf[SnappyCompressionCodec].getName)
    --- End diff --
    
    OMG, it is a typo.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #17225: [CORE] Support ZStandard Compression

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/17225
  
    @dongjinleekr, it seems there is a JIRA - https://issues.apache.org/jira/browse/SPARK-19112. Let's add this in the title of this PR if they are the same.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #17225: [CORE] Support ZStandard Compression

Posted by dongjinleekr <gi...@git.apache.org>.
Github user dongjinleekr commented on the issue:

    https://github.com/apache/spark/pull/17225
  
    @HyukjinKwon Thanks for the information. It seems like both of Jira issue and my PR are messed up - it will re-create PR with the Jira issue.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #17225: [CORE] Support ZStandard Compression

Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/17225#discussion_r105321812
  
    --- Diff: core/src/main/scala/org/apache/spark/io/CompressionCodec.scala ---
    @@ -49,13 +50,14 @@ private[spark] object CompressionCodec {
     
       private[spark] def supportsConcatenationOfSerializedStreams(codec: CompressionCodec): Boolean = {
         (codec.isInstanceOf[SnappyCompressionCodec] || codec.isInstanceOf[LZFCompressionCodec]
    -      || codec.isInstanceOf[LZ4CompressionCodec])
    +      || codec.isInstanceOf[LZ4CompressionCodec] || codec.isInstanceOf[ZStandardCompressionCodec])
       }
     
       private val shortCompressionCodecNames = Map(
         "lz4" -> classOf[LZ4CompressionCodec].getName,
         "lzf" -> classOf[LZFCompressionCodec].getName,
    -    "snappy" -> classOf[SnappyCompressionCodec].getName)
    +    "snappy" -> classOf[SnappyCompressionCodec].getName,
    +    "zstd" -> classOf[SnappyCompressionCodec].getName)
    --- End diff --
    
    Is it right, still using snappy?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #17225: [CORE] Support ZStandard Compression

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/17225
  
    Hi @dongjinleekr, how about opening a jira and adding it to the title? lt seems not a minor change that does not need a JIRA.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org