You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by fukuo33 <gi...@git.apache.org> on 2015/02/05 12:51:46 UTC

[GitHub] spark pull request: [SPARK-5618][Spark Core][Minor] Optimise utili...

GitHub user fukuo33 opened a pull request:

    https://github.com/apache/spark/pull/4396

    [SPARK-5618][Spark Core][Minor] Optimise utility code.

    

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/fukuo33/spark fix-unnecessary-regex

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/4396.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #4396
    
----
commit cd07fd6f040f1860022bd6611aede66672f3c4a1
Author: Makoto Fukuhara <fu...@gmail.com>
Date:   2015-01-29T11:48:18Z

    fix unnecessary regex.

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5618][Spark Core][Minor] Optimise utili...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/4396#issuecomment-73034654
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5618][Spark Core][Minor] Optimise utili...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/4396#issuecomment-73294626
  
    Ok I'm merging this with Sean's comments. Thanks @fukuo33 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5618][Spark Core][Minor] Optimise utili...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/4396


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5618][Spark Core][Minor] Optimise utili...

Posted by fukuo33 <gi...@git.apache.org>.
Github user fukuo33 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/4396#discussion_r24572286
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -1067,9 +1067,9 @@ private[spark] object Utils extends Logging {
         // finding the call site of a method.
         val SPARK_CORE_CLASS_REGEX =
           """^org\.apache\.spark(\.api\.java)?(\.util)?(\.rdd)?(\.broadcast)?\.[A-Z]""".r
    -    val SCALA_CLASS_REGEX = """^scala""".r
    +    val SCALA_CLASS = "scala"
    --- End diff --
    
    @srowen Thank you.
    
    The reason for using a "lazy val" is as follows.
    
    The function || takes two Boolean arguments, but only evaluates the second argument if the first is false:
    ```scala
    scala> true || { println("!!"); false }
    res0: Boolean = true
    ```
    
    If "isSparkCoreClass" is true, "className.startsWith (SCALA_CORE_CLASS_PREFIX)" does not need to be evaluated.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5618][Spark Core][Minor] Optimise utili...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/4396#discussion_r24174192
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -1067,9 +1067,9 @@ private[spark] object Utils extends Logging {
         // finding the call site of a method.
         val SPARK_CORE_CLASS_REGEX =
           """^org\.apache\.spark(\.api\.java)?(\.util)?(\.rdd)?(\.broadcast)?\.[A-Z]""".r
    -    val SCALA_CLASS_REGEX = """^scala""".r
    +    val SCALA_CLASS = "scala"
    --- End diff --
    
    LGTM but could this be called `SCALA_CORE_CLASS_PREFIX`? why is a `lazy val` needed? doesn't seem worth it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5618][Spark Core][Minor] Optimise utili...

Posted by fukuo33 <gi...@git.apache.org>.
Github user fukuo33 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/4396#discussion_r24573174
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -1067,9 +1067,9 @@ private[spark] object Utils extends Logging {
         // finding the call site of a method.
         val SPARK_CORE_CLASS_REGEX =
           """^org\.apache\.spark(\.api\.java)?(\.util)?(\.rdd)?(\.broadcast)?\.[A-Z]""".r
    -    val SCALA_CLASS_REGEX = """^scala""".r
    +    val SCALA_CLASS = "scala"
    --- End diff --
    
    I got it, thank you!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-5618][Spark Core][Minor] Optimise utili...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/4396#discussion_r24572582
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -1067,9 +1067,9 @@ private[spark] object Utils extends Logging {
         // finding the call site of a method.
         val SPARK_CORE_CLASS_REGEX =
           """^org\.apache\.spark(\.api\.java)?(\.util)?(\.rdd)?(\.broadcast)?\.[A-Z]""".r
    -    val SCALA_CLASS_REGEX = """^scala""".r
    +    val SCALA_CLASS = "scala"
    --- End diff --
    
    Yes I understand that effect, but it's a trivial condition to check, so don't think it overcomes the overhead of the generated code to implement "lazy". It doesn't matter; it's merged.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org