You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by witgo <gi...@git.apache.org> on 2014/04/08 08:51:33 UTC

[GitHub] spark pull request: Fix:SPARK-1441 Compile Spark Core error with H...

GitHub user witgo opened a pull request:

    https://github.com/apache/spark/pull/357

    Fix:SPARK-1441 Compile Spark Core error with Hadoop 0.23.x

    

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/witgo/spark SPARK-1441

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/357.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #357
    
----
commit 7de20115cf699be623c4362377637422d7416289
Author: witgo <wi...@qq.com>
Date:   2014-04-08T06:38:20Z

    add avro dependency to core project

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1441: Compile Spark Core error with Hado...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/357#issuecomment-40305906
  
    That's not quite what I mean. `hadoop.version` affects the version of the various artifacts in the build of course, like `hadoop-client`. You can express activations based on artifact versions, IIRC. So you might activate based on this *effect* of setting `hadoop.version` rather than the property itself. It might still not work, but worth a shot.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1441: Compile Spark Core error with Hado...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on the pull request:

    https://github.com/apache/spark/pull/357#issuecomment-40305799
  
    ```xml
    <activation>
       <property><name>hadoop.version</name><value>[0.23,0.24)</value></property>
    </activation>
    ```
    It doesn't work
    
    see [PropertyProfileActivator.java](https://github.com/apache/maven/blob/master/maven-model-builder/src/main/java/org/apache/maven/model/profile/activation/PropertyProfileActivator.java)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: Fix:SPARK-1441 Compile Spark Core error with H...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/357#issuecomment-39816909
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1441: Compile Spark Core error with Hado...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/357#issuecomment-40305010
  
    Right, but you can just write `-Pyarn-alpha` and set `hadoop.version` and `yarn.version` as you like. That gets what you need.
    
    A better change would be to change automatically based on versions, but that's not what this PR does.
    
    I agree that Maven does not support ranges on property values. It would support ranges on the artifact versions that a property like this controls. So it may work to 'query' the version of `hadoop-client` for example. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1441: Compile Spark Core error with Hado...

Posted by witgo <gi...@git.apache.org>.
Github user witgo closed the pull request at:

    https://github.com/apache/spark/pull/357


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1441: Compile Spark Core error with Hado...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on the pull request:

    https://github.com/apache/spark/pull/357#issuecomment-40301947
  
    @srowen mind reviewing the PR?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1441: Compile Spark Core error with Hado...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on the pull request:

    https://github.com/apache/spark/pull/357#issuecomment-40304913
  
    So, if someone  compile the spark with hadoop 0.23.x how to automatically activate the profile
    ```xml
        <profile>
          <id>yarn-alpha</id>
          <dependencies>
            <dependency>
              <groupId>org.apache.avro</groupId>
              <artifactId>avro</artifactId>
            </dependency>
          </dependencies>
        </profile>
    ```
    Maven does not support such a activation
    ```xml
    <activation>
       <property><name>hadoop.version</name><value>0.23.*</value></property>
    </activation>
    ```



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1441: Compile Spark Core error with Hado...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/357#issuecomment-40304539
  
    I myself don't agree with this change, no. See the discussion in https://issues.apache.org/jira/browse/SPARK-1441 . For example, I think you can merely build with the yarn-alpha profile to get the artifacts you want.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: Fix:SPARK-1441 Compile Spark Core error with H...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/357#issuecomment-39817720
  
    You're not enabling the yarn-alpha build profile here, and Hadoop 0.23.x is what it's for. Similarity for SBT although the "profile" is enabled by the SPARK_YARN env variable. Your change just forced the config intended for yarn-alpha into the rest of the build, which seems incorrect.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---