You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Kanagha Kumar <kp...@salesforce.com> on 2017/06/16 21:59:27 UTC

Error while doing mvn release for spark 2.0.2 using scala 2.10

Hey all,


I'm trying to use Spark 2.0.2 with scala 2.10 by following this
https://spark.apache.org/docs/2.0.2/building-spark.html#building-for-scala-210

./dev/change-scala-version.sh 2.10
./build/mvn -Pyarn -Phadoop-2.4 -Dscala-2.10 -DskipTests clean package


I could build the distribution successfully using
bash -xv dev/make-distribution.sh --tgz  -Dscala-2.10 -DskipTests

But, when I am trying to maven release, it keeps failing with the error
using the command:


Executing Maven:  -B -f pom.xml  -DscmCommentPrefix=[maven-release-plugin]
-e  -Dscala-2.10 -Pyarn -Phadoop-2.7 -Phadoop-provided -DskipTests
-Dresume=false -U -X *release:prepare release:perform*

Failed to execute goal on project spark-sketch_2.10: Could not resolve
dependencies for project
org.apache.spark:spark-sketch_2.10:jar:2.0.2-sfdc-3.0.0: *Failure to find
org.apache.spark:spark-tags_2.11:jar:2.0.2-sfdc-3.0.0* in <a .. nexus
repo...> was cached in the local repository, resolution will not be
reattempted until the update interval of nexus has elapsed or updates are
forced -&gt; [Help 1]


Why does spark-sketch depend upon spark-tags_2.11 when I have already
compiled against scala 2.10?? Any pointers would be helpful.
Thanks
Kanagha

Re: Error while doing mvn release for spark 2.0.2 using scala 2.10

Posted by Kanagha Kumar <kp...@salesforce.com>.
The problem I see is that the <scala.version> and <scala.binary.version>
defined in profile - scala2.10 are not getting picked up by the submodules
while doing maven release - 3.3.9 version. It works correctly while doing
mvn package though.

I also changed pom.xml default properties to have 2.10 scala versions and
tried maven release.
Is it related to any* maven issue where properties are not getting
substituted correctly*? Any insights as to why is this occurring will be
very helpful.

Also, we tried mvn dependency:tree . For common/sketch, I see the following
output using nexus within my company.
*[WARNING] The POM for org.apache.spark:spark-tags_2.11:jar:2.0.2 is
missing, no dependency information available*

When I tried hardcoding properties defined within <dependencies>, it works
correctly.

*pom.xml:*

<profile>

      <id>scala-2.10</id>

      <activation>

        <property><name>scala-2.10</name></property>

      </activation>

      <properties>

        <scala.version>2.10.6</scala.version>

        <scala.binary.version>2.10</scala.binary.version>

        <jline.version>${scala.version}</jline.version>

        <jline.groupid>org.scala-lang</jline.groupid>

      </properties>

      <build>


*common/sketch/pom.xml*

<dependencies>

    <dependency>

      <groupId>org.apache.spark</groupId>

      <artifactId>spark-tags_${scala.binary.version}</artifactId>

    </dependency>

  </dependencies>

On Mon, Jun 19, 2017 at 2:25 PM, Kanagha Kumar <kp...@salesforce.com>
wrote:

> Thanks. But, I am required to do a maven release to Nexus on spark 2.0.2
> built against scala 2.10.
> How can I go about with this? Is this a bug that I need to open in Spark
> jira?
>
> On Mon, Jun 19, 2017 at 12:12 PM, Shixiong(Ryan) Zhu <
> shixiong@databricks.com> wrote:
>
>> Some of projects (such as spark-tags) are Java projects. Spark doesn't
>> fix the artifact name and just hard-core 2.11.
>>
>> For your issue, try to use `install` rather than `package`.
>>
>> On Sat, Jun 17, 2017 at 7:20 PM, Kanagha Kumar <kp...@salesforce.com>
>> wrote:
>>
>>> Hi,
>>>
>>> Bumping up again! Why does spark modules depend upon scala2.11 versions
>>> inspite of changing pom.xmls using ./dev/change-scala-version.sh 2.10.
>>> Appreciate any quick help!!
>>>
>>> Thanks
>>>
>>> On Fri, Jun 16, 2017 at 2:59 PM, Kanagha Kumar <kp...@salesforce.com>
>>> wrote:
>>>
>>>> Hey all,
>>>>
>>>>
>>>> I'm trying to use Spark 2.0.2 with scala 2.10 by following this
>>>> https://spark.apache.org/docs/2.0.2/building-spark.html
>>>> #building-for-scala-210
>>>>
>>>> ./dev/change-scala-version.sh 2.10
>>>> ./build/mvn -Pyarn -Phadoop-2.4 -Dscala-2.10 -DskipTests clean package
>>>>
>>>>
>>>> I could build the distribution successfully using
>>>> bash -xv dev/make-distribution.sh --tgz  -Dscala-2.10 -DskipTests
>>>>
>>>> But, when I am trying to maven release, it keeps failing with the error
>>>> using the command:
>>>>
>>>>
>>>> Executing Maven:  -B -f pom.xml  -DscmCommentPrefix=[maven-release-plugin]
>>>> -e  -Dscala-2.10 -Pyarn -Phadoop-2.7 -Phadoop-provided -DskipTests
>>>> -Dresume=false -U -X *release:prepare release:perform*
>>>>
>>>> Failed to execute goal on project spark-sketch_2.10: Could not resolve
>>>> dependencies for project org.apache.spark:spark-sketch_2.10:jar:2.0.2-sfdc-3.0.0:
>>>> *Failure to find org.apache.spark:spark-tags_2.11:jar:2.0.2-sfdc-3.0.0*
>>>> in <a .. nexus repo...> was cached in the local repository, resolution will
>>>> not be reattempted until the update interval of nexus has elapsed or
>>>> updates are forced -&gt; [Help 1]
>>>>
>>>>
>>>> Why does spark-sketch depend upon spark-tags_2.11 when I have already
>>>> compiled against scala 2.10?? Any pointers would be helpful.
>>>> Thanks
>>>> Kanagha
>>>>
>>>
>>>
>>
>

Re: Error while doing mvn release for spark 2.0.2 using scala 2.10

Posted by Kanagha Kumar <kp...@salesforce.com>.
Thanks. But, I am required to do a maven release to Nexus on spark 2.0.2
built against scala 2.10.
How can I go about with this? Is this a bug that I need to open in Spark
jira?

On Mon, Jun 19, 2017 at 12:12 PM, Shixiong(Ryan) Zhu <
shixiong@databricks.com> wrote:

> Some of projects (such as spark-tags) are Java projects. Spark doesn't
> fix the artifact name and just hard-core 2.11.
>
> For your issue, try to use `install` rather than `package`.
>
> On Sat, Jun 17, 2017 at 7:20 PM, Kanagha Kumar <kp...@salesforce.com>
> wrote:
>
>> Hi,
>>
>> Bumping up again! Why does spark modules depend upon scala2.11 versions
>> inspite of changing pom.xmls using ./dev/change-scala-version.sh 2.10.
>> Appreciate any quick help!!
>>
>> Thanks
>>
>> On Fri, Jun 16, 2017 at 2:59 PM, Kanagha Kumar <kp...@salesforce.com>
>> wrote:
>>
>>> Hey all,
>>>
>>>
>>> I'm trying to use Spark 2.0.2 with scala 2.10 by following this
>>> https://spark.apache.org/docs/2.0.2/building-spark.html
>>> #building-for-scala-210
>>>
>>> ./dev/change-scala-version.sh 2.10
>>> ./build/mvn -Pyarn -Phadoop-2.4 -Dscala-2.10 -DskipTests clean package
>>>
>>>
>>> I could build the distribution successfully using
>>> bash -xv dev/make-distribution.sh --tgz  -Dscala-2.10 -DskipTests
>>>
>>> But, when I am trying to maven release, it keeps failing with the error
>>> using the command:
>>>
>>>
>>> Executing Maven:  -B -f pom.xml  -DscmCommentPrefix=[maven-release-plugin]
>>> -e  -Dscala-2.10 -Pyarn -Phadoop-2.7 -Phadoop-provided -DskipTests
>>> -Dresume=false -U -X *release:prepare release:perform*
>>>
>>> Failed to execute goal on project spark-sketch_2.10: Could not resolve
>>> dependencies for project org.apache.spark:spark-sketch_2.10:jar:2.0.2-sfdc-3.0.0:
>>> *Failure to find org.apache.spark:spark-tags_2.11:jar:2.0.2-sfdc-3.0.0*
>>> in <a .. nexus repo...> was cached in the local repository, resolution will
>>> not be reattempted until the update interval of nexus has elapsed or
>>> updates are forced -&gt; [Help 1]
>>>
>>>
>>> Why does spark-sketch depend upon spark-tags_2.11 when I have already
>>> compiled against scala 2.10?? Any pointers would be helpful.
>>> Thanks
>>> Kanagha
>>>
>>
>>
>

Re: Error while doing mvn release for spark 2.0.2 using scala 2.10

Posted by "Shixiong(Ryan) Zhu" <sh...@databricks.com>.
Some of projects (such as spark-tags) are Java projects. Spark doesn't fix
the artifact name and just hard-core 2.11.

For your issue, try to use `install` rather than `package`.

On Sat, Jun 17, 2017 at 7:20 PM, Kanagha Kumar <kp...@salesforce.com>
wrote:

> Hi,
>
> Bumping up again! Why does spark modules depend upon scala2.11 versions
> inspite of changing pom.xmls using ./dev/change-scala-version.sh 2.10.
> Appreciate any quick help!!
>
> Thanks
>
> On Fri, Jun 16, 2017 at 2:59 PM, Kanagha Kumar <kp...@salesforce.com>
> wrote:
>
>> Hey all,
>>
>>
>> I'm trying to use Spark 2.0.2 with scala 2.10 by following this
>> https://spark.apache.org/docs/2.0.2/building-spark.html
>> #building-for-scala-210
>>
>> ./dev/change-scala-version.sh 2.10
>> ./build/mvn -Pyarn -Phadoop-2.4 -Dscala-2.10 -DskipTests clean package
>>
>>
>> I could build the distribution successfully using
>> bash -xv dev/make-distribution.sh --tgz  -Dscala-2.10 -DskipTests
>>
>> But, when I am trying to maven release, it keeps failing with the error
>> using the command:
>>
>>
>> Executing Maven:  -B -f pom.xml  -DscmCommentPrefix=[maven-release-plugin]
>> -e  -Dscala-2.10 -Pyarn -Phadoop-2.7 -Phadoop-provided -DskipTests
>> -Dresume=false -U -X *release:prepare release:perform*
>>
>> Failed to execute goal on project spark-sketch_2.10: Could not resolve
>> dependencies for project org.apache.spark:spark-sketch_2.10:jar:2.0.2-sfdc-3.0.0:
>> *Failure to find org.apache.spark:spark-tags_2.11:jar:2.0.2-sfdc-3.0.0*
>> in <a .. nexus repo...> was cached in the local repository, resolution will
>> not be reattempted until the update interval of nexus has elapsed or
>> updates are forced -&gt; [Help 1]
>>
>>
>> Why does spark-sketch depend upon spark-tags_2.11 when I have already
>> compiled against scala 2.10?? Any pointers would be helpful.
>> Thanks
>> Kanagha
>>
>
>

Re: Error while doing mvn release for spark 2.0.2 using scala 2.10

Posted by Kanagha Kumar <kp...@salesforce.com>.
Hi,

Bumping up again! Why does spark modules depend upon scala2.11 versions
inspite of changing pom.xmls using ./dev/change-scala-version.sh 2.10.
Appreciate any quick help!!

Thanks

On Fri, Jun 16, 2017 at 2:59 PM, Kanagha Kumar <kp...@salesforce.com>
wrote:

> Hey all,
>
>
> I'm trying to use Spark 2.0.2 with scala 2.10 by following this
> https://spark.apache.org/docs/2.0.2/building-spark.
> html#building-for-scala-210
>
> ./dev/change-scala-version.sh 2.10
> ./build/mvn -Pyarn -Phadoop-2.4 -Dscala-2.10 -DskipTests clean package
>
>
> I could build the distribution successfully using
> bash -xv dev/make-distribution.sh --tgz  -Dscala-2.10 -DskipTests
>
> But, when I am trying to maven release, it keeps failing with the error
> using the command:
>
>
> Executing Maven:  -B -f pom.xml  -DscmCommentPrefix=[maven-release-plugin]
> -e  -Dscala-2.10 -Pyarn -Phadoop-2.7 -Phadoop-provided -DskipTests
> -Dresume=false -U -X *release:prepare release:perform*
>
> Failed to execute goal on project spark-sketch_2.10: Could not resolve
> dependencies for project org.apache.spark:spark-sketch_2.10:jar:2.0.2-sfdc-3.0.0:
> *Failure to find org.apache.spark:spark-tags_2.11:jar:2.0.2-sfdc-3.0.0*
> in <a .. nexus repo...> was cached in the local repository, resolution will
> not be reattempted until the update interval of nexus has elapsed or
> updates are forced -&gt; [Help 1]
>
>
> Why does spark-sketch depend upon spark-tags_2.11 when I have already
> compiled against scala 2.10?? Any pointers would be helpful.
> Thanks
> Kanagha
>