You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Marcelo Vanzin <va...@cloudera.com> on 2015/06/04 01:33:42 UTC

Ivy support in Spark vs. sbt

Hey all,

I've been bit by something really weird lately and I'm starting to think
it's related to the ivy support we have in Spark, and running unit tests
that use that code.

The first thing that happens is that after running unit tests, sometimes my
sbt builds start failing with error saying something about "dependency path
must be relative" (sorry, don't have the exact error around). The
dependency path it prints is a "file:" URL.

I have a feeling that this is because Spark uses Ivy 2.4 while sbt uses Ivy
2.3, and those might be incompatible. So if they get mixed up, things can
break.

The second is that sometimes unit tests fail with some weird error
downloading dependencies. When checking the ivy metadata in ~/.ivy2/cache,
the offending dependencies are pointing to my local maven repo (I have
"maven-local" as one of the entries in my ~/.sbt/repositories).

My feeling in this case is that Spark's version of Ivy somehow doesn't
handle that case.

So, long story short:

- Has anyone run into either of these problems?
- Is it possible to set some env variable or something during tests to
force them to use their own directory instead of messing up and breaking my
~/.ivy2?


-- 
Marcelo

Re: Ivy support in Spark vs. sbt

Posted by Marcelo Vanzin <va...@cloudera.com>.
They're my local builds, so I wouldn't be able to send you any links... and
the error is generally from sbt, not the unit tests. But if there's any
info I can collect when I see the error, let me know.

I'll try "spark.jars.ivy". I wonder if we should just set that to the
system properties in Spark's root pom.

On Thu, Jun 4, 2015 at 9:47 AM, Burak Yavuz <br...@gmail.com> wrote:

> Hi Marcelo,
>
> This is interesting. Can you please send me links to any failing builds if
> you see that problem please. For now you can set a conf: `spark.jars.ivy`
> to use a path except `~/.ivy2` for Spark.
>
> Thanks,
> Burak
>
> On Thu, Jun 4, 2015 at 4:29 AM, Sean Owen <so...@cloudera.com> wrote:
>
>> I've definitely seen the "dependency path must be relative" problem,
>> and fixed it by deleting the ivy cache, but I don't know more than
>> this.
>>
>> On Thu, Jun 4, 2015 at 1:33 AM, Marcelo Vanzin <va...@cloudera.com>
>> wrote:
>> > Hey all,
>> >
>> > I've been bit by something really weird lately and I'm starting to think
>> > it's related to the ivy support we have in Spark, and running unit tests
>> > that use that code.
>> >
>> > The first thing that happens is that after running unit tests,
>> sometimes my
>> > sbt builds start failing with error saying something about "dependency
>> path
>> > must be relative" (sorry, don't have the exact error around). The
>> dependency
>> > path it prints is a "file:" URL.
>> >
>> > I have a feeling that this is because Spark uses Ivy 2.4 while sbt uses
>> Ivy
>> > 2.3, and those might be incompatible. So if they get mixed up, things
>> can
>> > break.
>> >
>> > The second is that sometimes unit tests fail with some weird error
>> > downloading dependencies. When checking the ivy metadata in
>> ~/.ivy2/cache,
>> > the offending dependencies are pointing to my local maven repo (I have
>> > "maven-local" as one of the entries in my ~/.sbt/repositories).
>> >
>> > My feeling in this case is that Spark's version of Ivy somehow doesn't
>> > handle that case.
>> >
>> > So, long story short:
>> >
>> > - Has anyone run into either of these problems?
>> > - Is it possible to set some env variable or something during tests to
>> force
>> > them to use their own directory instead of messing up and breaking my
>> > ~/.ivy2?
>> >
>> >
>> > --
>> > Marcelo
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
>>
>


-- 
Marcelo

Re: Ivy support in Spark vs. sbt

Posted by Burak Yavuz <br...@gmail.com>.
Hi Marcelo,

This is interesting. Can you please send me links to any failing builds if
you see that problem please. For now you can set a conf: `spark.jars.ivy`
to use a path except `~/.ivy2` for Spark.

Thanks,
Burak

On Thu, Jun 4, 2015 at 4:29 AM, Sean Owen <so...@cloudera.com> wrote:

> I've definitely seen the "dependency path must be relative" problem,
> and fixed it by deleting the ivy cache, but I don't know more than
> this.
>
> On Thu, Jun 4, 2015 at 1:33 AM, Marcelo Vanzin <va...@cloudera.com>
> wrote:
> > Hey all,
> >
> > I've been bit by something really weird lately and I'm starting to think
> > it's related to the ivy support we have in Spark, and running unit tests
> > that use that code.
> >
> > The first thing that happens is that after running unit tests, sometimes
> my
> > sbt builds start failing with error saying something about "dependency
> path
> > must be relative" (sorry, don't have the exact error around). The
> dependency
> > path it prints is a "file:" URL.
> >
> > I have a feeling that this is because Spark uses Ivy 2.4 while sbt uses
> Ivy
> > 2.3, and those might be incompatible. So if they get mixed up, things can
> > break.
> >
> > The second is that sometimes unit tests fail with some weird error
> > downloading dependencies. When checking the ivy metadata in
> ~/.ivy2/cache,
> > the offending dependencies are pointing to my local maven repo (I have
> > "maven-local" as one of the entries in my ~/.sbt/repositories).
> >
> > My feeling in this case is that Spark's version of Ivy somehow doesn't
> > handle that case.
> >
> > So, long story short:
> >
> > - Has anyone run into either of these problems?
> > - Is it possible to set some env variable or something during tests to
> force
> > them to use their own directory instead of messing up and breaking my
> > ~/.ivy2?
> >
> >
> > --
> > Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>

Re: Ivy support in Spark vs. sbt

Posted by Igor Costa <ig...@apache.org>.
Marcelo

I've run this problem once, when I was starting with Spark, like you
mentioned. I found out that ivy get messy with diff sbt version.

My solution was using a previous compatible version with sbt to not cross
with version.


Best
Igor

On Thu, Jun 4, 2015 at 8:08 PM, Eron Wright <ew...@live.com> wrote:

> I saw something like this last night, with a similar message.  Is this
> what you’re referring to?
>
> [error]
> org.deeplearning4j#dl4j-spark-ml;0.0.3.3.4.alpha1-SNAPSHOT!dl4j-spark-ml.jar
> origin location must be absolute:
> file:/Users/eron/.m2/repository/org/deeplearning4j/dl4j-spark-ml/0.0.3.3.4.alpha1-SNAPSHOT/dl4j-spark-ml-0.0.3.3.4.alpha1-SNAPSHOT.jar
> java.lang.IllegalArgumentException:
> org.deeplearning4j#dl4j-spark-ml;0.0.3.3.4.alpha1-SNAPSHOT!dl4j-spark-ml.jar
> origin location must be absolute:
> file:/Users/eron/.m2/repository/org/deeplearning4j/dl4j-spark-ml/0.0.3.3.4.alpha1-SNAPSHOT/dl4j-spark-ml-0.0.3.3.4.alpha1-SNAPSHOT.jar
>         at org.apache.ivy.util.Checks.checkAbsolute(Checks.java:57)
>         at
> org.apache.ivy.core.cache.DefaultRepositoryCacheManager.getArchiveFileInCache(DefaultRepositoryCacheManager.java:385)
>         at
> org.apache.ivy.core.cache.DefaultRepositoryCacheManager.download(DefaultRepositoryCacheManager.java:849)
>         at
> org.apache.ivy.plugins.resolver.BasicResolver.download(BasicResolver.java:835)
>         at
> org.apache.ivy.plugins.resolver.RepositoryResolver.download(RepositoryResolver.java:282)
>         at
> org.apache.ivy.plugins.resolver.ChainResolver.download(ChainResolver.java:219)
>         at
> org.apache.ivy.plugins.resolver.ChainResolver.download(ChainResolver.java:219)
>         at
> org.apache.ivy.core.resolve.ResolveEngine.downloadArtifacts(ResolveEngine.java:388)
>         at
> org.apache.ivy.core.resolve.ResolveEngine.resolve(ResolveEngine.java:331)
>         at org.apache.ivy.Ivy.resolve(Ivy.java:517)
>         at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:266)
>         at
> sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:175)
>         at
> sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:157)
>         at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151)
>         at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151)
>         at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:128)
>         at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:56)
>         at sbt.IvySbt$$anon$4.call(Ivy.scala:64)
>         at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93)
>         at
> xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:78)
>         at
> xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:97)
>         at xsbt.boot.Using$.withResource(Using.scala:10)
>         at xsbt.boot.Using$.apply(Using.scala:9)
>         at
> xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58)
>         at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48)
>         at xsbt.boot.Locks$.apply0(Locks.scala:31)
>         at xsbt.boot.Locks$.apply(Locks.scala:28)
>         at sbt.IvySbt.withDefaultLogger(Ivy.scala:64)
>         at sbt.IvySbt.withIvy(Ivy.scala:123)
>         at sbt.IvySbt.withIvy(Ivy.scala:120)
>         at sbt.IvySbt$Module.withModule(Ivy.scala:151)
>         at sbt.IvyActions$.updateEither(IvyActions.scala:157)
>         at
> sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1318)
>         at
> sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1315)
>         at
> sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$85.apply(Defaults.scala:1345)
>         at
> sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$85.apply(Defaults.scala:1343)
>         at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:35)
>         at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1348)
>         at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1342)
>         at sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:45)
>         at sbt.Classpaths$.cachedUpdate(Defaults.scala:1360)
>         at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1300)
>         at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1275)
>         at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
>         at
> sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
>         at sbt.std.Transform$$anon$4.work(System.scala:63)
>         at
> sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
>         at
> sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
>         at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
>         at sbt.Execute.work(Execute.scala:235)
>         at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
>         at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
>         at
> sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
>         at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>         at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>         at java.lang.Thread.run(Thread.java:745)
>
>
>
>
> On 6/4/15, 4:29 AM, "Sean Owen" <so...@cloudera.com> wrote:
>
> >I've definitely seen the "dependency path must be relative" problem,
> >and fixed it by deleting the ivy cache, but I don't know more than
> >this.
> >
> >On Thu, Jun 4, 2015 at 1:33 AM, Marcelo Vanzin <va...@cloudera.com>
> wrote:
> >> Hey all,
> >>
> >> I've been bit by something really weird lately and I'm starting to think
> >> it's related to the ivy support we have in Spark, and running unit tests
> >> that use that code.
> >>
> >> The first thing that happens is that after running unit tests,
> sometimes my
> >> sbt builds start failing with error saying something about "dependency
> path
> >> must be relative" (sorry, don't have the exact error around). The
> dependency
> >> path it prints is a "file:" URL.
> >>
> >> I have a feeling that this is because Spark uses Ivy 2.4 while sbt uses
> Ivy
> >> 2.3, and those might be incompatible. So if they get mixed up, things
> can
> >> break.
> >>
> >> The second is that sometimes unit tests fail with some weird error
> >> downloading dependencies. When checking the ivy metadata in
> ~/.ivy2/cache,
> >> the offending dependencies are pointing to my local maven repo (I have
> >> "maven-local" as one of the entries in my ~/.sbt/repositories).
> >>
> >> My feeling in this case is that Spark's version of Ivy somehow doesn't
> >> handle that case.
> >>
> >> So, long story short:
> >>
> >> - Has anyone run into either of these problems?
> >> - Is it possible to set some env variable or something during tests to
> force
> >> them to use their own directory instead of messing up and breaking my
> >> ~/.ivy2?
> >>
> >>
> >> --
> >> Marcelo
> >
> >---------------------------------------------------------------------
> >To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> >For additional commands, e-mail: dev-help@spark.apache.org
> >
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>

Re: Ivy support in Spark vs. sbt

Posted by Eron Wright <ew...@live.com>.
I saw something like this last night, with a similar message.  Is this what you’re referring to?

[error] org.deeplearning4j#dl4j-spark-ml;0.0.3.3.4.alpha1-SNAPSHOT!dl4j-spark-ml.jar origin location must be absolute: file:/Users/eron/.m2/repository/org/deeplearning4j/dl4j-spark-ml/0.0.3.3.4.alpha1-SNAPSHOT/dl4j-spark-ml-0.0.3.3.4.alpha1-SNAPSHOT.jar
java.lang.IllegalArgumentException: org.deeplearning4j#dl4j-spark-ml;0.0.3.3.4.alpha1-SNAPSHOT!dl4j-spark-ml.jar origin location must be absolute: file:/Users/eron/.m2/repository/org/deeplearning4j/dl4j-spark-ml/0.0.3.3.4.alpha1-SNAPSHOT/dl4j-spark-ml-0.0.3.3.4.alpha1-SNAPSHOT.jar
	at org.apache.ivy.util.Checks.checkAbsolute(Checks.java:57)
	at org.apache.ivy.core.cache.DefaultRepositoryCacheManager.getArchiveFileInCache(DefaultRepositoryCacheManager.java:385)
	at org.apache.ivy.core.cache.DefaultRepositoryCacheManager.download(DefaultRepositoryCacheManager.java:849)
	at org.apache.ivy.plugins.resolver.BasicResolver.download(BasicResolver.java:835)
	at org.apache.ivy.plugins.resolver.RepositoryResolver.download(RepositoryResolver.java:282)
	at org.apache.ivy.plugins.resolver.ChainResolver.download(ChainResolver.java:219)
	at org.apache.ivy.plugins.resolver.ChainResolver.download(ChainResolver.java:219)
	at org.apache.ivy.core.resolve.ResolveEngine.downloadArtifacts(ResolveEngine.java:388)
	at org.apache.ivy.core.resolve.ResolveEngine.resolve(ResolveEngine.java:331)
	at org.apache.ivy.Ivy.resolve(Ivy.java:517)
	at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:266)
	at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:175)
	at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:157)
	at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151)
	at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151)
	at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:128)
	at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:56)
	at sbt.IvySbt$$anon$4.call(Ivy.scala:64)
	at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93)
	at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:78)
	at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:97)
	at xsbt.boot.Using$.withResource(Using.scala:10)
	at xsbt.boot.Using$.apply(Using.scala:9)
	at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58)
	at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48)
	at xsbt.boot.Locks$.apply0(Locks.scala:31)
	at xsbt.boot.Locks$.apply(Locks.scala:28)
	at sbt.IvySbt.withDefaultLogger(Ivy.scala:64)
	at sbt.IvySbt.withIvy(Ivy.scala:123)
	at sbt.IvySbt.withIvy(Ivy.scala:120)
	at sbt.IvySbt$Module.withModule(Ivy.scala:151)
	at sbt.IvyActions$.updateEither(IvyActions.scala:157)
	at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1318)
	at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1315)
	at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$85.apply(Defaults.scala:1345)
	at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$85.apply(Defaults.scala:1343)
	at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:35)
	at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1348)
	at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1342)
	at sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:45)
	at sbt.Classpaths$.cachedUpdate(Defaults.scala:1360)
	at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1300)
	at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1275)
	at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
	at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
	at sbt.std.Transform$$anon$4.work(System.scala:63)
	at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
	at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
	at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
	at sbt.Execute.work(Execute.scala:235)
	at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
	at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
	at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
	at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)




On 6/4/15, 4:29 AM, "Sean Owen" <so...@cloudera.com> wrote:

>I've definitely seen the "dependency path must be relative" problem,
>and fixed it by deleting the ivy cache, but I don't know more than
>this.
>
>On Thu, Jun 4, 2015 at 1:33 AM, Marcelo Vanzin <va...@cloudera.com> wrote:
>> Hey all,
>>
>> I've been bit by something really weird lately and I'm starting to think
>> it's related to the ivy support we have in Spark, and running unit tests
>> that use that code.
>>
>> The first thing that happens is that after running unit tests, sometimes my
>> sbt builds start failing with error saying something about "dependency path
>> must be relative" (sorry, don't have the exact error around). The dependency
>> path it prints is a "file:" URL.
>>
>> I have a feeling that this is because Spark uses Ivy 2.4 while sbt uses Ivy
>> 2.3, and those might be incompatible. So if they get mixed up, things can
>> break.
>>
>> The second is that sometimes unit tests fail with some weird error
>> downloading dependencies. When checking the ivy metadata in ~/.ivy2/cache,
>> the offending dependencies are pointing to my local maven repo (I have
>> "maven-local" as one of the entries in my ~/.sbt/repositories).
>>
>> My feeling in this case is that Spark's version of Ivy somehow doesn't
>> handle that case.
>>
>> So, long story short:
>>
>> - Has anyone run into either of these problems?
>> - Is it possible to set some env variable or something during tests to force
>> them to use their own directory instead of messing up and breaking my
>> ~/.ivy2?
>>
>>
>> --
>> Marcelo
>
>---------------------------------------------------------------------
>To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>For additional commands, e-mail: dev-help@spark.apache.org
>


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Ivy support in Spark vs. sbt

Posted by Marcelo Vanzin <va...@cloudera.com>.
Here's one of the types of exceptions I get (this one when running
VersionsSuite from sql/hive):

[info] - 13: create client *** FAILED *** (1 second, 946 milliseconds)
[info]   java.lang.RuntimeException: [download failed:
org.apache.httpcomponents#httpclient;4.2.5!httpclient.jar, download failed:
commons-codec#commons-codec;1.4!commons-codec.jar]
[info]   at
org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:978)

This is the content of the ivy metadata file for that component:

#ivy cached data file for org.apache.httpcomponents#httpclient;4.2.5
#Thu Jun 04 13:26:10 PDT 2015
artifact\:ivy\#ivy\#xml\#1855381640.is-local=true
artifact\:ivy\#ivy\#xml\#1855381640.location=file\:/home/vanzin/.m2/repository/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.pom
artifact\:ivy\#ivy\#xml\#1855381640.exists=true
resolver=local-m2-cache
artifact\:httpclient\#pom.original\#pom\#-365933676.original=artifact\:httpclient\#pom.original\#pom\#-365933676
artifact\:ivy\#ivy\#xml\#1855381640.original=artifact\:httpclient\#pom.original\#pom\#-365933676
artifact.resolver=local-m2-cache
artifact\:httpclient\#pom.original\#pom\#-365933676.is-local=true
artifact\:httpclient\#pom.original\#pom\#-365933676.location=file\:/home/vanzin/.m2/repository/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.pom
artifact\:httpclient\#pom.original\#pom\#-365933676.exists=true


If I delete that file *and* the maven copy of those artifacts, then the
tests pass. But that's really annoying, since I have to use sbt and maven
for different things and I really like the fact that sbt can read the maven
cache directly.


On Thu, Jun 4, 2015 at 10:23 AM, shane knapp <sk...@berkeley.edu> wrote:

> interesting...  i definitely haven't seen it happen that often in our
> build system, and when it has happened, i wasn't able to determine the
> cause.
>
> On Thu, Jun 4, 2015 at 10:16 AM, Marcelo Vanzin <va...@cloudera.com>
> wrote:
>
>> On Thu, Jun 4, 2015 at 10:04 AM, shane knapp <sk...@berkeley.edu> wrote:
>>
>>> this has occasionally happened on our jenkins as well (twice since last
>>> august), and deleting the cache fixes it right up.
>>>
>>
>> Yes deleting the cache fixes things, but it's kinda annoying to have to
>> do that. And yesterday when I was testing a patch that actually used the
>> ivy feature, I had to do that multiple times... that slows things down a
>> lot.
>>
>>
>>>
>>> On Thu, Jun 4, 2015 at 4:29 AM, Sean Owen <so...@cloudera.com> wrote:
>>>
>>>> I've definitely seen the "dependency path must be relative" problem,
>>>> and fixed it by deleting the ivy cache, but I don't know more than
>>>> this.
>>>>
>>>> On Thu, Jun 4, 2015 at 1:33 AM, Marcelo Vanzin <va...@cloudera.com>
>>>> wrote:
>>>> > Hey all,
>>>> >
>>>> > I've been bit by something really weird lately and I'm starting to
>>>> think
>>>> > it's related to the ivy support we have in Spark, and running unit
>>>> tests
>>>> > that use that code.
>>>> >
>>>> > The first thing that happens is that after running unit tests,
>>>> sometimes my
>>>> > sbt builds start failing with error saying something about
>>>> "dependency path
>>>> > must be relative" (sorry, don't have the exact error around). The
>>>> dependency
>>>> > path it prints is a "file:" URL.
>>>> >
>>>> > I have a feeling that this is because Spark uses Ivy 2.4 while sbt
>>>> uses Ivy
>>>> > 2.3, and those might be incompatible. So if they get mixed up, things
>>>> can
>>>> > break.
>>>> >
>>>> > The second is that sometimes unit tests fail with some weird error
>>>> > downloading dependencies. When checking the ivy metadata in
>>>> ~/.ivy2/cache,
>>>> > the offending dependencies are pointing to my local maven repo (I have
>>>> > "maven-local" as one of the entries in my ~/.sbt/repositories).
>>>> >
>>>> > My feeling in this case is that Spark's version of Ivy somehow doesn't
>>>> > handle that case.
>>>> >
>>>> > So, long story short:
>>>> >
>>>> > - Has anyone run into either of these problems?
>>>> > - Is it possible to set some env variable or something during tests
>>>> to force
>>>> > them to use their own directory instead of messing up and breaking my
>>>> > ~/.ivy2?
>>>> >
>>>> >
>>>> > --
>>>> > Marcelo
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>>
>>>>
>>>
>>
>>
>> --
>> Marcelo
>>
>
>


-- 
Marcelo

Re: Ivy support in Spark vs. sbt

Posted by shane knapp <sk...@berkeley.edu>.
interesting...  i definitely haven't seen it happen that often in our build
system, and when it has happened, i wasn't able to determine the cause.

On Thu, Jun 4, 2015 at 10:16 AM, Marcelo Vanzin <va...@cloudera.com> wrote:

> On Thu, Jun 4, 2015 at 10:04 AM, shane knapp <sk...@berkeley.edu> wrote:
>
>> this has occasionally happened on our jenkins as well (twice since last
>> august), and deleting the cache fixes it right up.
>>
>
> Yes deleting the cache fixes things, but it's kinda annoying to have to do
> that. And yesterday when I was testing a patch that actually used the ivy
> feature, I had to do that multiple times... that slows things down a lot.
>
>
>>
>> On Thu, Jun 4, 2015 at 4:29 AM, Sean Owen <so...@cloudera.com> wrote:
>>
>>> I've definitely seen the "dependency path must be relative" problem,
>>> and fixed it by deleting the ivy cache, but I don't know more than
>>> this.
>>>
>>> On Thu, Jun 4, 2015 at 1:33 AM, Marcelo Vanzin <va...@cloudera.com>
>>> wrote:
>>> > Hey all,
>>> >
>>> > I've been bit by something really weird lately and I'm starting to
>>> think
>>> > it's related to the ivy support we have in Spark, and running unit
>>> tests
>>> > that use that code.
>>> >
>>> > The first thing that happens is that after running unit tests,
>>> sometimes my
>>> > sbt builds start failing with error saying something about "dependency
>>> path
>>> > must be relative" (sorry, don't have the exact error around). The
>>> dependency
>>> > path it prints is a "file:" URL.
>>> >
>>> > I have a feeling that this is because Spark uses Ivy 2.4 while sbt
>>> uses Ivy
>>> > 2.3, and those might be incompatible. So if they get mixed up, things
>>> can
>>> > break.
>>> >
>>> > The second is that sometimes unit tests fail with some weird error
>>> > downloading dependencies. When checking the ivy metadata in
>>> ~/.ivy2/cache,
>>> > the offending dependencies are pointing to my local maven repo (I have
>>> > "maven-local" as one of the entries in my ~/.sbt/repositories).
>>> >
>>> > My feeling in this case is that Spark's version of Ivy somehow doesn't
>>> > handle that case.
>>> >
>>> > So, long story short:
>>> >
>>> > - Has anyone run into either of these problems?
>>> > - Is it possible to set some env variable or something during tests to
>>> force
>>> > them to use their own directory instead of messing up and breaking my
>>> > ~/.ivy2?
>>> >
>>> >
>>> > --
>>> > Marcelo
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>
>>>
>>
>
>
> --
> Marcelo
>

Re: Ivy support in Spark vs. sbt

Posted by Marcelo Vanzin <va...@cloudera.com>.
On Thu, Jun 4, 2015 at 10:04 AM, shane knapp <sk...@berkeley.edu> wrote:

> this has occasionally happened on our jenkins as well (twice since last
> august), and deleting the cache fixes it right up.
>

Yes deleting the cache fixes things, but it's kinda annoying to have to do
that. And yesterday when I was testing a patch that actually used the ivy
feature, I had to do that multiple times... that slows things down a lot.


>
> On Thu, Jun 4, 2015 at 4:29 AM, Sean Owen <so...@cloudera.com> wrote:
>
>> I've definitely seen the "dependency path must be relative" problem,
>> and fixed it by deleting the ivy cache, but I don't know more than
>> this.
>>
>> On Thu, Jun 4, 2015 at 1:33 AM, Marcelo Vanzin <va...@cloudera.com>
>> wrote:
>> > Hey all,
>> >
>> > I've been bit by something really weird lately and I'm starting to think
>> > it's related to the ivy support we have in Spark, and running unit tests
>> > that use that code.
>> >
>> > The first thing that happens is that after running unit tests,
>> sometimes my
>> > sbt builds start failing with error saying something about "dependency
>> path
>> > must be relative" (sorry, don't have the exact error around). The
>> dependency
>> > path it prints is a "file:" URL.
>> >
>> > I have a feeling that this is because Spark uses Ivy 2.4 while sbt uses
>> Ivy
>> > 2.3, and those might be incompatible. So if they get mixed up, things
>> can
>> > break.
>> >
>> > The second is that sometimes unit tests fail with some weird error
>> > downloading dependencies. When checking the ivy metadata in
>> ~/.ivy2/cache,
>> > the offending dependencies are pointing to my local maven repo (I have
>> > "maven-local" as one of the entries in my ~/.sbt/repositories).
>> >
>> > My feeling in this case is that Spark's version of Ivy somehow doesn't
>> > handle that case.
>> >
>> > So, long story short:
>> >
>> > - Has anyone run into either of these problems?
>> > - Is it possible to set some env variable or something during tests to
>> force
>> > them to use their own directory instead of messing up and breaking my
>> > ~/.ivy2?
>> >
>> >
>> > --
>> > Marcelo
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
>>
>


-- 
Marcelo

Re: Ivy support in Spark vs. sbt

Posted by shane knapp <sk...@berkeley.edu>.
this has occasionally happened on our jenkins as well (twice since last
august), and deleting the cache fixes it right up.

On Thu, Jun 4, 2015 at 4:29 AM, Sean Owen <so...@cloudera.com> wrote:

> I've definitely seen the "dependency path must be relative" problem,
> and fixed it by deleting the ivy cache, but I don't know more than
> this.
>
> On Thu, Jun 4, 2015 at 1:33 AM, Marcelo Vanzin <va...@cloudera.com>
> wrote:
> > Hey all,
> >
> > I've been bit by something really weird lately and I'm starting to think
> > it's related to the ivy support we have in Spark, and running unit tests
> > that use that code.
> >
> > The first thing that happens is that after running unit tests, sometimes
> my
> > sbt builds start failing with error saying something about "dependency
> path
> > must be relative" (sorry, don't have the exact error around). The
> dependency
> > path it prints is a "file:" URL.
> >
> > I have a feeling that this is because Spark uses Ivy 2.4 while sbt uses
> Ivy
> > 2.3, and those might be incompatible. So if they get mixed up, things can
> > break.
> >
> > The second is that sometimes unit tests fail with some weird error
> > downloading dependencies. When checking the ivy metadata in
> ~/.ivy2/cache,
> > the offending dependencies are pointing to my local maven repo (I have
> > "maven-local" as one of the entries in my ~/.sbt/repositories).
> >
> > My feeling in this case is that Spark's version of Ivy somehow doesn't
> > handle that case.
> >
> > So, long story short:
> >
> > - Has anyone run into either of these problems?
> > - Is it possible to set some env variable or something during tests to
> force
> > them to use their own directory instead of messing up and breaking my
> > ~/.ivy2?
> >
> >
> > --
> > Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>

Re: Ivy support in Spark vs. sbt

Posted by Sean Owen <so...@cloudera.com>.
I've definitely seen the "dependency path must be relative" problem,
and fixed it by deleting the ivy cache, but I don't know more than
this.

On Thu, Jun 4, 2015 at 1:33 AM, Marcelo Vanzin <va...@cloudera.com> wrote:
> Hey all,
>
> I've been bit by something really weird lately and I'm starting to think
> it's related to the ivy support we have in Spark, and running unit tests
> that use that code.
>
> The first thing that happens is that after running unit tests, sometimes my
> sbt builds start failing with error saying something about "dependency path
> must be relative" (sorry, don't have the exact error around). The dependency
> path it prints is a "file:" URL.
>
> I have a feeling that this is because Spark uses Ivy 2.4 while sbt uses Ivy
> 2.3, and those might be incompatible. So if they get mixed up, things can
> break.
>
> The second is that sometimes unit tests fail with some weird error
> downloading dependencies. When checking the ivy metadata in ~/.ivy2/cache,
> the offending dependencies are pointing to my local maven repo (I have
> "maven-local" as one of the entries in my ~/.sbt/repositories).
>
> My feeling in this case is that Spark's version of Ivy somehow doesn't
> handle that case.
>
> So, long story short:
>
> - Has anyone run into either of these problems?
> - Is it possible to set some env variable or something during tests to force
> them to use their own directory instead of messing up and breaking my
> ~/.ivy2?
>
>
> --
> Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org