You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Holden Karau <ho...@pigscanfly.ca> on 2015/08/26 23:23:34 UTC

Building with sbt "impossible to get artifacts when data has not been loaded"

Has anyone else run into "impossible to get artifacts when data has not
been loaded. IvyNode = org.scala-lang#scala-library;2.10.3" during
hive/update when building with sbt. Working around it is pretty simple
(just add it as a dependency), but I'm wondering if its impacting anyone
else and I should make a PR for it or if its something funky with my local
build setup.

-- 
Cell : 425-233-8271
Twitter: https://twitter.com/holdenkarau
Linked In: https://www.linkedin.com/in/holdenkarau

Re: Building with sbt "impossible to get artifacts when data has not been loaded"

Posted by Jacek Laskowski <ja...@japila.pl>.
On Wed, Aug 26, 2015 at 11:23 PM, Holden Karau <ho...@pigscanfly.ca> wrote:
> Has anyone else run into "impossible to get artifacts when data has not been
> loaded. IvyNode = org.scala-lang#scala-library;2.10.3" during hive/update
> when building with sbt. Working around it is pretty simple (just add it as a
> dependency), but I'm wondering if its impacting anyone else and I should
> make a PR for it or if its something funky with my local build setup.

Hi,

Since it's sbt...

I've just tried it out myself, and didn't face any errors. How do you
run the command that led to the issue? Can you hide ~/.m2/repository
and ~/.ivy2 and ~/.sbt/0.13 by renaming them to some other name and
give it a shot again? There are few shared repos in play that might
get in the way to reproduce it.

BTW you could use the Docker image
https://hub.docker.com/r/jaceklaskowski/docker-builds-sbt/ to work in
a very clean and isolated build environment.

[spark]> hive/update
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * com.google.code.findbugs:jsr305:1.3.9 -> 2.0.1
[warn] * com.google.guava:guava:11.0.2 -> 14.0.1
[warn] * commons-net:commons-net:2.2 -> 3.1
[warn] Run 'evicted' to see detailed eviction warnings
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * com.google.guava:guava:11.0.2 -> 14.0.1
[warn] Run 'evicted' to see detailed eviction warnings
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * com.google.guava:guava:11.0.2 -> 14.0.1
[warn] Run 'evicted' to see detailed eviction warnings
[info] Updating {file:/Users/jacek/dev/oss/spark/}hive...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * com.google.code.findbugs:jsr305:1.3.9 -> 2.0.1
[warn] * com.google.guava:guava:11.0.2 -> 14.0.1
[warn] Run 'evicted' to see detailed eviction warnings
[success] Total time: 3 s, completed Aug 27, 2015 11:58:18 AM

Pozdrawiam,
Jacek

--
Jacek Laskowski | http://blog.japila.pl | http://blog.jaceklaskowski.pl
Follow me at https://twitter.com/jaceklaskowski
Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Building with sbt "impossible to get artifacts when data has not been loaded"

Posted by Josh Rosen <ro...@gmail.com>.
I ran into a similar problem while working on the spark-redshift library
and was able to fix it by bumping that library's ScalaTest version. I'm
still fighting some mysterious Scala issues while trying to test the
spark-csv library against 1.5.0-RC1, so it's possible that a build or
dependency change in Spark might be responsible for this.

On 8/26/15 2:27 PM, Marcelo Vanzin wrote:
> I ran into the same error (different dependency) earlier today. In my
> case, the maven pom files and the sbt dependencies had a conflict
> (different versions of the same artifact) and ivy got confused. Not
> sure whether that will help in your case or not...
>
> On Wed, Aug 26, 2015 at 2:23 PM, Holden Karau <ho...@pigscanfly.ca> wrote:
>> Has anyone else run into "impossible to get artifacts when data has not been
>> loaded. IvyNode = org.scala-lang#scala-library;2.10.3" during hive/update
>> when building with sbt. Working around it is pretty simple (just add it as a
>> dependency), but I'm wondering if its impacting anyone else and I should
>> make a PR for it or if its something funky with my local build setup.
>>
>> --
>> Cell : 425-233-8271
>> Twitter: https://twitter.com/holdenkarau
>> Linked In: https://www.linkedin.com/in/holdenkarau
>
>


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Building with sbt "impossible to get artifacts when data has not been loaded"

Posted by Marcelo Vanzin <va...@cloudera.com>.
I ran into the same error (different dependency) earlier today. In my
case, the maven pom files and the sbt dependencies had a conflict
(different versions of the same artifact) and ivy got confused. Not
sure whether that will help in your case or not...

On Wed, Aug 26, 2015 at 2:23 PM, Holden Karau <ho...@pigscanfly.ca> wrote:
> Has anyone else run into "impossible to get artifacts when data has not been
> loaded. IvyNode = org.scala-lang#scala-library;2.10.3" during hive/update
> when building with sbt. Working around it is pretty simple (just add it as a
> dependency), but I'm wondering if its impacting anyone else and I should
> make a PR for it or if its something funky with my local build setup.
>
> --
> Cell : 425-233-8271
> Twitter: https://twitter.com/holdenkarau
> Linked In: https://www.linkedin.com/in/holdenkarau



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org