You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2019/03/20 22:57:00 UTC

[jira] [Resolved] (SPARK-27205) spark-shell with packages option fails to load transitive dependencies even ivy successfully pulls jars

     [ https://issues.apache.org/jira/browse/SPARK-27205?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-27205.
-------------------------------
       Resolution: Fixed
    Fix Version/s: 3.0.0

Issue resolved by pull request 24147
[https://github.com/apache/spark/pull/24147]

> spark-shell with packages option fails to load transitive dependencies even ivy successfully pulls jars
> -------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-27205
>                 URL: https://issues.apache.org/jira/browse/SPARK-27205
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Jungtaek Lim
>            Assignee: Jungtaek Lim
>            Priority: Major
>             Fix For: 3.0.0
>
>
> I found this bug while testing my patch regarding Spark SQL Kafka module - I tend to open spark-shell and link kafka module via `–packages`.
> When we run
> {code:java}
> ./bin/spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.4.0{code}
> we should be able to import "org.apache.kafka" in spark-shell, but it doesn't work for current master branch.
> There's not enough evidence as well as I have no idea what's happening here even with `–verbose` option, so I had to spend couple of hours dealing with git bisect.
> Turned out the commit introducing the bug was SPARK-26977 ([81dd21fda99da48ed76adb739a07d1dabf1ffb51|https://github.com/apache/spark/commit/81dd21fda99da48ed76adb739a07d1dabf1ffb51]).
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org