You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by JayKay <ju...@gmail.com> on 2016/10/13 08:24:23 UTC

Want to test spark-sql-kafka but get unresolved dependency error

I want to work with the Kafka integration for structured streaming. I use
Spark version 2.0.0. and I start the spark-shell with: 

spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.0.0

As described here:
https://github.com/apache/spark/blob/master/docs/structured-streaming-kafka-integration.md

But I get a unresolved dependency error ("unresolved dependency:
org.apache.spark#spark-sql-kafka-0-10_2.11;2.0.0: not found"). So it seems
not to be available via maven or spark-packages.

How can I accesss this package? Or am I doing something wrong/missing? 

Thank you for you help.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Want-to-test-spark-sql-kafka-but-get-unresolved-dependency-error-tp27891.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: Want to test spark-sql-kafka but get unresolved dependency error

Posted by Sean Owen <so...@cloudera.com>.
I don't believe that's been released yet. It looks like it was merged into
branches about a week ago. You're looking at unreleased docs too - have a
look at http://spark.apache.org/docs/latest/ for the latest released docs.

On Thu, Oct 13, 2016 at 9:24 AM JayKay <ju...@gmail.com> wrote:

> I want to work with the Kafka integration for structured streaming. I use
> Spark version 2.0.0. and I start the spark-shell with:
>
> spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.0.0
>
> As described here:
>
> https://github.com/apache/spark/blob/master/docs/structured-streaming-kafka-integration.md
>
> But I get a unresolved dependency error ("unresolved dependency:
> org.apache.spark#spark-sql-kafka-0-10_2.11;2.0.0: not found"). So it seems
> not to be available via maven or spark-packages.
>
> How can I accesss this package? Or am I doing something wrong/missing?
>
> Thank you for you help.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Want-to-test-spark-sql-kafka-but-get-unresolved-dependency-error-tp27891.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>

Re: Want to test spark-sql-kafka but get unresolved dependency error

Posted by Mich Talebzadeh <mi...@gmail.com>.
add --jars <LOCATION>/spark-streaming-kafka_2.10-1.5.1.jar

(may need to download the jar file or any newer version)


to spark-shell.

I also have spark-streaming-kafka-assembly_2.10-1.6.1.jar as well on --jar
list

HTH

Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 13 October 2016 at 09:24, JayKay <ju...@gmail.com> wrote:

> I want to work with the Kafka integration for structured streaming. I use
> Spark version 2.0.0. and I start the spark-shell with:
>
> spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.0.0
>
> As described here:
> https://github.com/apache/spark/blob/master/docs/
> structured-streaming-kafka-integration.md
>
> But I get a unresolved dependency error ("unresolved dependency:
> org.apache.spark#spark-sql-kafka-0-10_2.11;2.0.0: not found"). So it seems
> not to be available via maven or spark-packages.
>
> How can I accesss this package? Or am I doing something wrong/missing?
>
> Thank you for you help.
>
>
>
> --
> View this message in context: http://apache-spark-user-list.
> 1001560.n3.nabble.com/Want-to-test-spark-sql-kafka-but-get-
> unresolved-dependency-error-tp27891.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>

Re: Want to test spark-sql-kafka but get unresolved dependency error

Posted by Cody Koeninger <co...@koeninger.org>.
I can't be sure, no.

On Fri, Oct 14, 2016 at 3:06 AM, Julian Keppel
<ju...@gmail.com> wrote:
> Okay, thank you! Can you say, when this feature will be released?
>
> 2016-10-13 16:29 GMT+02:00 Cody Koeninger <co...@koeninger.org>:
>>
>> As Sean said, it's unreleased.  If you want to try it out, build spark
>>
>> http://spark.apache.org/docs/latest/building-spark.html
>>
>> The easiest way to include the jar is probably to use mvn install to
>> put it in your local repository, then link it in your application's
>> mvn or sbt build file as described in the docs you linked.
>>
>>
>> On Thu, Oct 13, 2016 at 3:24 AM, JayKay <ju...@gmail.com>
>> wrote:
>> > I want to work with the Kafka integration for structured streaming. I
>> > use
>> > Spark version 2.0.0. and I start the spark-shell with:
>> >
>> > spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.0.0
>> >
>> > As described here:
>> >
>> > https://github.com/apache/spark/blob/master/docs/structured-streaming-kafka-integration.md
>> >
>> > But I get a unresolved dependency error ("unresolved dependency:
>> > org.apache.spark#spark-sql-kafka-0-10_2.11;2.0.0: not found"). So it
>> > seems
>> > not to be available via maven or spark-packages.
>> >
>> > How can I accesss this package? Or am I doing something wrong/missing?
>> >
>> > Thank you for you help.
>> >
>> >
>> >
>> > --
>> > View this message in context:
>> > http://apache-spark-user-list.1001560.n3.nabble.com/Want-to-test-spark-sql-kafka-but-get-unresolved-dependency-error-tp27891.html
>> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>> >
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: Want to test spark-sql-kafka but get unresolved dependency error

Posted by Julian Keppel <ju...@gmail.com>.
Okay, thank you! Can you say, when this feature will be released?

2016-10-13 16:29 GMT+02:00 Cody Koeninger <co...@koeninger.org>:

> As Sean said, it's unreleased.  If you want to try it out, build spark
>
> http://spark.apache.org/docs/latest/building-spark.html
>
> The easiest way to include the jar is probably to use mvn install to
> put it in your local repository, then link it in your application's
> mvn or sbt build file as described in the docs you linked.
>
>
> On Thu, Oct 13, 2016 at 3:24 AM, JayKay <ju...@gmail.com>
> wrote:
> > I want to work with the Kafka integration for structured streaming. I use
> > Spark version 2.0.0. and I start the spark-shell with:
> >
> > spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.0.0
> >
> > As described here:
> > https://github.com/apache/spark/blob/master/docs/
> structured-streaming-kafka-integration.md
> >
> > But I get a unresolved dependency error ("unresolved dependency:
> > org.apache.spark#spark-sql-kafka-0-10_2.11;2.0.0: not found"). So it
> seems
> > not to be available via maven or spark-packages.
> >
> > How can I accesss this package? Or am I doing something wrong/missing?
> >
> > Thank you for you help.
> >
> >
> >
> > --
> > View this message in context: http://apache-spark-user-list.
> 1001560.n3.nabble.com/Want-to-test-spark-sql-kafka-but-get-
> unresolved-dependency-error-tp27891.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> > ---------------------------------------------------------------------
> > To unsubscribe e-mail: user-unsubscribe@spark.apache.org
> >
>

Re: Want to test spark-sql-kafka but get unresolved dependency error

Posted by Cody Koeninger <co...@koeninger.org>.
As Sean said, it's unreleased.  If you want to try it out, build spark

http://spark.apache.org/docs/latest/building-spark.html

The easiest way to include the jar is probably to use mvn install to
put it in your local repository, then link it in your application's
mvn or sbt build file as described in the docs you linked.


On Thu, Oct 13, 2016 at 3:24 AM, JayKay <ju...@gmail.com> wrote:
> I want to work with the Kafka integration for structured streaming. I use
> Spark version 2.0.0. and I start the spark-shell with:
>
> spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.0.0
>
> As described here:
> https://github.com/apache/spark/blob/master/docs/structured-streaming-kafka-integration.md
>
> But I get a unresolved dependency error ("unresolved dependency:
> org.apache.spark#spark-sql-kafka-0-10_2.11;2.0.0: not found"). So it seems
> not to be available via maven or spark-packages.
>
> How can I accesss this package? Or am I doing something wrong/missing?
>
> Thank you for you help.
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Want-to-test-spark-sql-kafka-but-get-unresolved-dependency-error-tp27891.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org