You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Nathan Kronenfeld <nk...@oculusinfo.com> on 2013/11/28 16:18:38 UTC

build problem

Hi, folks.

I'm trying to build the a spark distribution with the latest code.

I started out this morning with:

./make-distribution.sh


and that worked fine. But then I realized I'd forgotten to set the hadoop
version I needed, so I redid it with

./make-distribution.sh --hadoop 2.0.0-cdh4.4.0


That failed with a whole bunch of error messages (43 to be exact) in
streaming on the lines of:

...streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaPairDStream.scala:51:
type mismatch
found: org.apache.spark.streaming..DStream[(K, V)]
expected: org.apache.spark.streaming.api.java.JavaPairDStream[K, V]
Note: implicit method fromPairDStream is not applicable here because it
comes after the application point and it lacks an explicit return type.
dstream.filter(x => f(x).booleanValue())


(42 more like that in different places).  So I went back and tried

./make-distribution.sh


again - now it failed with the same errors, though it just worked a second
ago.  Clean up the dist directory - same thing  Log out and in to reset my
environment - same thing.

So though it built fine once, now it refuses to build again.

Does anyone have a clue what is going on here?

Any help very much appreciated,
                -Nathan


-- 
Nathan Kronenfeld
Senior Visualization Developer
Oculus Info Inc
2 Berkeley Street, Suite 600,
Toronto, Ontario M5A 4J5
Phone:  +1-416-203-3003 x 238
Email:  nkronenfeld@oculusinfo.com

Re: build problem

Posted by Цвигун Евгений <ut...@gmail.com>.
Hi Nathan,

try 'sbt clean' and then re-run make-distribution.sh.

To clean really every trace of previous builds, try also removing ~/.m2,
~/.ivy2, ~/.sbt. That's a little time expensive, cause all external
artifacts will be downloaded again, but does the trick.

Kind regards,
Eugene


On Thu, Nov 28, 2013 at 7:18 PM, Nathan Kronenfeld <
nkronenfeld@oculusinfo.com> wrote:

> Hi, folks.
>
> I'm trying to build the a spark distribution with the latest code.
>
> I started out this morning with:
>
> ./make-distribution.sh
>
>
> and that worked fine. But then I realized I'd forgotten to set the hadoop
> version I needed, so I redid it with
>
> ./make-distribution.sh --hadoop 2.0.0-cdh4.4.0
>
>
> That failed with a whole bunch of error messages (43 to be exact) in
> streaming on the lines of:
>
> ...streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaPairDStream.scala:51:
> type mismatch
> found: org.apache.spark.streaming..DStream[(K, V)]
> expected: org.apache.spark.streaming.api.java.JavaPairDStream[K, V]
> Note: implicit method fromPairDStream is not applicable here because it
> comes after the application point and it lacks an explicit return type.
> dstream.filter(x => f(x).booleanValue())
>
>
> (42 more like that in different places).  So I went back and tried
>
> ./make-distribution.sh
>
>
> again - now it failed with the same errors, though it just worked a second
> ago.  Clean up the dist directory - same thing  Log out and in to reset my
> environment - same thing.
>
> So though it built fine once, now it refuses to build again.
>
> Does anyone have a clue what is going on here?
>
> Any help very much appreciated,
>                 -Nathan
>
>
> --
> Nathan Kronenfeld
> Senior Visualization Developer
> Oculus Info Inc
> 2 Berkeley Street, Suite 600,
> Toronto, Ontario M5A 4J5
> Phone:  +1-416-203-3003 x 238
> Email:  nkronenfeld@oculusinfo.com
>

Re: build problem

Posted by Prashant Sharma <sc...@gmail.com>.
IMHO, cleaning and building might help.
To do so you can do `sbt/sbt clean` on the command line. Additionally on
linux you can
(I prefer doing this)    find -name "target" -type d -exec rm -r {} +



On Thu, Nov 28, 2013 at 8:48 PM, Nathan Kronenfeld <
nkronenfeld@oculusinfo.com> wrote:

> Hi, folks.
>
> I'm trying to build the a spark distribution with the latest code.
>
> I started out this morning with:
>
> ./make-distribution.sh
>
>
> and that worked fine. But then I realized I'd forgotten to set the hadoop
> version I needed, so I redid it with
>
> ./make-distribution.sh --hadoop 2.0.0-cdh4.4.0
>
>
> That failed with a whole bunch of error messages (43 to be exact) in
> streaming on the lines of:
>
> ...streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaPairDStream.scala:51:
> type mismatch
> found: org.apache.spark.streaming..DStream[(K, V)]
> expected: org.apache.spark.streaming.api.java.JavaPairDStream[K, V]
> Note: implicit method fromPairDStream is not applicable here because it
> comes after the application point and it lacks an explicit return type.
> dstream.filter(x => f(x).booleanValue())
>
>
> (42 more like that in different places).  So I went back and tried
>
> ./make-distribution.sh
>
>
> again - now it failed with the same errors, though it just worked a second
> ago.  Clean up the dist directory - same thing  Log out and in to reset my
> environment - same thing.
>
> So though it built fine once, now it refuses to build again.
>
> Does anyone have a clue what is going on here?
>
> Any help very much appreciated,
>                 -Nathan
>
>
> --
> Nathan Kronenfeld
> Senior Visualization Developer
> Oculus Info Inc
> 2 Berkeley Street, Suite 600,
> Toronto, Ontario M5A 4J5
> Phone:  +1-416-203-3003 x 238
> Email:  nkronenfeld@oculusinfo.com
>



-- 
s