You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Rahul Palamuttam <ra...@gmail.com> on 2015/07/27 20:38:29 UTC

Spark build/sbt assembly

Hi All,

I hope this is the right place to post troubleshooting questions.
I've been following the install instructions and I get the following error
when running the following from Spark home directory

$./build/sbt
Using /usr/java/jdk1.8.0_20/ as default JAVA_HOME.
Note, this will be overridden by -java-home if it is set.
Attempting to fetch sbt
Launching sbt from build/sbt-launch-0.13.7.jar
Error: Invalid or corrupt jarfile build/sbt-launch-0.13.7.jar

However when I run sbt assembly it compiles, with a couple of warnings, but
it works none-the less.
Is the build/sbt script deprecated? I do notice on one node it works but on
the other it gives me the above error.

Thanks,

Rahul P



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Spark build/sbt assembly

Posted by Rahul Palamuttam <ra...@gmail.com>.
So just to clarify, I have 4 nodes, all of which use Java 8.
Only one of them is able to successfully execute the build/sbt assembly
command.
However on the 3 others I get the error.

If I run sbt assembly in Spark Home, it works and I'm able to launch the
master and worker processes.

On Mon, Jul 27, 2015 at 11:48 AM, Rahul Palamuttam <ra...@gmail.com>
wrote:

> All nodes are using java 8.
> I've tried to mimic the environments as much as possible among all nodes.
>
>
> On Mon, Jul 27, 2015 at 11:44 AM, Ted Yu <yu...@gmail.com> wrote:
>
>> bq. on one node it works but on the other it gives me the above error.
>>
>> Can you tell us the difference between the environments on the two nodes ?
>> Does the other node use Java 8 ?
>>
>> Cheers
>>
>> On Mon, Jul 27, 2015 at 11:38 AM, Rahul Palamuttam <
>> rahulpalamut@gmail.com> wrote:
>>
>>> Hi All,
>>>
>>> I hope this is the right place to post troubleshooting questions.
>>> I've been following the install instructions and I get the following
>>> error
>>> when running the following from Spark home directory
>>>
>>> $./build/sbt
>>> Using /usr/java/jdk1.8.0_20/ as default JAVA_HOME.
>>> Note, this will be overridden by -java-home if it is set.
>>> Attempting to fetch sbt
>>> Launching sbt from build/sbt-launch-0.13.7.jar
>>> Error: Invalid or corrupt jarfile build/sbt-launch-0.13.7.jar
>>>
>>> However when I run sbt assembly it compiles, with a couple of warnings,
>>> but
>>> it works none-the less.
>>> Is the build/sbt script deprecated? I do notice on one node it works but
>>> on
>>> the other it gives me the above error.
>>>
>>> Thanks,
>>>
>>> Rahul P
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: user-help@spark.apache.org
>>>
>>>
>>
>

Re: Spark build/sbt assembly

Posted by Rahul Palamuttam <ra...@gmail.com>.
All nodes are using java 8.
I've tried to mimic the environments as much as possible among all nodes.


On Mon, Jul 27, 2015 at 11:44 AM, Ted Yu <yu...@gmail.com> wrote:

> bq. on one node it works but on the other it gives me the above error.
>
> Can you tell us the difference between the environments on the two nodes ?
> Does the other node use Java 8 ?
>
> Cheers
>
> On Mon, Jul 27, 2015 at 11:38 AM, Rahul Palamuttam <rahulpalamut@gmail.com
> > wrote:
>
>> Hi All,
>>
>> I hope this is the right place to post troubleshooting questions.
>> I've been following the install instructions and I get the following error
>> when running the following from Spark home directory
>>
>> $./build/sbt
>> Using /usr/java/jdk1.8.0_20/ as default JAVA_HOME.
>> Note, this will be overridden by -java-home if it is set.
>> Attempting to fetch sbt
>> Launching sbt from build/sbt-launch-0.13.7.jar
>> Error: Invalid or corrupt jarfile build/sbt-launch-0.13.7.jar
>>
>> However when I run sbt assembly it compiles, with a couple of warnings,
>> but
>> it works none-the less.
>> Is the build/sbt script deprecated? I do notice on one node it works but
>> on
>> the other it gives me the above error.
>>
>> Thanks,
>>
>> Rahul P
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>

Re: Spark build/sbt assembly

Posted by Ted Yu <yu...@gmail.com>.
bq. on one node it works but on the other it gives me the above error.

Can you tell us the difference between the environments on the two nodes ?
Does the other node use Java 8 ?

Cheers

On Mon, Jul 27, 2015 at 11:38 AM, Rahul Palamuttam <ra...@gmail.com>
wrote:

> Hi All,
>
> I hope this is the right place to post troubleshooting questions.
> I've been following the install instructions and I get the following error
> when running the following from Spark home directory
>
> $./build/sbt
> Using /usr/java/jdk1.8.0_20/ as default JAVA_HOME.
> Note, this will be overridden by -java-home if it is set.
> Attempting to fetch sbt
> Launching sbt from build/sbt-launch-0.13.7.jar
> Error: Invalid or corrupt jarfile build/sbt-launch-0.13.7.jar
>
> However when I run sbt assembly it compiles, with a couple of warnings, but
> it works none-the less.
> Is the build/sbt script deprecated? I do notice on one node it works but on
> the other it gives me the above error.
>
> Thanks,
>
> Rahul P
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: Spark build/sbt assembly

Posted by Rahul Palamuttam <ra...@gmail.com>.
Hi Akhil,

Yes I did try to remove it, and i tried to build again.
However that jar keeps getting recreated, whenever i run ./build/sbt
assembly

Thanks,

Rahul P

On Thu, Jul 30, 2015 at 12:38 AM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> Did you try removing this jar? build/sbt-launch-0.13.7.jar
>
> Thanks
> Best Regards
>
> On Tue, Jul 28, 2015 at 12:08 AM, Rahul Palamuttam <rahulpalamut@gmail.com
> > wrote:
>
>> Hi All,
>>
>> I hope this is the right place to post troubleshooting questions.
>> I've been following the install instructions and I get the following error
>> when running the following from Spark home directory
>>
>> $./build/sbt
>> Using /usr/java/jdk1.8.0_20/ as default JAVA_HOME.
>> Note, this will be overridden by -java-home if it is set.
>> Attempting to fetch sbt
>> Launching sbt from build/sbt-launch-0.13.7.jar
>> Error: Invalid or corrupt jarfile build/sbt-launch-0.13.7.jar
>>
>> However when I run sbt assembly it compiles, with a couple of warnings,
>> but
>> it works none-the less.
>> Is the build/sbt script deprecated? I do notice on one node it works but
>> on
>> the other it gives me the above error.
>>
>> Thanks,
>>
>> Rahul P
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>

Re: Spark build/sbt assembly

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Did you try removing this jar? build/sbt-launch-0.13.7.jar

Thanks
Best Regards

On Tue, Jul 28, 2015 at 12:08 AM, Rahul Palamuttam <ra...@gmail.com>
wrote:

> Hi All,
>
> I hope this is the right place to post troubleshooting questions.
> I've been following the install instructions and I get the following error
> when running the following from Spark home directory
>
> $./build/sbt
> Using /usr/java/jdk1.8.0_20/ as default JAVA_HOME.
> Note, this will be overridden by -java-home if it is set.
> Attempting to fetch sbt
> Launching sbt from build/sbt-launch-0.13.7.jar
> Error: Invalid or corrupt jarfile build/sbt-launch-0.13.7.jar
>
> However when I run sbt assembly it compiles, with a couple of warnings, but
> it works none-the less.
> Is the build/sbt script deprecated? I do notice on one node it works but on
> the other it gives me the above error.
>
> Thanks,
>
> Rahul P
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>