You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Ascot Moss <as...@gmail.com> on 2016/07/24 23:33:25 UTC
Spark 1.6.2 version displayed as 1.6.1
Hi,
I am trying to upgrade spark from 1.6.1 to 1.6.2, from 1.6.2 spark-shell, I
found the version is still displayed 1.6.1
Is this a minor typo/bug?
Regards
###
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.6.1
/_/
Re: Spark 1.6.2 version displayed as 1.6.1
Posted by Krishna Sankar <ks...@gmail.com>.
This intrigued me as well.
- Just for sure, I downloaded the 1.6.2 code and recompiled.
- spark-shell and pyspark both show 1.6.2 as expected.
Cheers
On Mon, Jul 25, 2016 at 1:45 AM, Daniel Darabos <
daniel.darabos@lynxanalytics.com> wrote:
> Another possible explanation is that by accident you are still running
> Spark 1.6.1. Which download are you using? This is what I see:
>
> $ ~/spark-1.6.2-bin-hadoop2.6/bin/spark-shell
> log4j:WARN No appenders could be found for logger
> (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
> log4j:WARN Please initialize the log4j system properly.
> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
> more info.
> Using Spark's repl log4j profile:
> org/apache/spark/log4j-defaults-repl.properties
> To adjust logging level use sc.setLogLevel("INFO")
> Welcome to
> ____ __
> / __/__ ___ _____/ /__
> _\ \/ _ \/ _ `/ __/ '_/
> /___/ .__/\_,_/_/ /_/\_\ version 1.6.2
> /_/
>
>
> On Mon, Jul 25, 2016 at 7:45 AM, Sean Owen <so...@cloudera.com> wrote:
>
>> Are you certain? looks like it was correct in the release:
>>
>>
>> https://github.com/apache/spark/blob/v1.6.2/core/src/main/scala/org/apache/spark/package.scala
>>
>>
>>
>> On Mon, Jul 25, 2016 at 12:33 AM, Ascot Moss <as...@gmail.com>
>> wrote:
>> > Hi,
>> >
>> > I am trying to upgrade spark from 1.6.1 to 1.6.2, from 1.6.2
>> spark-shell, I
>> > found the version is still displayed 1.6.1
>> >
>> > Is this a minor typo/bug?
>> >
>> > Regards
>> >
>> >
>> >
>> > ###
>> >
>> > Welcome to
>> >
>> > ____ __
>> >
>> > / __/__ ___ _____/ /__
>> >
>> > _\ \/ _ \/ _ `/ __/ '_/
>> >
>> > /___/ .__/\_,_/_/ /_/\_\ version 1.6.1
>> >
>> > /_/
>> >
>> >
>> >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>>
>>
>
Re: Spark 1.6.2 version displayed as 1.6.1
Posted by Daniel Darabos <da...@lynxanalytics.com>.
Another possible explanation is that by accident you are still running
Spark 1.6.1. Which download are you using? This is what I see:
$ ~/spark-1.6.2-bin-hadoop2.6/bin/spark-shell
log4j:WARN No appenders could be found for logger
(org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
more info.
Using Spark's repl log4j profile:
org/apache/spark/log4j-defaults-repl.properties
To adjust logging level use sc.setLogLevel("INFO")
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.6.2
/_/
On Mon, Jul 25, 2016 at 7:45 AM, Sean Owen <so...@cloudera.com> wrote:
> Are you certain? looks like it was correct in the release:
>
>
> https://github.com/apache/spark/blob/v1.6.2/core/src/main/scala/org/apache/spark/package.scala
>
>
>
> On Mon, Jul 25, 2016 at 12:33 AM, Ascot Moss <as...@gmail.com> wrote:
> > Hi,
> >
> > I am trying to upgrade spark from 1.6.1 to 1.6.2, from 1.6.2
> spark-shell, I
> > found the version is still displayed 1.6.1
> >
> > Is this a minor typo/bug?
> >
> > Regards
> >
> >
> >
> > ###
> >
> > Welcome to
> >
> > ____ __
> >
> > / __/__ ___ _____/ /__
> >
> > _\ \/ _ \/ _ `/ __/ '_/
> >
> > /___/ .__/\_,_/_/ /_/\_\ version 1.6.1
> >
> > /_/
> >
> >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>
Re: Spark 1.6.2 version displayed as 1.6.1
Posted by Sean Owen <so...@cloudera.com>.
Are you certain? looks like it was correct in the release:
https://github.com/apache/spark/blob/v1.6.2/core/src/main/scala/org/apache/spark/package.scala
On Mon, Jul 25, 2016 at 12:33 AM, Ascot Moss <as...@gmail.com> wrote:
> Hi,
>
> I am trying to upgrade spark from 1.6.1 to 1.6.2, from 1.6.2 spark-shell, I
> found the version is still displayed 1.6.1
>
> Is this a minor typo/bug?
>
> Regards
>
>
>
> ###
>
> Welcome to
>
> ____ __
>
> / __/__ ___ _____/ /__
>
> _\ \/ _ \/ _ `/ __/ '_/
>
> /___/ .__/\_,_/_/ /_/\_\ version 1.6.1
>
> /_/
>
>
>
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org