You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Hyukjin Kwon <gu...@gmail.com> on 2019/12/27 04:14:28 UTC

Re: Fail to use SparkR of 3.0 preview 2

I was randomly googling out of curiosity, and seems indeed that's the
problem (
https://r.789695.n4.nabble.com/Error-in-rbind-info-getNamespaceInfo-env-quot-S3methods-quot-td4755490.html
).
Yes, seems we should make sure we build SparkR in an old version.
Since that support for R prior to version 3.4 is deprecated as of Spark
3.0.0, we could use either R 3.4 or matching to Jenkins's (R 3.1 IIRC) for
Spark 3.0 release.

Redirecting to a dev list and Yuming as well for visibility.

2019년 12월 27일 (금) 오후 12:02, Jeff Zhang <zj...@gmail.com>님이 작성:

> Yes, I guess so. But R 3.6.2 is just released this month, I think we
> should use an older version to build SparkR.
>
> Felix Cheung <fe...@hotmail.com> 于2019年12月27日周五 上午10:43写道:
>
>> Maybe it’s the reverse - the package is built to run in latest but not
>> compatible with slightly older (3.5.2 was Dec 2018)
>>
>> ------------------------------
>> *From:* Jeff Zhang <zj...@gmail.com>
>> *Sent:* Thursday, December 26, 2019 5:36:50 PM
>> *To:* Felix Cheung <fe...@hotmail.com>
>> *Cc:* user.spark <us...@spark.apache.org>
>> *Subject:* Re: Fail to use SparkR of 3.0 preview 2
>>
>> I use R 3.5.2
>>
>> Felix Cheung <fe...@hotmail.com> 于2019年12月27日周五 上午4:32写道:
>>
>> It looks like a change in the method signature in R base packages.
>>
>> Which version of R are you running on?
>>
>> ------------------------------
>> *From:* Jeff Zhang <zj...@gmail.com>
>> *Sent:* Thursday, December 26, 2019 12:46:12 AM
>> *To:* user.spark <us...@spark.apache.org>
>> *Subject:* Fail to use SparkR of 3.0 preview 2
>>
>> I tried SparkR of spark 3.0 preview 2, but hit the following issue.
>>
>> Error in rbind(info, getNamespaceInfo(env, "S3methods")) :
>>   number of columns of matrices must match (see arg 2)
>> Error: package or namespace load failed for ‘SparkR’ in rbind(info,
>> getNamespaceInfo(env, "S3methods")):
>>  number of columns of matrices must match (see arg 2)
>> During startup - Warning messages:
>> 1: package ‘SparkR’ was built under R version 3.6.2
>> 2: package ‘SparkR’ in options("defaultPackages") was not found
>>
>> Does anyone know what might be wrong ? Thanks
>>
>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>
>
> --
> Best Regards
>
> Jeff Zhang
>

Re: Fail to use SparkR of 3.0 preview 2

Posted by Xiao Li <li...@databricks.com>.
We can use R version 3.6.1, if we have a concern about the quality of 3.6.2?

On Thu, Dec 26, 2019 at 8:14 PM Hyukjin Kwon <gu...@gmail.com> wrote:

> I was randomly googling out of curiosity, and seems indeed that's the
> problem (
> https://r.789695.n4.nabble.com/Error-in-rbind-info-getNamespaceInfo-env-quot-S3methods-quot-td4755490.html
> ).
> Yes, seems we should make sure we build SparkR in an old version.
> Since that support for R prior to version 3.4 is deprecated as of Spark
> 3.0.0, we could use either R 3.4 or matching to Jenkins's (R 3.1 IIRC) for
> Spark 3.0 release.
>
> Redirecting to a dev list and Yuming as well for visibility.
>
> 2019년 12월 27일 (금) 오후 12:02, Jeff Zhang <zj...@gmail.com>님이 작성:
>
>> Yes, I guess so. But R 3.6.2 is just released this month, I think we
>> should use an older version to build SparkR.
>>
>> Felix Cheung <fe...@hotmail.com> 于2019年12月27日周五 上午10:43写道:
>>
>>> Maybe it’s the reverse - the package is built to run in latest but not
>>> compatible with slightly older (3.5.2 was Dec 2018)
>>>
>>> ------------------------------
>>> *From:* Jeff Zhang <zj...@gmail.com>
>>> *Sent:* Thursday, December 26, 2019 5:36:50 PM
>>> *To:* Felix Cheung <fe...@hotmail.com>
>>> *Cc:* user.spark <us...@spark.apache.org>
>>> *Subject:* Re: Fail to use SparkR of 3.0 preview 2
>>>
>>> I use R 3.5.2
>>>
>>> Felix Cheung <fe...@hotmail.com> 于2019年12月27日周五 上午4:32写道:
>>>
>>> It looks like a change in the method signature in R base packages.
>>>
>>> Which version of R are you running on?
>>>
>>> ------------------------------
>>> *From:* Jeff Zhang <zj...@gmail.com>
>>> *Sent:* Thursday, December 26, 2019 12:46:12 AM
>>> *To:* user.spark <us...@spark.apache.org>
>>> *Subject:* Fail to use SparkR of 3.0 preview 2
>>>
>>> I tried SparkR of spark 3.0 preview 2, but hit the following issue.
>>>
>>> Error in rbind(info, getNamespaceInfo(env, "S3methods")) :
>>>   number of columns of matrices must match (see arg 2)
>>> Error: package or namespace load failed for ‘SparkR’ in rbind(info,
>>> getNamespaceInfo(env, "S3methods")):
>>>  number of columns of matrices must match (see arg 2)
>>> During startup - Warning messages:
>>> 1: package ‘SparkR’ was built under R version 3.6.2
>>> 2: package ‘SparkR’ in options("defaultPackages") was not found
>>>
>>> Does anyone know what might be wrong ? Thanks
>>>
>>>
>>>
>>> --
>>> Best Regards
>>>
>>> Jeff Zhang
>>>
>>>
>>>
>>> --
>>> Best Regards
>>>
>>> Jeff Zhang
>>>
>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>

-- 
[image: Databricks Summit - Watch the talks]
<https://databricks.com/sparkaisummit/north-america>

Re: Fail to use SparkR of 3.0 preview 2

Posted by Xiao Li <li...@databricks.com>.
We can use R version 3.6.1, if we have a concern about the quality of 3.6.2?

On Thu, Dec 26, 2019 at 8:14 PM Hyukjin Kwon <gu...@gmail.com> wrote:

> I was randomly googling out of curiosity, and seems indeed that's the
> problem (
> https://r.789695.n4.nabble.com/Error-in-rbind-info-getNamespaceInfo-env-quot-S3methods-quot-td4755490.html
> ).
> Yes, seems we should make sure we build SparkR in an old version.
> Since that support for R prior to version 3.4 is deprecated as of Spark
> 3.0.0, we could use either R 3.4 or matching to Jenkins's (R 3.1 IIRC) for
> Spark 3.0 release.
>
> Redirecting to a dev list and Yuming as well for visibility.
>
> 2019년 12월 27일 (금) 오후 12:02, Jeff Zhang <zj...@gmail.com>님이 작성:
>
>> Yes, I guess so. But R 3.6.2 is just released this month, I think we
>> should use an older version to build SparkR.
>>
>> Felix Cheung <fe...@hotmail.com> 于2019年12月27日周五 上午10:43写道:
>>
>>> Maybe it’s the reverse - the package is built to run in latest but not
>>> compatible with slightly older (3.5.2 was Dec 2018)
>>>
>>> ------------------------------
>>> *From:* Jeff Zhang <zj...@gmail.com>
>>> *Sent:* Thursday, December 26, 2019 5:36:50 PM
>>> *To:* Felix Cheung <fe...@hotmail.com>
>>> *Cc:* user.spark <us...@spark.apache.org>
>>> *Subject:* Re: Fail to use SparkR of 3.0 preview 2
>>>
>>> I use R 3.5.2
>>>
>>> Felix Cheung <fe...@hotmail.com> 于2019年12月27日周五 上午4:32写道:
>>>
>>> It looks like a change in the method signature in R base packages.
>>>
>>> Which version of R are you running on?
>>>
>>> ------------------------------
>>> *From:* Jeff Zhang <zj...@gmail.com>
>>> *Sent:* Thursday, December 26, 2019 12:46:12 AM
>>> *To:* user.spark <us...@spark.apache.org>
>>> *Subject:* Fail to use SparkR of 3.0 preview 2
>>>
>>> I tried SparkR of spark 3.0 preview 2, but hit the following issue.
>>>
>>> Error in rbind(info, getNamespaceInfo(env, "S3methods")) :
>>>   number of columns of matrices must match (see arg 2)
>>> Error: package or namespace load failed for ‘SparkR’ in rbind(info,
>>> getNamespaceInfo(env, "S3methods")):
>>>  number of columns of matrices must match (see arg 2)
>>> During startup - Warning messages:
>>> 1: package ‘SparkR’ was built under R version 3.6.2
>>> 2: package ‘SparkR’ in options("defaultPackages") was not found
>>>
>>> Does anyone know what might be wrong ? Thanks
>>>
>>>
>>>
>>> --
>>> Best Regards
>>>
>>> Jeff Zhang
>>>
>>>
>>>
>>> --
>>> Best Regards
>>>
>>> Jeff Zhang
>>>
>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>

-- 
[image: Databricks Summit - Watch the talks]
<https://databricks.com/sparkaisummit/north-america>