You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by bo zhao <zh...@gmail.com> on 2021/12/13 03:24:23 UTC

Hi Team, I put a UDF-Utils jar on Google Cloud Storage, but I can't run it

Hi Team,
I'm doing migration from onPerm to Google Cloud Platform. I have a
UDF-Uitls jar including several udf functions. when  code on Perm running
by Spark just need "add jar hdfs://x/home/path/UDF-Uitls.jar" with
SparkSQL. Now I put this jar on cloud storage. when I run SparkSQL with
"add jar gs://x/home/path/UDF-Uitls.jar". It showed: not supported url
schema  only support hdfs | file | ivy.  So I want to know if we have a
solution to deal with this? I can't open hdfs file system on Goole Cloud
Platform or put this jar on artifactory.

Re: Hi Team, I put a UDF-Utils jar on Google Cloud Storage, but I can't run it

Posted by Mich Talebzadeh <mi...@gmail.com>.
probably it is because you are using an older version of spark

This works for version 3.1.1

gsutil ls gs://etcbucket/ojdbc8.jar
gs://etcbucket/ojdbc8.jar

*spark-sql add jar gs://etcbucket/ojdbc8.jar*

21/12/13 08:56:00 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
setLogLevel(newLevel).
21/12/13 08:56:05 WARN HiveConf: HiveConf of name hive.stats.jdbc.timeout
does not exist
21/12/13 08:56:05 WARN HiveConf: HiveConf of name hive.stats.retries.wait
does not exist
21/12/13 08:56:09 WARN ObjectStore: Version information not found in
metastore. hive.metastore.schema.verification is not enabled so recording
the schema version 2.3.0
21/12/13 08:56:09 WARN ObjectStore: setMetaStoreSchemaVersion called but
recording version is disabled: version = 2.3.0, comment = Set by MetaStore
hduser@10.154.15.201
Spark master: local[*], Application Id: local-1639385763388
spark-sql>

HTH



   view my Linkedin profile
<https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>



*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Mon, 13 Dec 2021 at 03:25, bo zhao <zh...@gmail.com> wrote:

> Update. My Spark version is 2.3.
>
> bo zhao <zh...@gmail.com> 于2021年12月13日周一 11:24写道:
>
>> Hi Team,
>> I'm doing migration from onPerm to Google Cloud Platform. I have a
>> UDF-Uitls jar including several udf functions. when  code on Perm running
>> by Spark just need "add jar hdfs://x/home/path/UDF-Uitls.jar" with
>> SparkSQL. Now I put this jar on cloud storage. when I run SparkSQL with
>> "add jar gs://x/home/path/UDF-Uitls.jar". It showed: not supported url
>> schema  only support hdfs | file | ivy.  So I want to know if we have a
>> solution to deal with this? I can't open hdfs file system on Goole Cloud
>> Platform or put this jar on artifactory.
>>
>

Re: Hi Team, I put a UDF-Utils jar on Google Cloud Storage, but I can't run it

Posted by bo zhao <zh...@gmail.com>.
Update. My Spark version is 2.3.

bo zhao <zh...@gmail.com> 于2021年12月13日周一 11:24写道:

> Hi Team,
> I'm doing migration from onPerm to Google Cloud Platform. I have a
> UDF-Uitls jar including several udf functions. when  code on Perm running
> by Spark just need "add jar hdfs://x/home/path/UDF-Uitls.jar" with
> SparkSQL. Now I put this jar on cloud storage. when I run SparkSQL with
> "add jar gs://x/home/path/UDF-Uitls.jar". It showed: not supported url
> schema  only support hdfs | file | ivy.  So I want to know if we have a
> solution to deal with this? I can't open hdfs file system on Goole Cloud
> Platform or put this jar on artifactory.
>