You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by jerrytim <je...@126.com> on 2017/02/13 22:23:04 UTC

Re: Spark 2.1.0 issue with spark-shell and pyspark

I came across the same problem while I ran my code at "model.save(sc, path)"

Error info:
IllegalArgumentException: u"Error while instantiating
'org.apache.spark.sql.hive.HiveSessionState':"

My platform is Mac, I installed Spark with Hadoop prebuilt. Then I
integrated PySpark with Jupyter.

Anyone has any ideas?



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-2-1-0-issue-with-spark-shell-and-pyspark-tp28339p28385.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org