You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by sririshindra <sr...@gmail.com> on 2016/09/14 09:57:40 UTC

sqlContext.registerDataFrameAsTable is not working properly in pyspark 2.0

Hi,

I have a production job that is registering four different dataframes as
tables in pyspark 1.6.2 . when we upgraded to spark 2.0 only three of the
four dataframes are getting registered. the fourth dataframe is not getting
registered. There are no code changes whatsoever. The only change is the
spark verion. When I revert the spark version to 1.6.2 the dataframe is
getting registered properly.  Did anyone face a similar issue? Is this a bug
in spark 2.0 or is it just a compatibility issue?



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/sqlContext-registerDataFrameAsTable-is-not-working-properly-in-pyspark-2-0-tp18938.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org