You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Srinivas Rishindra Pothireddi (JIRA)" <ji...@apache.org> on 2016/09/14 15:36:21 UTC

[jira] [Updated] (SPARK-17538) sqlContext.registerDataFrameAsTable is not working sometimes in pyspark 2.0.0

     [ https://issues.apache.org/jira/browse/SPARK-17538?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Srinivas Rishindra Pothireddi updated SPARK-17538:
--------------------------------------------------
    Summary: sqlContext.registerDataFrameAsTable is not working sometimes in pyspark 2.0.0  (was: sqlContext.registerDataFrameAsTable is not working sometimes in spark 2.0)

> sqlContext.registerDataFrameAsTable is not working sometimes in pyspark 2.0.0
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-17538
>                 URL: https://issues.apache.org/jira/browse/SPARK-17538
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 2.0.0, 2.0.1, 2.1.0
>         Environment: os - linux
> cluster -> yarn and local
>            Reporter: Srinivas Rishindra Pothireddi
>            Priority: Critical
>             Fix For: 2.0.1, 2.1.0
>
>
> I have a production job in spark 1.6.2 that registers four dataframes as tables. After testing the job in spark 2.0.0 one of the dataframes is not getting registered as a table.
> output of sqlContext.tableNames() just after registering the fourth dataframe in spark 1.6.2 is
> temp1,temp2,temp3,temp4
> output of sqlContext.tableNames() just after registering the fourth dataframe in spark 2.0.0 is
> temp1,temp2,temp3
> so when the table temp4 is used by the job at a later stage an AnalysisException is raised in spark 2.0.0
> There are no changes in the code whatsoever. 
>  
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org