You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by KhajaAsmath Mohammed <md...@gmail.com> on 2016/05/09 17:33:34 UTC

DataFrame cannot find temporary table

Hi,

I have created dataframe with below code and I was able to print the schema
but unfortuntely cannot pull the data from the temporary table. It always
says that table is not found

    val df=convertRDDToDF(records, mapper, errorRecords, sparkContext);
    import sqlContext._
    df.printSchema()
    df.registerTempTable("person")
    val personRecords = sqlContext.sql("select * from person")
    personRecords.foreach { println }

Schema Output:
root
 |-- address: struct (nullable = true)
 |    |-- city: string (nullable = true)
 |    |-- line1: string (nullable = true)
 |    |-- state: string (nullable = true)
 |    |-- zip: string (nullable = true)
 |-- first: string (nullable = true)
 |-- last: string (nullable = true)

*Error while accessing table:*
Exception in thread "main" org.apache.spark.sql.AnalysisException: Table
not found: person;

Does anyone have solution for this?

Thanks,
Asmath

Re: DataFrame cannot find temporary table

Posted by Takeshi Yamamuro <li...@gmail.com>.
Hi,

What's `convertRDDToDF`?
Seems you use different `SQLContext` between table registration and
querying.

//maropu


On Tue, May 10, 2016 at 2:46 AM, Mich Talebzadeh <mi...@gmail.com>
wrote:

> Have you created sqlContext based on HiveContext?
>
>
>   val sc = new SparkContext(conf)
>   // Create sqlContext based on HiveContext
>   val sqlContext = new HiveContext(sc)
>   import sqlContext.implicits._
>
> df.registerTempTable("person")
> ...............
>
>
>
>
>
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> On 9 May 2016 at 18:33, KhajaAsmath Mohammed <md...@gmail.com>
> wrote:
>
>> Hi,
>>
>> I have created dataframe with below code and I was able to print the
>> schema but unfortuntely cannot pull the data from the temporary table. It
>> always says that table is not found
>>
>>     val df=convertRDDToDF(records, mapper, errorRecords, sparkContext);
>>
>>     import sqlContext._
>>     df.printSchema()
>>     df.registerTempTable("person")
>>     val personRecords = sqlContext.sql("select * from person")
>>     personRecords.foreach { println }
>>
>> Schema Output:
>> root
>>  |-- address: struct (nullable = true)
>>  |    |-- city: string (nullable = true)
>>  |    |-- line1: string (nullable = true)
>>  |    |-- state: string (nullable = true)
>>  |    |-- zip: string (nullable = true)
>>  |-- first: string (nullable = true)
>>  |-- last: string (nullable = true)
>>
>> *Error while accessing table:*
>> Exception in thread "main" org.apache.spark.sql.AnalysisException: Table
>> not found: person;
>>
>> Does anyone have solution for this?
>>
>> Thanks,
>> Asmath
>>
>
>


-- 
---
Takeshi Yamamuro

Re: DataFrame cannot find temporary table

Posted by Mich Talebzadeh <mi...@gmail.com>.
Have you created sqlContext based on HiveContext?


  val sc = new SparkContext(conf)
  // Create sqlContext based on HiveContext
  val sqlContext = new HiveContext(sc)
  import sqlContext.implicits._

df.registerTempTable("person")
...............






Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 9 May 2016 at 18:33, KhajaAsmath Mohammed <md...@gmail.com>
wrote:

> Hi,
>
> I have created dataframe with below code and I was able to print the
> schema but unfortuntely cannot pull the data from the temporary table. It
> always says that table is not found
>
>     val df=convertRDDToDF(records, mapper, errorRecords, sparkContext);
>     import sqlContext._
>     df.printSchema()
>     df.registerTempTable("person")
>     val personRecords = sqlContext.sql("select * from person")
>     personRecords.foreach { println }
>
> Schema Output:
> root
>  |-- address: struct (nullable = true)
>  |    |-- city: string (nullable = true)
>  |    |-- line1: string (nullable = true)
>  |    |-- state: string (nullable = true)
>  |    |-- zip: string (nullable = true)
>  |-- first: string (nullable = true)
>  |-- last: string (nullable = true)
>
> *Error while accessing table:*
> Exception in thread "main" org.apache.spark.sql.AnalysisException: Table
> not found: person;
>
> Does anyone have solution for this?
>
> Thanks,
> Asmath
>