You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by ayan guha <gu...@gmail.com> on 2016/05/31 05:02:31 UTC

Spark SQL Errors

Hi

While running spark thrift, we are getting 2 issues.

1.. 16/05/31 14:36:18 WARN ThriftCLIService: Error executing statement:
org.apache.hive.service.cli.HiveSQLException:
org.apache.spark.sql.AnalysisException: Table not found:
sds.unhealthy_om_delta;
        at
org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org
$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:
246)

Unfortunately, the table exists and I can see it from beeline.

This error is happening from a front end, where the front end service is
launched by a different user. However, we do not restrict read access to
anybody.

2. org.apache.hive.service.cli.HiveSQLException:
java.lang.RuntimeException: [1.20] failure: end of input expected

SHOW TABLES IN sds LIKE '.*'
                   ^
        at
org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org
$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:
246)

It seems a pure Hive error, and looks like wrong syntax.. Any suggestion
what is the correct syntax?

Both issues are coming while running a 3rd party tool (datameer) connecting
to Spark Thrift Server. Spark Version 1.6 on HDP 2.4.


TIA...

-- 
Best Regards,
Ayan Guha

Re: Spark SQL Errors

Posted by ayan guha <gu...@gmail.com>.
Unfortunately, I do not have it, as it is 3rd party code :(

But essentially I am trying to overwrite data to a hive table from a source

On Tue, May 31, 2016 at 4:01 PM, Mich Talebzadeh <mi...@gmail.com>
wrote:

> ok what is the exact spark code that is causing the issue.
>
> can you show it in its entirety?
>
> HTH
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> On 31 May 2016 at 06:31, ayan guha <gu...@gmail.com> wrote:
>
>> No there is no semicolon.
>>
>> This is the query:
>>
>> 16/05/31 14:34:29 INFO SparkExecuteStatementOperation: Running query
>> 'DESCRIBE EXTENDED `sds.unhealthy_om_delta`' with
>> e24282a8-43d1-4c3a-a3f3-2645761ed40f
>>
>>
>> On Tue, May 31, 2016 at 3:10 PM, Raju Bairishetti <ra...@gmail.com>
>> wrote:
>>
>>>
>>>
>>> On Tue, May 31, 2016 at 1:02 PM, ayan guha <gu...@gmail.com> wrote:
>>>
>>>> Hi
>>>>
>>>> While running spark thrift, we are getting 2 issues.
>>>>
>>>> 1.. 16/05/31 14:36:18 WARN ThriftCLIService: Error executing statement:
>>>> org.apache.hive.service.cli.HiveSQLException:
>>>> org.apache.spark.sql.AnalysisException: Table not found:
>>>> sds.unhealthy_om_delta;
>>>>
>>>
>>> Are you using *;* (semi colon) at the end of query like
>>> *sqlcontext.sql(query;)*?   You should not mention *;* at the end of
>>> query
>>>
>>>         at
>>>> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org
>>>> $apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:
>>>> 246)
>>>>
>>>> Unfortunately, the table exists and I can see it from beeline.
>>>>
>>>> This error is happening from a front end, where the front end service
>>>> is launched by a different user. However, we do not restrict read access to
>>>> anybody.
>>>>
>>>> 2. org.apache.hive.service.cli.HiveSQLException:
>>>> java.lang.RuntimeException: [1.20] failure: end of input expected
>>>>
>>>> SHOW TABLES IN sds LIKE '.*'
>>>>                    ^
>>>>         at
>>>> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org
>>>> $apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:
>>>> 246)
>>>>
>>>> It seems a pure Hive error, and looks like wrong syntax.. Any
>>>> suggestion what is the correct syntax?
>>>>
>>>> Both issues are coming while running a 3rd party tool (datameer)
>>>> connecting to Spark Thrift Server. Spark Version 1.6 on HDP 2.4.
>>>>
>>>>
>>>> TIA...
>>>>
>>>> --
>>>> Best Regards,
>>>> Ayan Guha
>>>>
>>>
>>>
>>>
>>> --
>>> Thanks,
>>> Raju Bairishetti,
>>>
>>> www.lazada.com
>>>
>>>
>>>
>>
>>
>> --
>> Best Regards,
>> Ayan Guha
>>
>
>


-- 
Best Regards,
Ayan Guha

Re: Spark SQL Errors

Posted by Mich Talebzadeh <mi...@gmail.com>.
ok what is the exact spark code that is causing the issue.

can you show it in its entirety?

HTH

Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 31 May 2016 at 06:31, ayan guha <gu...@gmail.com> wrote:

> No there is no semicolon.
>
> This is the query:
>
> 16/05/31 14:34:29 INFO SparkExecuteStatementOperation: Running query
> 'DESCRIBE EXTENDED `sds.unhealthy_om_delta`' with
> e24282a8-43d1-4c3a-a3f3-2645761ed40f
>
>
> On Tue, May 31, 2016 at 3:10 PM, Raju Bairishetti <ra...@gmail.com>
> wrote:
>
>>
>>
>> On Tue, May 31, 2016 at 1:02 PM, ayan guha <gu...@gmail.com> wrote:
>>
>>> Hi
>>>
>>> While running spark thrift, we are getting 2 issues.
>>>
>>> 1.. 16/05/31 14:36:18 WARN ThriftCLIService: Error executing statement:
>>> org.apache.hive.service.cli.HiveSQLException:
>>> org.apache.spark.sql.AnalysisException: Table not found:
>>> sds.unhealthy_om_delta;
>>>
>>
>> Are you using *;* (semi colon) at the end of query like
>> *sqlcontext.sql(query;)*?   You should not mention *;* at the end of
>> query
>>
>>         at
>>> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org
>>> $apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:
>>> 246)
>>>
>>> Unfortunately, the table exists and I can see it from beeline.
>>>
>>> This error is happening from a front end, where the front end service is
>>> launched by a different user. However, we do not restrict read access to
>>> anybody.
>>>
>>> 2. org.apache.hive.service.cli.HiveSQLException:
>>> java.lang.RuntimeException: [1.20] failure: end of input expected
>>>
>>> SHOW TABLES IN sds LIKE '.*'
>>>                    ^
>>>         at
>>> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org
>>> $apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:
>>> 246)
>>>
>>> It seems a pure Hive error, and looks like wrong syntax.. Any suggestion
>>> what is the correct syntax?
>>>
>>> Both issues are coming while running a 3rd party tool (datameer)
>>> connecting to Spark Thrift Server. Spark Version 1.6 on HDP 2.4.
>>>
>>>
>>> TIA...
>>>
>>> --
>>> Best Regards,
>>> Ayan Guha
>>>
>>
>>
>>
>> --
>> Thanks,
>> Raju Bairishetti,
>>
>> www.lazada.com
>>
>>
>>
>
>
> --
> Best Regards,
> Ayan Guha
>

Re: Spark SQL Errors

Posted by ayan guha <gu...@gmail.com>.
No there is no semicolon.

This is the query:

16/05/31 14:34:29 INFO SparkExecuteStatementOperation: Running query
'DESCRIBE EXTENDED `sds.unhealthy_om_delta`' with
e24282a8-43d1-4c3a-a3f3-2645761ed40f


On Tue, May 31, 2016 at 3:10 PM, Raju Bairishetti <ra...@gmail.com>
wrote:

>
>
> On Tue, May 31, 2016 at 1:02 PM, ayan guha <gu...@gmail.com> wrote:
>
>> Hi
>>
>> While running spark thrift, we are getting 2 issues.
>>
>> 1.. 16/05/31 14:36:18 WARN ThriftCLIService: Error executing statement:
>> org.apache.hive.service.cli.HiveSQLException:
>> org.apache.spark.sql.AnalysisException: Table not found:
>> sds.unhealthy_om_delta;
>>
>
> Are you using *;* (semi colon) at the end of query like
> *sqlcontext.sql(query;)*?   You should not mention *;* at the end of
> query
>
>         at
>> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org
>> $apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:
>> 246)
>>
>> Unfortunately, the table exists and I can see it from beeline.
>>
>> This error is happening from a front end, where the front end service is
>> launched by a different user. However, we do not restrict read access to
>> anybody.
>>
>> 2. org.apache.hive.service.cli.HiveSQLException:
>> java.lang.RuntimeException: [1.20] failure: end of input expected
>>
>> SHOW TABLES IN sds LIKE '.*'
>>                    ^
>>         at
>> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org
>> $apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:
>> 246)
>>
>> It seems a pure Hive error, and looks like wrong syntax.. Any suggestion
>> what is the correct syntax?
>>
>> Both issues are coming while running a 3rd party tool (datameer)
>> connecting to Spark Thrift Server. Spark Version 1.6 on HDP 2.4.
>>
>>
>> TIA...
>>
>> --
>> Best Regards,
>> Ayan Guha
>>
>
>
>
> --
> Thanks,
> Raju Bairishetti,
>
> www.lazada.com
>
>
>


-- 
Best Regards,
Ayan Guha

Re: Spark SQL Errors

Posted by Raju Bairishetti <ra...@gmail.com>.
On Tue, May 31, 2016 at 1:02 PM, ayan guha <gu...@gmail.com> wrote:

> Hi
>
> While running spark thrift, we are getting 2 issues.
>
> 1.. 16/05/31 14:36:18 WARN ThriftCLIService: Error executing statement:
> org.apache.hive.service.cli.HiveSQLException:
> org.apache.spark.sql.AnalysisException: Table not found:
> sds.unhealthy_om_delta;
>

Are you using *;* (semi colon) at the end of query like
*sqlcontext.sql(query;)*?   You should not mention *;* at the end of query

        at
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org
> $apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:
> 246)
>
> Unfortunately, the table exists and I can see it from beeline.
>
> This error is happening from a front end, where the front end service is
> launched by a different user. However, we do not restrict read access to
> anybody.
>
> 2. org.apache.hive.service.cli.HiveSQLException:
> java.lang.RuntimeException: [1.20] failure: end of input expected
>
> SHOW TABLES IN sds LIKE '.*'
>                    ^
>         at
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org
> $apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:
> 246)
>
> It seems a pure Hive error, and looks like wrong syntax.. Any suggestion
> what is the correct syntax?
>
> Both issues are coming while running a 3rd party tool (datameer)
> connecting to Spark Thrift Server. Spark Version 1.6 on HDP 2.4.
>
>
> TIA...
>
> --
> Best Regards,
> Ayan Guha
>



-- 
Thanks,
Raju Bairishetti,

www.lazada.com