You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by sagi <zh...@gmail.com> on 2014/11/07 04:28:01 UTC

Is there a way to limit the sql query result size?

Hi spark-users,

When I use spark-sql or beeline to  query a large dataset, sometimes the
query result may cause driver OOM.

So I wonder is there a config property in spark sql to limit the max return
result size (without LIMIT clause in sql query)?

For example, before the select query, I run these in sql client:

set maxRows = 100;

And then, all queries have a  default return limit of 100.

---------------------------------
Best Regards

Re: Is there a way to limit the sql query result size?

Posted by neeravsalaria <ne...@gmail.com>.
Hi Eric, 

   We are also running into the same issue. Are you able to find some
suitable solution to this problem

Best Regards



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Is-there-a-way-to-limit-the-sql-query-result-size-tp18316p23272.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org