You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by darin <li...@foxmail.com> on 2017/03/14 02:44:53 UTC

Re: [SparkSQL] too many open files although ulimit set to 1048576

I think your sets not works
try add `ulimit -n 10240 ` in spark-env.sh




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-too-many-open-files-although-ulimit-set-to-1048576-tp28490p28491.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org