You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by klrmowse <kl...@gmail.com> on 2018/04/16 21:09:35 UTC
[Spark 2.x Core] Job writing out an extra empty part-0000* file
the spark job succeeds (and with correct output), except there is always an
extra part-0000* file, and it is empty...
i even set number of partitions to only 2 via spark-submit, but there is
still a 3rd, empty, part-file that shows up.
why does it do that? how to fix?
Thank you
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org
Re: [Spark 2.x Core] Job writing out an extra empty part-0000* file
Posted by klrmowse <kl...@gmail.com>.
well... it turns out, that extra part-0000* file goes away when i limit
--num-executors to 1 or 2 (leaving it to default maxes it out, which in turn
gives an extra empty part-file)
i guess the test data i'm using only requires that many executors
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org