You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Vinoth Sankar <vi...@gmail.com> on 2015/10/28 10:35:32 UTC

How to check whether my Spark Jobs are palatalized or not

Hi,

I'm reading N(mostly in thousands) no of files and filtering it through
Spark based on some criteria. Running Spark Application with two Workers(4
cores each). I enforced parallelism by giving
*sparkContext.parallelize(fileList*) in my java code, but didn't any
performance improvement. And i'm always getting "Active Jobs" as 1 in Spark
UI. Am I missing anything.? How do I Check whether my Spark Jobs are
paralleled or not ?

Regards
Vinoth Sankar