You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Maximilien Belinga <ma...@wouri.co> on 2017/03/06 17:15:22 UTC

Spark application does not work with only one core

I am currently working to deploy two spark applications  and I want to
restrict cores and executors per application. My config is as follows:

spark.executor.cores=1
spark.driver.cores=1
spark.cores.max=1
spark.executor.instances=1

Now the issue is that with this exact configuration, one streaming
application works while the other doesn't. The application that doesn't
work remain in state: RUNNING and continuously print the following message
in logs:

17/03/06 10:31:50 INFO JobScheduler: Added jobs for time 1488814310000 ms
17/03/06 10:31:55 INFO JobScheduler: Added jobs for time 1488814315000 ms

Surprisingly, if I change the configuration to the following, the same
application that was not working now proceed without problem.

spark.executor.cores=3
spark.driver.cores=1
spark.cores.max=3
spark.executor.instances=3

*Note:* The application does not work with value 2. This is why I took 3.

It thus appears that some streaming applications need more cores that
others. My question is what determines how much resources an application
needs? Why is one application not able to run with once single core while
it can run with 3 cores?


Regards,
Maximilien.

Fwd: Spark application does not work with only one core

Posted by Maximilien Belinga <ma...@wouri.co>.
I am currently working to deploy two spark applications  and I want to
restrict cores and executors per application. My config is as follows:

spark.executor.cores=1
spark.driver.cores=1
spark.cores.max=1
spark.executor.instances=1

Now the issue is that with this exact configuration, one streaming
application works while the other doesn't. The application that doesn't
work remain in state: RUNNING and continuously print the following message
in logs:

17/03/06 10:31:50 INFO JobScheduler: Added jobs for time 1488814310000 ms
17/03/06 10:31:55 INFO JobScheduler: Added jobs for time 1488814315000 ms

Surprisingly, if I change the configuration to the following, the same
application that was not working now proceed without problem.

spark.executor.cores=3
spark.driver.cores=1
spark.cores.max=3
spark.executor.instances=3

*Note:* The application does not work with value 2. This is why I took 3.

It thus appears that some streaming applications need more cores that
others. My question is what determines how much resources an application
needs? Why is one application not able to run with once single core while
it can run with 3 cores?


Regards,
Maximilien.