You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Dhruv Singla <dv...@gmail.com> on 2023/04/17 16:52:29 UTC

[Spark on SBT] Executor just keeps running

Hi Team
       I was trying to run spark using `sbt console` on the terminal. I am
able to build the project successfully using build.sbt and the following
piece of code runs fine on IntelliJ. The only issue I am facing while
running the same on terminal is that the Executor keeps running and is not
able to complete the task. I don't know if it is not able to get the
resources it wants or something else is stopping it.
Here's the Code
```

import org.apache.spark._val sc = new SparkContext("local[1]",
"SimpleProg")val nums = sc.parallelize(List(1, 2, 3, 4));
println(nums.reduce((a, b) => a - b))

```

I've attached a file that contains the errors that show up when I manually
stop the program using `Ctrl+C`.

Re: [Spark on SBT] Executor just keeps running

Posted by Dhruv Singla <dv...@gmail.com>.
You can reproduce the behavior in ordinary Scala code if you keep reduce in
an object outside the main method. Hope it might help

On Mon, Apr 17, 2023 at 10:22 PM Dhruv Singla <dv...@gmail.com> wrote:

> Hi Team
>        I was trying to run spark using `sbt console` on the terminal. I am
> able to build the project successfully using build.sbt and the following
> piece of code runs fine on IntelliJ. The only issue I am facing while
> running the same on terminal is that the Executor keeps running and is not
> able to complete the task. I don't know if it is not able to get the
> resources it wants or something else is stopping it.
> Here's the Code
> ```
>
> import org.apache.spark._val sc = new SparkContext("local[1]", "SimpleProg")val nums = sc.parallelize(List(1, 2, 3, 4));
> println(nums.reduce((a, b) => a - b))
>
> ```
>
> I've attached a file that contains the errors that show up when I manually
> stop the program using `Ctrl+C`.
>