You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Travis Athougies <tr...@f-omega.com.INVALID> on 2023/03/08 21:50:25 UTC

spark-submit: No "driver-" id printed in standalone mode

Hello,

I'm trying to get Airflow to work with spark in cluster mode. I can
successfully submit jobs via spark-submit and see them complete
successfully.

However, 'spark-submit' doesn't seem to print any driver- ID to the
console. Clearly the drivers have an ID, as they are listed with one in
the spark master. But spark-submit's output is rather terse:

For example, here is what I get from spark-submit:

23/03/08 13:33:42 WARN Utils: Your hostname, epsilon resolves to a
loopback address: 127.0.0.2; using 192.168.1.189 instead (on interface
enp58s0)
23/03/08 13:33:42 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to
another address
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform
(file:/nix/store/vgs0ipwwkl9szmqpjawg5p08y57nq57m-spark-
3.2.2/lib/spark-3.2.2/jars/spark-unsafe_2.12-3.2.2.jar) to constructor
java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of
org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further
illegal reflective access operations
WARNING: All illegal access operations will be denied in a future
release
23/03/08 13:33:42 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where
applicable

If I enable verbose mode, I get more data on stderr, but no driver- id.

The full logs are here:
https://gist.github.com/tathougies/c97a02d7e44379bd03086a9fa4ab39b7

Any idea?

It seems to me that org.apache.spark.deploy.SparkSubmit ought to print
out the driver ID as a matter of habit, but from scouring various
answers online it seems it's just part of the logs.

Travis

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org