You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Zahid Rahman <za...@gmail.com> on 2020/02/29 17:23:38 UTC

configuration error

Hi,

I am running it on linux.
Is there a programmatic way to get rid of this security error or is it
configuration file error.

zahid@localhost:~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
mvn exec:java -Dexec.mainClass=com.javacodegeek.examples.SparkExampleRDD
-Dexec.args="input.txt"
[INFO] Scanning for projects...
[WARNING]
[WARNING] Some problems were encountered while building the effective model
for javacodegeek:examples:jar:1.0-SNAPSHOT
[WARNING] 'build.plugins.plugin.version' for
org.apache.maven.plugins:maven-compiler-plugin is missing. @ line 12,
column 21
[WARNING]
[WARNING] It is highly recommended to fix these problems because they
threaten the stability of your build.
[WARNING]
[WARNING] For this reason, future Maven versions might no longer support
building such malformed projects.
[WARNING]
[INFO]
[INFO] -----------------------< javacodegeek:examples
>------------------------
[INFO] Building examples 1.0-SNAPSHOT
[INFO] --------------------------------[ jar
]---------------------------------
[INFO]
[INFO] --- exec-maven-plugin:1.6.0:java (default-cli) @ examples ---
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform
(file:/home/zahid/.m2/repository/org/apache/spark/spark-unsafe_2.12/2.4.5/spark-unsafe_2.12-2.4.5.jar)
to method java.nio.Bits.unaligned()
WARNING: Please consider reporting this to the maintainers of
org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal
reflective access operations
WARNING: All illegal access operations will be denied in a future release
Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
20/02/29 17:20:40 INFO SparkContext: Running Spark version 2.4.5
20/02/29 17:20:40 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
20/02/29 17:20:41 INFO SparkContext: Submitted application: Word Count
20/02/29 17:20:41 INFO SecurityManager: Changing view acls to: zahid
20/02/29 17:20:41 INFO SecurityManager: Changing modify acls to: zahid
20/02/29 17:20:41 INFO SecurityManager: Changing view acls groups to:
20/02/29 17:20:41 INFO SecurityManager: Changing modify acls groups to:
20/02/29 17:20:41 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users  with view permissions: Set(zahid);
groups with view permissions: Set(); users  with modify permissions:
Set(zahid); groups with modify permissions: Set()
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:20:41 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service
'sparkDriver' failed after 16 retries (on a random free port)! Consider
explicitly setting the appropriate binding address for the service
'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the
correct binding address.
        at java.base/sun.nio.ch.Net.bind0(Native Method)
        at java.base/sun.nio.ch.Net.bind(Net.java:469)
        at java.base/sun.nio.ch.Net.bind(Net.java:458)
        at
java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:220)
        at
io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:132)
        at
io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:551)
        at
io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1346)
        at
io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:503)
        at
io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:488)
        at
io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:985)
        at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:247)
        at
io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:344)
        at
io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
        at
io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:510)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:518)
        at
io.netty.util.concurrent.SingleThreadEventExecutor$6.run(SingleThreadEventExecutor.java:1044)
        at
io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
        at
io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base/java.lang.Thread.run(Thread.java:830)
20/02/29 17:20:41 INFO SparkContext: Successfully stopped SparkContext
[WARNING]
java.net.BindException: Cannot assign requested address: Service
'sparkDriver' failed after 16 retries (on a random free port)! Consider
explicitly setting the appropriate binding address for the service
'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the
correct binding address.
    at sun.nio.ch.Net.bind0 (Native Method)
    at sun.nio.ch.Net.bind (Net.java:469)
    at sun.nio.ch.Net.bind (Net.java:458)
    at sun.nio.ch.ServerSocketChannelImpl.bind
(ServerSocketChannelImpl.java:220)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind
(NioServerSocketChannel.java:132)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind
(AbstractChannel.java:551)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind
(DefaultChannelPipeline.java:1346)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind
(AbstractChannelHandlerContext.java:503)
    at io.netty.channel.AbstractChannelHandlerContext.bind
(AbstractChannelHandlerContext.java:488)
    at io.netty.channel.DefaultChannelPipeline.bind
(DefaultChannelPipeline.java:985)
    at io.netty.channel.AbstractChannel.bind (AbstractChannel.java:247)
    at io.netty.bootstrap.AbstractBootstrap$2.run
(AbstractBootstrap.java:344)
    at io.netty.util.concurrent.AbstractEventExecutor.safeExecute
(AbstractEventExecutor.java:163)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks
(SingleThreadEventExecutor.java:510)
    at io.netty.channel.nio.NioEventLoop.run (NioEventLoop.java:518)
    at io.netty.util.concurrent.SingleThreadEventExecutor$6.run
(SingleThreadEventExecutor.java:1044)
    at io.netty.util.internal.ThreadExecutorMap$2.run
(ThreadExecutorMap.java:74)
    at io.netty.util.concurrent.FastThreadLocalRunnable.run
(FastThreadLocalRunnable.java:30)
    at java.lang.Thread.run (Thread.java:830)
[INFO]
------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO]
------------------------------------------------------------------------
[INFO] Total time:  3.977 s
[INFO] Finished at: 2020-02-29T17:20:43Z
[INFO]
------------------------------------------------------------------------
[ERROR] Failed to execute goal
org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project
examples: An exception occured while executing the Java class. Cannot
assign requested address: Service 'sparkDriver' failed after 16 retries (on
a random free port)! Consider explicitly setting the appropriate binding
address for the service 'sparkDriver' (for example spark.driver.bindAddress
for SparkDriver) to the correct binding address. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions,
please read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
zahid@localhost:~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
mvn exec:java -Dexec.mainClass=com.javacodegeek.examples.SparkExampleRDD
-Dexec.args="input.txt" > log.out
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform
(file:/home/zahid/.m2/repository/org/apache/spark/spark-unsafe_2.12/2.4.5/spark-unsafe_2.12-2.4.5.jar)
to method java.nio.Bits.unaligned()
WARNING: Please consider reporting this to the maintainers of
org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal
reflective access operations
WARNING: All illegal access operations will be denied in a future release
Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
20/02/29 17:21:12 INFO SparkContext: Running Spark version 2.4.5
20/02/29 17:21:12 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
20/02/29 17:21:12 INFO SparkContext: Submitted application: Word Count
20/02/29 17:21:12 INFO SecurityManager: Changing view acls to: zahid
20/02/29 17:21:12 INFO SecurityManager: Changing modify acls to: zahid
20/02/29 17:21:12 INFO SecurityManager: Changing view acls groups to:
20/02/29 17:21:12 INFO SecurityManager: Changing modify acls groups to:
20/02/29 17:21:12 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users  with view permissions: Set(zahid);
groups with view permissions: Set(); users  with modify permissions:
Set(zahid); groups with modify permissions: Set()
20/02/29 17:21:12 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:21:12 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:21:12 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:21:12 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:21:12 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
random free port. You may check whether configuring an appropriate binding
address.
20/02/29 17:21:13 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service
'sparkDriver' failed after 16 retries (on a random free port)! Consider
explicitly setting the appropriate binding address for the service
'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the
correct binding address.
        at java.base/sun.nio.ch.Net.bind0(Native Method)
        at java.base/sun.nio.ch.Net.bind(Net.java:469)
        at java.base/sun.nio.ch.Net.bind(Net.java:458)
        at
java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:220)
        at
io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:132)
        at
io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:551)
        at
io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1346)
        at
io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:503)
        at
io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:488)
        at
io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:985)
        at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:247)
        at
io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:344)
        at
io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
        at
io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:510)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:518)
        at
io.netty.util.concurrent.SingleThreadEventExecutor$6.run(SingleThreadEventExecutor.java:1044)
        at
io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
        at
io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base/java.lang.Thread.run(Thread.java:830)
20/02/29 17:21:13 INFO SparkContext: Successfully stopped SparkContext
zahid@localhost:~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
ls
input.txt  log.out  pom.xml  src  target
zahid@localhost:~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
cat log.out
[INFO] Scanning for projects...
[WARNING]
[WARNING] Some problems were encountered while building the effective model
for javacodegeek:examples:jar:1.0-SNAPSHOT
[WARNING] 'build.plugins.plugin.version' for
org.apache.maven.plugins:maven-compiler-plugin is missing. @ line 12,
column 21
[WARNING]
[WARNING] It is highly recommended to fix these problems because they
threaten the stability of your build.
[WARNING]
[WARNING] For this reason, future Maven versions might no longer support
building such malformed projects.
[WARNING]
[INFO]
[INFO] -----------------------< javacodegeek:examples
>------------------------
[INFO] Building examples 1.0-SNAPSHOT
[INFO] --------------------------------[ jar
]---------------------------------
[INFO]
[INFO] --- exec-maven-plugin:1.6.0:java (default-cli) @ examples ---
[WARNING]
java.net.BindException: Cannot assign requested address: Service
'sparkDriver' failed after 16 retries (on a random free port)! Consider
explicitly setting the appropriate binding address for the service
'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the
correct binding address.
    at sun.nio.ch.Net.bind0 (Native Method)
    at sun.nio.ch.Net.bind (Net.java:469)
    at sun.nio.ch.Net.bind (Net.java:458)
    at sun.nio.ch.ServerSocketChannelImpl.bind
(ServerSocketChannelImpl.java:220)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind
(NioServerSocketChannel.java:132)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind
(AbstractChannel.java:551)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind
(DefaultChannelPipeline.java:1346)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind
(AbstractChannelHandlerContext.java:503)
    at io.netty.channel.AbstractChannelHandlerContext.bind
(AbstractChannelHandlerContext.java:488)
    at io.netty.channel.DefaultChannelPipeline.bind
(DefaultChannelPipeline.java:985)
    at io.netty.channel.AbstractChannel.bind (AbstractChannel.java:247)
    at io.netty.bootstrap.AbstractBootstrap$2.run
(AbstractBootstrap.java:344)
    at io.netty.util.concurrent.AbstractEventExecutor.safeExecute
(AbstractEventExecutor.java:163)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks
(SingleThreadEventExecutor.java:510)
    at io.netty.channel.nio.NioEventLoop.run (NioEventLoop.java:518)
    at io.netty.util.concurrent.SingleThreadEventExecutor$6.run
(SingleThreadEventExecutor.java:1044)
    at io.netty.util.internal.ThreadExecutorMap$2.run
(ThreadExecutorMap.java:74)
    at io.netty.util.concurrent.FastThreadLocalRunnable.run
(FastThreadLocalRunnable.java:30)
    at java.lang.Thread.run (Thread.java:830)
[INFO]
------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO]
------------------------------------------------------------------------
[INFO] Total time:  4.949 s
[INFO] Finished at: 2020-02-29T17:21:16Z
[INFO]
------------------------------------------------------------------------
[ERROR] Failed to execute goal
org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project
examples: An exception occured while executing the Java class. Cannot
assign requested address: Service 'sparkDriver' failed after 16 retries (on
a random free port)! Consider explicitly setting the appropriate binding
address for the service 'sparkDriver' (for example spark.driver.bindAddress
for SparkDriver) to the correct binding address. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions,
please read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
zahid@localhost:~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>





<http://www.backbutton.co.uk>

Re: configuration error

Posted by Zahid Rahman <za...@gmail.com>.
SOLVED


On Sat, 29 Feb 2020 at 17:23, Zahid Rahman <za...@gmail.com> wrote:

>
> Hi,
>
> I am running it on linux.
> Is there a programmatic way to get rid of this security error or is it
> configuration file error.
>
> zahid@localhost
> :~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
> mvn exec:java -Dexec.mainClass=com.javacodegeek.examples.SparkExampleRDD
> -Dexec.args="input.txt"
> [INFO] Scanning for projects...
> [WARNING]
> [WARNING] Some problems were encountered while building the effective
> model for javacodegeek:examples:jar:1.0-SNAPSHOT
> [WARNING] 'build.plugins.plugin.version' for
> org.apache.maven.plugins:maven-compiler-plugin is missing. @ line 12,
> column 21
> [WARNING]
> [WARNING] It is highly recommended to fix these problems because they
> threaten the stability of your build.
> [WARNING]
> [WARNING] For this reason, future Maven versions might no longer support
> building such malformed projects.
> [WARNING]
> [INFO]
> [INFO] -----------------------< javacodegeek:examples
> >------------------------
> [INFO] Building examples 1.0-SNAPSHOT
> [INFO] --------------------------------[ jar
> ]---------------------------------
> [INFO]
> [INFO] --- exec-maven-plugin:1.6.0:java (default-cli) @ examples ---
> WARNING: An illegal reflective access operation has occurred
> WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform
> (file:/home/zahid/.m2/repository/org/apache/spark/spark-unsafe_2.12/2.4.5/spark-unsafe_2.12-2.4.5.jar)
> to method java.nio.Bits.unaligned()
> WARNING: Please consider reporting this to the maintainers of
> org.apache.spark.unsafe.Platform
> WARNING: Use --illegal-access=warn to enable warnings of further illegal
> reflective access operations
> WARNING: All illegal access operations will be denied in a future release
> Using Spark's default log4j profile:
> org/apache/spark/log4j-defaults.properties
> 20/02/29 17:20:40 INFO SparkContext: Running Spark version 2.4.5
> 20/02/29 17:20:40 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> 20/02/29 17:20:41 INFO SparkContext: Submitted application: Word Count
> 20/02/29 17:20:41 INFO SecurityManager: Changing view acls to: zahid
> 20/02/29 17:20:41 INFO SecurityManager: Changing modify acls to: zahid
> 20/02/29 17:20:41 INFO SecurityManager: Changing view acls groups to:
> 20/02/29 17:20:41 INFO SecurityManager: Changing modify acls groups to:
> 20/02/29 17:20:41 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users  with view permissions: Set(zahid);
> groups with view permissions: Set(); users  with modify permissions:
> Set(zahid); groups with modify permissions: Set()
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:20:41 ERROR SparkContext: Error initializing SparkContext.
> java.net.BindException: Cannot assign requested address: Service
> 'sparkDriver' failed after 16 retries (on a random free port)! Consider
> explicitly setting the appropriate binding address for the service
> 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the
> correct binding address.
>         at java.base/sun.nio.ch.Net.bind0(Native Method)
>         at java.base/sun.nio.ch.Net.bind(Net.java:469)
>         at java.base/sun.nio.ch.Net.bind(Net.java:458)
>         at
> java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:220)
>         at
> io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:132)
>         at
> io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:551)
>         at
> io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1346)
>         at
> io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:503)
>         at
> io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:488)
>         at
> io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:985)
>         at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:247)
>         at
> io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:344)
>         at
> io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
>         at
> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:510)
>         at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:518)
>         at
> io.netty.util.concurrent.SingleThreadEventExecutor$6.run(SingleThreadEventExecutor.java:1044)
>         at
> io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
>         at
> io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
>         at java.base/java.lang.Thread.run(Thread.java:830)
> 20/02/29 17:20:41 INFO SparkContext: Successfully stopped SparkContext
> [WARNING]
> java.net.BindException: Cannot assign requested address: Service
> 'sparkDriver' failed after 16 retries (on a random free port)! Consider
> explicitly setting the appropriate binding address for the service
> 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the
> correct binding address.
>     at sun.nio.ch.Net.bind0 (Native Method)
>     at sun.nio.ch.Net.bind (Net.java:469)
>     at sun.nio.ch.Net.bind (Net.java:458)
>     at sun.nio.ch.ServerSocketChannelImpl.bind
> (ServerSocketChannelImpl.java:220)
>     at io.netty.channel.socket.nio.NioServerSocketChannel.doBind
> (NioServerSocketChannel.java:132)
>     at io.netty.channel.AbstractChannel$AbstractUnsafe.bind
> (AbstractChannel.java:551)
>     at io.netty.channel.DefaultChannelPipeline$HeadContext.bind
> (DefaultChannelPipeline.java:1346)
>     at io.netty.channel.AbstractChannelHandlerContext.invokeBind
> (AbstractChannelHandlerContext.java:503)
>     at io.netty.channel.AbstractChannelHandlerContext.bind
> (AbstractChannelHandlerContext.java:488)
>     at io.netty.channel.DefaultChannelPipeline.bind
> (DefaultChannelPipeline.java:985)
>     at io.netty.channel.AbstractChannel.bind (AbstractChannel.java:247)
>     at io.netty.bootstrap.AbstractBootstrap$2.run
> (AbstractBootstrap.java:344)
>     at io.netty.util.concurrent.AbstractEventExecutor.safeExecute
> (AbstractEventExecutor.java:163)
>     at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks
> (SingleThreadEventExecutor.java:510)
>     at io.netty.channel.nio.NioEventLoop.run (NioEventLoop.java:518)
>     at io.netty.util.concurrent.SingleThreadEventExecutor$6.run
> (SingleThreadEventExecutor.java:1044)
>     at io.netty.util.internal.ThreadExecutorMap$2.run
> (ThreadExecutorMap.java:74)
>     at io.netty.util.concurrent.FastThreadLocalRunnable.run
> (FastThreadLocalRunnable.java:30)
>     at java.lang.Thread.run (Thread.java:830)
> [INFO]
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Total time:  3.977 s
> [INFO] Finished at: 2020-02-29T17:20:43Z
> [INFO]
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal
> org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project
> examples: An exception occured while executing the Java class. Cannot
> assign requested address: Service 'sparkDriver' failed after 16 retries (on
> a random free port)! Consider explicitly setting the appropriate binding
> address for the service 'sparkDriver' (for example spark.driver.bindAddress
> for SparkDriver) to the correct binding address. -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions,
> please read the following articles:
> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
> zahid@localhost:~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
> mvn exec:java -Dexec.mainClass=com.javacodegeek.examples.SparkExampleRDD
> -Dexec.args="input.txt" > log.out
> WARNING: An illegal reflective access operation has occurred
> WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform
> (file:/home/zahid/.m2/repository/org/apache/spark/spark-unsafe_2.12/2.4.5/spark-unsafe_2.12-2.4.5.jar)
> to method java.nio.Bits.unaligned()
> WARNING: Please consider reporting this to the maintainers of
> org.apache.spark.unsafe.Platform
> WARNING: Use --illegal-access=warn to enable warnings of further illegal
> reflective access operations
> WARNING: All illegal access operations will be denied in a future release
> Using Spark's default log4j profile:
> org/apache/spark/log4j-defaults.properties
> 20/02/29 17:21:12 INFO SparkContext: Running Spark version 2.4.5
> 20/02/29 17:21:12 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> 20/02/29 17:21:12 INFO SparkContext: Submitted application: Word Count
> 20/02/29 17:21:12 INFO SecurityManager: Changing view acls to: zahid
> 20/02/29 17:21:12 INFO SecurityManager: Changing modify acls to: zahid
> 20/02/29 17:21:12 INFO SecurityManager: Changing view acls groups to:
> 20/02/29 17:21:12 INFO SecurityManager: Changing modify acls groups to:
> 20/02/29 17:21:12 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users  with view permissions: Set(zahid);
> groups with view permissions: Set(); users  with modify permissions:
> Set(zahid); groups with modify permissions: Set()
> 20/02/29 17:21:12 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:21:12 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:21:12 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:21:12 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:21:12 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
> random free port. You may check whether configuring an appropriate binding
> address.
> 20/02/29 17:21:13 ERROR SparkContext: Error initializing SparkContext.
> java.net.BindException: Cannot assign requested address: Service
> 'sparkDriver' failed after 16 retries (on a random free port)! Consider
> explicitly setting the appropriate binding address for the service
> 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the
> correct binding address.
>         at java.base/sun.nio.ch.Net.bind0(Native Method)
>         at java.base/sun.nio.ch.Net.bind(Net.java:469)
>         at java.base/sun.nio.ch.Net.bind(Net.java:458)
>         at
> java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:220)
>         at
> io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:132)
>         at
> io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:551)
>         at
> io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1346)
>         at
> io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:503)
>         at
> io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:488)
>         at
> io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:985)
>         at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:247)
>         at
> io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:344)
>         at
> io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
>         at
> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:510)
>         at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:518)
>         at
> io.netty.util.concurrent.SingleThreadEventExecutor$6.run(SingleThreadEventExecutor.java:1044)
>         at
> io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
>         at
> io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
>         at java.base/java.lang.Thread.run(Thread.java:830)
> 20/02/29 17:21:13 INFO SparkContext: Successfully stopped SparkContext
> zahid@localhost:~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
> ls
> input.txt  log.out  pom.xml  src  target
> zahid@localhost:~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
> cat log.out
> [INFO] Scanning for projects...
> [WARNING]
> [WARNING] Some problems were encountered while building the effective
> model for javacodegeek:examples:jar:1.0-SNAPSHOT
> [WARNING] 'build.plugins.plugin.version' for
> org.apache.maven.plugins:maven-compiler-plugin is missing. @ line 12,
> column 21
> [WARNING]
> [WARNING] It is highly recommended to fix these problems because they
> threaten the stability of your build.
> [WARNING]
> [WARNING] For this reason, future Maven versions might no longer support
> building such malformed projects.
> [WARNING]
> [INFO]
> [INFO] -----------------------< javacodegeek:examples
> >------------------------
> [INFO] Building examples 1.0-SNAPSHOT
> [INFO] --------------------------------[ jar
> ]---------------------------------
> [INFO]
> [INFO] --- exec-maven-plugin:1.6.0:java (default-cli) @ examples ---
> [WARNING]
> java.net.BindException: Cannot assign requested address: Service
> 'sparkDriver' failed after 16 retries (on a random free port)! Consider
> explicitly setting the appropriate binding address for the service
> 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the
> correct binding address.
>     at sun.nio.ch.Net.bind0 (Native Method)
>     at sun.nio.ch.Net.bind (Net.java:469)
>     at sun.nio.ch.Net.bind (Net.java:458)
>     at sun.nio.ch.ServerSocketChannelImpl.bind
> (ServerSocketChannelImpl.java:220)
>     at io.netty.channel.socket.nio.NioServerSocketChannel.doBind
> (NioServerSocketChannel.java:132)
>     at io.netty.channel.AbstractChannel$AbstractUnsafe.bind
> (AbstractChannel.java:551)
>     at io.netty.channel.DefaultChannelPipeline$HeadContext.bind
> (DefaultChannelPipeline.java:1346)
>     at io.netty.channel.AbstractChannelHandlerContext.invokeBind
> (AbstractChannelHandlerContext.java:503)
>     at io.netty.channel.AbstractChannelHandlerContext.bind
> (AbstractChannelHandlerContext.java:488)
>     at io.netty.channel.DefaultChannelPipeline.bind
> (DefaultChannelPipeline.java:985)
>     at io.netty.channel.AbstractChannel.bind (AbstractChannel.java:247)
>     at io.netty.bootstrap.AbstractBootstrap$2.run
> (AbstractBootstrap.java:344)
>     at io.netty.util.concurrent.AbstractEventExecutor.safeExecute
> (AbstractEventExecutor.java:163)
>     at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks
> (SingleThreadEventExecutor.java:510)
>     at io.netty.channel.nio.NioEventLoop.run (NioEventLoop.java:518)
>     at io.netty.util.concurrent.SingleThreadEventExecutor$6.run
> (SingleThreadEventExecutor.java:1044)
>     at io.netty.util.internal.ThreadExecutorMap$2.run
> (ThreadExecutorMap.java:74)
>     at io.netty.util.concurrent.FastThreadLocalRunnable.run
> (FastThreadLocalRunnable.java:30)
>     at java.lang.Thread.run (Thread.java:830)
> [INFO]
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Total time:  4.949 s
> [INFO] Finished at: 2020-02-29T17:21:16Z
> [INFO]
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal
> org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project
> examples: An exception occured while executing the Java class. Cannot
> assign requested address: Service 'sparkDriver' failed after 16 retries (on
> a random free port)! Consider explicitly setting the appropriate binding
> address for the service 'sparkDriver' (for example spark.driver.bindAddress
> for SparkDriver) to the correct binding address. -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions,
> please read the following articles:
> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
> zahid@localhost:~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
>
>
>
>
>
> <http://www.backbutton.co.uk>
>