You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by xiatao123 <ta...@udacity.com> on 2018/01/31 06:51:00 UTC

How to access JobManager and TaskManager

In the web UI, I can see these information under JobManager. How can I access
variables job_env in main method?

Job Manager
Configuration
env.hadoop.conf.dir	/etc/hadoop/conf
env.yarn.conf.dir	/etc/hadoop/conf
high-availability.cluster-id	application_1517362137681_0001
job_env	stage
jobmanager.rpc.address	ip-172-32-37-243.us-west-2.compute.internal
jobmanager.rpc.port	46253
jobmanager.web.port	0
taskmanager.numberOfTaskSlots	4

When Task Manager starts, I also noticed the same setting "job_env" is
loaded in GlobalConfiguration.

2018-01-31 01:34:54,970 INFO 
org.apache.flink.configuration.GlobalConfiguration            - Loading
configuration property: env.yarn.conf.dir, /etc/hadoop/conf
2018-01-31 01:34:54,976 INFO 
org.apache.flink.configuration.GlobalConfiguration            - Loading
configuration property: taskmanager.maxRegistrationDuration, 5 minutes
2018-01-31 01:34:54,979 INFO 
org.apache.flink.configuration.GlobalConfiguration            - Loading
configuration property: high-availability.cluster-id,
application_1517362137681_0001
2018-01-31 01:34:54,979 INFO 
org.apache.flink.configuration.GlobalConfiguration            - Loading
configuration property: env.hadoop.conf.dir, /etc/hadoop/conf
2018-01-31 01:34:54,979 INFO 
org.apache.flink.configuration.GlobalConfiguration            - Loading
configuration property: taskmanager.numberOfTaskSlots, 4
2018-01-31 01:34:54,982 INFO 
org.apache.flink.configuration.GlobalConfiguration            - Loading
configuration property: jobmanager.rpc.address,
ip-172-32-37-243.us-west-2.compute.internal
2018-01-31 01:34:54,982 INFO 
org.apache.flink.configuration.GlobalConfiguration            - Loading
configuration property: job_env, stage
2018-01-31 01:34:54,982 INFO 
org.apache.flink.configuration.GlobalConfiguration            - Loading
configuration property: jobmanager.web.port, 0
2018-01-31 01:34:54,983 INFO 
org.apache.flink.configuration.GlobalConfiguration            - Loading
configuration property: jobmanager.rpc.port, 46253

BUT, when I try to access or print out all the variables in my main method
    val configs = GlobalConfiguration.loadConfiguration().toMap
    for ((k,v) <- configs) println(s"Configs key: $k, value: $v")
I only got these 3:
Configs key: env.hadoop.conf.dir, value: /etc/hadoop/conf
Configs key: taskmanager.numberOfTaskSlots, value: 4
Configs key: env.yarn.conf.dir, value: /etc/hadoop/conf

anyone can help?



--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/

Re: How to access JobManager and TaskManager

Posted by xiatao123 <ta...@udacity.com>.
Hi Tim,
  "job_env" is a variable I passed to launch YARN application. I just want
to access it in my flink application main method. There is is no
documentation on how to access customized job environment variables or
settings.
Thanks,
Tao



--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/

Re: How to access JobManager and TaskManager

Posted by Timo Walther <tw...@apache.org>.
I don't have this property in my local running Flink cluster.

Which Flink version and deployment are you using? Are you sure this 
property is not set in your flink-conf.yaml?

Regards,
Timo


Am 1/31/18 um 7:51 AM schrieb xiatao123:
> In the web UI, I can see these information under JobManager. How can I access
> variables job_env in main method?
>
> Job Manager
> Configuration
> env.hadoop.conf.dir	/etc/hadoop/conf
> env.yarn.conf.dir	/etc/hadoop/conf
> high-availability.cluster-id	application_1517362137681_0001
> job_env	stage
> jobmanager.rpc.address	ip-172-32-37-243.us-west-2.compute.internal
> jobmanager.rpc.port	46253
> jobmanager.web.port	0
> taskmanager.numberOfTaskSlots	4
>
> When Task Manager starts, I also noticed the same setting "job_env" is
> loaded in GlobalConfiguration.
>
> 2018-01-31 01:34:54,970 INFO
> org.apache.flink.configuration.GlobalConfiguration            - Loading
> configuration property: env.yarn.conf.dir, /etc/hadoop/conf
> 2018-01-31 01:34:54,976 INFO
> org.apache.flink.configuration.GlobalConfiguration            - Loading
> configuration property: taskmanager.maxRegistrationDuration, 5 minutes
> 2018-01-31 01:34:54,979 INFO
> org.apache.flink.configuration.GlobalConfiguration            - Loading
> configuration property: high-availability.cluster-id,
> application_1517362137681_0001
> 2018-01-31 01:34:54,979 INFO
> org.apache.flink.configuration.GlobalConfiguration            - Loading
> configuration property: env.hadoop.conf.dir, /etc/hadoop/conf
> 2018-01-31 01:34:54,979 INFO
> org.apache.flink.configuration.GlobalConfiguration            - Loading
> configuration property: taskmanager.numberOfTaskSlots, 4
> 2018-01-31 01:34:54,982 INFO
> org.apache.flink.configuration.GlobalConfiguration            - Loading
> configuration property: jobmanager.rpc.address,
> ip-172-32-37-243.us-west-2.compute.internal
> 2018-01-31 01:34:54,982 INFO
> org.apache.flink.configuration.GlobalConfiguration            - Loading
> configuration property: job_env, stage
> 2018-01-31 01:34:54,982 INFO
> org.apache.flink.configuration.GlobalConfiguration            - Loading
> configuration property: jobmanager.web.port, 0
> 2018-01-31 01:34:54,983 INFO
> org.apache.flink.configuration.GlobalConfiguration            - Loading
> configuration property: jobmanager.rpc.port, 46253
>
> BUT, when I try to access or print out all the variables in my main method
>      val configs = GlobalConfiguration.loadConfiguration().toMap
>      for ((k,v) <- configs) println(s"Configs key: $k, value: $v")
> I only got these 3:
> Configs key: env.hadoop.conf.dir, value: /etc/hadoop/conf
> Configs key: taskmanager.numberOfTaskSlots, value: 4
> Configs key: env.yarn.conf.dir, value: /etc/hadoop/conf
>
> anyone can help?
>
>
>
> --
> Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/