You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/08/16 03:36:19 UTC
[GitHub] [spark] zhengruifeng opened a new pull request, #37530: [SPARK-40095][PYTHON] sc.uiWebUrl should not throw exception when webui is disabled
zhengruifeng opened a new pull request, #37530:
URL: https://github.com/apache/spark/pull/37530
### What changes were proposed in this pull request?
`sc.uiWebUrl` return the url only when the web ui is enabled.
### Why are the changes needed?
spark-shell runs well with `spark.ui.enabled=False`
```
(base) ➜ spark git:(master) bin/spark-shell --conf spark.ui.enabled=False
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/08/16 11:31:29 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Spark context available as 'sc' (master = local[*], app id = local-1660620690256).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 3.4.0-SNAPSHOT
/_/
Using Scala version 2.12.16 (OpenJDK 64-Bit Server VM, Java 1.8.0_342)
Type in expressions to have them evaluated.
Type :help for more information.
scala> sc.uiWebUrl
res0: Option[String] = None
```
while pyspark will throw an exception:
```
(base) ➜ spark git:(master) bin/pyspark --conf spark.ui.enabled=False
Python 3.9.12 (main, Apr 5 2022, 01:52:34)
Type 'copyright', 'credits' or 'license' for more information
IPython 8.4.0 -- An enhanced Interactive Python. Type '?' for help.
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/08/16 11:34:33 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/__ / .__/\_,_/_/ /_/\_\ version 3.4.0-SNAPSHOT
/_/
Using Python version 3.9.12 (main, Apr 5 2022 01:52:34)
[TerminalIPythonApp] WARNING | Unknown error in handling PYTHONSTARTUP file /Users/ruifeng.zheng/Dev/spark//python/pyspark/shell.py:
---------------------------------------------------------------------------
Py4JJavaError Traceback (most recent call last)
File ~/.dev/miniconda3/lib/python3.9/site-packages/IPython/core/shellapp.py:360, in InteractiveShellApp._exec_file(self, fname, shell_futures)
356 self.shell.safe_execfile_ipy(full_filename,
357 shell_futures=shell_futures)
358 else:
359 # default to python, even without extension
--> 360 self.shell.safe_execfile(full_filename,
361 self.shell.user_ns,
362 shell_futures=shell_futures,
363 raise_exceptions=True)
364 finally:
365 sys.argv = save_argv
File ~/.dev/miniconda3/lib/python3.9/site-packages/IPython/core/interactiveshell.py:2738, in InteractiveShell.safe_execfile(self, fname, exit_ignore, raise_exceptions, shell_futures, *where)
2736 try:
2737 glob, loc = (where + (None, ))[:2]
-> 2738 py3compat.execfile(
2739 fname, glob, loc,
2740 self.compile if shell_futures else None)
2741 except SystemExit as status:
2742 # If the call was made with 0 or None exit status (sys.exit(0)
2743 # or sys.exit() ), don't bother showing a traceback, as both of
(...)
2749 # For other exit status, we show the exception unless
2750 # explicitly silenced, but only in short form.
2751 if status.code:
File ~/.dev/miniconda3/lib/python3.9/site-packages/IPython/utils/py3compat.py:55, in execfile(fname, glob, loc, compiler)
53 with open(fname, "rb") as f:
54 compiler = compiler or compile
---> 55 exec(compiler(f.read(), fname, "exec"), glob, loc)
File ~/Dev/spark/python/pyspark/shell.py:70, in <module>
56 print(
57 r"""Welcome to
58 ____ __
(...)
64 % sc.version
65 )
66 print(
67 "Using Python version %s (%s, %s)"
68 % (platform.python_version(), platform.python_build()[0], platform.python_build()[1])
69 )
---> 70 print("Spark context Web UI available at %s" % (sc.uiWebUrl))
71 print("Spark context available as 'sc' (master = %s, app id = %s)." % (sc.master, sc.applicationId))
72 print("SparkSession available as 'spark'.")
File ~/Dev/spark/python/pyspark/context.py:583, in SparkContext.uiWebUrl(self)
572 @property
573 def uiWebUrl(self) -> str:
574 """Return the URL of the SparkUI instance started by this :class:`SparkContext`
575
576 .. versionadded:: 2.1.0
(...)
581 'http://...'
582 """
--> 583 return self._jsc.sc().uiWebUrl().get()
File ~/Dev/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/java_gateway.py:1321, in JavaMember.__call__(self, *args)
1315 command = proto.CALL_COMMAND_NAME +\
1316 self.command_header +\
1317 args_command +\
1318 proto.END_COMMAND_PART
1320 answer = self.gateway_client.send_command(command)
-> 1321 return_value = get_return_value(
1322 answer, self.gateway_client, self.target_id, self.name)
1324 for temp_arg in temp_args:
1325 temp_arg._detach()
File ~/Dev/spark/python/pyspark/sql/utils.py:190, in capture_sql_exception.<locals>.deco(*a, **kw)
188 def deco(*a: Any, **kw: Any) -> Any:
189 try:
--> 190 return f(*a, **kw)
191 except Py4JJavaError as e:
192 converted = convert_exception(e.java_exception)
File ~/Dev/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/protocol.py:326, in get_return_value(answer, gateway_client, target_id, name)
324 value = OUTPUT_CONVERTER[type](answer[2:], gateway_client)
325 if answer[1] == REFERENCE_TYPE:
--> 326 raise Py4JJavaError(
327 "An error occurred while calling {0}{1}{2}.\n".
328 format(target_id, ".", name), value)
329 else:
330 raise Py4JError(
331 "An error occurred while calling {0}{1}{2}. Trace:\n{3}\n".
332 format(target_id, ".", name, value))
Py4JJavaError: An error occurred while calling o33.get.
: java.util.NoSuchElementException: None.get
at scala.None$.get(Option.scala:529)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
at java.lang.Thread.run(Thread.java:750)
```
### Does this PR introduce _any_ user-facing change?
yes, when webui is disabled, `sc.uiWebUrl` returns None
### How was this patch tested?
manually test
```
(base) ➜ spark git:(ui_not_exception) bin/pyspark --conf spark.ui.enabled=False
Python 3.9.12 (main, Apr 5 2022, 01:52:34)
Type 'copyright', 'credits' or 'license' for more information
IPython 8.4.0 -- An enhanced Interactive Python. Type '?' for help.
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/08/16 11:35:55 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/__ / .__/\_,_/_/ /_/\_\ version 3.4.0-SNAPSHOT
/_/
Using Python version 3.9.12 (main, Apr 5 2022 01:52:34)
Spark context Web UI available at None
Spark context available as 'sc' (master = local[*], app id = local-1660620955503).
SparkSession available as 'spark'.
In [1]: sc.uiWebUrl
In [2]: sc.uiWebUrl == None
Out[2]: True
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
[GitHub] [spark] HyukjinKwon commented on pull request #37530: [SPARK-40095][PYTHON] sc.uiWebUrl should not throw exception when webui is disabled
Posted by GitBox <gi...@apache.org>.
HyukjinKwon commented on PR #37530:
URL: https://github.com/apache/spark/pull/37530#issuecomment-1216186917
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
[GitHub] [spark] zhengruifeng commented on pull request #37530: [SPARK-40095][PYTHON] sc.uiWebUrl should not throw exception when webui is disabled
Posted by GitBox <gi...@apache.org>.
zhengruifeng commented on PR #37530:
URL: https://github.com/apache/spark/pull/37530#issuecomment-1216224581
thank you @HyukjinKwon
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
[GitHub] [spark] HyukjinKwon commented on a diff in pull request #37530: [SPARK-40095][PYTHON] sc.uiWebUrl should not throw exception when webui is disabled
Posted by GitBox <gi...@apache.org>.
HyukjinKwon commented on code in PR #37530:
URL: https://github.com/apache/spark/pull/37530#discussion_r946309177
##########
python/pyspark/context.py:
##########
@@ -570,17 +570,22 @@ def applicationId(self) -> str:
return self._jsc.sc().applicationId()
@property
- def uiWebUrl(self) -> str:
+ def uiWebUrl(self) -> Optional[str]:
"""Return the URL of the SparkUI instance started by this :class:`SparkContext`
.. versionadded:: 2.1.0
+ Notes
+ -----
+ When the web ui is disabled ("spark.ui.enabled=False"), it returns None.
Review Comment:
```suggestion
When the web ui is disabled, e.g., by ``spark.ui.enabled`` set to ``False``,
it returns ``None``.
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org
[GitHub] [spark] HyukjinKwon closed pull request #37530: [SPARK-40095][PYTHON] sc.uiWebUrl should not throw exception when webui is disabled
Posted by GitBox <gi...@apache.org>.
HyukjinKwon closed pull request #37530: [SPARK-40095][PYTHON] sc.uiWebUrl should not throw exception when webui is disabled
URL: https://github.com/apache/spark/pull/37530
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org