You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2017/09/18 04:22:00 UTC
[jira] [Resolved] (SPARK-22043) Python profile, show_profiles() and
dump_profiles(), should throw an error with a better message
[ https://issues.apache.org/jira/browse/SPARK-22043?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-22043.
----------------------------------
Resolution: Fixed
Fix Version/s: 2.1.2
2.3.0
2.2.1
Issue resolved by pull request 19260
[https://github.com/apache/spark/pull/19260]
> Python profile, show_profiles() and dump_profiles(), should throw an error with a better message
> ------------------------------------------------------------------------------------------------
>
> Key: SPARK-22043
> URL: https://issues.apache.org/jira/browse/SPARK-22043
> Project: Spark
> Issue Type: Improvement
> Components: PySpark
> Affects Versions: 2.3.0
> Reporter: Hyukjin Kwon
> Priority: Trivial
> Fix For: 2.2.1, 2.3.0, 2.1.2
>
>
> I mistakenly missed {{spark.python.profile}} enabled today while profiling and met this unfriendly messages:
> {code}
> >>> sc.show_profiles()
> Traceback (most recent call last):
> File "<stdin>", line 1, in <module>
> File ".../spark/python/pyspark/context.py", line 1000, in show_profiles
> self.profiler_collector.show_profiles()
> AttributeError: 'NoneType' object has no attribute 'show_profiles'
> >>> sc.dump_profiles("/tmp/abc")
> Traceback (most recent call last):
> File "<stdin>", line 1, in <module>
> File ".../spark/python/pyspark/context.py", line 1005, in dump_profiles
> self.profiler_collector.dump_profiles(path)
> AttributeError: 'NoneType' object has no attribute 'dump_profiles'
> {code}
> It looks we should give better information that says {{spark.python.profile}} should be enabled.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org