You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Albertus Kelvin (JIRA)" <ji...@apache.org> on 2019/08/01 11:36:00 UTC
[jira] [Updated] (SPARK-28590) Add sort_stats Setter for Custom
Profiler
[ https://issues.apache.org/jira/browse/SPARK-28590?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Albertus Kelvin updated SPARK-28590:
------------------------------------
Description:
When I want to use BasicProfiler with different sorters in sort_stats, I sometimes need to create a custom profiler and implement the show() method only to replace the following line: stats.sort_stats("time", "cumulative").print_stats().
I think it'd be better if the users are able to specify the sorters without creating a custom profiler.
I implemented the changes in PySpark only.
To apply the setter and getter methods, one can use the following way:
{code:python}
conf = SparkConf().set("spark.python.profile", "true")
# use BasicProfiler
sc = SparkContext('local', 'test', conf=conf)
sc.profiler_collector.profiler_cls.set_sort_stats_sorters(BasicProfiler, ['ncalls', 'tottime', 'name']
{code}
was:
When doing profiling using custom profilers, I sometimes need to re-create another custom profiler class when using different sorter in *sort_stats*. I think it would be better if the users are able to set their own sorting columns preference. To do this, I thought of creating a *setter* method in the *Profiler* class (python/pyspark/profiler.py).
FYI, I implemented the changes in PySpark only.
> Add sort_stats Setter for Custom Profiler
> -----------------------------------------
>
> Key: SPARK-28590
> URL: https://issues.apache.org/jira/browse/SPARK-28590
> Project: Spark
> Issue Type: Improvement
> Components: PySpark
> Affects Versions: 2.4.0
> Reporter: Albertus Kelvin
> Priority: Minor
>
> When I want to use BasicProfiler with different sorters in sort_stats, I sometimes need to create a custom profiler and implement the show() method only to replace the following line: stats.sort_stats("time", "cumulative").print_stats().
> I think it'd be better if the users are able to specify the sorters without creating a custom profiler.
> I implemented the changes in PySpark only.
> To apply the setter and getter methods, one can use the following way:
> {code:python}
> conf = SparkConf().set("spark.python.profile", "true")
> # use BasicProfiler
> sc = SparkContext('local', 'test', conf=conf)
> sc.profiler_collector.profiler_cls.set_sort_stats_sorters(BasicProfiler, ['ncalls', 'tottime', 'name']
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.14#76016)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org