You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2018/01/11 15:48:00 UTC
[jira] [Commented] (SPARK-23044) merge script has bug when
assigning jiras to non-contributors
[ https://issues.apache.org/jira/browse/SPARK-23044?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16322433#comment-16322433 ]
Apache Spark commented on SPARK-23044:
--------------------------------------
User 'squito' has created a pull request for this issue:
https://github.com/apache/spark/pull/20236
> merge script has bug when assigning jiras to non-contributors
> -------------------------------------------------------------
>
> Key: SPARK-23044
> URL: https://issues.apache.org/jira/browse/SPARK-23044
> Project: Spark
> Issue Type: Bug
> Components: Project Infra
> Affects Versions: 2.3.0
> Reporter: Imran Rashid
> Priority: Minor
>
> as reported by [~jerryshao]:
> bq. Hi Imran Rashid, looks like the changes will throw an exception when the assignee is not yet a contributor. Please see the stack.
> {noformat}
> Traceback (most recent call last):
> File "./dev/merge_spark_pr.py", line 501, in <module>
> main()
> File "./dev/merge_spark_pr.py", line 487, in main
> resolve_jira_issues(title, merged_refs, jira_comment)
> File "./dev/merge_spark_pr.py", line 327, in resolve_jira_issues
> resolve_jira_issue(merge_branches, comment, jira_id)
> File "./dev/merge_spark_pr.py", line 245, in resolve_jira_issue
> cur_assignee = choose_jira_assignee(issue, asf_jira)
> File "./dev/merge_spark_pr.py", line 317, in choose_jira_assignee
> asf_jira.assign_issue(issue.key, assignee.key)
> File "/Library/Python/2.7/site-packages/jira/client.py", line 108, in wrapper
> result = func(*arg_list, **kwargs)
> {noformat}
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org