You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reynold Xin (JIRA)" <ji...@apache.org> on 2015/06/01 03:04:17 UTC

[jira] [Commented] (SPARK-7987) TransportContext.createServer(int port) is missing in Spark 1.4

    [ https://issues.apache.org/jira/browse/SPARK-7987?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14566857#comment-14566857 ] 

Reynold Xin commented on SPARK-7987:
------------------------------------

I don't think mima runs on the network module, which is not a public API. Why do we care about TransportContext.createServer?

> TransportContext.createServer(int port) is missing in Spark 1.4
> ---------------------------------------------------------------
>
>                 Key: SPARK-7987
>                 URL: https://issues.apache.org/jira/browse/SPARK-7987
>             Project: Spark
>          Issue Type: Bug
>          Components: Shuffle
>    Affects Versions: 1.4.0
>            Reporter: Patrick Wendell
>            Priority: Blocker
>
> From what I can tell the SPARK-6229 patch removed this API:
> https://github.com/apache/spark/commit/38d4e9e446b425ca6a8fe8d8080f387b08683842#diff-d9d4b8d8e82b7d96d5e779353e4b2f4eL85
> I think adding it back should be easy enough, but I cannot figure out why this didn't trigger MIMA errors. I am wondering if MIMA was not enabled properly for some of the new modules:
> /cc [~vanzin] [~rxin] and [~adav]
> I put this as a blocker level issue because I'm wondering if we just aren't enforcing checks for some reason in some of our API's. So I think we need to block the 1.4 release on at least making sure no other serious API's were broken. If it turns out only this API was affected, or I'm just missing something, we can downgrade it.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org