You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@predictionio.apache.org by "Donald Szeto (JIRA)" <ji...@apache.org> on 2017/04/03 20:09:41 UTC

[jira] [Updated] (PIO-36) Use Spark standalone cluster in integration tests

     [ https://issues.apache.org/jira/browse/PIO-36?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Donald Szeto updated PIO-36:
----------------------------
    Labels: gsoc2017  (was: )

> Use Spark standalone cluster in integration tests
> -------------------------------------------------
>
>                 Key: PIO-36
>                 URL: https://issues.apache.org/jira/browse/PIO-36
>             Project: PredictionIO
>          Issue Type: Bug
>            Reporter: Marcin ZiemiƄski
>            Priority: Minor
>              Labels: gsoc2017
>
> Although Spark master and worker are executed inside the docker image that run integration tests, no tests actually make use of them. Only a default local spark implementation is used. Setting a --master to a living cluster would resemble a real world cases better and also be more reliable.
> The python framework has to be updated to fix this issue, which should be just a matter of adding a few lines. We can make it an option in the tests to set up the master.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)