You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Samuel Marks (JIRA)" <ji...@apache.org> on 2015/08/01 18:03:05 UTC

[jira] [Commented] (SPARK-9524) Latest Spark (8765665015ef47a23e00f7d01d4d280c31bb236d) breaks (pyspark)

    [ https://issues.apache.org/jira/browse/SPARK-9524?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14650413#comment-14650413 ] 

Samuel Marks commented on SPARK-9524:
-------------------------------------

You're welcome to close the issue, it was only reported because it said: "2.4+ kernel w/o ELF notes? -- report this".

Working from the 1.4 branch and everything built fine + works fine, which means that it's unlikely to be an IPython issue.

Anyways.

> Latest Spark (8765665015ef47a23e00f7d01d4d280c31bb236d) breaks (pyspark)
> ------------------------------------------------------------------------
>
>                 Key: SPARK-9524
>                 URL: https://issues.apache.org/jira/browse/SPARK-9524
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>         Environment: Ubuntu 15.04
> Linux Kudu 3.19.0-25-generic #26-Ubuntu SMP Fri Jul 24 21:17:31 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux
>            Reporter: Samuel Marks
>
> I start my ipython notebook like usual, after updating to the latest Spark (`git pull`). Also tried a complete folder removal + clone + `build/mvn -DskipTests clean package` just to be sure.
> I get a bunch of these 404 errors then this:
> {code:none}
> [W 00:13:49.462 NotebookApp] 404 GET /api/kernels/e7db54cb-f7bb-4bdf-8d0c-76110f26c12c/channels?session_id=64C70A32AA2940808FDCA038A3D9E5B5 (127.0.0.1) 3.72ms referer=None
> 2.4+ kernel w/o ELF notes? -- report this
> {code}
> PS: None of my Python code works within `ipython notebook` when it's launched via pyspark.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org