You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Elchin (JIRA)" <ji...@apache.org> on 2019/01/16 08:52:00 UTC
[jira] [Closed] (SPARK-26591) Scalar Pandas UDF fails with 'illegal
hardware instruction' in a certain environment
[ https://issues.apache.org/jira/browse/SPARK-26591?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Elchin closed SPARK-26591.
--------------------------
> Scalar Pandas UDF fails with 'illegal hardware instruction' in a certain environment
> ------------------------------------------------------------------------------------
>
> Key: SPARK-26591
> URL: https://issues.apache.org/jira/browse/SPARK-26591
> Project: Spark
> Issue Type: Bug
> Components: PySpark
> Affects Versions: 2.4.0
> Environment: Python 3.6.7
> Pyspark 2.4.0
> OS:
> {noformat}
> Linux 4.15.0-43-generic #46-Ubuntu SMP Thu Dec 6 14:45:28 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux{noformat}
> CPU:
>
> {code:java}
> Dual core AMD Athlon II P360 (-MCP-) cache: 1024 KB
> clock speeds: max: 2300 MHz 1: 1700 MHz 2: 1700 MHz
> {code}
>
>
> Reporter: Elchin
> Priority: Major
> Attachments: core
>
>
> When I try to use pandas_udf from examples in [documentation|https://spark.apache.org/docs/2.4.0/api/python/pyspark.sql.html#pyspark.sql.functions.pandas_udf]:
> {code:java}
> from pyspark.sql.functions import pandas_udf, PandasUDFType
> from pyspark.sql.types import IntegerType, StringType
> slen = pandas_udf(lambda s: s.str.len(), IntegerType()) #here it is crashed{code}
> I get the error:
> {code:java}
> [1] 17969 illegal hardware instruction (core dumped) python3{code}
> The environment is:
> Python 3.6.7
> PySpark 2.4.0
> PyArrow: 0.11.1
> Pandas: 0.23.4
> NumPy: 1.15.4
> OS: Linux 4.15.0-43-generic #46-Ubuntu SMP Thu Dec 6 14:45:28 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org