You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Justin Uang (JIRA)" <ji...@apache.org> on 2015/12/05 17:54:10 UTC

[jira] [Created] (SPARK-12157) Support numpy types as return values of Python UDFs

Justin Uang created SPARK-12157:
-----------------------------------

             Summary: Support numpy types as return values of Python UDFs
                 Key: SPARK-12157
                 URL: https://issues.apache.org/jira/browse/SPARK-12157
             Project: Spark
          Issue Type: Improvement
          Components: PySpark, SQL
    Affects Versions: 1.5.2
            Reporter: Justin Uang


Currently, if I have a python UDF

{code}
import pyspark.sql.types as T
import pyspark.sql.functions as F
from pyspark.sql import Row
import numpy as np

argmax = F.udf(lambda x: np.argmax(x), T.IntegerType())

df = sqlContext.createDataFrame([Row(array=[1,2,3])])
df.select(argmax("array")).count()
{code}

I get an exception that is fairly opaque:

{code}
Caused by: net.razorvine.pickle.PickleException: expected zero arguments for construction of ClassDict (for numpy.dtype)
        at net.razorvine.pickle.objects.ClassDictConstructor.construct(ClassDictConstructor.java:23)
        at net.razorvine.pickle.Unpickler.load_reduce(Unpickler.java:701)
        at net.razorvine.pickle.Unpickler.dispatch(Unpickler.java:171)
        at net.razorvine.pickle.Unpickler.load(Unpickler.java:85)
        at net.razorvine.pickle.Unpickler.loads(Unpickler.java:98)
        at org.apache.spark.sql.execution.BatchPythonEvaluation$$anonfun$doExecute$1$$anonfun$apply$3.apply(python.scala:404)
        at org.apache.spark.sql.execution.BatchPythonEvaluation$$anonfun$doExecute$1$$anonfun$apply$3.apply(python.scala:403)
{code}

Numpy types like np.int and np.float64 should automatically be cast to the proper dtypes.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org