You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Lokesh Yadav (JIRA)" <ji...@apache.org> on 2017/02/23 08:29:50 UTC
[jira] [Issue Comment Deleted] (SPARK-18832) Spark SQL:
thiriftserver unable to run a registered Hive UDTF
[ https://issues.apache.org/jira/browse/SPARK-18832?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Lokesh Yadav updated SPARK-18832:
---------------------------------
Comment: was deleted
(was: This is the code for the sample UDTF that I am using to test:
{{
package com.fuzzylogix.experiments.udf.hiveUDF;
import org.apache.hadoop.hive.ql.exec.UDFArgumentException;
import org.apache.hadoop.hive.ql.metadata.HiveException;
import org.apache.hadoop.hive.ql.udf.generic.GenericUDTF;
import org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector;
import org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorFactory;
import org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector;
import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorFactory;
import org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableConstantStringObjectInspector;
import java.util.ArrayList;
import java.util.List;
/*
* This is the class for a sample User Defined Table Function (UDTF)
*
*
* Example:
* Input:
*
* Input : "Paris"
*
* Output :
* "***Paris***"
* "###Paris###"
*
*/
public class SampleUDTF extends GenericUDTF {
private PrimitiveObjectInspector nameOI = null;
static String pName = null;
@Override
public StructObjectInspector initialize(ObjectInspector[] args) throws UDFArgumentException {
// input inspectors
nameOI = (PrimitiveObjectInspector) args[0];
pName = ((WritableConstantStringObjectInspector) nameOI).getWritableConstantValue()
.toString();
// output inspectors
List<String> fieldNames = new ArrayList<String>();
List<ObjectInspector> fieldOIs = new ArrayList<ObjectInspector>();
fieldNames.add("First");
fieldOIs.add(PrimitiveObjectInspectorFactory.javaStringObjectInspector);
fieldNames.add("Second");
fieldOIs.add(PrimitiveObjectInspectorFactory.javaStringObjectInspector);
return ObjectInspectorFactory.getStandardStructObjectInspector(fieldNames, fieldOIs);
}
@Override
public void process(Object[] record) throws HiveException {
ArrayList<Object> outList = new ArrayList<Object>();
outList.add("***" + pName + "***");
outList.add("###" + pName + "###");
forward(outList);
}
@Override
public void close() throws HiveException {
// do nothing
}
}
}})
> Spark SQL: thiriftserver unable to run a registered Hive UDTF
> -------------------------------------------------------------
>
> Key: SPARK-18832
> URL: https://issues.apache.org/jira/browse/SPARK-18832
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.0, 2.0.1, 2.0.2
> Environment: HDP: 2.5
> Spark: 2.0.0
> Reporter: Lokesh Yadav
>
> Spark Thriftserver is unable to run a HiveUDTF.
> It throws the error that it is unable to find the functions although the function registration succeeds and the funtions does show up in the list output by {{show functions}}.
> I am using a Hive UDTF, registering it using a jar placed on my local machine. Calling it using the following commands:
> //Registering the functions, this command succeeds.
> {{CREATE FUNCTION SampleUDTF AS 'com.fuzzylogix.experiments.udf.hiveUDF.SampleUDTF' USING JAR '/root/spark_files/experiments-1.2.jar';}}
> //Thriftserver is able to look up the functuion, on this command:
> {{DESCRIBE FUNCTION SampleUDTF;}}
> {quote}
> Output:
> +-----------------------------------------------------------+--+
> | function_desc |
> +-----------------------------------------------------------+--+
> | Function: default.SampleUDTF |
> | Class: com.fuzzylogix.experiments.udf.hiveUDF.SampleUDTF |
> | Usage: N/A. |
> +-----------------------------------------------------------+--+
> {quote}
> // Calling the function:
> {{SELECT SampleUDTF('Paris');}}
> bq. Output of the above command: Error: org.apache.spark.sql.AnalysisException: Undefined function: 'SampleUDTF'. This function is neither a registered temporary function nor a permanent function registered in the database 'default'.; line 1 pos 7 (state=,code=0)
> I have also tried with using a non-local (on hdfs) jar, but I get the same error.
> My environment: HDP 2.4 with spark 2.00
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org