You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@phoenix.apache.org by "Josh Mahonin (JIRA)" <ji...@apache.org> on 2015/11/17 16:43:10 UTC

[jira] [Commented] (PHOENIX-2426) Spark Data Source API Giving Exception

    [ https://issues.apache.org/jira/browse/PHOENIX-2426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15008857#comment-15008857 ] 

Josh Mahonin commented on PHOENIX-2426:
---------------------------------------

Patch attached. The catalyst mapper was missing the SMALLINT data type. It was also missing TINYINT, so I've included that as well.

I think the catalyst matchers are (finally) complete for non-abstract PDataTypes, but if anyone else wants to help double-check for me, the Phoenix types are here [1] and the Spark types are here [2]. The Spark types don't appear to have changed since 1.3

[1] https://github.com/apache/phoenix/tree/master/phoenix-core/src/main/java/org/apache/phoenix/schema/types
[2] http://spark.apache.org/docs/latest/sql-programming-guide.html#data-types

> Spark Data Source API Giving Exception
> --------------------------------------
>
>                 Key: PHOENIX-2426
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-2426
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 4.4.0, 4.6.0
>         Environment: CentOS 7.0
>            Reporter: Gokhan Cagrici
>            Priority: Blocker
>         Attachments: PHOENIX-2426-v2.patch, PHOENIX-2426.patch
>
>
> Table Definition:
> CREATE TABLE EVENT_FACT (
>   TENANT_ID VARCHAR NOT NULL,
>   EVENT_ID BIGINT,
>   EVENT_KEY BIGINT NOT NULL,
>   DATA_SOURCE_ID VARCHAR(64),
>   DEVICE_TYPE1_KEY BIGINT,
>   AUTHENTITYID BIGINT,
>   ALARMFOREVENTS_ID BIGINT,
>   SEVERITY_NUMBER SMALLINT,
>   SEVERITY VARCHAR(20),
>   NOTIFICATIONDATE DATE,
>   NOTIFICATIONTIMESTAMP TIMESTAMP,
>   DAY_IN_MONTH SMALLINT,
>   MONTH_NUMBER SMALLINT,
>   QUARTER_NUMBER SMALLINT,
>   YEAR SMALLINT,
>   WEEK_NUMBER SMALLINT,
>   YEAR_FOR_WEEK SMALLINT,
>   HOUR SMALLINT,
>   MINUTE SMALLINT,
>   SECOND SMALLINT,
>   TIME_KEY INTEGER,
>   CLASSNAME VARCHAR(255),
>   CATEGORY VARCHAR(255),
>   DISPLAYNAME VARCHAR(255),
>   DESCRIPTION VARCHAR(1024),
>   SOURCE VARCHAR(255),
>   EVENTTYPE VARCHAR(255),
>   NTTYADDRSS7_ADDRESS VARCHAR(100),
>   GENERATEDBY VARCHAR(255),
>   SRCOBJECTBUSINESSKEY VARCHAR(1024),
>   SRCOBJECTDISPLAYNAME VARCHAR(255),
>   OWNING_ENTITY VARCHAR(255),
>   PRDCTSRS_VALUE VARCHAR(255),
>   PRODUCTTYPE_VALUE VARCHAR(255),
>   PRDCTFMLY_VALUE VARCHAR(100),
>   SOFTWAREVERSION VARCHAR(100),
>   IPADDRESS VARCHAR(50),
>   DEVICENAME VARCHAR(255),
>   COUNT SMALLINT,
>   ELEMENTNAME VARCHAR(255),
>   SRCOBJECTID BIGINT,
>   CONSTRAINT PK PRIMARY KEY (TENANT_ID, EVENT_KEY)
> ) SALT_BUCKETS=4, COMPRESSION='GZ', VERSIONS=1 , IMMUTABLE_ROWS=true, MULTI_TENANT=true;
> Code:
> val df = sqlContext.load(
>   "org.apache.phoenix.spark",
>   Map("table" -> "EVENT_FACT", "zkUrl" -> "zookeeper:2181")
> )
> Exception:
> scala.MatchError: SMALLINT (of class org.apache.phoenix.schema.types.PSmallint)
>   at org.apache.phoenix.spark.PhoenixRDD.phoenixTypeToCatalystType(PhoenixRDD.scala:134)
>   at org.apache.phoenix.spark.PhoenixRDD$$anonfun$phoenixSchemaToCatalystSchema$1.apply(PhoenixRDD.scala:127)
>   at org.apache.phoenix.spark.PhoenixRDD$$anonfun$phoenixSchemaToCatalystSchema$1.apply(PhoenixRDD.scala:126)
>   at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
>   at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
>   at scala.collection.Iterator$class.foreach(Iterator.scala:742)
>   at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
>   at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
>   at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
>   at scala.collection.TraversableLike$class.map(TraversableLike.scala:245)
>   at scala.collection.AbstractTraversable.map(Traversable.scala:104)
>   at org.apache.phoenix.spark.PhoenixRDD.phoenixSchemaToCatalystSchema(PhoenixRDD.scala:126)
>   at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:110)
>   at org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:57)
>   at org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(LogicalRelation.scala:31)
>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:120)
>   at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1203)
>   ... 56 elided



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)