You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@calcite.apache.org by Christian Tzolov <ct...@pivotal.io> on 2017/11/28 20:39:05 UTC

Handling Functions in custom adapter implementations?

Hey there,

I have another question related to
​ ​
handling
​​
spatial (
​or​
any) functions in custom adapter implementations.

For e
​ ​
example the
​ ​
​rel
. plan for the following  query
​(​
with
​​
spatial function
​)​


EXPLAIN PLAN FOR SELECT
  "city",
  ST_Point(
      cast("loc" [0] AS DOUBLE),
      cast("loc" [1] AS DOUBLE)
  )
FROM "geode"."Zips"
LIMIT
10;

will ​produce plan like this:


GeodeToEnumerableConverterRel
  GeodeProjectRel(city=[$1], EXPR$1=[*ST_POIN*T(CAST(ITEM($2, 0)):DOUBLE,
CAST(ITEM($2, 1)):DOUBLE)])
    GeodeSortRel(fetch=[10])
      GeodeTableScanRel(table=[[geode, Zips]])

​Given that my ​GeodeProjectRel implementation doesn't know about the
*ST_POIN*T this should fail?  Or somehow get ignored in my case.

What is the correct way to handle such cases?

Perhaps in my Project rule i should make sure that projection with function
expression should not be matched so it falls back to the the
EnumerableProjection?

Thanks,
Christian

Re: Handling Functions in custom adapter implementations?

Posted by Christian Tzolov <ct...@pivotal.io>.
Yeh, unlike Cassandra, Geode support certain function types (such as CAST
and ITEM). To prevent dropping support for those, i've blacklisted just
the SqlTypeName.GEOMETRY RexNode for now.

It went a step further to hit the ST_Point parameter types issues :)

"No applicable constructor/method found for actual parameters "double,
double"; candidates are: "public static
org.apache.calcite.runtime.GeoFunctions$Geom
org.apache.calcite.runtime.GeoFunctions.ST_Point(java.math.BigDecimal,
java.math.BigDecimal, java.math.BigDecimal)", "public static
org.apache.calcite.runtime.GeoFunctions$Geom
org.apache.calcite.runtime.GeoFunctions.ST_Point(java.math.BigDecimal,
java.math.BigDecimal)""

This is expected as  ST_Point takes BigDecimal (e.g. DECIMAL sql type)
rather Double parameters.

While the "cast("loc"[0] AS DOUBLE)" works because "loc"[0] is a Numeric,
the "cast("loc"[0] AS DECIMAL)" doesn't because it tries to cast Numeric to
BigDecimal java types!

Is there other function (apart from CAST) to allow me to convert
the Numeric into BigDecimal?

Thanks,
Christian


On 28 November 2017 at 22:13, Michael Mior <mm...@uwaterloo.ca> wrote:

> Yes, the appropriate solution would be to check the expressions being
> projected and not to trigger your rule for projections you can't handle.
> For example, check out CassandraProjectRule#matches which validates that
> only field references are being projected.
>
> --
> Michael Mior
> mmior@apache.org
>
> 2017-11-28 15:39 GMT-05:00 Christian Tzolov <ct...@pivotal.io>:
>
> > Hey there,
> >
> > I have another question related to
> > ​ ​
> > handling
> > ​​
> > spatial (
> > ​or​
> > any) functions in custom adapter implementations.
> >
> > For e
> > ​ ​
> > example the
> > ​ ​
> > ​rel
> > . plan for the following  query
> > ​(​
> > with
> > ​​
> > spatial function
> > ​)​
> >
> >
> > EXPLAIN PLAN FOR SELECT
> >   "city",
> >   ST_Point(
> >       cast("loc" [0] AS DOUBLE),
> >       cast("loc" [1] AS DOUBLE)
> >   )
> > FROM "geode"."Zips"
> > LIMIT
> > 10;
> >
> > will ​produce plan like this:
> >
> >
> > GeodeToEnumerableConverterRel
> >   GeodeProjectRel(city=[$1], EXPR$1=[*ST_POIN*T(CAST(ITEM($2,
> 0)):DOUBLE,
> > CAST(ITEM($2, 1)):DOUBLE)])
> >     GeodeSortRel(fetch=[10])
> >       GeodeTableScanRel(table=[[geode, Zips]])
> >
> > ​Given that my ​GeodeProjectRel implementation doesn't know about the
> > *ST_POIN*T this should fail?  Or somehow get ignored in my case.
> >
> > What is the correct way to handle such cases?
> >
> > Perhaps in my Project rule i should make sure that projection with
> function
> > expression should not be matched so it falls back to the the
> > EnumerableProjection?
> >
> > Thanks,
> > Christian
> >
>



-- 
Christian Tzolov <http://www.linkedin.com/in/tzolov> | Principle Software
Engineer | Spring <https://spring.io/>.io | Pivotal <http://pivotal.io/> |
ctzolov@pivotal.io

Re: Handling Functions in custom adapter implementations?

Posted by Michael Mior <mm...@uwaterloo.ca>.
Yes, the appropriate solution would be to check the expressions being
projected and not to trigger your rule for projections you can't handle.
For example, check out CassandraProjectRule#matches which validates that
only field references are being projected.

--
Michael Mior
mmior@apache.org

2017-11-28 15:39 GMT-05:00 Christian Tzolov <ct...@pivotal.io>:

> Hey there,
>
> I have another question related to
> ​ ​
> handling
> ​​
> spatial (
> ​or​
> any) functions in custom adapter implementations.
>
> For e
> ​ ​
> example the
> ​ ​
> ​rel
> . plan for the following  query
> ​(​
> with
> ​​
> spatial function
> ​)​
>
>
> EXPLAIN PLAN FOR SELECT
>   "city",
>   ST_Point(
>       cast("loc" [0] AS DOUBLE),
>       cast("loc" [1] AS DOUBLE)
>   )
> FROM "geode"."Zips"
> LIMIT
> 10;
>
> will ​produce plan like this:
>
>
> GeodeToEnumerableConverterRel
>   GeodeProjectRel(city=[$1], EXPR$1=[*ST_POIN*T(CAST(ITEM($2, 0)):DOUBLE,
> CAST(ITEM($2, 1)):DOUBLE)])
>     GeodeSortRel(fetch=[10])
>       GeodeTableScanRel(table=[[geode, Zips]])
>
> ​Given that my ​GeodeProjectRel implementation doesn't know about the
> *ST_POIN*T this should fail?  Or somehow get ignored in my case.
>
> What is the correct way to handle such cases?
>
> Perhaps in my Project rule i should make sure that projection with function
> expression should not be matched so it falls back to the the
> EnumerableProjection?
>
> Thanks,
> Christian
>