You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Junaid Nasir <jn...@an10.io> on 2017/12/17 10:02:34 UTC

Lucene Index with Spark Cassandra

Hi everyone,
I am trying to run lucene with spark but sparkSQL returns zero results, where
when same query is run using cqlsh it returns correct rows. same issue as  
https://github.com/Stratio/cassandra-lucene-index/issues/79 I can see in spark
logs that lucene is working but as mentioned in the link spark is filtering
those results afterwards. any help to prevent this would be highly
appreciated.
Using datastax:spark-cassandra-connector:2.0.3-s_2.11spark:  2.1.1
cassandra:3.11.0lucene: 3.11.0.0
I am using the old syntax as mentioned in lucene docs i.ecreated a dummy col  
C* schema and Lucene index                   CREATE TABLE alldev.temp (    devid text,    day date,    datetime timestamp,    lucene text,    value text,    PRIMARY KEY ((devid, day), datetime))CREATE CUSTOM INDEX idx ON alldev.temp (lucene)   USING 'com.stratio.cassandra.lucene.Index'  WITH OPTIONS = {    'refresh_seconds': '1',    'schema': '{       fields: {         devid: {type: "string"},         day:{type: "date", pattern: "yyyy-MM-dd"},         datetime:{type: "date"},         value:{type:"integer"}        }    }'  };    cqlsh> select * from alldev.temp  where lucene =  '{filter: {type: "range", field: "value", lower: "0"}}' ;
                

Not using Mixmax yet?  


Thanks for your time,Junaid