You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@pinot.apache.org by ki...@apache.org on 2019/01/09 16:50:34 UTC

[incubator-pinot] branch pinot-text-search updated (abb1a74 -> 0d7d4a0)

This is an automated email from the ASF dual-hosted git repository.

kishoreg pushed a change to branch pinot-text-search
in repository https://gitbox.apache.org/repos/asf/incubator-pinot.git.


 discard abb1a74  Enhanced PQL grammar to support text match and wired the predicate to TextMatchFilterOperator. Pending - loading of lucene index and invoke search
     new 0d7d4a0  Enhanced PQL grammar to support text match and wired the predicate to TextMatchFilterOperator. Pending - loading of lucene index and invoke search

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (abb1a74)
            \
             N -- N -- N   refs/heads/pinot-text-search (0d7d4a0)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 ...AstNode.java => TextMatchPredicateAstNode.java} |  29 ++---
 ...pLikePredicate.java => TextMatchPredicate.java} |  31 ++---
 ...rOperator.java => TextMatchFilterOperator.java} |  52 ++-------
 .../TextMatchPredicateEvaluatorFactory.java        |  60 ++++++++++
 .../linkedin/pinot/tools/TextInvertedIndex.java    | 126 +++++++++++++++++++++
 5 files changed, 225 insertions(+), 73 deletions(-)
 copy pinot-common/src/main/java/com/linkedin/pinot/pql/parsers/pql2/ast/{RegexpLikePredicateAstNode.java => TextMatchPredicateAstNode.java} (64%)
 copy pinot-core/src/main/java/com/linkedin/pinot/core/common/predicate/{RegexpLikePredicate.java => TextMatchPredicate.java} (66%)
 copy pinot-core/src/main/java/com/linkedin/pinot/core/operator/filter/{BitmapBasedFilterOperator.java => TextMatchFilterOperator.java} (56%)
 create mode 100644 pinot-core/src/main/java/com/linkedin/pinot/core/operator/filter/predicate/TextMatchPredicateEvaluatorFactory.java
 create mode 100644 pinot-tools/src/main/java/com/linkedin/pinot/tools/TextInvertedIndex.java


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org


[incubator-pinot] 01/01: Enhanced PQL grammar to support text match and wired the predicate to TextMatchFilterOperator. Pending - loading of lucene index and invoke search

Posted by ki...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kishoreg pushed a commit to branch pinot-text-search
in repository https://gitbox.apache.org/repos/asf/incubator-pinot.git

commit 0d7d4a02fa0d6a23dac1adb4883fffdc9d062b66
Author: kishore gopalakrishna <g....@gmail.com>
AuthorDate: Sun Jan 6 13:45:57 2019 -0800

    Enhanced PQL grammar to support text match and wired the predicate to TextMatchFilterOperator. Pending - loading of lucene index and invoke search
---
 .../antlr4/com/linkedin/pinot/pql/parsers/PQL2.g4  |   6 +-
 .../pinot/common/request/AggregationInfo.java      |   2 +-
 .../pinot/common/request/BrokerRequest.java        |  80 ++++++-------
 .../pinot/common/request/FilterOperator.java       |   5 +-
 .../linkedin/pinot/common/request/FilterQuery.java |   2 +-
 .../pinot/common/request/FilterQueryMap.java       |   2 +-
 .../com/linkedin/pinot/common/request/GroupBy.java |  66 +++++------
 .../pinot/common/request/HavingFilterQuery.java    |   2 +-
 .../pinot/common/request/HavingFilterQueryMap.java |   2 +-
 .../pinot/common/request/InstanceRequest.java      |  34 +++---
 .../linkedin/pinot/common/request/QuerySource.java |   2 +-
 .../linkedin/pinot/common/request/QueryType.java   |   2 +-
 .../linkedin/pinot/common/request/Selection.java   |  70 ++++++------
 .../pinot/common/request/SelectionSort.java        |   2 +-
 .../pinot/common/response/ProcessingException.java |   2 +-
 .../pinot/pql/parsers/Pql2AstListener.java         |  43 ++-----
 .../pql2/ast/RegexpLikePredicateAstNode.java       |   3 +-
 ...AstNode.java => TextMatchPredicateAstNode.java} |  32 +++---
 .../pinot/pql/parsers/Pql2CompilerTest.java        |  13 +++
 pinot-common/src/thrift/request.thrift             |   3 +-
 .../com/linkedin/pinot/core/common/Predicate.java  |  14 +--
 .../core/common/predicate/TextMatchPredicate.java  |  48 ++++++++
 .../core/operator/filter/FilterOperatorUtils.java  |   4 +-
 .../operator/filter/TextMatchFilterOperator.java   |  71 ++++++++++++
 .../predicate/PredicateEvaluatorProvider.java      |  12 +-
 .../TextMatchPredicateEvaluatorFactory.java        |  60 ++++++++++
 pinot-tools/pom.xml                                |  10 ++
 .../linkedin/pinot/tools/TextInvertedIndex.java    | 126 +++++++++++++++++++++
 28 files changed, 515 insertions(+), 203 deletions(-)

diff --git a/pinot-common/src/main/antlr4/com/linkedin/pinot/pql/parsers/PQL2.g4 b/pinot-common/src/main/antlr4/com/linkedin/pinot/pql/parsers/PQL2.g4
index 1716d0f..62aaf11 100644
--- a/pinot-common/src/main/antlr4/com/linkedin/pinot/pql/parsers/PQL2.g4
+++ b/pinot-common/src/main/antlr4/com/linkedin/pinot/pql/parsers/PQL2.g4
@@ -74,6 +74,7 @@ predicate:
   | betweenClause                         # BetweenPredicate
   | isClause                              # IsPredicate
   | regexpLikeClause                      # RegexpLikePredicate
+  | textMatchClause                       # TextMatchPredicate
   ;
 
 inClause:
@@ -85,13 +86,15 @@ isClause:
 comparisonClause:
   expression comparisonOperator expression;
 comparisonOperator: '<' | '>' | '<>' | '<=' | '>=' | '=' | '!=';
-
 betweenClause:
   expression BETWEEN expression AND expression;
 
 regexpLikeClause:
   REGEXP_LIKE '(' expression ',' literal ')';
 
+textMatchClause:
+  TEXT_MATCH '(' expression ',' literal ',' literal)';
+
 booleanOperator: OR | AND;
 
 groupByClause: GROUP BY groupByList;
@@ -128,6 +131,7 @@ LIMIT: L I M I T;
 NOT : N O T;
 OR: O R;
 REGEXP_LIKE: R E G E X P '_' L I K E;
+TEXT_MATCH: T E X T '_' M A T C H;
 ORDER: O R D E R;
 SELECT: S E L E C T;
 TOP: T O P;
diff --git a/pinot-common/src/main/java/com/linkedin/pinot/common/request/AggregationInfo.java b/pinot-common/src/main/java/com/linkedin/pinot/common/request/AggregationInfo.java
index cea1010..2f4d716 100644
--- a/pinot-common/src/main/java/com/linkedin/pinot/common/request/AggregationInfo.java
+++ b/pinot-common/src/main/java/com/linkedin/pinot/common/request/AggregationInfo.java
@@ -39,7 +39,7 @@ import org.slf4j.LoggerFactory;
  *  Aggregation
  * 
  */
-@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2017-8-24")
+@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2018-12-26")
 public class AggregationInfo implements org.apache.thrift.TBase<AggregationInfo, AggregationInfo._Fields>, java.io.Serializable, Cloneable, Comparable<AggregationInfo> {
   private static final org.apache.thrift.protocol.TStruct STRUCT_DESC = new org.apache.thrift.protocol.TStruct("AggregationInfo");
 
diff --git a/pinot-common/src/main/java/com/linkedin/pinot/common/request/BrokerRequest.java b/pinot-common/src/main/java/com/linkedin/pinot/common/request/BrokerRequest.java
index 6de1e71..b9be341 100644
--- a/pinot-common/src/main/java/com/linkedin/pinot/common/request/BrokerRequest.java
+++ b/pinot-common/src/main/java/com/linkedin/pinot/common/request/BrokerRequest.java
@@ -37,9 +37,9 @@ import org.slf4j.LoggerFactory;
 /**
  * AUTO GENERATED: DO NOT EDIT
  * Broker Query
- *
+ * 
  */
-@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2017-8-24")
+@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2018-12-26")
 public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, BrokerRequest._Fields>, java.io.Serializable, Cloneable, Comparable<BrokerRequest> {
   private static final org.apache.thrift.protocol.TStruct STRUCT_DESC = new org.apache.thrift.protocol.TStruct("BrokerRequest");
 
@@ -193,42 +193,42 @@ public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, Bro
   public static final Map<_Fields, org.apache.thrift.meta_data.FieldMetaData> metaDataMap;
   static {
     Map<_Fields, org.apache.thrift.meta_data.FieldMetaData> tmpMap = new EnumMap<_Fields, org.apache.thrift.meta_data.FieldMetaData>(_Fields.class);
-    tmpMap.put(_Fields.QUERY_TYPE, new org.apache.thrift.meta_data.FieldMetaData("queryType", org.apache.thrift.TFieldRequirementType.OPTIONAL,
+    tmpMap.put(_Fields.QUERY_TYPE, new org.apache.thrift.meta_data.FieldMetaData("queryType", org.apache.thrift.TFieldRequirementType.OPTIONAL, 
         new org.apache.thrift.meta_data.StructMetaData(org.apache.thrift.protocol.TType.STRUCT, QueryType.class)));
-    tmpMap.put(_Fields.QUERY_SOURCE, new org.apache.thrift.meta_data.FieldMetaData("querySource", org.apache.thrift.TFieldRequirementType.OPTIONAL,
+    tmpMap.put(_Fields.QUERY_SOURCE, new org.apache.thrift.meta_data.FieldMetaData("querySource", org.apache.thrift.TFieldRequirementType.OPTIONAL, 
         new org.apache.thrift.meta_data.StructMetaData(org.apache.thrift.protocol.TType.STRUCT, QuerySource.class)));
-    tmpMap.put(_Fields.TIME_INTERVAL, new org.apache.thrift.meta_data.FieldMetaData("timeInterval", org.apache.thrift.TFieldRequirementType.OPTIONAL,
+    tmpMap.put(_Fields.TIME_INTERVAL, new org.apache.thrift.meta_data.FieldMetaData("timeInterval", org.apache.thrift.TFieldRequirementType.OPTIONAL, 
         new org.apache.thrift.meta_data.FieldValueMetaData(org.apache.thrift.protocol.TType.STRING)));
-    tmpMap.put(_Fields.DURATION, new org.apache.thrift.meta_data.FieldMetaData("duration", org.apache.thrift.TFieldRequirementType.OPTIONAL,
+    tmpMap.put(_Fields.DURATION, new org.apache.thrift.meta_data.FieldMetaData("duration", org.apache.thrift.TFieldRequirementType.OPTIONAL, 
         new org.apache.thrift.meta_data.FieldValueMetaData(org.apache.thrift.protocol.TType.STRING)));
-    tmpMap.put(_Fields.FILTER_QUERY, new org.apache.thrift.meta_data.FieldMetaData("filterQuery", org.apache.thrift.TFieldRequirementType.OPTIONAL,
+    tmpMap.put(_Fields.FILTER_QUERY, new org.apache.thrift.meta_data.FieldMetaData("filterQuery", org.apache.thrift.TFieldRequirementType.OPTIONAL, 
         new org.apache.thrift.meta_data.StructMetaData(org.apache.thrift.protocol.TType.STRUCT, FilterQuery.class)));
-    tmpMap.put(_Fields.AGGREGATIONS_INFO, new org.apache.thrift.meta_data.FieldMetaData("aggregationsInfo", org.apache.thrift.TFieldRequirementType.OPTIONAL,
-        new org.apache.thrift.meta_data.ListMetaData(org.apache.thrift.protocol.TType.LIST,
+    tmpMap.put(_Fields.AGGREGATIONS_INFO, new org.apache.thrift.meta_data.FieldMetaData("aggregationsInfo", org.apache.thrift.TFieldRequirementType.OPTIONAL, 
+        new org.apache.thrift.meta_data.ListMetaData(org.apache.thrift.protocol.TType.LIST, 
             new org.apache.thrift.meta_data.StructMetaData(org.apache.thrift.protocol.TType.STRUCT, AggregationInfo.class))));
-    tmpMap.put(_Fields.GROUP_BY, new org.apache.thrift.meta_data.FieldMetaData("groupBy", org.apache.thrift.TFieldRequirementType.OPTIONAL,
+    tmpMap.put(_Fields.GROUP_BY, new org.apache.thrift.meta_data.FieldMetaData("groupBy", org.apache.thrift.TFieldRequirementType.OPTIONAL, 
         new org.apache.thrift.meta_data.StructMetaData(org.apache.thrift.protocol.TType.STRUCT, GroupBy.class)));
-    tmpMap.put(_Fields.SELECTIONS, new org.apache.thrift.meta_data.FieldMetaData("selections", org.apache.thrift.TFieldRequirementType.OPTIONAL,
+    tmpMap.put(_Fields.SELECTIONS, new org.apache.thrift.meta_data.FieldMetaData("selections", org.apache.thrift.TFieldRequirementType.OPTIONAL, 
         new org.apache.thrift.meta_data.StructMetaData(org.apache.thrift.protocol.TType.STRUCT, Selection.class)));
-    tmpMap.put(_Fields.FILTER_SUB_QUERY_MAP, new org.apache.thrift.meta_data.FieldMetaData("filterSubQueryMap", org.apache.thrift.TFieldRequirementType.OPTIONAL,
+    tmpMap.put(_Fields.FILTER_SUB_QUERY_MAP, new org.apache.thrift.meta_data.FieldMetaData("filterSubQueryMap", org.apache.thrift.TFieldRequirementType.OPTIONAL, 
         new org.apache.thrift.meta_data.StructMetaData(org.apache.thrift.protocol.TType.STRUCT, FilterQueryMap.class)));
-    tmpMap.put(_Fields.BUCKET_HASH_KEY, new org.apache.thrift.meta_data.FieldMetaData("bucketHashKey", org.apache.thrift.TFieldRequirementType.OPTIONAL,
+    tmpMap.put(_Fields.BUCKET_HASH_KEY, new org.apache.thrift.meta_data.FieldMetaData("bucketHashKey", org.apache.thrift.TFieldRequirementType.OPTIONAL, 
         new org.apache.thrift.meta_data.FieldValueMetaData(org.apache.thrift.protocol.TType.STRING)));
-    tmpMap.put(_Fields.ENABLE_TRACE, new org.apache.thrift.meta_data.FieldMetaData("enableTrace", org.apache.thrift.TFieldRequirementType.OPTIONAL,
+    tmpMap.put(_Fields.ENABLE_TRACE, new org.apache.thrift.meta_data.FieldMetaData("enableTrace", org.apache.thrift.TFieldRequirementType.OPTIONAL, 
         new org.apache.thrift.meta_data.FieldValueMetaData(org.apache.thrift.protocol.TType.BOOL)));
-    tmpMap.put(_Fields.RESPONSE_FORMAT, new org.apache.thrift.meta_data.FieldMetaData("responseFormat", org.apache.thrift.TFieldRequirementType.OPTIONAL,
+    tmpMap.put(_Fields.RESPONSE_FORMAT, new org.apache.thrift.meta_data.FieldMetaData("responseFormat", org.apache.thrift.TFieldRequirementType.OPTIONAL, 
         new org.apache.thrift.meta_data.FieldValueMetaData(org.apache.thrift.protocol.TType.STRING)));
-    tmpMap.put(_Fields.DEBUG_OPTIONS, new org.apache.thrift.meta_data.FieldMetaData("debugOptions", org.apache.thrift.TFieldRequirementType.OPTIONAL,
-        new org.apache.thrift.meta_data.MapMetaData(org.apache.thrift.protocol.TType.MAP,
-            new org.apache.thrift.meta_data.FieldValueMetaData(org.apache.thrift.protocol.TType.STRING),
+    tmpMap.put(_Fields.DEBUG_OPTIONS, new org.apache.thrift.meta_data.FieldMetaData("debugOptions", org.apache.thrift.TFieldRequirementType.OPTIONAL, 
+        new org.apache.thrift.meta_data.MapMetaData(org.apache.thrift.protocol.TType.MAP, 
+            new org.apache.thrift.meta_data.FieldValueMetaData(org.apache.thrift.protocol.TType.STRING), 
             new org.apache.thrift.meta_data.FieldValueMetaData(org.apache.thrift.protocol.TType.STRING))));
-    tmpMap.put(_Fields.QUERY_OPTIONS, new org.apache.thrift.meta_data.FieldMetaData("queryOptions", org.apache.thrift.TFieldRequirementType.OPTIONAL,
-        new org.apache.thrift.meta_data.MapMetaData(org.apache.thrift.protocol.TType.MAP,
-            new org.apache.thrift.meta_data.FieldValueMetaData(org.apache.thrift.protocol.TType.STRING),
+    tmpMap.put(_Fields.QUERY_OPTIONS, new org.apache.thrift.meta_data.FieldMetaData("queryOptions", org.apache.thrift.TFieldRequirementType.OPTIONAL, 
+        new org.apache.thrift.meta_data.MapMetaData(org.apache.thrift.protocol.TType.MAP, 
+            new org.apache.thrift.meta_data.FieldValueMetaData(org.apache.thrift.protocol.TType.STRING), 
             new org.apache.thrift.meta_data.FieldValueMetaData(org.apache.thrift.protocol.TType.STRING))));
-    tmpMap.put(_Fields.HAVING_FILTER_QUERY, new org.apache.thrift.meta_data.FieldMetaData("havingFilterQuery", org.apache.thrift.TFieldRequirementType.OPTIONAL,
+    tmpMap.put(_Fields.HAVING_FILTER_QUERY, new org.apache.thrift.meta_data.FieldMetaData("havingFilterQuery", org.apache.thrift.TFieldRequirementType.OPTIONAL, 
         new org.apache.thrift.meta_data.StructMetaData(org.apache.thrift.protocol.TType.STRUCT, HavingFilterQuery.class)));
-    tmpMap.put(_Fields.HAVING_FILTER_SUB_QUERY_MAP, new org.apache.thrift.meta_data.FieldMetaData("havingFilterSubQueryMap", org.apache.thrift.TFieldRequirementType.OPTIONAL,
+    tmpMap.put(_Fields.HAVING_FILTER_SUB_QUERY_MAP, new org.apache.thrift.meta_data.FieldMetaData("havingFilterSubQueryMap", org.apache.thrift.TFieldRequirementType.OPTIONAL, 
         new org.apache.thrift.meta_data.StructMetaData(org.apache.thrift.protocol.TType.STRUCT, HavingFilterQueryMap.class)));
     metaDataMap = Collections.unmodifiableMap(tmpMap);
     org.apache.thrift.meta_data.FieldMetaData.addStructMetaDataMap(BrokerRequest.class, metaDataMap);
@@ -1610,7 +1610,7 @@ public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, Bro
       while (true)
       {
         schemeField = iprot.readFieldBegin();
-        if (schemeField.type == org.apache.thrift.protocol.TType.STOP) {
+        if (schemeField.type == org.apache.thrift.protocol.TType.STOP) { 
           break;
         }
         switch (schemeField.id) {
@@ -1619,7 +1619,7 @@ public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, Bro
               struct.queryType = new QueryType();
               struct.queryType.read(iprot);
               struct.setQueryTypeIsSet(true);
-            } else {
+            } else { 
               org.apache.thrift.protocol.TProtocolUtil.skip(iprot, schemeField.type);
             }
             break;
@@ -1628,7 +1628,7 @@ public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, Bro
               struct.querySource = new QuerySource();
               struct.querySource.read(iprot);
               struct.setQuerySourceIsSet(true);
-            } else {
+            } else { 
               org.apache.thrift.protocol.TProtocolUtil.skip(iprot, schemeField.type);
             }
             break;
@@ -1636,7 +1636,7 @@ public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, Bro
             if (schemeField.type == org.apache.thrift.protocol.TType.STRING) {
               struct.timeInterval = iprot.readString();
               struct.setTimeIntervalIsSet(true);
-            } else {
+            } else { 
               org.apache.thrift.protocol.TProtocolUtil.skip(iprot, schemeField.type);
             }
             break;
@@ -1644,7 +1644,7 @@ public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, Bro
             if (schemeField.type == org.apache.thrift.protocol.TType.STRING) {
               struct.duration = iprot.readString();
               struct.setDurationIsSet(true);
-            } else {
+            } else { 
               org.apache.thrift.protocol.TProtocolUtil.skip(iprot, schemeField.type);
             }
             break;
@@ -1653,7 +1653,7 @@ public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, Bro
               struct.filterQuery = new FilterQuery();
               struct.filterQuery.read(iprot);
               struct.setFilterQueryIsSet(true);
-            } else {
+            } else { 
               org.apache.thrift.protocol.TProtocolUtil.skip(iprot, schemeField.type);
             }
             break;
@@ -1672,7 +1672,7 @@ public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, Bro
                 iprot.readListEnd();
               }
               struct.setAggregationsInfoIsSet(true);
-            } else {
+            } else { 
               org.apache.thrift.protocol.TProtocolUtil.skip(iprot, schemeField.type);
             }
             break;
@@ -1681,7 +1681,7 @@ public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, Bro
               struct.groupBy = new GroupBy();
               struct.groupBy.read(iprot);
               struct.setGroupByIsSet(true);
-            } else {
+            } else { 
               org.apache.thrift.protocol.TProtocolUtil.skip(iprot, schemeField.type);
             }
             break;
@@ -1690,7 +1690,7 @@ public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, Bro
               struct.selections = new Selection();
               struct.selections.read(iprot);
               struct.setSelectionsIsSet(true);
-            } else {
+            } else { 
               org.apache.thrift.protocol.TProtocolUtil.skip(iprot, schemeField.type);
             }
             break;
@@ -1699,7 +1699,7 @@ public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, Bro
               struct.filterSubQueryMap = new FilterQueryMap();
               struct.filterSubQueryMap.read(iprot);
               struct.setFilterSubQueryMapIsSet(true);
-            } else {
+            } else { 
               org.apache.thrift.protocol.TProtocolUtil.skip(iprot, schemeField.type);
             }
             break;
@@ -1707,7 +1707,7 @@ public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, Bro
             if (schemeField.type == org.apache.thrift.protocol.TType.STRING) {
               struct.bucketHashKey = iprot.readString();
               struct.setBucketHashKeyIsSet(true);
-            } else {
+            } else { 
               org.apache.thrift.protocol.TProtocolUtil.skip(iprot, schemeField.type);
             }
             break;
@@ -1715,7 +1715,7 @@ public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, Bro
             if (schemeField.type == org.apache.thrift.protocol.TType.BOOL) {
               struct.enableTrace = iprot.readBool();
               struct.setEnableTraceIsSet(true);
-            } else {
+            } else { 
               org.apache.thrift.protocol.TProtocolUtil.skip(iprot, schemeField.type);
             }
             break;
@@ -1723,7 +1723,7 @@ public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, Bro
             if (schemeField.type == org.apache.thrift.protocol.TType.STRING) {
               struct.responseFormat = iprot.readString();
               struct.setResponseFormatIsSet(true);
-            } else {
+            } else { 
               org.apache.thrift.protocol.TProtocolUtil.skip(iprot, schemeField.type);
             }
             break;
@@ -1743,7 +1743,7 @@ public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, Bro
                 iprot.readMapEnd();
               }
               struct.setDebugOptionsIsSet(true);
-            } else {
+            } else { 
               org.apache.thrift.protocol.TProtocolUtil.skip(iprot, schemeField.type);
             }
             break;
@@ -1763,7 +1763,7 @@ public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, Bro
                 iprot.readMapEnd();
               }
               struct.setQueryOptionsIsSet(true);
-            } else {
+            } else { 
               org.apache.thrift.protocol.TProtocolUtil.skip(iprot, schemeField.type);
             }
             break;
@@ -1772,7 +1772,7 @@ public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, Bro
               struct.havingFilterQuery = new HavingFilterQuery();
               struct.havingFilterQuery.read(iprot);
               struct.setHavingFilterQueryIsSet(true);
-            } else {
+            } else { 
               org.apache.thrift.protocol.TProtocolUtil.skip(iprot, schemeField.type);
             }
             break;
@@ -1781,7 +1781,7 @@ public class BrokerRequest implements org.apache.thrift.TBase<BrokerRequest, Bro
               struct.havingFilterSubQueryMap = new HavingFilterQueryMap();
               struct.havingFilterSubQueryMap.read(iprot);
               struct.setHavingFilterSubQueryMapIsSet(true);
-            } else {
+            } else { 
               org.apache.thrift.protocol.TProtocolUtil.skip(iprot, schemeField.type);
             }
             break;
diff --git a/pinot-common/src/main/java/com/linkedin/pinot/common/request/FilterOperator.java b/pinot-common/src/main/java/com/linkedin/pinot/common/request/FilterOperator.java
index 784ab2e..18b9747 100644
--- a/pinot-common/src/main/java/com/linkedin/pinot/common/request/FilterOperator.java
+++ b/pinot-common/src/main/java/com/linkedin/pinot/common/request/FilterOperator.java
@@ -24,7 +24,8 @@ public enum FilterOperator implements org.apache.thrift.TEnum {
   RANGE(4),
   REGEXP_LIKE(5),
   NOT_IN(6),
-  IN(7);
+  IN(7),
+  TEXT_MATCH(8);
 
   private final int value;
 
@@ -61,6 +62,8 @@ public enum FilterOperator implements org.apache.thrift.TEnum {
         return NOT_IN;
       case 7:
         return IN;
+      case 8:
+        return TEXT_MATCH;
       default:
         return null;
     }
diff --git a/pinot-common/src/main/java/com/linkedin/pinot/common/request/FilterQuery.java b/pinot-common/src/main/java/com/linkedin/pinot/common/request/FilterQuery.java
index 9a68404..f589f1c 100644
--- a/pinot-common/src/main/java/com/linkedin/pinot/common/request/FilterQuery.java
+++ b/pinot-common/src/main/java/com/linkedin/pinot/common/request/FilterQuery.java
@@ -39,7 +39,7 @@ import org.slf4j.LoggerFactory;
  * Filter query
  * 
  */
-@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2017-5-24")
+@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2018-12-26")
 public class FilterQuery implements org.apache.thrift.TBase<FilterQuery, FilterQuery._Fields>, java.io.Serializable, Cloneable, Comparable<FilterQuery> {
   private static final org.apache.thrift.protocol.TStruct STRUCT_DESC = new org.apache.thrift.protocol.TStruct("FilterQuery");
 
diff --git a/pinot-common/src/main/java/com/linkedin/pinot/common/request/FilterQueryMap.java b/pinot-common/src/main/java/com/linkedin/pinot/common/request/FilterQueryMap.java
index 55ddc61..30161c7 100644
--- a/pinot-common/src/main/java/com/linkedin/pinot/common/request/FilterQueryMap.java
+++ b/pinot-common/src/main/java/com/linkedin/pinot/common/request/FilterQueryMap.java
@@ -39,7 +39,7 @@ import org.slf4j.LoggerFactory;
  * Filter Query is nested but thrift stable version does not support yet (The support is there in top of the trunk but no released jars. Two concerns : stability and onus of maintaining a stable point. Also, its pretty difficult to compile thrift in Linkedin software development environment which is not geared towards c++ dev. Hence, the )
  * 
  */
-@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2017-5-24")
+@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2018-12-26")
 public class FilterQueryMap implements org.apache.thrift.TBase<FilterQueryMap, FilterQueryMap._Fields>, java.io.Serializable, Cloneable, Comparable<FilterQueryMap> {
   private static final org.apache.thrift.protocol.TStruct STRUCT_DESC = new org.apache.thrift.protocol.TStruct("FilterQueryMap");
 
diff --git a/pinot-common/src/main/java/com/linkedin/pinot/common/request/GroupBy.java b/pinot-common/src/main/java/com/linkedin/pinot/common/request/GroupBy.java
index c3bb483..ca5d6e3 100644
--- a/pinot-common/src/main/java/com/linkedin/pinot/common/request/GroupBy.java
+++ b/pinot-common/src/main/java/com/linkedin/pinot/common/request/GroupBy.java
@@ -39,7 +39,7 @@ import org.slf4j.LoggerFactory;
  * GroupBy
  * 
  */
-@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2017-5-24")
+@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2018-12-26")
 public class GroupBy implements org.apache.thrift.TBase<GroupBy, GroupBy._Fields>, java.io.Serializable, Cloneable, Comparable<GroupBy> {
   private static final org.apache.thrift.protocol.TStruct STRUCT_DESC = new org.apache.thrift.protocol.TStruct("GroupBy");
 
@@ -526,13 +526,13 @@ public class GroupBy implements org.apache.thrift.TBase<GroupBy, GroupBy._Fields
           case 1: // COLUMNS
             if (schemeField.type == org.apache.thrift.protocol.TType.LIST) {
               {
-                org.apache.thrift.protocol.TList _list36 = iprot.readListBegin();
-                struct.columns = new ArrayList<String>(_list36.size);
-                String _elem37;
-                for (int _i38 = 0; _i38 < _list36.size; ++_i38)
+                org.apache.thrift.protocol.TList _list62 = iprot.readListBegin();
+                struct.columns = new ArrayList<String>(_list62.size);
+                String _elem63;
+                for (int _i64 = 0; _i64 < _list62.size; ++_i64)
                 {
-                  _elem37 = iprot.readString();
-                  struct.columns.add(_elem37);
+                  _elem63 = iprot.readString();
+                  struct.columns.add(_elem63);
                 }
                 iprot.readListEnd();
               }
@@ -552,13 +552,13 @@ public class GroupBy implements org.apache.thrift.TBase<GroupBy, GroupBy._Fields
           case 3: // EXPRESSIONS
             if (schemeField.type == org.apache.thrift.protocol.TType.LIST) {
               {
-                org.apache.thrift.protocol.TList _list39 = iprot.readListBegin();
-                struct.expressions = new ArrayList<String>(_list39.size);
-                String _elem40;
-                for (int _i41 = 0; _i41 < _list39.size; ++_i41)
+                org.apache.thrift.protocol.TList _list65 = iprot.readListBegin();
+                struct.expressions = new ArrayList<String>(_list65.size);
+                String _elem66;
+                for (int _i67 = 0; _i67 < _list65.size; ++_i67)
                 {
-                  _elem40 = iprot.readString();
-                  struct.expressions.add(_elem40);
+                  _elem66 = iprot.readString();
+                  struct.expressions.add(_elem66);
                 }
                 iprot.readListEnd();
               }
@@ -585,9 +585,9 @@ public class GroupBy implements org.apache.thrift.TBase<GroupBy, GroupBy._Fields
           oprot.writeFieldBegin(COLUMNS_FIELD_DESC);
           {
             oprot.writeListBegin(new org.apache.thrift.protocol.TList(org.apache.thrift.protocol.TType.STRING, struct.columns.size()));
-            for (String _iter42 : struct.columns)
+            for (String _iter68 : struct.columns)
             {
-              oprot.writeString(_iter42);
+              oprot.writeString(_iter68);
             }
             oprot.writeListEnd();
           }
@@ -604,9 +604,9 @@ public class GroupBy implements org.apache.thrift.TBase<GroupBy, GroupBy._Fields
           oprot.writeFieldBegin(EXPRESSIONS_FIELD_DESC);
           {
             oprot.writeListBegin(new org.apache.thrift.protocol.TList(org.apache.thrift.protocol.TType.STRING, struct.expressions.size()));
-            for (String _iter43 : struct.expressions)
+            for (String _iter69 : struct.expressions)
             {
-              oprot.writeString(_iter43);
+              oprot.writeString(_iter69);
             }
             oprot.writeListEnd();
           }
@@ -644,9 +644,9 @@ public class GroupBy implements org.apache.thrift.TBase<GroupBy, GroupBy._Fields
       if (struct.isSetColumns()) {
         {
           oprot.writeI32(struct.columns.size());
-          for (String _iter44 : struct.columns)
+          for (String _iter70 : struct.columns)
           {
-            oprot.writeString(_iter44);
+            oprot.writeString(_iter70);
           }
         }
       }
@@ -656,9 +656,9 @@ public class GroupBy implements org.apache.thrift.TBase<GroupBy, GroupBy._Fields
       if (struct.isSetExpressions()) {
         {
           oprot.writeI32(struct.expressions.size());
-          for (String _iter45 : struct.expressions)
+          for (String _iter71 : struct.expressions)
           {
-            oprot.writeString(_iter45);
+            oprot.writeString(_iter71);
           }
         }
       }
@@ -670,13 +670,13 @@ public class GroupBy implements org.apache.thrift.TBase<GroupBy, GroupBy._Fields
       BitSet incoming = iprot.readBitSet(3);
       if (incoming.get(0)) {
         {
-          org.apache.thrift.protocol.TList _list46 = new org.apache.thrift.protocol.TList(org.apache.thrift.protocol.TType.STRING, iprot.readI32());
-          struct.columns = new ArrayList<String>(_list46.size);
-          String _elem47;
-          for (int _i48 = 0; _i48 < _list46.size; ++_i48)
+          org.apache.thrift.protocol.TList _list72 = new org.apache.thrift.protocol.TList(org.apache.thrift.protocol.TType.STRING, iprot.readI32());
+          struct.columns = new ArrayList<String>(_list72.size);
+          String _elem73;
+          for (int _i74 = 0; _i74 < _list72.size; ++_i74)
           {
-            _elem47 = iprot.readString();
-            struct.columns.add(_elem47);
+            _elem73 = iprot.readString();
+            struct.columns.add(_elem73);
           }
         }
         struct.setColumnsIsSet(true);
@@ -687,13 +687,13 @@ public class GroupBy implements org.apache.thrift.TBase<GroupBy, GroupBy._Fields
       }
       if (incoming.get(2)) {
         {
-          org.apache.thrift.protocol.TList _list49 = new org.apache.thrift.protocol.TList(org.apache.thrift.protocol.TType.STRING, iprot.readI32());
-          struct.expressions = new ArrayList<String>(_list49.size);
-          String _elem50;
-          for (int _i51 = 0; _i51 < _list49.size; ++_i51)
+          org.apache.thrift.protocol.TList _list75 = new org.apache.thrift.protocol.TList(org.apache.thrift.protocol.TType.STRING, iprot.readI32());
+          struct.expressions = new ArrayList<String>(_list75.size);
+          String _elem76;
+          for (int _i77 = 0; _i77 < _list75.size; ++_i77)
           {
-            _elem50 = iprot.readString();
-            struct.expressions.add(_elem50);
+            _elem76 = iprot.readString();
+            struct.expressions.add(_elem76);
           }
         }
         struct.setExpressionsIsSet(true);
diff --git a/pinot-common/src/main/java/com/linkedin/pinot/common/request/HavingFilterQuery.java b/pinot-common/src/main/java/com/linkedin/pinot/common/request/HavingFilterQuery.java
index f2e68f5..0371bdb 100644
--- a/pinot-common/src/main/java/com/linkedin/pinot/common/request/HavingFilterQuery.java
+++ b/pinot-common/src/main/java/com/linkedin/pinot/common/request/HavingFilterQuery.java
@@ -39,7 +39,7 @@ import org.slf4j.LoggerFactory;
  * Having Filter query
  * 
  */
-@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2017-8-24")
+@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2018-12-26")
 public class HavingFilterQuery implements org.apache.thrift.TBase<HavingFilterQuery, HavingFilterQuery._Fields>, java.io.Serializable, Cloneable, Comparable<HavingFilterQuery> {
   private static final org.apache.thrift.protocol.TStruct STRUCT_DESC = new org.apache.thrift.protocol.TStruct("HavingFilterQuery");
 
diff --git a/pinot-common/src/main/java/com/linkedin/pinot/common/request/HavingFilterQueryMap.java b/pinot-common/src/main/java/com/linkedin/pinot/common/request/HavingFilterQueryMap.java
index dd2f5d4..cd88922 100644
--- a/pinot-common/src/main/java/com/linkedin/pinot/common/request/HavingFilterQueryMap.java
+++ b/pinot-common/src/main/java/com/linkedin/pinot/common/request/HavingFilterQueryMap.java
@@ -34,7 +34,7 @@ import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
 @SuppressWarnings({"cast", "rawtypes", "serial", "unchecked"})
-@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2017-8-24")
+@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2018-12-26")
 public class HavingFilterQueryMap implements org.apache.thrift.TBase<HavingFilterQueryMap, HavingFilterQueryMap._Fields>, java.io.Serializable, Cloneable, Comparable<HavingFilterQueryMap> {
   private static final org.apache.thrift.protocol.TStruct STRUCT_DESC = new org.apache.thrift.protocol.TStruct("HavingFilterQueryMap");
 
diff --git a/pinot-common/src/main/java/com/linkedin/pinot/common/request/InstanceRequest.java b/pinot-common/src/main/java/com/linkedin/pinot/common/request/InstanceRequest.java
index 9decb8f..1f42c00 100644
--- a/pinot-common/src/main/java/com/linkedin/pinot/common/request/InstanceRequest.java
+++ b/pinot-common/src/main/java/com/linkedin/pinot/common/request/InstanceRequest.java
@@ -39,7 +39,7 @@ import org.slf4j.LoggerFactory;
  * Instance Request
  * 
  */
-@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2017-5-24")
+@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2018-12-26")
 public class InstanceRequest implements org.apache.thrift.TBase<InstanceRequest, InstanceRequest._Fields>, java.io.Serializable, Cloneable, Comparable<InstanceRequest> {
   private static final org.apache.thrift.protocol.TStruct STRUCT_DESC = new org.apache.thrift.protocol.TStruct("InstanceRequest");
 
@@ -700,13 +700,13 @@ public class InstanceRequest implements org.apache.thrift.TBase<InstanceRequest,
           case 3: // SEARCH_SEGMENTS
             if (schemeField.type == org.apache.thrift.protocol.TType.LIST) {
               {
-                org.apache.thrift.protocol.TList _list96 = iprot.readListBegin();
-                struct.searchSegments = new ArrayList<String>(_list96.size);
-                String _elem97;
-                for (int _i98 = 0; _i98 < _list96.size; ++_i98)
+                org.apache.thrift.protocol.TList _list122 = iprot.readListBegin();
+                struct.searchSegments = new ArrayList<String>(_list122.size);
+                String _elem123;
+                for (int _i124 = 0; _i124 < _list122.size; ++_i124)
                 {
-                  _elem97 = iprot.readString();
-                  struct.searchSegments.add(_elem97);
+                  _elem123 = iprot.readString();
+                  struct.searchSegments.add(_elem123);
                 }
                 iprot.readListEnd();
               }
@@ -757,9 +757,9 @@ public class InstanceRequest implements org.apache.thrift.TBase<InstanceRequest,
           oprot.writeFieldBegin(SEARCH_SEGMENTS_FIELD_DESC);
           {
             oprot.writeListBegin(new org.apache.thrift.protocol.TList(org.apache.thrift.protocol.TType.STRING, struct.searchSegments.size()));
-            for (String _iter99 : struct.searchSegments)
+            for (String _iter125 : struct.searchSegments)
             {
-              oprot.writeString(_iter99);
+              oprot.writeString(_iter125);
             }
             oprot.writeListEnd();
           }
@@ -811,9 +811,9 @@ public class InstanceRequest implements org.apache.thrift.TBase<InstanceRequest,
       if (struct.isSetSearchSegments()) {
         {
           oprot.writeI32(struct.searchSegments.size());
-          for (String _iter100 : struct.searchSegments)
+          for (String _iter126 : struct.searchSegments)
           {
-            oprot.writeString(_iter100);
+            oprot.writeString(_iter126);
           }
         }
       }
@@ -836,13 +836,13 @@ public class InstanceRequest implements org.apache.thrift.TBase<InstanceRequest,
       BitSet incoming = iprot.readBitSet(3);
       if (incoming.get(0)) {
         {
-          org.apache.thrift.protocol.TList _list101 = new org.apache.thrift.protocol.TList(org.apache.thrift.protocol.TType.STRING, iprot.readI32());
-          struct.searchSegments = new ArrayList<String>(_list101.size);
-          String _elem102;
-          for (int _i103 = 0; _i103 < _list101.size; ++_i103)
+          org.apache.thrift.protocol.TList _list127 = new org.apache.thrift.protocol.TList(org.apache.thrift.protocol.TType.STRING, iprot.readI32());
+          struct.searchSegments = new ArrayList<String>(_list127.size);
+          String _elem128;
+          for (int _i129 = 0; _i129 < _list127.size; ++_i129)
           {
-            _elem102 = iprot.readString();
-            struct.searchSegments.add(_elem102);
+            _elem128 = iprot.readString();
+            struct.searchSegments.add(_elem128);
           }
         }
         struct.setSearchSegmentsIsSet(true);
diff --git a/pinot-common/src/main/java/com/linkedin/pinot/common/request/QuerySource.java b/pinot-common/src/main/java/com/linkedin/pinot/common/request/QuerySource.java
index a0e8ea3..cfd827b 100644
--- a/pinot-common/src/main/java/com/linkedin/pinot/common/request/QuerySource.java
+++ b/pinot-common/src/main/java/com/linkedin/pinot/common/request/QuerySource.java
@@ -39,7 +39,7 @@ import org.slf4j.LoggerFactory;
  * Query source
  * 
  */
-@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2017-5-24")
+@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2018-12-26")
 public class QuerySource implements org.apache.thrift.TBase<QuerySource, QuerySource._Fields>, java.io.Serializable, Cloneable, Comparable<QuerySource> {
   private static final org.apache.thrift.protocol.TStruct STRUCT_DESC = new org.apache.thrift.protocol.TStruct("QuerySource");
 
diff --git a/pinot-common/src/main/java/com/linkedin/pinot/common/request/QueryType.java b/pinot-common/src/main/java/com/linkedin/pinot/common/request/QueryType.java
index d457a07..0f66468 100644
--- a/pinot-common/src/main/java/com/linkedin/pinot/common/request/QueryType.java
+++ b/pinot-common/src/main/java/com/linkedin/pinot/common/request/QueryType.java
@@ -39,7 +39,7 @@ import org.slf4j.LoggerFactory;
  *  Query type
  * 
  */
-@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2017-8-24")
+@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2018-12-26")
 public class QueryType implements org.apache.thrift.TBase<QueryType, QueryType._Fields>, java.io.Serializable, Cloneable, Comparable<QueryType> {
   private static final org.apache.thrift.protocol.TStruct STRUCT_DESC = new org.apache.thrift.protocol.TStruct("QueryType");
 
diff --git a/pinot-common/src/main/java/com/linkedin/pinot/common/request/Selection.java b/pinot-common/src/main/java/com/linkedin/pinot/common/request/Selection.java
index b5b75aa..60c3b11 100644
--- a/pinot-common/src/main/java/com/linkedin/pinot/common/request/Selection.java
+++ b/pinot-common/src/main/java/com/linkedin/pinot/common/request/Selection.java
@@ -39,7 +39,7 @@ import org.slf4j.LoggerFactory;
  * Selection
  * 
  */
-@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2017-5-24")
+@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2018-12-26")
 public class Selection implements org.apache.thrift.TBase<Selection, Selection._Fields>, java.io.Serializable, Cloneable, Comparable<Selection> {
   private static final org.apache.thrift.protocol.TStruct STRUCT_DESC = new org.apache.thrift.protocol.TStruct("Selection");
 
@@ -609,13 +609,13 @@ public class Selection implements org.apache.thrift.TBase<Selection, Selection._
           case 1: // SELECTION_COLUMNS
             if (schemeField.type == org.apache.thrift.protocol.TType.LIST) {
               {
-                org.apache.thrift.protocol.TList _list52 = iprot.readListBegin();
-                struct.selectionColumns = new ArrayList<String>(_list52.size);
-                String _elem53;
-                for (int _i54 = 0; _i54 < _list52.size; ++_i54)
+                org.apache.thrift.protocol.TList _list78 = iprot.readListBegin();
+                struct.selectionColumns = new ArrayList<String>(_list78.size);
+                String _elem79;
+                for (int _i80 = 0; _i80 < _list78.size; ++_i80)
                 {
-                  _elem53 = iprot.readString();
-                  struct.selectionColumns.add(_elem53);
+                  _elem79 = iprot.readString();
+                  struct.selectionColumns.add(_elem79);
                 }
                 iprot.readListEnd();
               }
@@ -627,14 +627,14 @@ public class Selection implements org.apache.thrift.TBase<Selection, Selection._
           case 2: // SELECTION_SORT_SEQUENCE
             if (schemeField.type == org.apache.thrift.protocol.TType.LIST) {
               {
-                org.apache.thrift.protocol.TList _list55 = iprot.readListBegin();
-                struct.selectionSortSequence = new ArrayList<SelectionSort>(_list55.size);
-                SelectionSort _elem56;
-                for (int _i57 = 0; _i57 < _list55.size; ++_i57)
+                org.apache.thrift.protocol.TList _list81 = iprot.readListBegin();
+                struct.selectionSortSequence = new ArrayList<SelectionSort>(_list81.size);
+                SelectionSort _elem82;
+                for (int _i83 = 0; _i83 < _list81.size; ++_i83)
                 {
-                  _elem56 = new SelectionSort();
-                  _elem56.read(iprot);
-                  struct.selectionSortSequence.add(_elem56);
+                  _elem82 = new SelectionSort();
+                  _elem82.read(iprot);
+                  struct.selectionSortSequence.add(_elem82);
                 }
                 iprot.readListEnd();
               }
@@ -677,9 +677,9 @@ public class Selection implements org.apache.thrift.TBase<Selection, Selection._
           oprot.writeFieldBegin(SELECTION_COLUMNS_FIELD_DESC);
           {
             oprot.writeListBegin(new org.apache.thrift.protocol.TList(org.apache.thrift.protocol.TType.STRING, struct.selectionColumns.size()));
-            for (String _iter58 : struct.selectionColumns)
+            for (String _iter84 : struct.selectionColumns)
             {
-              oprot.writeString(_iter58);
+              oprot.writeString(_iter84);
             }
             oprot.writeListEnd();
           }
@@ -691,9 +691,9 @@ public class Selection implements org.apache.thrift.TBase<Selection, Selection._
           oprot.writeFieldBegin(SELECTION_SORT_SEQUENCE_FIELD_DESC);
           {
             oprot.writeListBegin(new org.apache.thrift.protocol.TList(org.apache.thrift.protocol.TType.STRUCT, struct.selectionSortSequence.size()));
-            for (SelectionSort _iter59 : struct.selectionSortSequence)
+            for (SelectionSort _iter85 : struct.selectionSortSequence)
             {
-              _iter59.write(oprot);
+              _iter85.write(oprot);
             }
             oprot.writeListEnd();
           }
@@ -744,18 +744,18 @@ public class Selection implements org.apache.thrift.TBase<Selection, Selection._
       if (struct.isSetSelectionColumns()) {
         {
           oprot.writeI32(struct.selectionColumns.size());
-          for (String _iter60 : struct.selectionColumns)
+          for (String _iter86 : struct.selectionColumns)
           {
-            oprot.writeString(_iter60);
+            oprot.writeString(_iter86);
           }
         }
       }
       if (struct.isSetSelectionSortSequence()) {
         {
           oprot.writeI32(struct.selectionSortSequence.size());
-          for (SelectionSort _iter61 : struct.selectionSortSequence)
+          for (SelectionSort _iter87 : struct.selectionSortSequence)
           {
-            _iter61.write(oprot);
+            _iter87.write(oprot);
           }
         }
       }
@@ -773,27 +773,27 @@ public class Selection implements org.apache.thrift.TBase<Selection, Selection._
       BitSet incoming = iprot.readBitSet(4);
       if (incoming.get(0)) {
         {
-          org.apache.thrift.protocol.TList _list62 = new org.apache.thrift.protocol.TList(org.apache.thrift.protocol.TType.STRING, iprot.readI32());
-          struct.selectionColumns = new ArrayList<String>(_list62.size);
-          String _elem63;
-          for (int _i64 = 0; _i64 < _list62.size; ++_i64)
+          org.apache.thrift.protocol.TList _list88 = new org.apache.thrift.protocol.TList(org.apache.thrift.protocol.TType.STRING, iprot.readI32());
+          struct.selectionColumns = new ArrayList<String>(_list88.size);
+          String _elem89;
+          for (int _i90 = 0; _i90 < _list88.size; ++_i90)
           {
-            _elem63 = iprot.readString();
-            struct.selectionColumns.add(_elem63);
+            _elem89 = iprot.readString();
+            struct.selectionColumns.add(_elem89);
           }
         }
         struct.setSelectionColumnsIsSet(true);
       }
       if (incoming.get(1)) {
         {
-          org.apache.thrift.protocol.TList _list65 = new org.apache.thrift.protocol.TList(org.apache.thrift.protocol.TType.STRUCT, iprot.readI32());
-          struct.selectionSortSequence = new ArrayList<SelectionSort>(_list65.size);
-          SelectionSort _elem66;
-          for (int _i67 = 0; _i67 < _list65.size; ++_i67)
+          org.apache.thrift.protocol.TList _list91 = new org.apache.thrift.protocol.TList(org.apache.thrift.protocol.TType.STRUCT, iprot.readI32());
+          struct.selectionSortSequence = new ArrayList<SelectionSort>(_list91.size);
+          SelectionSort _elem92;
+          for (int _i93 = 0; _i93 < _list91.size; ++_i93)
           {
-            _elem66 = new SelectionSort();
-            _elem66.read(iprot);
-            struct.selectionSortSequence.add(_elem66);
+            _elem92 = new SelectionSort();
+            _elem92.read(iprot);
+            struct.selectionSortSequence.add(_elem92);
           }
         }
         struct.setSelectionSortSequenceIsSet(true);
diff --git a/pinot-common/src/main/java/com/linkedin/pinot/common/request/SelectionSort.java b/pinot-common/src/main/java/com/linkedin/pinot/common/request/SelectionSort.java
index e5841dd..bc77db8 100644
--- a/pinot-common/src/main/java/com/linkedin/pinot/common/request/SelectionSort.java
+++ b/pinot-common/src/main/java/com/linkedin/pinot/common/request/SelectionSort.java
@@ -40,7 +40,7 @@ import org.slf4j.LoggerFactory;
  * The results can be sorted based on one or multiple columns
  * 
  */
-@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2017-5-24")
+@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2018-12-26")
 public class SelectionSort implements org.apache.thrift.TBase<SelectionSort, SelectionSort._Fields>, java.io.Serializable, Cloneable, Comparable<SelectionSort> {
   private static final org.apache.thrift.protocol.TStruct STRUCT_DESC = new org.apache.thrift.protocol.TStruct("SelectionSort");
 
diff --git a/pinot-common/src/main/java/com/linkedin/pinot/common/response/ProcessingException.java b/pinot-common/src/main/java/com/linkedin/pinot/common/response/ProcessingException.java
index e5f30c8..c3ba3d2 100644
--- a/pinot-common/src/main/java/com/linkedin/pinot/common/response/ProcessingException.java
+++ b/pinot-common/src/main/java/com/linkedin/pinot/common/response/ProcessingException.java
@@ -38,7 +38,7 @@ import org.slf4j.LoggerFactory;
  * Processing exception
  * 
  */
-@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2017-5-24")
+@Generated(value = "Autogenerated by Thrift Compiler (0.9.2)", date = "2018-12-26")
 public class ProcessingException extends TException implements org.apache.thrift.TBase<ProcessingException, ProcessingException._Fields>, java.io.Serializable, Cloneable, Comparable<ProcessingException> {
   private static final org.apache.thrift.protocol.TStruct STRUCT_DESC = new org.apache.thrift.protocol.TStruct("ProcessingException");
 
diff --git a/pinot-common/src/main/java/com/linkedin/pinot/pql/parsers/Pql2AstListener.java b/pinot-common/src/main/java/com/linkedin/pinot/pql/parsers/Pql2AstListener.java
index a7ac32a..3aeecb7 100644
--- a/pinot-common/src/main/java/com/linkedin/pinot/pql/parsers/Pql2AstListener.java
+++ b/pinot-common/src/main/java/com/linkedin/pinot/pql/parsers/Pql2AstListener.java
@@ -15,37 +15,8 @@
  */
 package com.linkedin.pinot.pql.parsers;
 
-import com.linkedin.pinot.pql.parsers.pql2.ast.AstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.BetweenPredicateAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.BinaryMathOpAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.BooleanOperatorAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.ComparisonPredicateAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.ExpressionParenthesisGroupAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.FloatingPointLiteralAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.FunctionCallAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.GroupByAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.HavingAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.IdentifierAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.InPredicateAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.IntegerLiteralAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.IsPredicateAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.LimitAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.OptionAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.OptionsAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.OrderByAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.OrderByExpressionAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.OutputColumnAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.OutputColumnListAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.PredicateListAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.PredicateParenthesisGroupAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.RegexpLikePredicateAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.SelectAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.StarColumnListAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.StarExpressionAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.StringLiteralAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.TableNameAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.TopAstNode;
-import com.linkedin.pinot.pql.parsers.pql2.ast.WhereAstNode;
+import com.linkedin.pinot.pql.parsers.pql2.ast.*;
+
 import java.util.Stack;
 import org.antlr.v4.runtime.misc.NotNull;
 
@@ -288,6 +259,16 @@ public class Pql2AstListener extends PQL2BaseListener {
   }
 
   @Override
+  public void enterTextMatchPredicate(@NotNull PQL2Parser.TextMatchPredicateContext ctx) {
+    pushNode(new TextMatchPredicateAstNode());
+  }
+
+  @Override
+  public void exitTextMatchPredicate(@NotNull PQL2Parser.TextMatchPredicateContext ctx) {
+    popNode();
+  }
+
+  @Override
   public void enterHaving(@NotNull PQL2Parser.HavingContext ctx) {
     pushNode(new HavingAstNode());
   }
diff --git a/pinot-common/src/main/java/com/linkedin/pinot/pql/parsers/pql2/ast/RegexpLikePredicateAstNode.java b/pinot-common/src/main/java/com/linkedin/pinot/pql/parsers/pql2/ast/RegexpLikePredicateAstNode.java
index 761768e..5e64bb6 100644
--- a/pinot-common/src/main/java/com/linkedin/pinot/pql/parsers/pql2/ast/RegexpLikePredicateAstNode.java
+++ b/pinot-common/src/main/java/com/linkedin/pinot/pql/parsers/pql2/ast/RegexpLikePredicateAstNode.java
@@ -41,8 +41,7 @@ public class RegexpLikePredicateAstNode extends PredicateAstNode {
       }
     } else if (childNode instanceof FunctionCallAstNode) {
       throw new Pql2CompilationException("REGEXP_LIKE operator can not be called for a function.");
-    }
-    else {
+    } else {
       super.addChild(childNode);
     }
   }
diff --git a/pinot-common/src/main/java/com/linkedin/pinot/pql/parsers/pql2/ast/RegexpLikePredicateAstNode.java b/pinot-common/src/main/java/com/linkedin/pinot/pql/parsers/pql2/ast/TextMatchPredicateAstNode.java
similarity index 64%
copy from pinot-common/src/main/java/com/linkedin/pinot/pql/parsers/pql2/ast/RegexpLikePredicateAstNode.java
copy to pinot-common/src/main/java/com/linkedin/pinot/pql/parsers/pql2/ast/TextMatchPredicateAstNode.java
index 761768e..f6968cc 100644
--- a/pinot-common/src/main/java/com/linkedin/pinot/pql/parsers/pql2/ast/RegexpLikePredicateAstNode.java
+++ b/pinot-common/src/main/java/com/linkedin/pinot/pql/parsers/pql2/ast/TextMatchPredicateAstNode.java
@@ -15,18 +15,15 @@
  */
 package com.linkedin.pinot.pql.parsers.pql2.ast;
 
-import java.util.Collections;
-import java.util.HashSet;
-import java.util.List;
-import java.util.Set;
-
 import com.linkedin.pinot.common.request.FilterOperator;
 import com.linkedin.pinot.common.utils.StringUtil;
 import com.linkedin.pinot.common.utils.request.FilterQueryTree;
 import com.linkedin.pinot.common.utils.request.HavingQueryTree;
 import com.linkedin.pinot.pql.parsers.Pql2CompilationException;
 
-public class RegexpLikePredicateAstNode extends PredicateAstNode {
+import java.util.*;
+
+public class TextMatchPredicateAstNode extends PredicateAstNode {
   private static final String SEPERATOR = "\t\t";
   private String _identifier;
 
@@ -37,12 +34,11 @@ public class RegexpLikePredicateAstNode extends PredicateAstNode {
         IdentifierAstNode node = (IdentifierAstNode) childNode;
         _identifier = node.getName();
       } else {
-        throw new Pql2CompilationException("REGEXP_LIKE predicate has more than one identifier.");
+        throw new Pql2CompilationException("TEXT_MATCH predicate has more than one identifier.");
       }
     } else if (childNode instanceof FunctionCallAstNode) {
-      throw new Pql2CompilationException("REGEXP_LIKE operator can not be called for a function.");
-    }
-    else {
+      throw new Pql2CompilationException("TEXT_MATCH operator can not be called for a function.");
+    } else {
       super.addChild(childNode);
     }
   }
@@ -50,10 +46,10 @@ public class RegexpLikePredicateAstNode extends PredicateAstNode {
   @Override
   public FilterQueryTree buildFilterQueryTree() {
     if (_identifier == null) {
-      throw new Pql2CompilationException("REGEXP_LIKE predicate has no identifier");
+      throw new Pql2CompilationException("TEXT_MATCH predicate has no identifier");
     }
 
-    Set<String> values = new HashSet<>();
+    List<String> values = new ArrayList<>();
 
     for (AstNode astNode : getChildren()) {
       if (astNode instanceof LiteralAstNode) {
@@ -62,18 +58,16 @@ public class RegexpLikePredicateAstNode extends PredicateAstNode {
         values.add(expr);
       }
     }
-    if(values.size() > 1) {
-      throw new Pql2CompilationException("Matching more than one regex is NOT supported currently");
+    if(values.size() != 2) {
+      throw new Pql2CompilationException("TEXT_MATCH expects columnName, 'queryString', 'queryOption'");
     }
 
-    String[] valueArray = values.toArray(new String[values.size()]);
-    FilterOperator filterOperator = FilterOperator.REGEXP_LIKE;
-    List<String> value = Collections.singletonList(StringUtil.join(SEPERATOR, valueArray));
-    return new FilterQueryTree(_identifier, value, filterOperator, null);
+    FilterOperator filterOperator = FilterOperator.TEXT_MATCH;
+    return new FilterQueryTree(_identifier, values, filterOperator, null);
   }
 
   @Override
   public HavingQueryTree buildHavingQueryTree() {
-    throw new Pql2CompilationException("REGEXP_LIKE predicate is not supported in HAVING clause.");
+    throw new Pql2CompilationException("TEXT_MATCH predicate is not supported in HAVING clause.");
   }
 }
diff --git a/pinot-common/src/test/java/com/linkedin/pinot/pql/parsers/Pql2CompilerTest.java b/pinot-common/src/test/java/com/linkedin/pinot/pql/parsers/Pql2CompilerTest.java
index fe49564..3a10e37 100644
--- a/pinot-common/src/test/java/com/linkedin/pinot/pql/parsers/Pql2CompilerTest.java
+++ b/pinot-common/src/test/java/com/linkedin/pinot/pql/parsers/Pql2CompilerTest.java
@@ -235,4 +235,17 @@ public class Pql2CompilerTest {
     Assert.assertEquals(expressions.size(), 1);
     Assert.assertEquals(expressions.get(0), "sub('foo',bar)");
   }
+  @Test
+  public void testTextMatch() {
+
+    // Allow string literal column in aggregation and group-by query
+    BrokerRequest brokerRequest =
+            COMPILER.compileToBrokerRequest("SELECT foo FROM table where text_match(col, 'title:\"harry\"', '')");
+    System.out.println(brokerRequest.getFilterQuery());
+    System.out.println(brokerRequest.getFilterQuery().getColumn());
+    System.out.println(brokerRequest.getFilterQuery().getValue().size());
+    System.out.println(brokerRequest.getFilterQuery().getValue());
+
+
+  }
 }
diff --git a/pinot-common/src/thrift/request.thrift b/pinot-common/src/thrift/request.thrift
index 8e72e37..2e400f3 100644
--- a/pinot-common/src/thrift/request.thrift
+++ b/pinot-common/src/thrift/request.thrift
@@ -27,7 +27,8 @@ enum FilterOperator {
   RANGE,
   REGEXP_LIKE,
   NOT_IN,
-  IN
+  IN,
+  TEXT_MATCH
 }
 
 /**
diff --git a/pinot-core/src/main/java/com/linkedin/pinot/core/common/Predicate.java b/pinot-core/src/main/java/com/linkedin/pinot/core/common/Predicate.java
index 425b45e..a6173db 100644
--- a/pinot-core/src/main/java/com/linkedin/pinot/core/common/Predicate.java
+++ b/pinot-core/src/main/java/com/linkedin/pinot/core/common/Predicate.java
@@ -17,12 +17,8 @@ package com.linkedin.pinot.core.common;
 
 import com.linkedin.pinot.common.request.FilterOperator;
 import com.linkedin.pinot.common.utils.request.FilterQueryTree;
-import com.linkedin.pinot.core.common.predicate.EqPredicate;
-import com.linkedin.pinot.core.common.predicate.InPredicate;
-import com.linkedin.pinot.core.common.predicate.NEqPredicate;
-import com.linkedin.pinot.core.common.predicate.NotInPredicate;
-import com.linkedin.pinot.core.common.predicate.RangePredicate;
-import com.linkedin.pinot.core.common.predicate.RegexpLikePredicate;
+import com.linkedin.pinot.core.common.predicate.*;
+
 import java.util.Arrays;
 import java.util.List;
 
@@ -35,7 +31,8 @@ public abstract class Predicate {
     REGEXP_LIKE,
     RANGE,
     IN,
-    NOT_IN;
+    NOT_IN,
+    TEXT_MATCH;
 
     public boolean isExclusive() {
       return this == NEQ || this == NOT_IN;
@@ -96,6 +93,9 @@ public abstract class Predicate {
     case IN:
       predicate = new InPredicate(column, value);
       break;
+    case TEXT_MATCH:
+      predicate = new TextMatchPredicate(column, value);
+      break;
     default:
       throw new UnsupportedOperationException("Unsupported filterType:" + filterType);
     }
diff --git a/pinot-core/src/main/java/com/linkedin/pinot/core/common/predicate/TextMatchPredicate.java b/pinot-core/src/main/java/com/linkedin/pinot/core/common/predicate/TextMatchPredicate.java
new file mode 100644
index 0000000..79ebaef
--- /dev/null
+++ b/pinot-core/src/main/java/com/linkedin/pinot/core/common/predicate/TextMatchPredicate.java
@@ -0,0 +1,48 @@
+/**
+ * Copyright (C) 2014-2018 LinkedIn Corp. (pinot-core@linkedin.com)
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ *         http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package com.linkedin.pinot.core.common.predicate;
+
+import com.google.common.base.Preconditions;
+import com.linkedin.pinot.core.common.Predicate;
+
+import java.util.Arrays;
+import java.util.List;
+
+
+public class TextMatchPredicate extends Predicate {
+  String _query;
+  String _options;
+  public TextMatchPredicate(String lhs, List<String> rhs) {
+    super(lhs, Type.TEXT_MATCH, rhs);
+    Preconditions.checkArgument(rhs.size() == 2);
+    _query = rhs.get(0);
+    _options = rhs.get(1);
+  }
+
+  @Override
+  public String toString() {
+    return "Predicate: type: " + getType() + ", left : " + getLhs() + ", right : " + Arrays.toString(new String[]{_query, _options}) + "\n";
+  }
+  
+  public String getQuery(){
+   return _query;
+  }
+
+  public String getQueryOptions(){
+    return _options;
+  }
+
+}
diff --git a/pinot-core/src/main/java/com/linkedin/pinot/core/operator/filter/FilterOperatorUtils.java b/pinot-core/src/main/java/com/linkedin/pinot/core/operator/filter/FilterOperatorUtils.java
index 8f5b5a1..b262d0e 100644
--- a/pinot-core/src/main/java/com/linkedin/pinot/core/operator/filter/FilterOperatorUtils.java
+++ b/pinot-core/src/main/java/com/linkedin/pinot/core/operator/filter/FilterOperatorUtils.java
@@ -53,7 +53,9 @@ public class FilterOperatorUtils {
     // Use inverted index if the predicate type is not RANGE or REGEXP_LIKE for efficiency
     DataSourceMetadata dataSourceMetadata = dataSource.getDataSourceMetadata();
     Predicate.Type predicateType = predicateEvaluator.getPredicateType();
-    if (dataSourceMetadata.hasInvertedIndex() && (predicateType != Predicate.Type.RANGE) && (predicateType
+    if(predicateType == Predicate.Type.TEXT_MATCH) {
+      return new TextMatchFilterOperator(predicateEvaluator, dataSource, startDocId, endDocId);
+    } else if (dataSourceMetadata.hasInvertedIndex() && (predicateType != Predicate.Type.RANGE) && (predicateType
         != Predicate.Type.REGEXP_LIKE)) {
       if (dataSourceMetadata.isSorted()) {
         return new SortedInvertedIndexBasedFilterOperator(predicateEvaluator, dataSource, startDocId, endDocId);
diff --git a/pinot-core/src/main/java/com/linkedin/pinot/core/operator/filter/TextMatchFilterOperator.java b/pinot-core/src/main/java/com/linkedin/pinot/core/operator/filter/TextMatchFilterOperator.java
new file mode 100644
index 0000000..70fce75
--- /dev/null
+++ b/pinot-core/src/main/java/com/linkedin/pinot/core/operator/filter/TextMatchFilterOperator.java
@@ -0,0 +1,71 @@
+/**
+ * Copyright (C) 2014-2018 LinkedIn Corp. (pinot-core@linkedin.com)
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ *         http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package com.linkedin.pinot.core.operator.filter;
+
+import com.google.common.base.Preconditions;
+import com.linkedin.pinot.core.common.DataSource;
+import com.linkedin.pinot.core.operator.blocks.FilterBlock;
+import com.linkedin.pinot.core.operator.docidsets.BitmapDocIdSet;
+import com.linkedin.pinot.core.operator.filter.predicate.PredicateEvaluator;
+import com.linkedin.pinot.core.segment.index.readers.InvertedIndexReader;
+import org.roaringbitmap.buffer.ImmutableRoaringBitmap;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.util.ArrayList;
+import java.util.List;
+
+
+public class TextMatchFilterOperator extends BaseFilterOperator {
+  private static final Logger LOGGER = LoggerFactory.getLogger(TextMatchFilterOperator.class);
+  private static final String OPERATOR_NAME = "BitmapBasedFilterOperator";
+
+  private final PredicateEvaluator _predicateEvaluator;
+  private final DataSource _dataSource;
+  private final ImmutableRoaringBitmap[] _bitmaps;
+  private final int _startDocId;
+  // TODO: change it to exclusive
+  // Inclusive
+  private final int _endDocId;
+  private final boolean _exclusive;
+
+  TextMatchFilterOperator(PredicateEvaluator predicateEvaluator, DataSource dataSource, int startDocId,
+                          int endDocId) {
+    // NOTE:
+    // Predicate that is always evaluated as true or false should not be passed into the TextMatchFilterOperator for
+    // performance concern.
+    // If predicate is always evaluated as true, use MatchAllFilterOperator; if predicate is always evaluated as false,
+    // use EmptyFilterOperator.
+    Preconditions.checkArgument(!predicateEvaluator.isAlwaysTrue() && !predicateEvaluator.isAlwaysFalse());
+
+    _predicateEvaluator = predicateEvaluator;
+    _dataSource = dataSource;
+    _bitmaps = null;
+    _startDocId = startDocId;
+    _endDocId = endDocId;
+    _exclusive = predicateEvaluator.isExclusive();
+  }
+
+  @Override
+  protected FilterBlock getNextBlock() {
+    throw new UnsupportedOperationException("WIP");
+  }
+
+  @Override
+  public String getOperatorName() {
+    return OPERATOR_NAME;
+  }
+}
diff --git a/pinot-core/src/main/java/com/linkedin/pinot/core/operator/filter/predicate/PredicateEvaluatorProvider.java b/pinot-core/src/main/java/com/linkedin/pinot/core/operator/filter/predicate/PredicateEvaluatorProvider.java
index 518c539..ac57911 100644
--- a/pinot-core/src/main/java/com/linkedin/pinot/core/operator/filter/predicate/PredicateEvaluatorProvider.java
+++ b/pinot-core/src/main/java/com/linkedin/pinot/core/operator/filter/predicate/PredicateEvaluatorProvider.java
@@ -18,12 +18,7 @@ package com.linkedin.pinot.core.operator.filter.predicate;
 import com.linkedin.pinot.common.data.FieldSpec.DataType;
 import com.linkedin.pinot.core.common.DataSource;
 import com.linkedin.pinot.core.common.Predicate;
-import com.linkedin.pinot.core.common.predicate.EqPredicate;
-import com.linkedin.pinot.core.common.predicate.InPredicate;
-import com.linkedin.pinot.core.common.predicate.NEqPredicate;
-import com.linkedin.pinot.core.common.predicate.NotInPredicate;
-import com.linkedin.pinot.core.common.predicate.RangePredicate;
-import com.linkedin.pinot.core.common.predicate.RegexpLikePredicate;
+import com.linkedin.pinot.core.common.predicate.*;
 import com.linkedin.pinot.core.query.exception.BadQueryRequestException;
 import com.linkedin.pinot.core.segment.index.readers.Dictionary;
 
@@ -50,6 +45,8 @@ public class PredicateEvaluatorProvider {
           case REGEXP_LIKE:
             return RegexpLikePredicateEvaluatorFactory.newDictionaryBasedEvaluator((RegexpLikePredicate) predicate,
                 dictionary);
+          case TEXT_MATCH:
+            new UnsupportedOperationException("Text Match predicate not supported on dictionary encoded columns");
           default:
             throw new UnsupportedOperationException("Unsupported predicate type: " + predicate.getType());
         }
@@ -69,6 +66,9 @@ public class PredicateEvaluatorProvider {
           case REGEXP_LIKE:
             return RegexpLikePredicateEvaluatorFactory.newRawValueBasedEvaluator((RegexpLikePredicate) predicate,
                 dataType);
+          case TEXT_MATCH:
+            return TextMatchPredicateEvaluatorFactory.newRawValueBasedEvaluator((TextMatchPredicate) predicate,
+                    dataType);
           default:
             throw new UnsupportedOperationException("Unsupported predicate type: " + predicate.getType());
         }
diff --git a/pinot-core/src/main/java/com/linkedin/pinot/core/operator/filter/predicate/TextMatchPredicateEvaluatorFactory.java b/pinot-core/src/main/java/com/linkedin/pinot/core/operator/filter/predicate/TextMatchPredicateEvaluatorFactory.java
new file mode 100644
index 0000000..556ffbc
--- /dev/null
+++ b/pinot-core/src/main/java/com/linkedin/pinot/core/operator/filter/predicate/TextMatchPredicateEvaluatorFactory.java
@@ -0,0 +1,60 @@
+/**
+ * Copyright (C) 2014-2018 LinkedIn Corp. (pinot-core@linkedin.com)
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ *         http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package com.linkedin.pinot.core.operator.filter.predicate;
+
+import com.linkedin.pinot.common.data.FieldSpec;
+import com.linkedin.pinot.core.common.Predicate;
+import com.linkedin.pinot.core.common.predicate.TextMatchPredicate;
+
+
+/**
+ * Factory for REGEXP_LIKE predicate evaluators.
+ */
+public class TextMatchPredicateEvaluatorFactory {
+  private TextMatchPredicateEvaluatorFactory() {
+  }
+
+  /**
+   * Create a new instance of raw value based REGEXP_LIKE predicate evaluator.
+   *
+   * @param textMatchPredicate REGEXP_LIKE predicate to evaluate
+   * @param dataType Data type for the column
+   * @return Raw value based REGEXP_LIKE predicate evaluator
+   */
+  public static BaseRawValueBasedPredicateEvaluator newRawValueBasedEvaluator(TextMatchPredicate textMatchPredicate,
+      FieldSpec.DataType dataType) {
+    return new RawValueBasedTextMatchPredicateEvaluator(textMatchPredicate);
+  }
+
+  private static final class RawValueBasedTextMatchPredicateEvaluator extends BaseRawValueBasedPredicateEvaluator {
+    String _query;
+    String _options;
+    public RawValueBasedTextMatchPredicateEvaluator(TextMatchPredicate textMatchPredicate) {
+      _query = textMatchPredicate.getQuery();
+      _options = textMatchPredicate.getQueryOptions();
+    }
+
+    @Override
+    public Predicate.Type getPredicateType() {
+      return Predicate.Type.TEXT_MATCH;
+    }
+
+    @Override
+    public boolean applySV(String value) {
+      throw new UnsupportedOperationException("Text Match is not supported via scanning, its supported only via inverted index");
+    }
+  }
+}
diff --git a/pinot-tools/pom.xml b/pinot-tools/pom.xml
index 28bf76a..c664811 100644
--- a/pinot-tools/pom.xml
+++ b/pinot-tools/pom.xml
@@ -78,6 +78,16 @@
       <groupId>org.yaml</groupId>
       <artifactId>snakeyaml</artifactId>
     </dependency>
+    <dependency>
+      <groupId>org.apache.lucene</groupId>
+      <artifactId>lucene-core</artifactId>
+      <version>7.6.0</version>
+    </dependency>
+    <dependency>
+      <groupId>org.apache.lucene</groupId>
+      <artifactId>lucene-queryparser</artifactId>
+      <version>7.6.0</version>
+    </dependency>
   </dependencies>
   <build>
     <plugins>
diff --git a/pinot-tools/src/main/java/com/linkedin/pinot/tools/TextInvertedIndex.java b/pinot-tools/src/main/java/com/linkedin/pinot/tools/TextInvertedIndex.java
new file mode 100644
index 0000000..f4fc23b
--- /dev/null
+++ b/pinot-tools/src/main/java/com/linkedin/pinot/tools/TextInvertedIndex.java
@@ -0,0 +1,126 @@
+package com.linkedin.pinot.tools;
+
+import com.google.common.base.Splitter;
+import com.linkedin.pinot.common.data.DimensionFieldSpec;
+import com.linkedin.pinot.common.data.FieldSpec;
+import com.linkedin.pinot.core.segment.creator.impl.SegmentDictionaryCreator;
+import com.linkedin.pinot.core.segment.creator.impl.inv.OffHeapBitmapInvertedIndexCreator;
+import org.apache.commons.io.IOUtils;
+import org.apache.lucene.analysis.standard.StandardAnalyzer;
+import org.apache.lucene.document.Document;
+import org.apache.lucene.document.Field;
+import org.apache.lucene.document.TextField;
+import org.apache.lucene.index.*;
+import org.apache.lucene.queryparser.classic.QueryParser;
+import org.apache.lucene.search.IndexSearcher;
+import org.apache.lucene.search.Query;
+import org.apache.lucene.search.TermQuery;
+import org.apache.lucene.search.TopDocs;
+import org.apache.lucene.store.Directory;
+import org.apache.lucene.store.FSDirectory;
+import org.apache.lucene.store.MMapDirectory;
+
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.IOException;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.*;
+
+public class TextInvertedIndex {
+
+    public static void main(String[] args) throws Exception {
+        String rootDir = "/tmp/pinot-index/access-log/";
+//        createIndices(rootDir);
+        loadLuceneIndex(rootDir);
+        return;
+
+    }
+
+    private static void loadLuceneIndex(String rootDir) throws Exception {
+        MMapDirectory directory;
+        Path path = Paths.get(rootDir, "lucene");
+        directory = new MMapDirectory(path);
+        IndexReader reader = DirectoryReader.open(directory);
+        IndexSearcher searcher = new IndexSearcher(reader);
+
+        Query query = new TermQuery(new Term("content","mozilla"));
+        TopDocs topDocs = searcher.search(query, 10);
+
+        System.out.println("topDocs.totalHits = " + topDocs.totalHits);
+
+    }
+
+    private static void createIndices(String rootDir) throws IOException {
+        Splitter splitter = Splitter.on(" ").omitEmptyStrings();
+        List<String> lines = IOUtils.readLines(new FileInputStream(new File("/Users/kishoreg/Downloads/access.log")));
+
+        HashMap<String, Integer> dictionary = new HashMap<>();
+        Map<Integer, Set<Integer>> map = new HashMap<>();
+        int numValues = 0;
+        for (int i = 0; i < lines.size(); i++) {
+            String line = lines.get(i);
+            Iterable<String> tokens = splitter.split(line);
+            TreeSet<Integer> dictIdSet = new TreeSet<>();
+            for (String token : tokens) {
+                token = token.trim();
+                if (!dictionary.containsKey(token)) {
+                    dictionary.put(token, dictionary.size());
+                }
+                dictIdSet.add(dictionary.get(token));
+            }
+            map.put(i, dictIdSet);
+            numValues += dictIdSet.size();
+        }
+        File indexDir;
+        indexDir = new File(rootDir, "roaringBitmap");
+        indexDir.delete();
+        indexDir.mkdirs();
+        FieldSpec fieldSpec = new DimensionFieldSpec();
+        fieldSpec.setDataType(FieldSpec.DataType.STRING);
+        fieldSpec.setSingleValueField(false);
+        fieldSpec.setName("textField");
+        int cardinality = dictionary.size();
+        int numDocs = map.size();
+
+        createRoaringBitmap(dictionary, map, numValues, indexDir, fieldSpec, cardinality, numDocs);
+        createLuceneIndex(lines, rootDir);
+    }
+
+    private static void createLuceneIndex(List<String> lines, String rootDir) throws IOException {
+        Directory index = FSDirectory.open(Paths.get(rootDir, "lucene"));
+        StandardAnalyzer analyzer = new StandardAnalyzer();
+        IndexWriterConfig conf = new IndexWriterConfig(analyzer);
+        conf.setRAMBufferSizeMB(500);
+        IndexWriter writer = new IndexWriter(index, conf);
+        for (int i = 0; i < lines.size(); i++) {
+            Document doc = new Document();
+            doc.add(new TextField("content", lines.get(i), Field.Store.NO));
+            writer.addDocument(doc);
+        }
+        writer.close();
+    }
+
+    private static void createRoaringBitmap(HashMap<String, Integer> dictionary, Map<Integer, Set<Integer>> map, int numValues, File indexDir, FieldSpec fieldSpec, int cardinality, int numDocs) throws IOException {
+        try (OffHeapBitmapInvertedIndexCreator creator = new OffHeapBitmapInvertedIndexCreator(indexDir, fieldSpec, cardinality, numDocs, numValues)) {
+            for (int i = 0; i < map.size(); i++) {
+                Set<Integer> dictIdSet = map.get(new Integer(i));
+                int[] dictIds = new int[dictIdSet.size()];
+                int k = 0;
+                for (int dictId : dictIdSet) {
+                    dictIds[k++] = dictId;
+                }
+//                System.out.println(dictIds);
+                creator.add(dictIds, dictIds.length);
+            }
+            creator.seal();
+        }
+        String sortedValues[] = new String[dictionary.size()];
+        dictionary.keySet().toArray(sortedValues);
+        Arrays.sort(sortedValues);
+        SegmentDictionaryCreator creator = new SegmentDictionaryCreator(sortedValues, fieldSpec, indexDir);
+        creator.close();
+    }
+
+
+}


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org