You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@carbondata.apache.org by sounakr <gi...@git.apache.org> on 2017/06/23 05:32:27 UTC

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

GitHub user sounakr opened a pull request:

    https://github.com/apache/carbondata/pull/1079

    [WIP]Measure Filter implementation

    Measure Filter Implementation

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/sounakr/incubator-carbondata measure_filter

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/carbondata/pull/1079.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #1079
    
----
commit b3fa1780ae0e26fa379d812f9aec1c1c6274b8c6
Author: sounakr <so...@gmail.com>
Date:   2017-06-20T17:22:36Z

    Measure Filter implementation

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125198925
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/ExcludeFilterExecuterImpl.java ---
    @@ -18,56 +18,152 @@
     
     import java.io.IOException;
     import java.util.BitSet;
    +import java.util.Comparator;
     
     import org.apache.carbondata.core.datastore.block.SegmentProperties;
     import org.apache.carbondata.core.datastore.chunk.DimensionColumnDataChunk;
    +import org.apache.carbondata.core.datastore.chunk.MeasureColumnDataChunk;
     import org.apache.carbondata.core.datastore.chunk.impl.DimensionRawColumnChunk;
    +import org.apache.carbondata.core.datastore.chunk.impl.MeasureRawColumnChunk;
    +import org.apache.carbondata.core.metadata.datatype.DataType;
     import org.apache.carbondata.core.scan.filter.FilterUtil;
    +import org.apache.carbondata.core.scan.filter.partition.PartitionFilterUtil;
     import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.DimColumnResolvedFilterInfo;
    +import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.MeasureColumnResolvedFilterInfo;
     import org.apache.carbondata.core.scan.processor.BlocksChunkHolder;
     import org.apache.carbondata.core.util.BitSetGroup;
     import org.apache.carbondata.core.util.CarbonUtil;
    +import org.apache.carbondata.core.util.DataTypeUtil;
     
     public class ExcludeFilterExecuterImpl implements FilterExecuter {
     
       protected DimColumnResolvedFilterInfo dimColEvaluatorInfo;
       protected DimColumnExecuterFilterInfo dimColumnExecuterInfo;
    +  protected MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo;
    +  protected MeasureColumnExecuterFilterInfo msrColumnExecutorInfo;
       protected SegmentProperties segmentProperties;
    +  protected boolean isDimensionPresentInCurrentBlock = false;
    +  protected boolean isMeasurePresentInCurrentBlock = false;
       /**
        * is dimension column data is natural sorted
        */
    -  private boolean isNaturalSorted;
    +  private boolean isNaturalSorted = false;
    +
       public ExcludeFilterExecuterImpl(DimColumnResolvedFilterInfo dimColEvaluatorInfo,
    -      SegmentProperties segmentProperties) {
    -    this.dimColEvaluatorInfo = dimColEvaluatorInfo;
    -    dimColumnExecuterInfo = new DimColumnExecuterFilterInfo();
    +      MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo, SegmentProperties segmentProperties,
    +      boolean isMeasure) {
         this.segmentProperties = segmentProperties;
    -    FilterUtil.prepareKeysFromSurrogates(dimColEvaluatorInfo.getFilterValues(), segmentProperties,
    -        dimColEvaluatorInfo.getDimension(), dimColumnExecuterInfo);
    -    isNaturalSorted = dimColEvaluatorInfo.getDimension().isUseInvertedIndex() && dimColEvaluatorInfo
    -        .getDimension().isSortColumn();
    +    if (isMeasure == false) {
    --- End diff --
    
    Done


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125153736
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/IncludeFilterExecuterImpl.java ---
    @@ -152,12 +261,31 @@ private BitSet setFilterdIndexToBitSet(DimensionColumnDataChunk dimensionColumnD
     
       public BitSet isScanRequired(byte[][] blkMaxVal, byte[][] blkMinVal) {
         BitSet bitSet = new BitSet(1);
    -    byte[][] filterValues = dimColumnExecuterInfo.getFilterKeys();
    -    int columnIndex = dimColumnEvaluatorInfo.getColumnIndex();
    -    int blockIndex = segmentProperties.getDimensionOrdinalToBlockMapping().get(columnIndex);
    +    byte[][] filterValues = null;
    +    int columnIndex = 0;
    +    int blockIndex = 0;
    +    boolean isScanRequired = false;
    +
    +    if (isDimensionPresentInCurrentBlock == true) {
    +      filterValues = dimColumnExecuterInfo.getFilterKeys();
    +      columnIndex = dimColumnEvaluatorInfo.getColumnIndex();
    +      blockIndex = segmentProperties.getDimensionOrdinalToBlockMapping().get(columnIndex);
    +      isScanRequired =
    +          isScanRequired(blkMaxVal[blockIndex], blkMinVal[blockIndex], filterValues);
    +
    +    } else if (isMeasurePresentInCurrentBlock) {
    +      filterValues = msrColumnExecutorInfo.getFilterKeys();
    +      columnIndex = msrColumnEvaluatorInfo.getColumnIndex();
    +      // blockIndex =
    +      // segmentProperties.getDimensionOrdinalToBlockMapping().get(columnIndex) + segmentProperties
    +      //         .getLastDimensionColOrdinal();
    --- End diff --
    
    remove commented code


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125199276
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/IncludeFilterExecuterImpl.java ---
    @@ -17,65 +17,174 @@
     package org.apache.carbondata.core.scan.filter.executer;
     
     import java.io.IOException;
    +import java.math.BigDecimal;
    +import java.nio.ByteBuffer;
     import java.util.BitSet;
    +import java.util.Comparator;
     
     import org.apache.carbondata.core.datastore.block.SegmentProperties;
     import org.apache.carbondata.core.datastore.chunk.DimensionColumnDataChunk;
    +import org.apache.carbondata.core.datastore.chunk.MeasureColumnDataChunk;
     import org.apache.carbondata.core.datastore.chunk.impl.DimensionRawColumnChunk;
    +import org.apache.carbondata.core.datastore.chunk.impl.MeasureRawColumnChunk;
    +import org.apache.carbondata.core.metadata.datatype.DataType;
     import org.apache.carbondata.core.scan.filter.FilterUtil;
    +import org.apache.carbondata.core.scan.filter.partition.PartitionFilterUtil;
     import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.DimColumnResolvedFilterInfo;
    +import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.MeasureColumnResolvedFilterInfo;
     import org.apache.carbondata.core.scan.processor.BlocksChunkHolder;
     import org.apache.carbondata.core.util.BitSetGroup;
     import org.apache.carbondata.core.util.ByteUtil;
     import org.apache.carbondata.core.util.CarbonUtil;
    +import org.apache.carbondata.core.util.DataTypeUtil;
     
     public class IncludeFilterExecuterImpl implements FilterExecuter {
     
       protected DimColumnResolvedFilterInfo dimColumnEvaluatorInfo;
       protected DimColumnExecuterFilterInfo dimColumnExecuterInfo;
    +  protected MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo;
    +  protected MeasureColumnExecuterFilterInfo msrColumnExecutorInfo;
       protected SegmentProperties segmentProperties;
    +  protected boolean isDimensionPresentInCurrentBlock = false;
    +  protected boolean isMeasurePresentInCurrentBlock = false;
       /**
        * is dimension column data is natural sorted
        */
    -  private boolean isNaturalSorted;
    +  private boolean isNaturalSorted = false;
    --- End diff --
    
    Done.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125190753
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/page/statistics/ColumnPageStatsVO.java ---
    @@ -56,9 +65,7 @@ public ColumnPageStatsVO(DataType dataType) {
             nonExistValue = Double.MIN_VALUE;
             break;
           case DECIMAL:
    -        max = new BigDecimal(Double.MIN_VALUE);
    -        min = new BigDecimal(Double.MAX_VALUE);
    -        nonExistValue = new BigDecimal(Double.MIN_VALUE);
    +        this.zeroDecimal = new BigDecimal(0);
    --- End diff --
    
    Done


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125191588
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/MeasureColumnExecuterFilterInfo.java ---
    @@ -0,0 +1,30 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.core.scan.filter.executer;
    +
    +public class MeasureColumnExecuterFilterInfo {
    +
    +  byte[][] filterKeys;
    --- End diff --
    
    In Current Implementation kept filterkeys as Byte Array to keep it simple and in sync with dimention array. During actual comparision the filter keys are converted back to Objects and compared. In next phase optimization will change the Filter Keys to hold objects. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125191734
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeGrtThanFiterExecuterImpl.java ---
    @@ -74,80 +87,205 @@ private void ifDefaultValueMatchesFilter() {
               }
             }
           }
    +    } else if (!msrColEvalutorInfoList.isEmpty() && !isMeasurePresentInCurrentBlock[0]) {
    +      CarbonMeasure measure = this.msrColEvalutorInfoList.get(0).getMeasure();
    +      byte[] defaultValue = measure.getDefaultValue();
    +      if (null != defaultValue) {
    +        for (int k = 0; k < filterRangeValues.length; k++) {
    +          int maxCompare =
    +              ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterRangeValues[k], defaultValue);
    +          if (maxCompare < 0) {
    +            isDefaultValuePresentInFilter = true;
    +            break;
    +          }
    +        }
    +      }
         }
       }
     
       @Override public BitSet isScanRequired(byte[][] blockMaxValue, byte[][] blockMinValue) {
         BitSet bitSet = new BitSet(1);
    -    boolean isScanRequired =
    -        isScanRequired(blockMaxValue[dimensionBlocksIndex[0]], filterRangeValues);
    +    boolean isScanRequired = false;
    +    byte[] maxValue = null;
    +    if (isMeasurePresentInCurrentBlock[0] || isDimensionPresentInCurrentBlock[0]) {
    +      if (isMeasurePresentInCurrentBlock[0]) {
    +        maxValue = blockMaxValue[measureBlocksIndex[0] + lastDimensionColOrdinal];
    +        isScanRequired =
    +            isScanRequired(maxValue, filterRangeValues, msrColEvalutorInfoList.get(0).getType());
    +      } else {
    +        maxValue = blockMaxValue[dimensionBlocksIndex[0]];
    +        isScanRequired = isScanRequired(maxValue, filterRangeValues);
    +      }
    +    } else {
    +      isScanRequired = isDefaultValuePresentInFilter;
    +    }
    +
         if (isScanRequired) {
           bitSet.set(0);
         }
         return bitSet;
       }
     
    +
       private boolean isScanRequired(byte[] blockMaxValue, byte[][] filterValues) {
         boolean isScanRequired = false;
    -    if (isDimensionPresentInCurrentBlock[0]) {
    -      for (int k = 0; k < filterValues.length; k++) {
    -        // filter value should be in range of max and min value i.e
    -        // max>filtervalue>min
    -        // so filter-max should be negative
    -        int maxCompare = ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMaxValue);
    -        // if any filter value is in range than this block needs to be
    -        // scanned means always less than block max range.
    -        if (maxCompare < 0) {
    -          isScanRequired = true;
    -          break;
    -        }
    +    for (int k = 0; k < filterValues.length; k++) {
    +      // filter value should be in range of max and min value i.e
    +      // max>filtervalue>min
    +      // so filter-max should be negative
    +      int maxCompare = ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMaxValue);
    +      // if any filter value is in range than this block needs to be
    +      // scanned less than equal to max range.
    +      if (maxCompare < 0) {
    +        isScanRequired = true;
    +        break;
           }
    -    } else {
    -      isScanRequired = isDefaultValuePresentInFilter;
         }
         return isScanRequired;
       }
     
    +  private boolean isScanRequired(byte[] maxValue, byte[][] filterValue,
    +      DataType dataType) {
    +    for (int i = 0; i < filterValue.length; i++) {
    +      if (filterValue[i].length == 0 || maxValue.length == 0) {
    +        return isScanRequired(maxValue, filterValue);
    +      }
    +      switch (dataType) {
    --- End diff --
    
    In Line 150 is a special Null Value case, rest of the cases comparator is being used. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125152252
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/page/statistics/ColumnPageStatsVO.java ---
    @@ -37,9 +41,14 @@
        */
       private Object nonExistValue;
     
    -  /** decimal count of the measures */
    +  /**
    +   * decimal count of the measures
    +   */
       private int decimal;
     
    +  private boolean isFirst = true;
    +  private BigDecimal zeroDecimal;
    --- End diff --
    
    use `BigDecimal.ZERO`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125199254
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/ExcludeFilterExecuterImpl.java ---
    @@ -18,56 +18,152 @@
     
     import java.io.IOException;
     import java.util.BitSet;
    +import java.util.Comparator;
     
     import org.apache.carbondata.core.datastore.block.SegmentProperties;
     import org.apache.carbondata.core.datastore.chunk.DimensionColumnDataChunk;
    +import org.apache.carbondata.core.datastore.chunk.MeasureColumnDataChunk;
     import org.apache.carbondata.core.datastore.chunk.impl.DimensionRawColumnChunk;
    +import org.apache.carbondata.core.datastore.chunk.impl.MeasureRawColumnChunk;
    +import org.apache.carbondata.core.metadata.datatype.DataType;
     import org.apache.carbondata.core.scan.filter.FilterUtil;
    +import org.apache.carbondata.core.scan.filter.partition.PartitionFilterUtil;
     import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.DimColumnResolvedFilterInfo;
    +import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.MeasureColumnResolvedFilterInfo;
     import org.apache.carbondata.core.scan.processor.BlocksChunkHolder;
     import org.apache.carbondata.core.util.BitSetGroup;
     import org.apache.carbondata.core.util.CarbonUtil;
    +import org.apache.carbondata.core.util.DataTypeUtil;
     
     public class ExcludeFilterExecuterImpl implements FilterExecuter {
     
       protected DimColumnResolvedFilterInfo dimColEvaluatorInfo;
       protected DimColumnExecuterFilterInfo dimColumnExecuterInfo;
    +  protected MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo;
    +  protected MeasureColumnExecuterFilterInfo msrColumnExecutorInfo;
       protected SegmentProperties segmentProperties;
    +  protected boolean isDimensionPresentInCurrentBlock = false;
    +  protected boolean isMeasurePresentInCurrentBlock = false;
       /**
        * is dimension column data is natural sorted
        */
    -  private boolean isNaturalSorted;
    +  private boolean isNaturalSorted = false;
    +
       public ExcludeFilterExecuterImpl(DimColumnResolvedFilterInfo dimColEvaluatorInfo,
    -      SegmentProperties segmentProperties) {
    -    this.dimColEvaluatorInfo = dimColEvaluatorInfo;
    -    dimColumnExecuterInfo = new DimColumnExecuterFilterInfo();
    +      MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo, SegmentProperties segmentProperties,
    +      boolean isMeasure) {
         this.segmentProperties = segmentProperties;
    -    FilterUtil.prepareKeysFromSurrogates(dimColEvaluatorInfo.getFilterValues(), segmentProperties,
    -        dimColEvaluatorInfo.getDimension(), dimColumnExecuterInfo);
    -    isNaturalSorted = dimColEvaluatorInfo.getDimension().isUseInvertedIndex() && dimColEvaluatorInfo
    -        .getDimension().isSortColumn();
    +    if (isMeasure == false) {
    +      this.dimColEvaluatorInfo = dimColEvaluatorInfo;
    +      dimColumnExecuterInfo = new DimColumnExecuterFilterInfo();
    +
    +      FilterUtil.prepareKeysFromSurrogates(dimColEvaluatorInfo.getFilterValues(), segmentProperties,
    +          dimColEvaluatorInfo.getDimension(), dimColumnExecuterInfo, null, null);
    +      isDimensionPresentInCurrentBlock = true;
    +      isNaturalSorted =
    +          dimColEvaluatorInfo.getDimension().isUseInvertedIndex() && dimColEvaluatorInfo
    +              .getDimension().isSortColumn();
    +    } else {
    +      this.msrColumnEvaluatorInfo = msrColumnEvaluatorInfo;
    +      msrColumnExecutorInfo = new MeasureColumnExecuterFilterInfo();
    +      FilterUtil
    +          .prepareKeysFromSurrogates(msrColumnEvaluatorInfo.getFilterValues(), segmentProperties,
    +              null, null, msrColumnEvaluatorInfo.getMeasure(), msrColumnExecutorInfo);
    +      isMeasurePresentInCurrentBlock = true;
    +    }
    +
       }
     
       @Override public BitSetGroup applyFilter(BlocksChunkHolder blockChunkHolder) throws IOException {
    -    int blockIndex = segmentProperties.getDimensionOrdinalToBlockMapping()
    -        .get(dimColEvaluatorInfo.getColumnIndex());
    -    if (null == blockChunkHolder.getDimensionRawDataChunk()[blockIndex]) {
    -      blockChunkHolder.getDimensionRawDataChunk()[blockIndex] = blockChunkHolder.getDataBlock()
    -          .getDimensionChunk(blockChunkHolder.getFileReader(), blockIndex);
    +    if (isDimensionPresentInCurrentBlock == true) {
    +      int blockIndex = segmentProperties.getDimensionOrdinalToBlockMapping()
    +          .get(dimColEvaluatorInfo.getColumnIndex());
    +      if (null == blockChunkHolder.getDimensionRawDataChunk()[blockIndex]) {
    +        blockChunkHolder.getDimensionRawDataChunk()[blockIndex] = blockChunkHolder.getDataBlock()
    +            .getDimensionChunk(blockChunkHolder.getFileReader(), blockIndex);
    +      }
    +      DimensionRawColumnChunk dimensionRawColumnChunk =
    +          blockChunkHolder.getDimensionRawDataChunk()[blockIndex];
    +      DimensionColumnDataChunk[] dimensionColumnDataChunks =
    +          dimensionRawColumnChunk.convertToDimColDataChunks();
    +      BitSetGroup bitSetGroup = new BitSetGroup(dimensionRawColumnChunk.getPagesCount());
    +      for (int i = 0; i < dimensionColumnDataChunks.length; i++) {
    +        BitSet bitSet = getFilteredIndexes(dimensionColumnDataChunks[i],
    +            dimensionRawColumnChunk.getRowCount()[i]);
    +        bitSetGroup.setBitSet(bitSet, i);
    +      }
    +
    +      return bitSetGroup;
    +    } else if (isMeasurePresentInCurrentBlock == true) {
    +      int blockIndex = segmentProperties.getMeasuresOrdinalToBlockMapping()
    +          .get(msrColumnEvaluatorInfo.getColumnIndex());
    +      if (null == blockChunkHolder.getMeasureRawDataChunk()[blockIndex]) {
    +        blockChunkHolder.getMeasureRawDataChunk()[blockIndex] = blockChunkHolder.getDataBlock()
    +            .getMeasureChunk(blockChunkHolder.getFileReader(), blockIndex);
    +      }
    +      MeasureRawColumnChunk measureRawColumnChunk =
    +          blockChunkHolder.getMeasureRawDataChunk()[blockIndex];
    +      MeasureColumnDataChunk[] measureColumnDataChunks =
    +          measureRawColumnChunk.convertToMeasureColDataChunks();
    +      BitSetGroup bitSetGroup = new BitSetGroup(measureRawColumnChunk.getPagesCount());
    +      DataType msrType = getMeasureDataType(msrColumnEvaluatorInfo);
    +      for (int i = 0; i < measureColumnDataChunks.length; i++) {
    +        BitSet bitSet =
    +            getFilteredIndexes(measureColumnDataChunks[i], measureRawColumnChunk.getRowCount()[i],
    +                msrType);
    +        bitSetGroup.setBitSet(bitSet, i);
    +      }
    +      return bitSetGroup;
         }
    -    DimensionRawColumnChunk dimensionRawColumnChunk =
    -        blockChunkHolder.getDimensionRawDataChunk()[blockIndex];
    -    DimensionColumnDataChunk[] dimensionColumnDataChunks =
    -        dimensionRawColumnChunk.convertToDimColDataChunks();
    -    BitSetGroup bitSetGroup =
    -        new BitSetGroup(dimensionRawColumnChunk.getPagesCount());
    -    for (int i = 0; i < dimensionColumnDataChunks.length; i++) {
    -      BitSet bitSet = getFilteredIndexes(dimensionColumnDataChunks[i],
    -          dimensionRawColumnChunk.getRowCount()[i]);
    -      bitSetGroup.setBitSet(bitSet, i);
    +    return null;
    +  }
    +
    +  private DataType getMeasureDataType(MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo) {
    +    switch (msrColumnEvaluatorInfo.getType()) {
    +      case SHORT:
    +        return DataType.SHORT;
    +      case INT:
    +        return DataType.INT;
    +      case LONG:
    +        return DataType.LONG;
    +      case DECIMAL:
    +        return DataType.DECIMAL;
    +      default:
    +        return DataType.DOUBLE;
         }
    +  }
     
    -    return bitSetGroup;
    +  protected BitSet getFilteredIndexes(MeasureColumnDataChunk measureColumnDataChunk,
    +      int numerOfRows, DataType msrType) {
    +    // Here the algorithm is
    +    // Get the measure values from the chunk. compare sequentially with the
    +    // the filter values. The one that matches sets it Bitset.
    +    BitSet bitSet = new BitSet(numerOfRows);
    +    bitSet.flip(0, numerOfRows);
    +    byte[][] filterValues = msrColumnExecutorInfo.getFilterKeys();
    --- End diff --
    
    As of now using ByteArray in filterKeys, in later optimization will convert will store objects.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125191770
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java ---
    @@ -113,6 +115,143 @@ public static Object getMeasureValueBasedOnDataType(String msrValue, DataType da
         }
       }
     
    +  public static Object getMeasureObjectFromDataType(byte[] data, DataType dataType) {
    +    ByteBuffer bb = ByteBuffer.wrap(data);
    +    switch (dataType) {
    +      case SHORT:
    +      case INT:
    +      case LONG:
    +        return bb.getLong();
    +      case DECIMAL:
    +        return byteToBigDecimal(data);
    +      default:
    +        return bb.getDouble();
    +    }
    +  }
    +
    +  /**
    +   * This method will convert a given ByteArray to its specific type
    +   *
    +   * @param msrValue
    +   * @param dataType
    +   * @param carbonMeasure
    +   * @return
    +   */
    +  //  public static byte[] getMeasureByteArrayBasedOnDataType(String msrValue, DataType dataType,
    +  //      CarbonMeasure carbonMeasure) {
    +  //    switch (dataType) {
    +  //      case DECIMAL:
    +  //        BigDecimal bigDecimal =
    +  //            new BigDecimal(msrValue).setScale(carbonMeasure.getScale(), RoundingMode.HALF_UP);
    +  //       return ByteUtil.toBytes(normalizeDecimalValue(bigDecimal, carbonMeasure.getPrecision()));
    +  //      case SHORT:
    +  //        return ByteUtil.toBytes((Short.parseShort(msrValue)));
    +  //      case INT:
    +  //        return ByteUtil.toBytes(Integer.parseInt(msrValue));
    +  //      case LONG:
    +  //        return ByteUtil.toBytes(Long.valueOf(msrValue));
    +  //      default:
    +  //        Double parsedValue = Double.valueOf(msrValue);
    +  //        if (Double.isInfinite(parsedValue) || Double.isNaN(parsedValue)) {
    +  //          return null;
    +  //        }
    +  //        return ByteUtil.toBytes(parsedValue);
    +  //    }
    +  //  }
    --- End diff --
    
    Done


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125152345
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/page/statistics/ColumnPageStatsVO.java ---
    @@ -102,9 +109,15 @@ public void update(Object value) {
             break;
           case DECIMAL:
             BigDecimal decimalValue = DataTypeUtil.byteToBigDecimal((byte[]) value);
    -        decimal = decimalValue.scale();
    -        BigDecimal val = (BigDecimal) min;
    -        nonExistValue = (val.subtract(new BigDecimal(1.0)));
    +        if (isFirst) {
    --- End diff --
    
    I think this `isFirst` is required. just check null for max or min here.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    
    Refer to this link for build results (access rights to CI server needed): 
    https://builds.apache.org/job/carbondata-pr-spark-1.6/683/<h2>Failed Tests: <span class='status-failure'>78</span></h2><h3><a name='carbondata-pr-spark-1.6/org.apache.carbondata:carbondata-spark' /><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark/testReport'>carbondata-pr-spark-1.6/org.apache.carbondata:carbondata-spark</a>: <span class='status-failure'>7</span></h3><ul><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark/testReport/org.apache.carbondata.integration.spark.testsuite.dataload/SparkDatasourceSuite/read_and_write_using_CarbonContext/'><strong>org.apache.carbondata.integration.spark.testsuite.dataload.SparkDatasourceSuite.read and write using CarbonContext</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark/testReport/org.apache.carbondata.integration.spark.testsuite.dataload/SparkD
 atasourceSuite/read_and_write_using_CarbonContext_with_compression/'><strong>org.apache.carbondata.integration.spark.testsuite.dataload.SparkDatasourceSuite.read and write using CarbonContext with compression</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark/testReport/org.apache.carbondata.integration.spark.testsuite.dataload/SparkDatasourceSuite/test_overwrite/'><strong>org.apache.carbondata.integration.spark.testsuite.dataload.SparkDatasourceSuite.test overwrite</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark/testReport/org.apache.carbondata.integration.spark.testsuite.dataload/SparkDatasourceSuite/read_and_write_using_CarbonContext__multiple_load/'><strong>org.apache.carbondata.integration.spark.testsuite.dataload.SparkDatasourceSuite.read and write using CarbonContext, multiple load</strong></a></li><li><a href='https://buil
 ds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark/testReport/org.apache.carbondata.integration.spark.testsuite.dataload/SparkDatasourceSuite/query_using_SQLContext/'><strong>org.apache.carbondata.integration.spark.testsuite.dataload.SparkDatasourceSuite.query using SQLContext</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark/testReport/org.apache.carbondata.integration.spark.testsuite.dataload/SparkDatasourceSuite/query_using_SQLContext_without_providing_schema/'><strong>org.apache.carbondata.integration.spark.testsuite.dataload.SparkDatasourceSuite.query using SQLContext without providing schema</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark/testReport/org.apache.carbondata.spark.testsuite.datacompaction/DataCompactionTest/check_if_compaction_with_Updates/'><strong>org.apache.carbondata.spark.
 testsuite.datacompaction.DataCompactionTest.check if compaction with Updates</strong></a></li></ul><h3><a name='carbondata-pr-spark-1.6/org.apache.carbondata:carbondata-spark-common-test' /><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport'>carbondata-pr-spark-1.6/org.apache.carbondata:carbondata-spark-common-test</a>: <span class='status-failure'>71</span></h3><ul><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.bigdecimal/TestBigInt/test_big_int_data_type_storage_for_boundary_values/'><strong>org.apache.carbondata.spark.testsuite.bigdecimal.TestBigInt.test big int data type storage for boundary values</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsui
 te.bigdecimal/TestNullAndEmptyFields/test_filter_query_on_column_is_null/'><strong>org.apache.carbondata.spark.testsuite.bigdecimal.TestNullAndEmptyFields.test filter query on column is null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.bigdecimal/TestNullAndEmptyFieldsUnsafe/test_filter_query_on_column_is_null/'><strong>org.apache.carbondata.spark.testsuite.bigdecimal.TestNullAndEmptyFieldsUnsafe.test filter query on column is null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestGlobalSortDataLoad/LOAD_with_DELETE/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestGlobalSortDataLoad.LOAD with DELETE</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/
 683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestGlobalSortDataLoad/LOAD_with_UPDATE/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestGlobalSortDataLoad.LOAD with UPDATE</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataFrame/test_load_dataframe_with_saving_compressed_csv_files/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataFrame.test load dataframe with saving compressed csv files</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataFrame/test_load_dataframe_with_saving_csv_uncompressed_files/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataFrame.test l
 oad dataframe with saving csv uncompressed files</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataFrame/test_load_dataframe_without_saving_csv_files/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataFrame.test load dataframe without saving csv files</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataFrame/test_load_dataframe_with_string_column_excluded_from_the_dictionary/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataFrame.test load dataframe with string column excluded from the dictionary</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/tes
 tReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataFrame/test_load_dataframe_with_single_pass_enabled/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataFrame.test load dataframe with single pass enabled</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataFrame/test_load_dataframe_with_single_pass_disabled/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataFrame.test load dataframe with single pass disabled</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithDiffTimestampFormat/test_load_data_with_different_timestamp_format/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithDiffTimestampFormat.test
  load data with different timestamp format</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxDefaultFormat/test_carbon_table_data_loading_with_special_character_2/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxDefaultFormat.test carbon table data loading with special character 2</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxDefaultFormat/test_data_which_contain_column_with_decimal_data_type_in_array_of_struct_/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxDefaultFormat.test data which contain column with decimal data type in array of struct.</strong></a></li><li><a hr
 ef='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxUnsafe/test_carbon_table_data_loading_with_special_character_2/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxUnsafe.test carbon table data loading with special character 2</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxUnsafe/test_data_which_contain_column_with_decimal_data_type_in_array_of_struct_/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxUnsafe.test data which contain column with decimal data type in array of struct.</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-sp
 ark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxV1Format/test_carbon_table_data_loading_with_special_character_2/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxV1Format.test carbon table data loading with special character 2</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxV1Format/test_data_which_contain_column_with_decimal_data_type_in_array_of_struct_/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxV1Format.test data which contain column with decimal data type in array of struct.</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWit
 hHiveSyntaxV2Format/test_carbon_table_data_loading_with_special_character_2/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxV2Format.test carbon table data loading with special character 2</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxV2Format/test_data_which_contain_column_with_decimal_data_type_in_array_of_struct_/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxV2Format.test data which contain column with decimal data type in array of struct.</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/ExpressionWithNullTestCase/test_to_check_in_expression_with_null_values/'><strong>org.apache.carbonda
 ta.spark.testsuite.detailquery.ExpressionWithNullTestCase.test to check in expression with null values</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/ExpressionWithNullTestCase/test_to_check_not_in_expression_with_null_values/'><strong>org.apache.carbondata.spark.testsuite.detailquery.ExpressionWithNullTestCase.test to check not in expression with null values</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/ExpressionWithNullTestCase/test_to_check_equals_expression_with_null_values/'><strong>org.apache.carbondata.spark.testsuite.detailquery.ExpressionWithNullTestCase.test to check equals expression with null values</strong></a></li><li><a href='https://builds.apache.org/job/carbondata
 -pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/ExpressionWithNullTestCase/test_to_check_not_equals_expression_with_null_values/'><strong>org.apache.carbondata.spark.testsuite.detailquery.ExpressionWithNullTestCase.test to check not equals expression with null values</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_range_filter_for_multiplecolumns_conditions/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test range filter for multiplecolumns conditions</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_range_filter_for_more_colum
 ns_conditions/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test range filter for more columns conditions</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_range_filter_for_multiple_columns_and_or_combination/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test range filter for multiple columns and or combination</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/FilterProcessorTestCase/Greater_Than_Filter/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.FilterProcessorTestCase.Greater Than Filter</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache
 .carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/FilterProcessorTestCase/Greater_Than_equal_to_Filter/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.FilterProcessorTestCase.Greater Than equal to Filter</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/FilterProcessorTestCase/Greater_Than_equal_to_Filter_with_limit/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.FilterProcessorTestCase.Greater Than equal to Filter with limit</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/FilterProcessorTestCase/Greater_Than_equal_to_Filter_with_aggregation_limit/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.FilterProce
 ssorTestCase.Greater Than equal to Filter with aggregation limit</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/FilterProcessorTestCase/Greater_Than_equal_to_Filter_with_decimal/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.FilterProcessorTestCase.Greater Than equal to Filter with decimal</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/FilterProcessorTestCase/Include_Filter/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.FilterProcessorTestCase.Include Filter</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/NullMea
 sureValueTestCaseFilter/select_ID_from_t3_where_salary_is_null/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.NullMeasureValueTestCaseFilter.select ID from t3 where salary is null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/DeleteCarbonTableTestCase/delete_data_from__carbon_table_where_clause__/'><strong>org.apache.carbondata.spark.testsuite.iud.DeleteCarbonTableTestCase.delete data from  carbon table[where clause ]</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/DeleteCarbonTableTestCase/delete_data_from__carbon_table_where_numeric_condition___/'><strong>org.apache.carbondata.spark.testsuite.iud.DeleteCarbonTableTestCase.delete data from  carbon table[where numeric condition  ]</stro
 ng></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/HorizontalCompactionTestCase/test_IUD_Horizontal_Compaction_Delete/'><strong>org.apache.carbondata.spark.testsuite.iud.HorizontalCompactionTestCase.test IUD Horizontal Compaction Delete</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/HorizontalCompactionTestCase/test_IUD_Horizontal_Compaction_Update_Delete_and_Clean/'><strong>org.apache.carbondata.spark.testsuite.iud.HorizontalCompactionTestCase.test IUD Horizontal Compaction Update Delete and Clean</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/HorizontalCompaction
 TestCase/test_IUD_Horizontal_Compaction_Check_Column_Cardinality/'><strong>org.apache.carbondata.spark.testsuite.iud.HorizontalCompactionTestCase.test IUD Horizontal Compaction Check Column Cardinality</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/HorizontalCompactionTestCase/test_IUD_Horizontal_Compaction_Segment_Delete_Test_Case/'><strong>org.apache.carbondata.spark.testsuite.iud.HorizontalCompactionTestCase.test IUD Horizontal Compaction Segment Delete Test Case</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/HorizontalCompactionTestCase/test_case_full_table_delete/'><strong>org.apache.carbondata.spark.testsuite.iud.HorizontalCompactionTestCase.test case full table delete</strong></a></li><li><a
  href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/UpdateCarbonTableTestCase/update_carbon__sub_query__between_and_existing_in_outer_condition__Customer_query____/'><strong>org.apache.carbondata.spark.testsuite.iud.UpdateCarbonTableTestCase.update carbon [sub query, between and existing in outer condition.(Customer query ) ]</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.nullvalueserialization/TestNullValueSerialization/test_filter_query_on_column_is_null/'><strong>org.apache.carbondata.spark.testsuite.nullvalueserialization.TestNullValueSerialization.test filter query on column is null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testR
 eport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_int/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_int</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_bigint/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_bigint</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_float/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_float</strong></a></li><li><a href='htt
 ps://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_double/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_double</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_int/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_int</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_float/'><strong>org.apache.carbondata.spark
 .testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_float</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_double/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_double</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_int/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_int</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.t
 estsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_bigint/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_bigint</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_float/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_float</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_double/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_double</strong></a></li><li><a href='https://builds.apache.org/
 job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestDataLoadingForPartitionTable/badrecords_on_partition_column/'><strong>org.apache.carbondata.spark.testsuite.partition.TestDataLoadingForPartitionTable.badrecords on partition column</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestQueryForPartitionTable/detail_query_on_partition_table__hash_table/'><strong>org.apache.carbondata.spark.testsuite.partition.TestQueryForPartitionTable.detail query on partition table: hash table</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestQueryForPartitionTable/detail_query_on_partition_table__list_partitio
 n/'><strong>org.apache.carbondata.spark.testsuite.partition.TestQueryForPartitionTable.detail query on partition table: list partition</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumns/filter_on_sort_columns_include_no_dictionary__direct_dictionary_and_dictioanry/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumns.filter on sort_columns include no-dictionary, direct-dictionary and dictioanry</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumns/unsorted_table_creation__query_data_loading_with_heap_and_safe_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumns.unsorted table creation, query data loading w
 ith heap and safe sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumns/unsorted_table_creation__query_and_data_loading_with_heap_and_unsafe_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumns.unsorted table creation, query and data loading with heap and unsafe sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumns/unsorted_table_creation__query_and_loading_with_heap_and_inmemory_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumns.unsorted table creation, query and loading with heap and inmemory sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbo
 ndata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumns/unsorted_table_creation__query_and_data_loading_with_offheap_and_safe_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumns.unsorted table creation, query and data loading with offheap and safe sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumns/unsorted_table_creation__query_and_data_loading_with_offheap_and_unsafe_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumns.unsorted table creation, query and data loading with offheap and unsafe sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testRep
 ort/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumns/unsorted_table_creation__query_and_data_loading_with_offheap_and_inmemory_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumns.unsorted table creation, query and data loading with offheap and inmemory sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumnsWithUnsafe/filter_on_sort_columns_include_no_dictionary__direct_dictionary_and_dictioanry/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumnsWithUnsafe.filter on sort_columns include no-dictionary, direct-dictionary and dictioanry</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortC
 olumnsWithUnsafe/unsorted_table_creation__query_data_loading_with_heap_and_safe_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumnsWithUnsafe.unsorted table creation, query data loading with heap and safe sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumnsWithUnsafe/unsorted_table_creation__query_and_data_loading_with_heap_and_unsafe_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumnsWithUnsafe.unsorted table creation, query and data loading with heap and unsafe sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumnsWithUnsafe/unsorted_table_creation__query_and_loading_w
 ith_heap_and_inmemory_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumnsWithUnsafe.unsorted table creation, query and loading with heap and inmemory sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumnsWithUnsafe/unsorted_table_creation__query_and_data_loading_with_offheap_and_safe_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumnsWithUnsafe.unsorted table creation, query and data loading with offheap and safe sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumnsWithUnsafe/unsorted_table_creation__query_and_data_loading_with_offheap_and_unsafe_sort_config/'><strong>org
 .apache.carbondata.spark.testsuite.sortcolumns.TestSortColumnsWithUnsafe.unsorted table creation, query and data loading with offheap and unsafe sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/683/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumnsWithUnsafe/unsorted_table_creation__query_and_data_loading_with_offheap_and_inmemory_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumnsWithUnsafe.unsorted table creation, query and data loading with offheap and inmemory sort config</strong></a></li></ul>



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125153056
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/ExcludeFilterExecuterImpl.java ---
    @@ -18,56 +18,152 @@
     
     import java.io.IOException;
     import java.util.BitSet;
    +import java.util.Comparator;
     
     import org.apache.carbondata.core.datastore.block.SegmentProperties;
     import org.apache.carbondata.core.datastore.chunk.DimensionColumnDataChunk;
    +import org.apache.carbondata.core.datastore.chunk.MeasureColumnDataChunk;
     import org.apache.carbondata.core.datastore.chunk.impl.DimensionRawColumnChunk;
    +import org.apache.carbondata.core.datastore.chunk.impl.MeasureRawColumnChunk;
    +import org.apache.carbondata.core.metadata.datatype.DataType;
     import org.apache.carbondata.core.scan.filter.FilterUtil;
    +import org.apache.carbondata.core.scan.filter.partition.PartitionFilterUtil;
     import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.DimColumnResolvedFilterInfo;
    +import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.MeasureColumnResolvedFilterInfo;
     import org.apache.carbondata.core.scan.processor.BlocksChunkHolder;
     import org.apache.carbondata.core.util.BitSetGroup;
     import org.apache.carbondata.core.util.CarbonUtil;
    +import org.apache.carbondata.core.util.DataTypeUtil;
     
     public class ExcludeFilterExecuterImpl implements FilterExecuter {
     
       protected DimColumnResolvedFilterInfo dimColEvaluatorInfo;
       protected DimColumnExecuterFilterInfo dimColumnExecuterInfo;
    +  protected MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo;
    +  protected MeasureColumnExecuterFilterInfo msrColumnExecutorInfo;
       protected SegmentProperties segmentProperties;
    +  protected boolean isDimensionPresentInCurrentBlock = false;
    +  protected boolean isMeasurePresentInCurrentBlock = false;
    --- End diff --
    
    I don't think all these flags are required. just do `null` check of  `dimColumnExecuterInfo` and  `msrColumnExecutorInfo` is enough


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [CARBONDATA-1257] Measure Filter implementation

Posted by zzcclp <gi...@git.apache.org>.
Github user zzcclp commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    @sounakr @ravipesala   any progress on this pr? it was merged onto branch-1.1.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125154226
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeGrtrThanEquaToFilterExecuterImpl.java ---
    @@ -91,67 +131,167 @@ private void ifDefaultValueMatchesFilter() {
     
       private boolean isScanRequired(byte[] blockMaxValue, byte[][] filterValues) {
         boolean isScanRequired = false;
    -    if (isDimensionPresentInCurrentBlock[0]) {
    -      for (int k = 0; k < filterValues.length; k++) {
    -        // filter value should be in range of max and min value i.e
    -        // max>filtervalue>min
    -        // so filter-max should be negative
    -        int maxCompare = ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMaxValue);
    -        // if any filter value is in range than this block needs to be
    -        // scanned less than equal to max range.
    -        if (maxCompare <= 0) {
    -          isScanRequired = true;
    -          break;
    -        }
    +    for (int k = 0; k < filterValues.length; k++) {
    +      // filter value should be in range of max and min value i.e
    +      // max>filtervalue>min
    +      // so filter-max should be negative
    +      int maxCompare = ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMaxValue);
    +      // if any filter value is in range than this block needs to be
    +      // scanned less than equal to max range.
    +      if (maxCompare <= 0) {
    +        isScanRequired = true;
    +        break;
           }
    -    } else {
    -      isScanRequired = isDefaultValuePresentInFilter;
         }
         return isScanRequired;
       }
     
    +  private boolean isScanRequired(byte[] maxValue, byte[][] filterValue,
    +      DataType dataType) {
    +    for (int i = 0; i < filterValue.length; i++) {
    +      if (filterValue[i].length == 0 || maxValue.length == 0) {
    +        return isScanRequired(maxValue, filterValue);
    +      }
    +      switch (dataType) {
    +        case DOUBLE:
    +          double maxValueDouble = ByteBuffer.wrap(maxValue).getDouble();
    +          double filterValueDouble = ByteBuffer.wrap(filterValue[i]).getDouble();
    +          if (filterValueDouble <= maxValueDouble) {
    +            return true;
    +          }
    +          break;
    +        case INT:
    +        case SHORT:
    +        case LONG:
    +          long maxValueLong = ByteBuffer.wrap(maxValue).getLong();
    +          long filterValueLong = ByteBuffer.wrap(filterValue[i]).getLong();
    +          if (filterValueLong <= maxValueLong) {
    +            return true;
    +          }
    +          break;
    +        case DECIMAL:
    +          BigDecimal maxDecimal = DataTypeUtil.byteToBigDecimal(maxValue);
    +          BigDecimal filterDecimal = DataTypeUtil.byteToBigDecimal(filterValue[i]);
    +          if (filterDecimal.compareTo(maxDecimal) <= 0) {
    +            return true;
    +          }
    +      }
    +    }
    +    return false;
    +  }
    +
       @Override public BitSetGroup applyFilter(BlocksChunkHolder blockChunkHolder)
           throws FilterUnsupportedException, IOException {
         // select all rows if dimension does not exists in the current block
    -    if (!isDimensionPresentInCurrentBlock[0]) {
    +    if (!isDimensionPresentInCurrentBlock[0] && !isMeasurePresentInCurrentBlock[0]) {
           int numberOfRows = blockChunkHolder.getDataBlock().nodeSize();
           return FilterUtil
               .createBitSetGroupWithDefaultValue(blockChunkHolder.getDataBlock().numberOfPages(),
                   numberOfRows, true);
         }
    -    int blockIndex =
    -        segmentProperties.getDimensionOrdinalToBlockMapping().get(dimensionBlocksIndex[0]);
    -    if (null == blockChunkHolder.getDimensionRawDataChunk()[blockIndex]) {
    -      blockChunkHolder.getDimensionRawDataChunk()[blockIndex] = blockChunkHolder.getDataBlock()
    -          .getDimensionChunk(blockChunkHolder.getFileReader(), blockIndex);
    -    }
    -    DimensionRawColumnChunk rawColumnChunk =
    -        blockChunkHolder.getDimensionRawDataChunk()[blockIndex];
    -    BitSetGroup bitSetGroup = new BitSetGroup(rawColumnChunk.getPagesCount());
    -    for (int i = 0; i < rawColumnChunk.getPagesCount(); i++) {
    -      if (rawColumnChunk.getMaxValues() != null) {
    -        if (isScanRequired(rawColumnChunk.getMaxValues()[i], this.filterRangeValues)) {
    -          int compare = ByteUtil.UnsafeComparer.INSTANCE
    -              .compareTo(filterRangeValues[0], rawColumnChunk.getMinValues()[i]);
    -          if (compare <= 0) {
    -            BitSet bitSet = new BitSet(rawColumnChunk.getRowCount()[i]);
    -            bitSet.flip(0, rawColumnChunk.getRowCount()[i]);
    -            bitSetGroup.setBitSet(bitSet, i);
    -          } else {
    -            BitSet bitSet = getFilteredIndexes(rawColumnChunk.convertToDimColDataChunk(i),
    -                rawColumnChunk.getRowCount()[i]);
    -            bitSetGroup.setBitSet(bitSet, i);
    +
    +    if (isDimensionPresentInCurrentBlock[0]) {
    +      int blockIndex =
    +          segmentProperties.getDimensionOrdinalToBlockMapping().get(dimensionBlocksIndex[0]);
    +      if (null == blockChunkHolder.getDimensionRawDataChunk()[blockIndex]) {
    +        blockChunkHolder.getDimensionRawDataChunk()[blockIndex] = blockChunkHolder.getDataBlock()
    +            .getDimensionChunk(blockChunkHolder.getFileReader(), blockIndex);
    +      }
    +      DimensionRawColumnChunk rawColumnChunk =
    +          blockChunkHolder.getDimensionRawDataChunk()[blockIndex];
    +      BitSetGroup bitSetGroup = new BitSetGroup(rawColumnChunk.getPagesCount());
    +      for (int i = 0; i < rawColumnChunk.getPagesCount(); i++) {
    +        if (rawColumnChunk.getMaxValues() != null) {
    +          if (isScanRequired(rawColumnChunk.getMaxValues()[i], this.filterRangeValues)) {
    +            int compare = ByteUtil.UnsafeComparer.INSTANCE
    +                .compareTo(filterRangeValues[0], rawColumnChunk.getMinValues()[i]);
    +            if (compare <= 0) {
    +              BitSet bitSet = new BitSet(rawColumnChunk.getRowCount()[i]);
    +              bitSet.flip(0, rawColumnChunk.getRowCount()[i]);
    +              bitSetGroup.setBitSet(bitSet, i);
    +            } else {
    +              BitSet bitSet = getFilteredIndexes(rawColumnChunk.convertToDimColDataChunk(i),
    +                  rawColumnChunk.getRowCount()[i]);
    +              bitSetGroup.setBitSet(bitSet, i);
    +            }
               }
    +        } else {
    +          BitSet bitSet = getFilteredIndexes(rawColumnChunk.convertToDimColDataChunk(i),
    +              rawColumnChunk.getRowCount()[i]);
    +          bitSetGroup.setBitSet(bitSet, i);
    +        }
    +      }
    +      return bitSetGroup;
    +    } else if (isMeasurePresentInCurrentBlock[0]) {
    +      int blockIndex =
    +          segmentProperties.getMeasuresOrdinalToBlockMapping().get(measureBlocksIndex[0]);
    +      if (null == blockChunkHolder.getMeasureRawDataChunk()[blockIndex]) {
    +        blockChunkHolder.getMeasureRawDataChunk()[blockIndex] = blockChunkHolder.getDataBlock()
    +            .getMeasureChunk(blockChunkHolder.getFileReader(), blockIndex);
    +      }
    +      MeasureRawColumnChunk rawColumnChunk = blockChunkHolder.getMeasureRawDataChunk()[blockIndex];
    +      BitSetGroup bitSetGroup = new BitSetGroup(rawColumnChunk.getPagesCount());
    +      for (int i = 0; i < rawColumnChunk.getPagesCount(); i++) {
    +        if (rawColumnChunk.getMaxValues() != null) {
    +          if (isScanRequired(rawColumnChunk.getMaxValues()[i], this.filterRangeValues,
    +              msrColEvalutorInfoList.get(0).getType())) {
    +            int compare = ByteUtil.UnsafeComparer.INSTANCE
    --- End diff --
    
    Binary comparison cannot be used here 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

Posted by CarbonDataQA <gi...@git.apache.org>.
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    Build Failed  with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2833/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

Posted by CarbonDataQA <gi...@git.apache.org>.
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    Build Failed  with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2785/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125190884
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
    @@ -180,9 +185,27 @@ private static FilterExecuter createFilterExecuterTree(
        * @return
        */
       private static FilterExecuter getIncludeFilterExecuter(
    -      DimColumnResolvedFilterInfo dimColResolvedFilterInfo, SegmentProperties segmentProperties) {
    -
    -    if (dimColResolvedFilterInfo.getDimension().isColumnar()) {
    +      DimColumnResolvedFilterInfo dimColResolvedFilterInfo,
    +      MeasureColumnResolvedFilterInfo msrColResolvedFilterInfo,
    +      SegmentProperties segmentProperties) {
    +    if (null != msrColResolvedFilterInfo && msrColResolvedFilterInfo.getMeasure().isColumnar()) {
    +      CarbonMeasure measuresFromCurrentBlock = segmentProperties
    +          .getMeasureFromCurrentBlock(msrColResolvedFilterInfo.getMeasure().getColumnId());
    +      if (null != measuresFromCurrentBlock) {
    +        // update dimension and column index according to the dimension position in current block
    +        MeasureColumnResolvedFilterInfo msrColResolvedFilterInfoCopyObject =
    +            msrColResolvedFilterInfo.getCopyObject();
    +        msrColResolvedFilterInfoCopyObject.setMeasure(measuresFromCurrentBlock);
    +        msrColResolvedFilterInfoCopyObject.setColumnIndex(measuresFromCurrentBlock.getOrdinal());
    +        msrColResolvedFilterInfoCopyObject.setType(measuresFromCurrentBlock.getDataType());
    +        return new IncludeFilterExecuterImpl(null, msrColResolvedFilterInfoCopyObject,
    +            segmentProperties, true);
    +      } else {
    +        return new RestructureIncludeFilterExecutorImpl(dimColResolvedFilterInfo,
    +            msrColResolvedFilterInfo, segmentProperties, true);
    +      }
    +    }
    --- End diff --
    
    No, This whole block belongs to Measure. Line 220 points to Restructure in case of dimentions.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125154292
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/partition/PartitionFilterUtil.java ---
    @@ -76,24 +99,25 @@ public static Comparator getComparatorByDataType(DataType dataType) {
     
       static class DoubleComparator implements Comparator<Object> {
         @Override public int compare(Object key1, Object key2) {
    -      double result = (double) key1 - (double) key2;
    -      if (result < 0) {
    +      double key1Double1 = (double)key1;
    --- End diff --
    
    Why need to change this logic? old logic seems fine right


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125191929
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java ---
    @@ -113,6 +115,143 @@ public static Object getMeasureValueBasedOnDataType(String msrValue, DataType da
         }
       }
     
    +  public static Object getMeasureObjectFromDataType(byte[] data, DataType dataType) {
    +    ByteBuffer bb = ByteBuffer.wrap(data);
    --- End diff --
    
    Done


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125190897
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
    @@ -209,9 +233,29 @@ private static FilterExecuter getIncludeFilterExecuter(
        * @return
        */
       private static FilterExecuter getExcludeFilterExecuter(
    -      DimColumnResolvedFilterInfo dimColResolvedFilterInfo, SegmentProperties segmentProperties) {
    +      DimColumnResolvedFilterInfo dimColResolvedFilterInfo,
    +      MeasureColumnResolvedFilterInfo msrColResolvedFilterInfo,
    +      SegmentProperties segmentProperties) {
     
    -    if (dimColResolvedFilterInfo.getDimension().isColumnar()) {
    +    if (null != msrColResolvedFilterInfo && msrColResolvedFilterInfo.getMeasure().isColumnar()) {
    --- End diff --
    
    Done. Removed


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125153179
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java ---
    @@ -113,6 +115,143 @@ public static Object getMeasureValueBasedOnDataType(String msrValue, DataType da
         }
       }
     
    +  public static Object getMeasureObjectFromDataType(byte[] data, DataType dataType) {
    +    ByteBuffer bb = ByteBuffer.wrap(data);
    --- End diff --
    
    This is unnecessary object for `decimal` so keep it in respective case statement


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125191345
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
    @@ -395,6 +440,58 @@ public static DimColumnFilterInfo getNoDictionaryValKeyMemberForFilter(
       }
     
       /**
    +   * This method will get the no dictionary data based on filters and same
    +   * will be in ColumnFilterInfo
    +   *
    +   * @param evaluateResultListFinal
    +   * @param isIncludeFilter
    +   * @return ColumnFilterInfo
    +   */
    +  public static ColumnFilterInfo getMeasureValKeyMemberForFilter(
    +      List<String> evaluateResultListFinal, boolean isIncludeFilter, DataType dataType,
    +      CarbonMeasure carbonMeasure) throws FilterUnsupportedException {
    +    List<byte[]> filterValuesList = new ArrayList<byte[]>(20);
    +    String result = null;
    +    try {
    +      int length = evaluateResultListFinal.size();
    +      for (int i = 0; i < length; i++) {
    +        result = evaluateResultListFinal.get(i);
    +        if (CarbonCommonConstants.MEMBER_DEFAULT_VAL.equals(result)) {
    +          filterValuesList.add(new byte[0]);
    +          continue;
    +        }
    +        // TODO have to understand what method to be used for measures.
    +        // filterValuesList
    +        //  .add(DataTypeUtil.getBytesBasedOnDataTypeForNoDictionaryColumn(result, dataType));
    +
    +        filterValuesList
    +            .add(DataTypeUtil.getMeasureByteArrayBasedOnDataTypes(result, dataType, carbonMeasure));
    +
    +      }
    +    } catch (Throwable ex) {
    +      throw new FilterUnsupportedException("Unsupported Filter condition: " + result, ex);
    +    }
    +
    +    Comparator<byte[]> filterMeasureComaparator = new Comparator<byte[]>() {
    +
    +      @Override public int compare(byte[] filterMember1, byte[] filterMember2) {
    +        // TODO Auto-generated method stub
    +        return ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterMember1, filterMember2);
    --- End diff --
    
    We are converting String dataTypes into Bytes array and then saving into filterValueList. From populateFilterResolvedInfo itself we convert all dataType to strings and then pass it along. We may have to rectify and pass on actual datatype from populateFilterResolvedInfo. 
    
    But all filterValue Comparision in measures are currently sequential, so there no chance of getting a wrong result. i.e. in Include, RowLevelLessThan, RowLevelLessThanEqual, RowLevelGrtThanEqual, RowLevelGrtThan. Also Range is not implemented yet for measures where 2 filter values should be in ascending order.  In the next stage optimization we can hold the filter values in actual datatype and have comparator for each. This was comparision will be proper and we dont have to convert to datatype to byte and again back to object while doing actual comparision, we can carry object all along.  


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125152956
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
    @@ -1042,12 +1144,17 @@ public static FilterExecuter getFilterExecuterTree(
        * @param dimension
        * @param dimColumnExecuterInfo
        */
    -  public static void prepareKeysFromSurrogates(DimColumnFilterInfo filterValues,
    +  public static void prepareKeysFromSurrogates(ColumnFilterInfo filterValues,
           SegmentProperties segmentProperties, CarbonDimension dimension,
    -      DimColumnExecuterFilterInfo dimColumnExecuterInfo) {
    -    byte[][] keysBasedOnFilter = getKeyArray(filterValues, dimension, segmentProperties);
    -    dimColumnExecuterInfo.setFilterKeys(keysBasedOnFilter);
    -
    +      DimColumnExecuterFilterInfo dimColumnExecuterInfo, CarbonMeasure measures,
    +      MeasureColumnExecuterFilterInfo msrColumnExecuterInfo) {
    +    if (null != measures) {
    --- End diff --
    
    I don't think this `if ` check is required. just pass dimension and measure to the method 'getKeyArray'


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

Posted by CarbonDataQA <gi...@git.apache.org>.
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    Build Failed with Spark 1.6, Please check CI http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/252/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [CARBONDATA-1257] Measure Filter implementation

Posted by zzcclp <gi...@git.apache.org>.
Github user zzcclp commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    why this pr is not yet merged, but was merged into branch-1.1 first?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125153811
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/MeasureColumnExecuterFilterInfo.java ---
    @@ -0,0 +1,30 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.core.scan.filter.executer;
    +
    +public class MeasureColumnExecuterFilterInfo {
    +
    +  byte[][] filterKeys;
    --- End diff --
    
    Use Object[]


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

Posted by CarbonDataQA <gi...@git.apache.org>.
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    Build Failed with Spark 1.6, Please check CI http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/175/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125154321
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/partition/PartitionFilterUtil.java ---
    @@ -107,6 +131,12 @@ public static Comparator getComparatorByDataType(DataType dataType) {
         }
       }
     
    +  static class DecimalComparator implements Comparator<Object> {
    --- End diff --
    
    what is the use of this comparator? Please remove if not used


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125152265
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/page/statistics/ColumnPageStatsVO.java ---
    @@ -56,9 +65,7 @@ public ColumnPageStatsVO(DataType dataType) {
             nonExistValue = Double.MIN_VALUE;
             break;
           case DECIMAL:
    -        max = new BigDecimal(Double.MIN_VALUE);
    -        min = new BigDecimal(Double.MAX_VALUE);
    -        nonExistValue = new BigDecimal(Double.MIN_VALUE);
    +        this.zeroDecimal = new BigDecimal(0);
    --- End diff --
    
    use `BigDecimal.ZERO`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125154337
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/resolver/ConditionalFilterResolverImpl.java ---
    @@ -44,16 +44,22 @@
       protected boolean isExpressionResolve;
       protected boolean isIncludeFilter;
       private DimColumnResolvedFilterInfo dimColResolvedFilterInfo;
    +  private MeasureColumnResolvedFilterInfo msrColResolvedFilterInfo;
       private AbsoluteTableIdentifier tableIdentifier;
    +  private boolean isMeasure;
    --- End diff --
    
    it is not used anywhere, please remove


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125154256
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanFiterExecuterImpl.java ---
    @@ -77,72 +89,188 @@ private void ifDefaultValueMatchesFilter() {
               }
             }
           }
    +    } else if (!msrColEvalutorInfoList.isEmpty() && !isMeasurePresentInCurrentBlock[0]) {
    +      CarbonMeasure measure = this.msrColEvalutorInfoList.get(0).getMeasure();
    +      byte[] defaultValue = measure.getDefaultValue();
    +      if (null != defaultValue) {
    +        for (int k = 0; k < filterRangeValues.length; k++) {
    +          int maxCompare =
    +              ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterRangeValues[k], defaultValue);
    --- End diff --
    
    Cannot use binary comparison here


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125191774
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java ---
    @@ -113,6 +115,143 @@ public static Object getMeasureValueBasedOnDataType(String msrValue, DataType da
         }
       }
     
    +  public static Object getMeasureObjectFromDataType(byte[] data, DataType dataType) {
    +    ByteBuffer bb = ByteBuffer.wrap(data);
    +    switch (dataType) {
    +      case SHORT:
    +      case INT:
    +      case LONG:
    +        return bb.getLong();
    +      case DECIMAL:
    +        return byteToBigDecimal(data);
    +      default:
    +        return bb.getDouble();
    +    }
    +  }
    +
    +  /**
    +   * This method will convert a given ByteArray to its specific type
    +   *
    +   * @param msrValue
    +   * @param dataType
    +   * @param carbonMeasure
    +   * @return
    +   */
    +  //  public static byte[] getMeasureByteArrayBasedOnDataType(String msrValue, DataType dataType,
    +  //      CarbonMeasure carbonMeasure) {
    +  //    switch (dataType) {
    +  //      case DECIMAL:
    +  //        BigDecimal bigDecimal =
    +  //            new BigDecimal(msrValue).setScale(carbonMeasure.getScale(), RoundingMode.HALF_UP);
    +  //       return ByteUtil.toBytes(normalizeDecimalValue(bigDecimal, carbonMeasure.getPrecision()));
    +  //      case SHORT:
    +  //        return ByteUtil.toBytes((Short.parseShort(msrValue)));
    +  //      case INT:
    +  //        return ByteUtil.toBytes(Integer.parseInt(msrValue));
    +  //      case LONG:
    +  //        return ByteUtil.toBytes(Long.valueOf(msrValue));
    +  //      default:
    +  //        Double parsedValue = Double.valueOf(msrValue);
    +  //        if (Double.isInfinite(parsedValue) || Double.isNaN(parsedValue)) {
    +  //          return null;
    +  //        }
    +  //        return ByteUtil.toBytes(parsedValue);
    +  //    }
    +  //  }
    +  public static byte[] getMeasureByteArrayBasedOnDataTypes(String msrValue, DataType dataType,
    +      CarbonMeasure carbonMeasure) {
    +    ByteBuffer b;
    +    switch (dataType) {
    +      case BYTE:
    +      case SHORT:
    +      case INT:
    +      case LONG:
    +        b = ByteBuffer.allocate(8);
    +        b.putLong(Long.valueOf(msrValue));
    +        b.flip();
    +        return b.array();
    +      case DOUBLE:
    +        b = ByteBuffer.allocate(8);
    +        b.putDouble(Double.valueOf(msrValue));
    +        b.flip();
    +        return b.array();
    +      case DECIMAL:
    +        BigDecimal bigDecimal =
    +            new BigDecimal(msrValue).setScale(carbonMeasure.getScale(), RoundingMode.HALF_UP);
    +        return DataTypeUtil
    +            .bigDecimalToByte(normalizeDecimalValue(bigDecimal, carbonMeasure.getPrecision()));
    +      default:
    +        throw new IllegalArgumentException("Invalid data type: " + dataType);
    +    }
    +  }
    +
    +  /**
    +   * This method will convert a given ByteArray to its specific type
    +   *
    +   * @param msrValue
    +   * @param dataType
    +   * @param carbonMeasure
    +   * @return
    +   */
    +  public static byte[] getMeasureByteArrayBasedOnDataType(ColumnPage measurePage, int index,
    +      DataType dataType, CarbonMeasure carbonMeasure) {
    +    switch (dataType) {
    +      case DECIMAL:
    +        BigDecimal bigDecimal = new BigDecimal(measurePage.getDouble(index))
    +            .setScale(carbonMeasure.getScale(), RoundingMode.HALF_UP);
    +        return ByteUtil.toBytes(normalizeDecimalValue(bigDecimal, carbonMeasure.getPrecision()));
    +      case SHORT:
    +        return ByteUtil.toBytes(measurePage.getShort(index));
    +      case INT:
    +        return ByteUtil.toBytes(measurePage.getInt(index));
    +      case LONG:
    +        return ByteUtil.toBytes(measurePage.getLong(index));
    +      default:
    +        Double parsedValue = Double.valueOf(measurePage.getDouble(index));
    +        if (Double.isInfinite(parsedValue) || Double.isNaN(parsedValue)) {
    +          return null;
    +        }
    +        return ByteUtil.toBytes(parsedValue);
    +    }
    +  }
    +
    +  public static Object getMeasureObjectBasedOnDataType(ColumnPage measurePage, int index,
    +      DataType dataType, CarbonMeasure carbonMeasure) {
    +    //    switch (dataType) {
    +    //      case DECIMAL:
    +    //        BigDecimal bigDecimal = new BigDecimal(measurePage.getDouble(index))
    +    //            .setScale(carbonMeasure.getScale(), RoundingMode.HALF_UP);
    +    //        return normalizeDecimalValue(bigDecimal, carbonMeasure.getPrecision());
    +    //      case SHORT:
    +    //      case INT:
    +    //      case LONG:
    +    //        return measurePage.getLong(index);
    +    //      default:
    +    //        Double parsedValue = Double.valueOf(measurePage.getDouble(index));
    +    //        if (Double.isInfinite(parsedValue) || Double.isNaN(parsedValue)) {
    +    //          return null;
    +    //        }
    +    //        return parsedValue;
    +    //    }
    --- End diff --
    
    Done.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125191629
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeGrtThanFiterExecuterImpl.java ---
    @@ -74,80 +87,205 @@ private void ifDefaultValueMatchesFilter() {
               }
             }
           }
    +    } else if (!msrColEvalutorInfoList.isEmpty() && !isMeasurePresentInCurrentBlock[0]) {
    +      CarbonMeasure measure = this.msrColEvalutorInfoList.get(0).getMeasure();
    +      byte[] defaultValue = measure.getDefaultValue();
    +      if (null != defaultValue) {
    +        for (int k = 0; k < filterRangeValues.length; k++) {
    +          int maxCompare =
    +              ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterRangeValues[k], defaultValue);
    --- End diff --
    
    Currently Filter Keys are in ByteArray and values for restructuring is also same. Is this still required.  


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125190787
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/metadata/schema/table/CarbonTable.java ---
    @@ -137,6 +137,8 @@
        */
       private int numberOfNoDictSortColumns;
     
    +  private int lastDimensionColumnOrdinal;
    --- End diff --
    
    Removed. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    
    Refer to this link for build results (access rights to CI server needed): 
    https://builds.apache.org/job/carbondata-pr-spark-1.6/715/<h2>Failed Tests: <span class='status-failure'>25</span></h2><h3><a name='carbondata-pr-spark-1.6/org.apache.carbondata:carbondata-spark-common-test' /><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport'>carbondata-pr-spark-1.6/org.apache.carbondata:carbondata-spark-common-test</a>: <span class='status-failure'>25</span></h3><ul><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.bigdecimal/TestBigInt/test_big_int_data_type_storage_for_boundary_values/'><strong>org.apache.carbondata.spark.testsuite.bigdecimal.TestBigInt.test big int data type storage for boundary values</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.c
 arbondata.spark.testsuite.bigdecimal/TestNullAndEmptyFields/test_filter_query_on_column_is_null/'><strong>org.apache.carbondata.spark.testsuite.bigdecimal.TestNullAndEmptyFields.test filter query on column is null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.bigdecimal/TestNullAndEmptyFieldsUnsafe/test_filter_query_on_column_is_null/'><strong>org.apache.carbondata.spark.testsuite.bigdecimal.TestNullAndEmptyFieldsUnsafe.test filter query on column is null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/ExpressionWithNullTestCase/test_to_check_in_expression_with_null_values/'><strong>org.apache.carbondata.spark.testsuite.detailquery.ExpressionWithNullTestCase.test to check in expression with nul
 l values</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/ExpressionWithNullTestCase/test_to_check_not_in_expression_with_null_values/'><strong>org.apache.carbondata.spark.testsuite.detailquery.ExpressionWithNullTestCase.test to check not in expression with null values</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/FilterProcessorTestCase/Greater_Than_equal_to_Filter/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.FilterProcessorTestCase.Greater Than equal to Filter</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/FilterProcess
 orTestCase/Greater_Than_equal_to_Filter_with_limit/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.FilterProcessorTestCase.Greater Than equal to Filter with limit</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/FilterProcessorTestCase/Greater_Than_equal_to_Filter_with_aggregation_limit/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.FilterProcessorTestCase.Greater Than equal to Filter with aggregation limit</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/FilterProcessorTestCase/Greater_Than_equal_to_Filter_with_decimal/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.FilterProcessorTestCase.Greater Than equal to Filter with decimal</strong></a></li><li><a
  href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/NullMeasureValueTestCaseFilter/select_ID_from_t3_where_salary_is_null/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.NullMeasureValueTestCaseFilter.select ID from t3 where salary is null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/DeleteCarbonTableTestCase/delete_data_from__carbon_table_where_numeric_condition___/'><strong>org.apache.carbondata.spark.testsuite.iud.DeleteCarbonTableTestCase.delete data from  carbon table[where numeric condition  ]</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/UpdateCarbonTableT
 estCase/update_carbon__sub_query__between_and_existing_in_outer_condition__Customer_query____/'><strong>org.apache.carbondata.spark.testsuite.iud.UpdateCarbonTableTestCase.update carbon [sub query, between and existing in outer condition.(Customer query ) ]</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.nullvalueserialization/TestNullValueSerialization/test_filter_query_on_column_is_null/'><strong>org.apache.carbondata.spark.testsuite.nullvalueserialization.TestNullValueSerialization.test filter query on column is null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_bigint/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeFo
 rPartitionTable.allTypeTable_hash_bigint</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_float/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_float</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_double/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_double</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeF
 orPartitionTable/allTypeTable_list_float/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_float</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_double/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_double</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_int/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_int</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.ca
 rbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_bigint/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_bigint</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_float/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_float</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_double/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allT
 ypeTable_range_double</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestDataLoadingForPartitionTable/badrecords_on_partition_column/'><strong>org.apache.carbondata.spark.testsuite.partition.TestDataLoadingForPartitionTable.badrecords on partition column</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumns/unsorted_table_creation__query_and_data_loading_with_offheap_and_inmemory_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumns.unsorted table creation, query and data loading with offheap and inmemory sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/715/org.apache.carbondata$carbondata-spark-
 common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumnsWithUnsafe/unsorted_table_creation__query_and_data_loading_with_offheap_and_inmemory_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumnsWithUnsafe.unsorted table creation, query and data loading with offheap and inmemory sort config</strong></a></li></ul>



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [CARBONDATA-1257] Measure Filter implementation

Posted by CarbonDataQA <gi...@git.apache.org>.
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    Build Success with Spark 1.6, Please check CI http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/698/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125199168
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/ExcludeFilterExecuterImpl.java ---
    @@ -18,56 +18,152 @@
     
     import java.io.IOException;
     import java.util.BitSet;
    +import java.util.Comparator;
     
     import org.apache.carbondata.core.datastore.block.SegmentProperties;
     import org.apache.carbondata.core.datastore.chunk.DimensionColumnDataChunk;
    +import org.apache.carbondata.core.datastore.chunk.MeasureColumnDataChunk;
     import org.apache.carbondata.core.datastore.chunk.impl.DimensionRawColumnChunk;
    +import org.apache.carbondata.core.datastore.chunk.impl.MeasureRawColumnChunk;
    +import org.apache.carbondata.core.metadata.datatype.DataType;
     import org.apache.carbondata.core.scan.filter.FilterUtil;
    +import org.apache.carbondata.core.scan.filter.partition.PartitionFilterUtil;
     import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.DimColumnResolvedFilterInfo;
    +import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.MeasureColumnResolvedFilterInfo;
     import org.apache.carbondata.core.scan.processor.BlocksChunkHolder;
     import org.apache.carbondata.core.util.BitSetGroup;
     import org.apache.carbondata.core.util.CarbonUtil;
    +import org.apache.carbondata.core.util.DataTypeUtil;
     
     public class ExcludeFilterExecuterImpl implements FilterExecuter {
     
       protected DimColumnResolvedFilterInfo dimColEvaluatorInfo;
       protected DimColumnExecuterFilterInfo dimColumnExecuterInfo;
    +  protected MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo;
    +  protected MeasureColumnExecuterFilterInfo msrColumnExecutorInfo;
       protected SegmentProperties segmentProperties;
    +  protected boolean isDimensionPresentInCurrentBlock = false;
    +  protected boolean isMeasurePresentInCurrentBlock = false;
    --- End diff --
    
    Done, removed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

Posted by CarbonDataQA <gi...@git.apache.org>.
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    Build Failed  with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2832/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125152873
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
    @@ -395,6 +440,58 @@ public static DimColumnFilterInfo getNoDictionaryValKeyMemberForFilter(
       }
     
       /**
    +   * This method will get the no dictionary data based on filters and same
    +   * will be in ColumnFilterInfo
    +   *
    +   * @param evaluateResultListFinal
    +   * @param isIncludeFilter
    +   * @return ColumnFilterInfo
    +   */
    +  public static ColumnFilterInfo getMeasureValKeyMemberForFilter(
    +      List<String> evaluateResultListFinal, boolean isIncludeFilter, DataType dataType,
    +      CarbonMeasure carbonMeasure) throws FilterUnsupportedException {
    +    List<byte[]> filterValuesList = new ArrayList<byte[]>(20);
    +    String result = null;
    +    try {
    +      int length = evaluateResultListFinal.size();
    +      for (int i = 0; i < length; i++) {
    +        result = evaluateResultListFinal.get(i);
    +        if (CarbonCommonConstants.MEMBER_DEFAULT_VAL.equals(result)) {
    +          filterValuesList.add(new byte[0]);
    +          continue;
    +        }
    +        // TODO have to understand what method to be used for measures.
    +        // filterValuesList
    +        //  .add(DataTypeUtil.getBytesBasedOnDataTypeForNoDictionaryColumn(result, dataType));
    +
    +        filterValuesList
    +            .add(DataTypeUtil.getMeasureByteArrayBasedOnDataTypes(result, dataType, carbonMeasure));
    +
    +      }
    +    } catch (Throwable ex) {
    +      throw new FilterUnsupportedException("Unsupported Filter condition: " + result, ex);
    +    }
    +
    +    Comparator<byte[]> filterMeasureComaparator = new Comparator<byte[]>() {
    +
    +      @Override public int compare(byte[] filterMember1, byte[] filterMember2) {
    +        // TODO Auto-generated method stub
    +        return ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterMember1, filterMember2);
    --- End diff --
    
    this is wrong, we cannot compare `double,float,decimal` through bytes. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125154262
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanFiterExecuterImpl.java ---
    @@ -77,72 +89,188 @@ private void ifDefaultValueMatchesFilter() {
               }
             }
           }
    +    } else if (!msrColEvalutorInfoList.isEmpty() && !isMeasurePresentInCurrentBlock[0]) {
    +      CarbonMeasure measure = this.msrColEvalutorInfoList.get(0).getMeasure();
    +      byte[] defaultValue = measure.getDefaultValue();
    +      if (null != defaultValue) {
    +        for (int k = 0; k < filterRangeValues.length; k++) {
    +          int maxCompare =
    +              ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterRangeValues[k], defaultValue);
    +          if (maxCompare > 0) {
    +            isDefaultValuePresentInFilter = true;
    +            break;
    +          }
    +        }
    +      }
         }
       }
     
       @Override public BitSet isScanRequired(byte[][] blockMaxValue, byte[][] blockMinValue) {
         BitSet bitSet = new BitSet(1);
    -    boolean isScanRequired =
    -        isScanRequired(blockMinValue[dimensionBlocksIndex[0]], filterRangeValues);
    +    byte[] minValue = null;
    +    boolean isScanRequired = false;
    +    if (isMeasurePresentInCurrentBlock[0] || isDimensionPresentInCurrentBlock[0]) {
    +      if (isMeasurePresentInCurrentBlock[0]) {
    +        minValue = blockMinValue[measureBlocksIndex[0] + lastDimensionColOrdinal];
    +        isScanRequired =
    +            isScanRequired(minValue, filterRangeValues, msrColEvalutorInfoList.get(0).getType());
    +      } else {
    +        minValue = blockMinValue[dimensionBlocksIndex[0]];
    +        isScanRequired = isScanRequired(minValue, filterRangeValues);
    +      }
    +    } else {
    +      isScanRequired = isDefaultValuePresentInFilter;
    +    }
         if (isScanRequired) {
           bitSet.set(0);
         }
         return bitSet;
       }
     
    +
       private boolean isScanRequired(byte[] blockMinValue, byte[][] filterValues) {
         boolean isScanRequired = false;
    -    if (isDimensionPresentInCurrentBlock[0]) {
    -      for (int k = 0; k < filterValues.length; k++) {
    -        // and filter-min should be positive
    -        int minCompare = ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMinValue);
    +    for (int k = 0; k < filterValues.length; k++) {
    +      // and filter-min should be positive
    +      int minCompare = ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMinValue);
     
    -        // if any filter applied is not in range of min and max of block
    -        // then since its a less than fiter validate whether the block
    -        // min range is less  than applied filter member
    -        if (minCompare > 0) {
    -          isScanRequired = true;
    -          break;
    -        }
    +      // if any filter applied is not in range of min and max of block
    +      // then since its a less than equal to fiter validate whether the block
    +      // min range is less than equal to applied filter member
    +      if (minCompare > 0) {
    +        isScanRequired = true;
    +        break;
           }
    -    } else {
    -      isScanRequired = isDefaultValuePresentInFilter;
         }
         return isScanRequired;
       }
     
    +  private boolean isScanRequired(byte[] minValue, byte[][] filterValue,
    +      DataType dataType) {
    +    for (int i = 0; i < filterValue.length; i++) {
    +      if (filterValue[i].length == 0 || minValue.length == 0) {
    +        return isScanRequired(minValue, filterValue);
    +      }
    +      switch (dataType) {
    --- End diff --
    
    Use existing methods of `DataTypeUtil` and comparator here


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125154119
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeGrtThanFiterExecuterImpl.java ---
    @@ -74,80 +87,205 @@ private void ifDefaultValueMatchesFilter() {
               }
             }
           }
    +    } else if (!msrColEvalutorInfoList.isEmpty() && !isMeasurePresentInCurrentBlock[0]) {
    +      CarbonMeasure measure = this.msrColEvalutorInfoList.get(0).getMeasure();
    +      byte[] defaultValue = measure.getDefaultValue();
    +      if (null != defaultValue) {
    +        for (int k = 0; k < filterRangeValues.length; k++) {
    +          int maxCompare =
    +              ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterRangeValues[k], defaultValue);
    +          if (maxCompare < 0) {
    +            isDefaultValuePresentInFilter = true;
    +            break;
    +          }
    +        }
    +      }
         }
       }
     
       @Override public BitSet isScanRequired(byte[][] blockMaxValue, byte[][] blockMinValue) {
         BitSet bitSet = new BitSet(1);
    -    boolean isScanRequired =
    -        isScanRequired(blockMaxValue[dimensionBlocksIndex[0]], filterRangeValues);
    +    boolean isScanRequired = false;
    +    byte[] maxValue = null;
    +    if (isMeasurePresentInCurrentBlock[0] || isDimensionPresentInCurrentBlock[0]) {
    +      if (isMeasurePresentInCurrentBlock[0]) {
    +        maxValue = blockMaxValue[measureBlocksIndex[0] + lastDimensionColOrdinal];
    +        isScanRequired =
    +            isScanRequired(maxValue, filterRangeValues, msrColEvalutorInfoList.get(0).getType());
    +      } else {
    +        maxValue = blockMaxValue[dimensionBlocksIndex[0]];
    +        isScanRequired = isScanRequired(maxValue, filterRangeValues);
    +      }
    +    } else {
    +      isScanRequired = isDefaultValuePresentInFilter;
    +    }
    +
         if (isScanRequired) {
           bitSet.set(0);
         }
         return bitSet;
       }
     
    +
       private boolean isScanRequired(byte[] blockMaxValue, byte[][] filterValues) {
         boolean isScanRequired = false;
    -    if (isDimensionPresentInCurrentBlock[0]) {
    -      for (int k = 0; k < filterValues.length; k++) {
    -        // filter value should be in range of max and min value i.e
    -        // max>filtervalue>min
    -        // so filter-max should be negative
    -        int maxCompare = ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMaxValue);
    -        // if any filter value is in range than this block needs to be
    -        // scanned means always less than block max range.
    -        if (maxCompare < 0) {
    -          isScanRequired = true;
    -          break;
    -        }
    +    for (int k = 0; k < filterValues.length; k++) {
    +      // filter value should be in range of max and min value i.e
    +      // max>filtervalue>min
    +      // so filter-max should be negative
    +      int maxCompare = ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMaxValue);
    +      // if any filter value is in range than this block needs to be
    +      // scanned less than equal to max range.
    +      if (maxCompare < 0) {
    +        isScanRequired = true;
    +        break;
           }
    -    } else {
    -      isScanRequired = isDefaultValuePresentInFilter;
         }
         return isScanRequired;
       }
     
    +  private boolean isScanRequired(byte[] maxValue, byte[][] filterValue,
    +      DataType dataType) {
    +    for (int i = 0; i < filterValue.length; i++) {
    +      if (filterValue[i].length == 0 || maxValue.length == 0) {
    +        return isScanRequired(maxValue, filterValue);
    +      }
    +      switch (dataType) {
    --- End diff --
    
    Use existing DataTypeUtil  methods and comparator here to compare


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125152663
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
    @@ -180,9 +185,27 @@ private static FilterExecuter createFilterExecuterTree(
        * @return
        */
       private static FilterExecuter getIncludeFilterExecuter(
    -      DimColumnResolvedFilterInfo dimColResolvedFilterInfo, SegmentProperties segmentProperties) {
    -
    -    if (dimColResolvedFilterInfo.getDimension().isColumnar()) {
    +      DimColumnResolvedFilterInfo dimColResolvedFilterInfo,
    +      MeasureColumnResolvedFilterInfo msrColResolvedFilterInfo,
    +      SegmentProperties segmentProperties) {
    +    if (null != msrColResolvedFilterInfo && msrColResolvedFilterInfo.getMeasure().isColumnar()) {
    +      CarbonMeasure measuresFromCurrentBlock = segmentProperties
    +          .getMeasureFromCurrentBlock(msrColResolvedFilterInfo.getMeasure().getColumnId());
    +      if (null != measuresFromCurrentBlock) {
    +        // update dimension and column index according to the dimension position in current block
    +        MeasureColumnResolvedFilterInfo msrColResolvedFilterInfoCopyObject =
    +            msrColResolvedFilterInfo.getCopyObject();
    +        msrColResolvedFilterInfoCopyObject.setMeasure(measuresFromCurrentBlock);
    +        msrColResolvedFilterInfoCopyObject.setColumnIndex(measuresFromCurrentBlock.getOrdinal());
    +        msrColResolvedFilterInfoCopyObject.setType(measuresFromCurrentBlock.getDataType());
    +        return new IncludeFilterExecuterImpl(null, msrColResolvedFilterInfoCopyObject,
    +            segmentProperties, true);
    +      } else {
    +        return new RestructureIncludeFilterExecutorImpl(dimColResolvedFilterInfo,
    +            msrColResolvedFilterInfo, segmentProperties, true);
    +      }
    +    }
    --- End diff --
    
    isn't `else` should be here for `dimColResolvedFilterInfo` check


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125153317
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java ---
    @@ -113,6 +115,143 @@ public static Object getMeasureValueBasedOnDataType(String msrValue, DataType da
         }
       }
     
    +  public static Object getMeasureObjectFromDataType(byte[] data, DataType dataType) {
    +    ByteBuffer bb = ByteBuffer.wrap(data);
    +    switch (dataType) {
    +      case SHORT:
    +      case INT:
    +      case LONG:
    +        return bb.getLong();
    +      case DECIMAL:
    +        return byteToBigDecimal(data);
    +      default:
    +        return bb.getDouble();
    +    }
    +  }
    +
    +  /**
    +   * This method will convert a given ByteArray to its specific type
    +   *
    +   * @param msrValue
    +   * @param dataType
    +   * @param carbonMeasure
    +   * @return
    +   */
    +  //  public static byte[] getMeasureByteArrayBasedOnDataType(String msrValue, DataType dataType,
    +  //      CarbonMeasure carbonMeasure) {
    +  //    switch (dataType) {
    +  //      case DECIMAL:
    +  //        BigDecimal bigDecimal =
    +  //            new BigDecimal(msrValue).setScale(carbonMeasure.getScale(), RoundingMode.HALF_UP);
    +  //       return ByteUtil.toBytes(normalizeDecimalValue(bigDecimal, carbonMeasure.getPrecision()));
    +  //      case SHORT:
    +  //        return ByteUtil.toBytes((Short.parseShort(msrValue)));
    +  //      case INT:
    +  //        return ByteUtil.toBytes(Integer.parseInt(msrValue));
    +  //      case LONG:
    +  //        return ByteUtil.toBytes(Long.valueOf(msrValue));
    +  //      default:
    +  //        Double parsedValue = Double.valueOf(msrValue);
    +  //        if (Double.isInfinite(parsedValue) || Double.isNaN(parsedValue)) {
    +  //          return null;
    +  //        }
    +  //        return ByteUtil.toBytes(parsedValue);
    +  //    }
    +  //  }
    +  public static byte[] getMeasureByteArrayBasedOnDataTypes(String msrValue, DataType dataType,
    +      CarbonMeasure carbonMeasure) {
    +    ByteBuffer b;
    +    switch (dataType) {
    +      case BYTE:
    +      case SHORT:
    +      case INT:
    +      case LONG:
    +        b = ByteBuffer.allocate(8);
    +        b.putLong(Long.valueOf(msrValue));
    +        b.flip();
    +        return b.array();
    +      case DOUBLE:
    +        b = ByteBuffer.allocate(8);
    +        b.putDouble(Double.valueOf(msrValue));
    +        b.flip();
    +        return b.array();
    +      case DECIMAL:
    +        BigDecimal bigDecimal =
    +            new BigDecimal(msrValue).setScale(carbonMeasure.getScale(), RoundingMode.HALF_UP);
    +        return DataTypeUtil
    +            .bigDecimalToByte(normalizeDecimalValue(bigDecimal, carbonMeasure.getPrecision()));
    +      default:
    +        throw new IllegalArgumentException("Invalid data type: " + dataType);
    +    }
    +  }
    +
    +  /**
    +   * This method will convert a given ByteArray to its specific type
    +   *
    +   * @param msrValue
    +   * @param dataType
    +   * @param carbonMeasure
    +   * @return
    +   */
    +  public static byte[] getMeasureByteArrayBasedOnDataType(ColumnPage measurePage, int index,
    +      DataType dataType, CarbonMeasure carbonMeasure) {
    +    switch (dataType) {
    +      case DECIMAL:
    +        BigDecimal bigDecimal = new BigDecimal(measurePage.getDouble(index))
    +            .setScale(carbonMeasure.getScale(), RoundingMode.HALF_UP);
    +        return ByteUtil.toBytes(normalizeDecimalValue(bigDecimal, carbonMeasure.getPrecision()));
    +      case SHORT:
    +        return ByteUtil.toBytes(measurePage.getShort(index));
    +      case INT:
    +        return ByteUtil.toBytes(measurePage.getInt(index));
    +      case LONG:
    +        return ByteUtil.toBytes(measurePage.getLong(index));
    +      default:
    +        Double parsedValue = Double.valueOf(measurePage.getDouble(index));
    +        if (Double.isInfinite(parsedValue) || Double.isNaN(parsedValue)) {
    +          return null;
    +        }
    +        return ByteUtil.toBytes(parsedValue);
    +    }
    +  }
    +
    +  public static Object getMeasureObjectBasedOnDataType(ColumnPage measurePage, int index,
    +      DataType dataType, CarbonMeasure carbonMeasure) {
    +    //    switch (dataType) {
    +    //      case DECIMAL:
    +    //        BigDecimal bigDecimal = new BigDecimal(measurePage.getDouble(index))
    +    //            .setScale(carbonMeasure.getScale(), RoundingMode.HALF_UP);
    +    //        return normalizeDecimalValue(bigDecimal, carbonMeasure.getPrecision());
    +    //      case SHORT:
    +    //      case INT:
    +    //      case LONG:
    +    //        return measurePage.getLong(index);
    +    //      default:
    +    //        Double parsedValue = Double.valueOf(measurePage.getDouble(index));
    +    //        if (Double.isInfinite(parsedValue) || Double.isNaN(parsedValue)) {
    +    //          return null;
    +    //        }
    +    //        return parsedValue;
    +    //    }
    --- End diff --
    
    Please remove commented code


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125152881
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
    @@ -395,6 +440,58 @@ public static DimColumnFilterInfo getNoDictionaryValKeyMemberForFilter(
       }
     
       /**
    +   * This method will get the no dictionary data based on filters and same
    +   * will be in ColumnFilterInfo
    +   *
    +   * @param evaluateResultListFinal
    +   * @param isIncludeFilter
    +   * @return ColumnFilterInfo
    +   */
    +  public static ColumnFilterInfo getMeasureValKeyMemberForFilter(
    +      List<String> evaluateResultListFinal, boolean isIncludeFilter, DataType dataType,
    +      CarbonMeasure carbonMeasure) throws FilterUnsupportedException {
    +    List<byte[]> filterValuesList = new ArrayList<byte[]>(20);
    +    String result = null;
    +    try {
    +      int length = evaluateResultListFinal.size();
    +      for (int i = 0; i < length; i++) {
    +        result = evaluateResultListFinal.get(i);
    +        if (CarbonCommonConstants.MEMBER_DEFAULT_VAL.equals(result)) {
    +          filterValuesList.add(new byte[0]);
    +          continue;
    +        }
    +        // TODO have to understand what method to be used for measures.
    +        // filterValuesList
    +        //  .add(DataTypeUtil.getBytesBasedOnDataTypeForNoDictionaryColumn(result, dataType));
    +
    +        filterValuesList
    +            .add(DataTypeUtil.getMeasureByteArrayBasedOnDataTypes(result, dataType, carbonMeasure));
    +
    +      }
    +    } catch (Throwable ex) {
    +      throw new FilterUnsupportedException("Unsupported Filter condition: " + result, ex);
    +    }
    +
    +    Comparator<byte[]> filterMeasureComaparator = new Comparator<byte[]>() {
    +
    +      @Override public int compare(byte[] filterMember1, byte[] filterMember2) {
    +        // TODO Auto-generated method stub
    +        return ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterMember1, filterMember2);
    --- End diff --
    
    Please compare actual values before convert to binary


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125152368
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/metadata/schema/table/CarbonTable.java ---
    @@ -137,6 +137,8 @@
        */
       private int numberOfNoDictSortColumns;
     
    +  private int lastDimensionColumnOrdinal;
    --- End diff --
    
    Not used any where, please remove


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125152630
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
    @@ -180,9 +185,27 @@ private static FilterExecuter createFilterExecuterTree(
        * @return
        */
       private static FilterExecuter getIncludeFilterExecuter(
    -      DimColumnResolvedFilterInfo dimColResolvedFilterInfo, SegmentProperties segmentProperties) {
    -
    -    if (dimColResolvedFilterInfo.getDimension().isColumnar()) {
    +      DimColumnResolvedFilterInfo dimColResolvedFilterInfo,
    +      MeasureColumnResolvedFilterInfo msrColResolvedFilterInfo,
    +      SegmentProperties segmentProperties) {
    +    if (null != msrColResolvedFilterInfo && msrColResolvedFilterInfo.getMeasure().isColumnar()) {
    --- End diff --
    
    I don't think `msrColResolvedFilterInfo.getMeasure().isColumnar()` is really required. it is only for dimensions.please remove


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

Posted by CarbonDataQA <gi...@git.apache.org>.
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    Build Failed  with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2663/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125191383
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/resolver/ConditionalFilterResolverImpl.java ---
    @@ -44,16 +44,22 @@
       protected boolean isExpressionResolve;
       protected boolean isIncludeFilter;
       private DimColumnResolvedFilterInfo dimColResolvedFilterInfo;
    +  private MeasureColumnResolvedFilterInfo msrColResolvedFilterInfo;
       private AbsoluteTableIdentifier tableIdentifier;
    +  private boolean isMeasure;
    --- End diff --
    
    Done. Removed


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

Posted by CarbonDataQA <gi...@git.apache.org>.
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2834/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

Posted by CarbonDataQA <gi...@git.apache.org>.
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    Build Failed with Spark 1.6, Please check CI http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/253/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125152433
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/expression/ColumnExpression.java ---
    @@ -31,12 +32,16 @@
     
       private boolean isDimension;
     
    +  private boolean isMeasure;
    --- End diff --
    
    same information is available inside `measure`, so please remove it and get it from `measure`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125199504
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/IncludeFilterExecuterImpl.java ---
    @@ -152,12 +261,31 @@ private BitSet setFilterdIndexToBitSet(DimensionColumnDataChunk dimensionColumnD
     
       public BitSet isScanRequired(byte[][] blkMaxVal, byte[][] blkMinVal) {
         BitSet bitSet = new BitSet(1);
    -    byte[][] filterValues = dimColumnExecuterInfo.getFilterKeys();
    -    int columnIndex = dimColumnEvaluatorInfo.getColumnIndex();
    -    int blockIndex = segmentProperties.getDimensionOrdinalToBlockMapping().get(columnIndex);
    +    byte[][] filterValues = null;
    +    int columnIndex = 0;
    +    int blockIndex = 0;
    +    boolean isScanRequired = false;
    +
    +    if (isDimensionPresentInCurrentBlock == true) {
    +      filterValues = dimColumnExecuterInfo.getFilterKeys();
    +      columnIndex = dimColumnEvaluatorInfo.getColumnIndex();
    +      blockIndex = segmentProperties.getDimensionOrdinalToBlockMapping().get(columnIndex);
    +      isScanRequired =
    +          isScanRequired(blkMaxVal[blockIndex], blkMinVal[blockIndex], filterValues);
    +
    +    } else if (isMeasurePresentInCurrentBlock) {
    +      filterValues = msrColumnExecutorInfo.getFilterKeys();
    +      columnIndex = msrColumnEvaluatorInfo.getColumnIndex();
    +      // blockIndex =
    +      // segmentProperties.getDimensionOrdinalToBlockMapping().get(columnIndex) + segmentProperties
    +      //         .getLastDimensionColOrdinal();
    --- End diff --
    
    Done


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125191810
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
    @@ -395,6 +440,58 @@ public static DimColumnFilterInfo getNoDictionaryValKeyMemberForFilter(
       }
     
       /**
    +   * This method will get the no dictionary data based on filters and same
    +   * will be in ColumnFilterInfo
    +   *
    +   * @param evaluateResultListFinal
    +   * @param isIncludeFilter
    +   * @return ColumnFilterInfo
    +   */
    +  public static ColumnFilterInfo getMeasureValKeyMemberForFilter(
    +      List<String> evaluateResultListFinal, boolean isIncludeFilter, DataType dataType,
    +      CarbonMeasure carbonMeasure) throws FilterUnsupportedException {
    +    List<byte[]> filterValuesList = new ArrayList<byte[]>(20);
    +    String result = null;
    +    try {
    +      int length = evaluateResultListFinal.size();
    +      for (int i = 0; i < length; i++) {
    +        result = evaluateResultListFinal.get(i);
    +        if (CarbonCommonConstants.MEMBER_DEFAULT_VAL.equals(result)) {
    +          filterValuesList.add(new byte[0]);
    +          continue;
    +        }
    +        // TODO have to understand what method to be used for measures.
    +        // filterValuesList
    +        //  .add(DataTypeUtil.getBytesBasedOnDataTypeForNoDictionaryColumn(result, dataType));
    +
    +        filterValuesList
    +            .add(DataTypeUtil.getMeasureByteArrayBasedOnDataTypes(result, dataType, carbonMeasure));
    --- End diff --
    
    Currently we are storing filter keys of measures in byte array format. In next optimization phase will change to Object array of respective type to avoid conversion. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125190853
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
    @@ -180,9 +185,27 @@ private static FilterExecuter createFilterExecuterTree(
        * @return
        */
       private static FilterExecuter getIncludeFilterExecuter(
    -      DimColumnResolvedFilterInfo dimColResolvedFilterInfo, SegmentProperties segmentProperties) {
    -
    -    if (dimColResolvedFilterInfo.getDimension().isColumnar()) {
    +      DimColumnResolvedFilterInfo dimColResolvedFilterInfo,
    +      MeasureColumnResolvedFilterInfo msrColResolvedFilterInfo,
    +      SegmentProperties segmentProperties) {
    +    if (null != msrColResolvedFilterInfo && msrColResolvedFilterInfo.getMeasure().isColumnar()) {
    --- End diff --
    
    Done. Removed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125153193
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java ---
    @@ -113,6 +115,143 @@ public static Object getMeasureValueBasedOnDataType(String msrValue, DataType da
         }
       }
     
    +  public static Object getMeasureObjectFromDataType(byte[] data, DataType dataType) {
    +    ByteBuffer bb = ByteBuffer.wrap(data);
    +    switch (dataType) {
    +      case SHORT:
    +      case INT:
    +      case LONG:
    +        return bb.getLong();
    +      case DECIMAL:
    +        return byteToBigDecimal(data);
    +      default:
    +        return bb.getDouble();
    +    }
    +  }
    +
    +  /**
    +   * This method will convert a given ByteArray to its specific type
    +   *
    +   * @param msrValue
    +   * @param dataType
    +   * @param carbonMeasure
    +   * @return
    +   */
    +  //  public static byte[] getMeasureByteArrayBasedOnDataType(String msrValue, DataType dataType,
    +  //      CarbonMeasure carbonMeasure) {
    +  //    switch (dataType) {
    +  //      case DECIMAL:
    +  //        BigDecimal bigDecimal =
    +  //            new BigDecimal(msrValue).setScale(carbonMeasure.getScale(), RoundingMode.HALF_UP);
    +  //       return ByteUtil.toBytes(normalizeDecimalValue(bigDecimal, carbonMeasure.getPrecision()));
    +  //      case SHORT:
    +  //        return ByteUtil.toBytes((Short.parseShort(msrValue)));
    +  //      case INT:
    +  //        return ByteUtil.toBytes(Integer.parseInt(msrValue));
    +  //      case LONG:
    +  //        return ByteUtil.toBytes(Long.valueOf(msrValue));
    +  //      default:
    +  //        Double parsedValue = Double.valueOf(msrValue);
    +  //        if (Double.isInfinite(parsedValue) || Double.isNaN(parsedValue)) {
    +  //          return null;
    +  //        }
    +  //        return ByteUtil.toBytes(parsedValue);
    +  //    }
    +  //  }
    --- End diff --
    
    remove the commented code


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125154233
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanEqualFilterExecuterImpl.java ---
    @@ -91,57 +129,147 @@ private void ifDefaultValueMatchesFilter() {
     
       private boolean isScanRequired(byte[] blockMinValue, byte[][] filterValues) {
         boolean isScanRequired = false;
    -    if (isDimensionPresentInCurrentBlock[0]) {
    -      for (int k = 0; k < filterValues.length; k++) {
    -        // and filter-min should be positive
    -        int minCompare = ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMinValue);
    +    for (int k = 0; k < filterValues.length; k++) {
    +      // and filter-min should be positive
    +      int minCompare = ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMinValue);
     
    -        // if any filter applied is not in range of min and max of block
    -        // then since its a less than equal to fiter validate whether the block
    -        // min range is less than equal to applied filter member
    -        if (minCompare >= 0) {
    -          isScanRequired = true;
    -          break;
    -        }
    +      // if any filter applied is not in range of min and max of block
    +      // then since its a less than equal to fiter validate whether the block
    +      // min range is less than equal to applied filter member
    +      if (minCompare >= 0) {
    +        isScanRequired = true;
    +        break;
           }
    -    } else {
    -      isScanRequired = isDefaultValuePresentInFilter;
         }
         return isScanRequired;
       }
     
    +  private boolean isScanRequired(byte[] minValue, byte[][] filterValue,
    +      DataType dataType) {
    +    for (int i = 0; i < filterValue.length; i++) {
    +      if (filterValue[i].length == 0 || minValue.length == 0) {
    +        return isScanRequired(minValue, filterValue);
    +      }
    +      switch (dataType) {
    --- End diff --
    
    Use existing methods of `DataTypeUtil` and comparator here


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125154090
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeGrtThanFiterExecuterImpl.java ---
    @@ -74,80 +87,205 @@ private void ifDefaultValueMatchesFilter() {
               }
             }
           }
    +    } else if (!msrColEvalutorInfoList.isEmpty() && !isMeasurePresentInCurrentBlock[0]) {
    +      CarbonMeasure measure = this.msrColEvalutorInfoList.get(0).getMeasure();
    +      byte[] defaultValue = measure.getDefaultValue();
    +      if (null != defaultValue) {
    +        for (int k = 0; k < filterRangeValues.length; k++) {
    +          int maxCompare =
    +              ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterRangeValues[k], defaultValue);
    --- End diff --
    
    This comparison in case of measure is wrong. Always compare actual values. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [CARBONDATA-1257] Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    LGTM


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125154170
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeGrtThanFiterExecuterImpl.java ---
    @@ -74,80 +87,205 @@ private void ifDefaultValueMatchesFilter() {
               }
             }
           }
    +    } else if (!msrColEvalutorInfoList.isEmpty() && !isMeasurePresentInCurrentBlock[0]) {
    +      CarbonMeasure measure = this.msrColEvalutorInfoList.get(0).getMeasure();
    +      byte[] defaultValue = measure.getDefaultValue();
    +      if (null != defaultValue) {
    +        for (int k = 0; k < filterRangeValues.length; k++) {
    +          int maxCompare =
    +              ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterRangeValues[k], defaultValue);
    +          if (maxCompare < 0) {
    +            isDefaultValuePresentInFilter = true;
    +            break;
    +          }
    +        }
    +      }
         }
       }
     
       @Override public BitSet isScanRequired(byte[][] blockMaxValue, byte[][] blockMinValue) {
         BitSet bitSet = new BitSet(1);
    -    boolean isScanRequired =
    -        isScanRequired(blockMaxValue[dimensionBlocksIndex[0]], filterRangeValues);
    +    boolean isScanRequired = false;
    +    byte[] maxValue = null;
    +    if (isMeasurePresentInCurrentBlock[0] || isDimensionPresentInCurrentBlock[0]) {
    +      if (isMeasurePresentInCurrentBlock[0]) {
    +        maxValue = blockMaxValue[measureBlocksIndex[0] + lastDimensionColOrdinal];
    +        isScanRequired =
    +            isScanRequired(maxValue, filterRangeValues, msrColEvalutorInfoList.get(0).getType());
    +      } else {
    +        maxValue = blockMaxValue[dimensionBlocksIndex[0]];
    +        isScanRequired = isScanRequired(maxValue, filterRangeValues);
    +      }
    +    } else {
    +      isScanRequired = isDefaultValuePresentInFilter;
    +    }
    +
         if (isScanRequired) {
           bitSet.set(0);
         }
         return bitSet;
       }
     
    +
       private boolean isScanRequired(byte[] blockMaxValue, byte[][] filterValues) {
         boolean isScanRequired = false;
    -    if (isDimensionPresentInCurrentBlock[0]) {
    -      for (int k = 0; k < filterValues.length; k++) {
    -        // filter value should be in range of max and min value i.e
    -        // max>filtervalue>min
    -        // so filter-max should be negative
    -        int maxCompare = ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMaxValue);
    -        // if any filter value is in range than this block needs to be
    -        // scanned means always less than block max range.
    -        if (maxCompare < 0) {
    -          isScanRequired = true;
    -          break;
    -        }
    +    for (int k = 0; k < filterValues.length; k++) {
    +      // filter value should be in range of max and min value i.e
    +      // max>filtervalue>min
    +      // so filter-max should be negative
    +      int maxCompare = ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMaxValue);
    +      // if any filter value is in range than this block needs to be
    +      // scanned less than equal to max range.
    +      if (maxCompare < 0) {
    +        isScanRequired = true;
    +        break;
           }
    -    } else {
    -      isScanRequired = isDefaultValuePresentInFilter;
         }
         return isScanRequired;
       }
     
    +  private boolean isScanRequired(byte[] maxValue, byte[][] filterValue,
    +      DataType dataType) {
    +    for (int i = 0; i < filterValue.length; i++) {
    +      if (filterValue[i].length == 0 || maxValue.length == 0) {
    +        return isScanRequired(maxValue, filterValue);
    +      }
    +      switch (dataType) {
    +        case DOUBLE:
    +          double maxValueDouble = ByteBuffer.wrap(maxValue).getDouble();
    +          double filterValueDouble = ByteBuffer.wrap(filterValue[i]).getDouble();
    +          if (filterValueDouble < maxValueDouble) {
    +            return true;
    +          }
    +          break;
    +        case INT:
    +        case SHORT:
    +        case LONG:
    +          long maxValueLong = ByteBuffer.wrap(maxValue).getLong();
    +          long filterValueLong = ByteBuffer.wrap(filterValue[i]).getLong();
    +          if (filterValueLong < maxValueLong) {
    +            return true;
    +          }
    +          break;
    +        case DECIMAL:
    +          BigDecimal maxDecimal = DataTypeUtil.byteToBigDecimal(maxValue);
    +          BigDecimal filterDecimal = DataTypeUtil.byteToBigDecimal(filterValue[i]);
    +          if (filterDecimal.compareTo(maxDecimal) < 0) {
    +            return true;
    +          }
    +      }
    +    }
    +    return false;
    +  }
    +
       @Override public BitSetGroup applyFilter(BlocksChunkHolder blockChunkHolder)
           throws FilterUnsupportedException, IOException {
         // select all rows if dimension does not exists in the current block
    -    if (!isDimensionPresentInCurrentBlock[0]) {
    +    if (!isDimensionPresentInCurrentBlock[0] && !isMeasurePresentInCurrentBlock[0]) {
           int numberOfRows = blockChunkHolder.getDataBlock().nodeSize();
           return FilterUtil
               .createBitSetGroupWithDefaultValue(blockChunkHolder.getDataBlock().numberOfPages(),
                   numberOfRows, true);
         }
    -    int blockIndex =
    -        segmentProperties.getDimensionOrdinalToBlockMapping().get(dimensionBlocksIndex[0]);
    -    if (null == blockChunkHolder.getDimensionRawDataChunk()[blockIndex]) {
    -      blockChunkHolder.getDimensionRawDataChunk()[blockIndex] = blockChunkHolder.getDataBlock()
    -          .getDimensionChunk(blockChunkHolder.getFileReader(), blockIndex);
    -    }
    -    DimensionRawColumnChunk rawColumnChunk =
    -        blockChunkHolder.getDimensionRawDataChunk()[blockIndex];
    -    BitSetGroup bitSetGroup = new BitSetGroup(rawColumnChunk.getPagesCount());
    -    for (int i = 0; i < rawColumnChunk.getPagesCount(); i++) {
    -      if (rawColumnChunk.getMaxValues() != null) {
    -        if (isScanRequired(rawColumnChunk.getMaxValues()[i], this.filterRangeValues)) {
    -          int compare = ByteUtil.UnsafeComparer.INSTANCE
    -              .compareTo(filterRangeValues[0], rawColumnChunk.getMinValues()[i]);
    -          if (compare < 0) {
    -            BitSet bitSet = new BitSet(rawColumnChunk.getRowCount()[i]);
    -            bitSet.flip(0, rawColumnChunk.getRowCount()[i]);
    -            bitSetGroup.setBitSet(bitSet, i);
    -          } else {
    -            BitSet bitSet = getFilteredIndexes(rawColumnChunk.convertToDimColDataChunk(i),
    -                rawColumnChunk.getRowCount()[i]);
    -            bitSetGroup.setBitSet(bitSet, i);
    +    if (isDimensionPresentInCurrentBlock[0]) {
    +      int blockIndex =
    +          segmentProperties.getDimensionOrdinalToBlockMapping().get(dimensionBlocksIndex[0]);
    +      if (null == blockChunkHolder.getDimensionRawDataChunk()[blockIndex]) {
    +        blockChunkHolder.getDimensionRawDataChunk()[blockIndex] = blockChunkHolder.getDataBlock()
    +            .getDimensionChunk(blockChunkHolder.getFileReader(), blockIndex);
    +      }
    +      DimensionRawColumnChunk rawColumnChunk =
    +          blockChunkHolder.getDimensionRawDataChunk()[blockIndex];
    +      BitSetGroup bitSetGroup = new BitSetGroup(rawColumnChunk.getPagesCount());
    +      for (int i = 0; i < rawColumnChunk.getPagesCount(); i++) {
    +        if (rawColumnChunk.getMaxValues() != null) {
    +          if (isScanRequired(rawColumnChunk.getMaxValues()[i], this.filterRangeValues)) {
    +            int compare = ByteUtil.UnsafeComparer.INSTANCE
    +                .compareTo(filterRangeValues[0], rawColumnChunk.getMinValues()[i]);
    +            if (compare < 0) {
    +              BitSet bitSet = new BitSet(rawColumnChunk.getRowCount()[i]);
    +              bitSet.flip(0, rawColumnChunk.getRowCount()[i]);
    +              bitSetGroup.setBitSet(bitSet, i);
    +            } else {
    +              BitSet bitSet = getFilteredIndexes(rawColumnChunk.convertToDimColDataChunk(i),
    +                  rawColumnChunk.getRowCount()[i]);
    +              bitSetGroup.setBitSet(bitSet, i);
    +            }
               }
    +        } else {
    +          BitSet bitSet = getFilteredIndexes(rawColumnChunk.convertToDimColDataChunk(i),
    +              rawColumnChunk.getRowCount()[i]);
    +          bitSetGroup.setBitSet(bitSet, i);
    +        }
    +      }
    +      return bitSetGroup;
    +    } else if (isMeasurePresentInCurrentBlock[0]) {
    +      int blockIndex =
    +          segmentProperties.getMeasuresOrdinalToBlockMapping().get(measureBlocksIndex[0]);
    +      if (null == blockChunkHolder.getMeasureRawDataChunk()[blockIndex]) {
    +        blockChunkHolder.getMeasureRawDataChunk()[blockIndex] = blockChunkHolder.getDataBlock()
    +            .getMeasureChunk(blockChunkHolder.getFileReader(), blockIndex);
    +      }
    +      MeasureRawColumnChunk rawColumnChunk = blockChunkHolder.getMeasureRawDataChunk()[blockIndex];
    +      BitSetGroup bitSetGroup = new BitSetGroup(rawColumnChunk.getPagesCount());
    +      for (int i = 0; i < rawColumnChunk.getPagesCount(); i++) {
    +        if (rawColumnChunk.getMaxValues() != null) {
    +          if (isScanRequired(rawColumnChunk.getMaxValues()[i], this.filterRangeValues,
    +              msrColEvalutorInfoList.get(0).getType())) {
    +            int compare = ByteUtil.UnsafeComparer.INSTANCE
    --- End diff --
    
    This binary comparison cannot be used for measures, compare actual values


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

Posted by CarbonDataQA <gi...@git.apache.org>.
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    Build Failed  with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2754/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125153644
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/ExcludeFilterExecuterImpl.java ---
    @@ -18,56 +18,152 @@
     
     import java.io.IOException;
     import java.util.BitSet;
    +import java.util.Comparator;
     
     import org.apache.carbondata.core.datastore.block.SegmentProperties;
     import org.apache.carbondata.core.datastore.chunk.DimensionColumnDataChunk;
    +import org.apache.carbondata.core.datastore.chunk.MeasureColumnDataChunk;
     import org.apache.carbondata.core.datastore.chunk.impl.DimensionRawColumnChunk;
    +import org.apache.carbondata.core.datastore.chunk.impl.MeasureRawColumnChunk;
    +import org.apache.carbondata.core.metadata.datatype.DataType;
     import org.apache.carbondata.core.scan.filter.FilterUtil;
    +import org.apache.carbondata.core.scan.filter.partition.PartitionFilterUtil;
     import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.DimColumnResolvedFilterInfo;
    +import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.MeasureColumnResolvedFilterInfo;
     import org.apache.carbondata.core.scan.processor.BlocksChunkHolder;
     import org.apache.carbondata.core.util.BitSetGroup;
     import org.apache.carbondata.core.util.CarbonUtil;
    +import org.apache.carbondata.core.util.DataTypeUtil;
     
     public class ExcludeFilterExecuterImpl implements FilterExecuter {
     
       protected DimColumnResolvedFilterInfo dimColEvaluatorInfo;
       protected DimColumnExecuterFilterInfo dimColumnExecuterInfo;
    +  protected MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo;
    +  protected MeasureColumnExecuterFilterInfo msrColumnExecutorInfo;
       protected SegmentProperties segmentProperties;
    +  protected boolean isDimensionPresentInCurrentBlock = false;
    +  protected boolean isMeasurePresentInCurrentBlock = false;
       /**
        * is dimension column data is natural sorted
        */
    -  private boolean isNaturalSorted;
    +  private boolean isNaturalSorted = false;
    +
       public ExcludeFilterExecuterImpl(DimColumnResolvedFilterInfo dimColEvaluatorInfo,
    -      SegmentProperties segmentProperties) {
    -    this.dimColEvaluatorInfo = dimColEvaluatorInfo;
    -    dimColumnExecuterInfo = new DimColumnExecuterFilterInfo();
    +      MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo, SegmentProperties segmentProperties,
    +      boolean isMeasure) {
         this.segmentProperties = segmentProperties;
    -    FilterUtil.prepareKeysFromSurrogates(dimColEvaluatorInfo.getFilterValues(), segmentProperties,
    -        dimColEvaluatorInfo.getDimension(), dimColumnExecuterInfo);
    -    isNaturalSorted = dimColEvaluatorInfo.getDimension().isUseInvertedIndex() && dimColEvaluatorInfo
    -        .getDimension().isSortColumn();
    +    if (isMeasure == false) {
    +      this.dimColEvaluatorInfo = dimColEvaluatorInfo;
    +      dimColumnExecuterInfo = new DimColumnExecuterFilterInfo();
    +
    +      FilterUtil.prepareKeysFromSurrogates(dimColEvaluatorInfo.getFilterValues(), segmentProperties,
    +          dimColEvaluatorInfo.getDimension(), dimColumnExecuterInfo, null, null);
    +      isDimensionPresentInCurrentBlock = true;
    +      isNaturalSorted =
    +          dimColEvaluatorInfo.getDimension().isUseInvertedIndex() && dimColEvaluatorInfo
    +              .getDimension().isSortColumn();
    +    } else {
    +      this.msrColumnEvaluatorInfo = msrColumnEvaluatorInfo;
    +      msrColumnExecutorInfo = new MeasureColumnExecuterFilterInfo();
    +      FilterUtil
    +          .prepareKeysFromSurrogates(msrColumnEvaluatorInfo.getFilterValues(), segmentProperties,
    +              null, null, msrColumnEvaluatorInfo.getMeasure(), msrColumnExecutorInfo);
    +      isMeasurePresentInCurrentBlock = true;
    +    }
    +
       }
     
       @Override public BitSetGroup applyFilter(BlocksChunkHolder blockChunkHolder) throws IOException {
    -    int blockIndex = segmentProperties.getDimensionOrdinalToBlockMapping()
    -        .get(dimColEvaluatorInfo.getColumnIndex());
    -    if (null == blockChunkHolder.getDimensionRawDataChunk()[blockIndex]) {
    -      blockChunkHolder.getDimensionRawDataChunk()[blockIndex] = blockChunkHolder.getDataBlock()
    -          .getDimensionChunk(blockChunkHolder.getFileReader(), blockIndex);
    +    if (isDimensionPresentInCurrentBlock == true) {
    +      int blockIndex = segmentProperties.getDimensionOrdinalToBlockMapping()
    +          .get(dimColEvaluatorInfo.getColumnIndex());
    +      if (null == blockChunkHolder.getDimensionRawDataChunk()[blockIndex]) {
    +        blockChunkHolder.getDimensionRawDataChunk()[blockIndex] = blockChunkHolder.getDataBlock()
    +            .getDimensionChunk(blockChunkHolder.getFileReader(), blockIndex);
    +      }
    +      DimensionRawColumnChunk dimensionRawColumnChunk =
    +          blockChunkHolder.getDimensionRawDataChunk()[blockIndex];
    +      DimensionColumnDataChunk[] dimensionColumnDataChunks =
    +          dimensionRawColumnChunk.convertToDimColDataChunks();
    +      BitSetGroup bitSetGroup = new BitSetGroup(dimensionRawColumnChunk.getPagesCount());
    +      for (int i = 0; i < dimensionColumnDataChunks.length; i++) {
    +        BitSet bitSet = getFilteredIndexes(dimensionColumnDataChunks[i],
    +            dimensionRawColumnChunk.getRowCount()[i]);
    +        bitSetGroup.setBitSet(bitSet, i);
    +      }
    +
    +      return bitSetGroup;
    +    } else if (isMeasurePresentInCurrentBlock == true) {
    +      int blockIndex = segmentProperties.getMeasuresOrdinalToBlockMapping()
    +          .get(msrColumnEvaluatorInfo.getColumnIndex());
    +      if (null == blockChunkHolder.getMeasureRawDataChunk()[blockIndex]) {
    +        blockChunkHolder.getMeasureRawDataChunk()[blockIndex] = blockChunkHolder.getDataBlock()
    +            .getMeasureChunk(blockChunkHolder.getFileReader(), blockIndex);
    +      }
    +      MeasureRawColumnChunk measureRawColumnChunk =
    +          blockChunkHolder.getMeasureRawDataChunk()[blockIndex];
    +      MeasureColumnDataChunk[] measureColumnDataChunks =
    +          measureRawColumnChunk.convertToMeasureColDataChunks();
    +      BitSetGroup bitSetGroup = new BitSetGroup(measureRawColumnChunk.getPagesCount());
    +      DataType msrType = getMeasureDataType(msrColumnEvaluatorInfo);
    +      for (int i = 0; i < measureColumnDataChunks.length; i++) {
    +        BitSet bitSet =
    +            getFilteredIndexes(measureColumnDataChunks[i], measureRawColumnChunk.getRowCount()[i],
    +                msrType);
    +        bitSetGroup.setBitSet(bitSet, i);
    +      }
    +      return bitSetGroup;
         }
    -    DimensionRawColumnChunk dimensionRawColumnChunk =
    -        blockChunkHolder.getDimensionRawDataChunk()[blockIndex];
    -    DimensionColumnDataChunk[] dimensionColumnDataChunks =
    -        dimensionRawColumnChunk.convertToDimColDataChunks();
    -    BitSetGroup bitSetGroup =
    -        new BitSetGroup(dimensionRawColumnChunk.getPagesCount());
    -    for (int i = 0; i < dimensionColumnDataChunks.length; i++) {
    -      BitSet bitSet = getFilteredIndexes(dimensionColumnDataChunks[i],
    -          dimensionRawColumnChunk.getRowCount()[i]);
    -      bitSetGroup.setBitSet(bitSet, i);
    +    return null;
    +  }
    +
    +  private DataType getMeasureDataType(MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo) {
    +    switch (msrColumnEvaluatorInfo.getType()) {
    +      case SHORT:
    +        return DataType.SHORT;
    +      case INT:
    +        return DataType.INT;
    +      case LONG:
    +        return DataType.LONG;
    +      case DECIMAL:
    +        return DataType.DECIMAL;
    +      default:
    +        return DataType.DOUBLE;
         }
    +  }
     
    -    return bitSetGroup;
    +  protected BitSet getFilteredIndexes(MeasureColumnDataChunk measureColumnDataChunk,
    +      int numerOfRows, DataType msrType) {
    +    // Here the algorithm is
    +    // Get the measure values from the chunk. compare sequentially with the
    +    // the filter values. The one that matches sets it Bitset.
    +    BitSet bitSet = new BitSet(numerOfRows);
    +    bitSet.flip(0, numerOfRows);
    +    byte[][] filterValues = msrColumnExecutorInfo.getFilterKeys();
    --- End diff --
    
    better set objects to `msrColumnExecutorInfo` and use objects with out converting


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [CARBONDATA-1257] Measure Filter implementati...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/carbondata/pull/1079


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125191772
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java ---
    @@ -113,6 +115,143 @@ public static Object getMeasureValueBasedOnDataType(String msrValue, DataType da
         }
       }
     
    +  public static Object getMeasureObjectFromDataType(byte[] data, DataType dataType) {
    +    ByteBuffer bb = ByteBuffer.wrap(data);
    +    switch (dataType) {
    +      case SHORT:
    +      case INT:
    +      case LONG:
    +        return bb.getLong();
    +      case DECIMAL:
    +        return byteToBigDecimal(data);
    +      default:
    +        return bb.getDouble();
    +    }
    +  }
    +
    +  /**
    +   * This method will convert a given ByteArray to its specific type
    +   *
    +   * @param msrValue
    +   * @param dataType
    +   * @param carbonMeasure
    +   * @return
    +   */
    +  //  public static byte[] getMeasureByteArrayBasedOnDataType(String msrValue, DataType dataType,
    +  //      CarbonMeasure carbonMeasure) {
    +  //    switch (dataType) {
    +  //      case DECIMAL:
    +  //        BigDecimal bigDecimal =
    +  //            new BigDecimal(msrValue).setScale(carbonMeasure.getScale(), RoundingMode.HALF_UP);
    +  //       return ByteUtil.toBytes(normalizeDecimalValue(bigDecimal, carbonMeasure.getPrecision()));
    +  //      case SHORT:
    +  //        return ByteUtil.toBytes((Short.parseShort(msrValue)));
    +  //      case INT:
    +  //        return ByteUtil.toBytes(Integer.parseInt(msrValue));
    +  //      case LONG:
    +  //        return ByteUtil.toBytes(Long.valueOf(msrValue));
    +  //      default:
    +  //        Double parsedValue = Double.valueOf(msrValue);
    +  //        if (Double.isInfinite(parsedValue) || Double.isNaN(parsedValue)) {
    +  //          return null;
    +  //        }
    +  //        return ByteUtil.toBytes(parsedValue);
    +  //    }
    +  //  }
    +  public static byte[] getMeasureByteArrayBasedOnDataTypes(String msrValue, DataType dataType,
    +      CarbonMeasure carbonMeasure) {
    +    ByteBuffer b;
    +    switch (dataType) {
    +      case BYTE:
    +      case SHORT:
    +      case INT:
    +      case LONG:
    +        b = ByteBuffer.allocate(8);
    +        b.putLong(Long.valueOf(msrValue));
    +        b.flip();
    +        return b.array();
    +      case DOUBLE:
    +        b = ByteBuffer.allocate(8);
    +        b.putDouble(Double.valueOf(msrValue));
    +        b.flip();
    +        return b.array();
    +      case DECIMAL:
    +        BigDecimal bigDecimal =
    +            new BigDecimal(msrValue).setScale(carbonMeasure.getScale(), RoundingMode.HALF_UP);
    +        return DataTypeUtil
    +            .bigDecimalToByte(normalizeDecimalValue(bigDecimal, carbonMeasure.getPrecision()));
    +      default:
    +        throw new IllegalArgumentException("Invalid data type: " + dataType);
    +    }
    +  }
    +
    +  /**
    +   * This method will convert a given ByteArray to its specific type
    +   *
    +   * @param msrValue
    +   * @param dataType
    +   * @param carbonMeasure
    +   * @return
    +   */
    +  public static byte[] getMeasureByteArrayBasedOnDataType(ColumnPage measurePage, int index,
    --- End diff --
    
    Done


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125191402
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/partition/PartitionFilterUtil.java ---
    @@ -107,6 +131,12 @@ public static Comparator getComparatorByDataType(DataType dataType) {
         }
       }
     
    +  static class DecimalComparator implements Comparator<Object> {
    --- End diff --
    
    Done. Removed


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

Posted by CarbonDataQA <gi...@git.apache.org>.
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    Build Failed with Spark 1.6, Please check CI http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/91/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125153305
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java ---
    @@ -113,6 +115,143 @@ public static Object getMeasureValueBasedOnDataType(String msrValue, DataType da
         }
       }
     
    +  public static Object getMeasureObjectFromDataType(byte[] data, DataType dataType) {
    +    ByteBuffer bb = ByteBuffer.wrap(data);
    +    switch (dataType) {
    +      case SHORT:
    +      case INT:
    +      case LONG:
    +        return bb.getLong();
    +      case DECIMAL:
    +        return byteToBigDecimal(data);
    +      default:
    +        return bb.getDouble();
    +    }
    +  }
    +
    +  /**
    +   * This method will convert a given ByteArray to its specific type
    +   *
    +   * @param msrValue
    +   * @param dataType
    +   * @param carbonMeasure
    +   * @return
    +   */
    +  //  public static byte[] getMeasureByteArrayBasedOnDataType(String msrValue, DataType dataType,
    +  //      CarbonMeasure carbonMeasure) {
    +  //    switch (dataType) {
    +  //      case DECIMAL:
    +  //        BigDecimal bigDecimal =
    +  //            new BigDecimal(msrValue).setScale(carbonMeasure.getScale(), RoundingMode.HALF_UP);
    +  //       return ByteUtil.toBytes(normalizeDecimalValue(bigDecimal, carbonMeasure.getPrecision()));
    +  //      case SHORT:
    +  //        return ByteUtil.toBytes((Short.parseShort(msrValue)));
    +  //      case INT:
    +  //        return ByteUtil.toBytes(Integer.parseInt(msrValue));
    +  //      case LONG:
    +  //        return ByteUtil.toBytes(Long.valueOf(msrValue));
    +  //      default:
    +  //        Double parsedValue = Double.valueOf(msrValue);
    +  //        if (Double.isInfinite(parsedValue) || Double.isNaN(parsedValue)) {
    +  //          return null;
    +  //        }
    +  //        return ByteUtil.toBytes(parsedValue);
    +  //    }
    +  //  }
    +  public static byte[] getMeasureByteArrayBasedOnDataTypes(String msrValue, DataType dataType,
    +      CarbonMeasure carbonMeasure) {
    +    ByteBuffer b;
    +    switch (dataType) {
    +      case BYTE:
    +      case SHORT:
    +      case INT:
    +      case LONG:
    +        b = ByteBuffer.allocate(8);
    +        b.putLong(Long.valueOf(msrValue));
    +        b.flip();
    +        return b.array();
    +      case DOUBLE:
    +        b = ByteBuffer.allocate(8);
    +        b.putDouble(Double.valueOf(msrValue));
    +        b.flip();
    +        return b.array();
    +      case DECIMAL:
    +        BigDecimal bigDecimal =
    +            new BigDecimal(msrValue).setScale(carbonMeasure.getScale(), RoundingMode.HALF_UP);
    +        return DataTypeUtil
    +            .bigDecimalToByte(normalizeDecimalValue(bigDecimal, carbonMeasure.getPrecision()));
    +      default:
    +        throw new IllegalArgumentException("Invalid data type: " + dataType);
    +    }
    +  }
    +
    +  /**
    +   * This method will convert a given ByteArray to its specific type
    +   *
    +   * @param msrValue
    +   * @param dataType
    +   * @param carbonMeasure
    +   * @return
    +   */
    +  public static byte[] getMeasureByteArrayBasedOnDataType(ColumnPage measurePage, int index,
    --- End diff --
    
    This method is not used, please remove it


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125154345
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/resolver/ConditionalFilterResolverImpl.java ---
    @@ -198,21 +237,31 @@ public AbsoluteTableIdentifier getTableIdentifier() {
        */
       public void getStartKey(SegmentProperties segmentProperties, long[] startKey,
           SortedMap<Integer, byte[]> setOfStartKeyByteArray, List<long[]> startKeyList) {
    -    FilterUtil.getStartKey(dimColResolvedFilterInfo.getDimensionResolvedFilterInstance(),
    -        segmentProperties, startKey, startKeyList);
    -    FilterUtil.getStartKeyForNoDictionaryDimension(dimColResolvedFilterInfo,
    -        segmentProperties, setOfStartKeyByteArray);
    +    if (null != dimColResolvedFilterInfo) {
    +      FilterUtil.getStartKey(dimColResolvedFilterInfo.getDimensionResolvedFilterInstance(),
    +          segmentProperties, startKey, startKeyList);
    +      FilterUtil.getStartKeyForNoDictionaryDimension(dimColResolvedFilterInfo, segmentProperties,
    +          setOfStartKeyByteArray);
    +    }
    +// else {
    +//      FilterUtil.getStartKey(dimColResolvedFilterInfo.getDimensionResolvedFilterInstance(),
    +//          segmentProperties, startKey, startKeyList);
    +//      FilterUtil.getStartKeyForNoDictionaryDimension(dimColResolvedFilterInfo, segmentProperties,
    +//          setOfStartKeyByteArray);
    +//    }
    --- End diff --
    
    remove commented code


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125199288
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/IncludeFilterExecuterImpl.java ---
    @@ -17,65 +17,174 @@
     package org.apache.carbondata.core.scan.filter.executer;
     
     import java.io.IOException;
    +import java.math.BigDecimal;
    +import java.nio.ByteBuffer;
     import java.util.BitSet;
    +import java.util.Comparator;
     
     import org.apache.carbondata.core.datastore.block.SegmentProperties;
     import org.apache.carbondata.core.datastore.chunk.DimensionColumnDataChunk;
    +import org.apache.carbondata.core.datastore.chunk.MeasureColumnDataChunk;
     import org.apache.carbondata.core.datastore.chunk.impl.DimensionRawColumnChunk;
    +import org.apache.carbondata.core.datastore.chunk.impl.MeasureRawColumnChunk;
    +import org.apache.carbondata.core.metadata.datatype.DataType;
     import org.apache.carbondata.core.scan.filter.FilterUtil;
    +import org.apache.carbondata.core.scan.filter.partition.PartitionFilterUtil;
     import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.DimColumnResolvedFilterInfo;
    +import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.MeasureColumnResolvedFilterInfo;
     import org.apache.carbondata.core.scan.processor.BlocksChunkHolder;
     import org.apache.carbondata.core.util.BitSetGroup;
     import org.apache.carbondata.core.util.ByteUtil;
     import org.apache.carbondata.core.util.CarbonUtil;
    +import org.apache.carbondata.core.util.DataTypeUtil;
     
     public class IncludeFilterExecuterImpl implements FilterExecuter {
     
       protected DimColumnResolvedFilterInfo dimColumnEvaluatorInfo;
       protected DimColumnExecuterFilterInfo dimColumnExecuterInfo;
    +  protected MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo;
    +  protected MeasureColumnExecuterFilterInfo msrColumnExecutorInfo;
       protected SegmentProperties segmentProperties;
    +  protected boolean isDimensionPresentInCurrentBlock = false;
    +  protected boolean isMeasurePresentInCurrentBlock = false;
    --- End diff --
    
    Done.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125153772
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/IncludeFilterExecuterImpl.java ---
    @@ -186,12 +314,60 @@ private boolean isScanRequired(byte[] blkMaxVal, byte[] blkMinVal, byte[][] filt
         return isScanRequired;
       }
     
    +  private boolean isScanRequired(byte[] maxValue, byte[] minValue, byte[][] filterValue,
    +      DataType dataType) {
    +    for (int i = 0; i < filterValue.length; i++) {
    +      if (filterValue[i].length == 0 || maxValue.length == 0 || minValue.length == 0) {
    +        return isScanRequired(maxValue, minValue, filterValue);
    +      } else {
    +        switch (dataType) {
    --- End diff --
    
    Use existing methods of DataTypeUtil for conversions. And use comparator which is used in applyFilter method here


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125154219
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeGrtrThanEquaToFilterExecuterImpl.java ---
    @@ -91,67 +131,167 @@ private void ifDefaultValueMatchesFilter() {
     
       private boolean isScanRequired(byte[] blockMaxValue, byte[][] filterValues) {
         boolean isScanRequired = false;
    -    if (isDimensionPresentInCurrentBlock[0]) {
    -      for (int k = 0; k < filterValues.length; k++) {
    -        // filter value should be in range of max and min value i.e
    -        // max>filtervalue>min
    -        // so filter-max should be negative
    -        int maxCompare = ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMaxValue);
    -        // if any filter value is in range than this block needs to be
    -        // scanned less than equal to max range.
    -        if (maxCompare <= 0) {
    -          isScanRequired = true;
    -          break;
    -        }
    +    for (int k = 0; k < filterValues.length; k++) {
    +      // filter value should be in range of max and min value i.e
    +      // max>filtervalue>min
    +      // so filter-max should be negative
    +      int maxCompare = ByteUtil.UnsafeComparer.INSTANCE.compareTo(filterValues[k], blockMaxValue);
    +      // if any filter value is in range than this block needs to be
    +      // scanned less than equal to max range.
    +      if (maxCompare <= 0) {
    +        isScanRequired = true;
    +        break;
           }
    -    } else {
    -      isScanRequired = isDefaultValuePresentInFilter;
         }
         return isScanRequired;
       }
     
    +  private boolean isScanRequired(byte[] maxValue, byte[][] filterValue,
    +      DataType dataType) {
    +    for (int i = 0; i < filterValue.length; i++) {
    +      if (filterValue[i].length == 0 || maxValue.length == 0) {
    +        return isScanRequired(maxValue, filterValue);
    +      }
    +      switch (dataType) {
    --- End diff --
    
    Use existing methods of `DataTypeUtil` and comparator here


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125191432
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/partition/PartitionFilterUtil.java ---
    @@ -76,24 +99,25 @@ public static Comparator getComparatorByDataType(DataType dataType) {
     
       static class DoubleComparator implements Comparator<Object> {
         @Override public int compare(Object key1, Object key2) {
    -      double result = (double) key1 - (double) key2;
    -      if (result < 0) {
    +      double key1Double1 = (double)key1;
    --- End diff --
    
    There is a scenario is a variable is a negative one the Key1 - Key2 wont give proper output. Better to check greater or less than operator. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125152824
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
    @@ -395,6 +440,58 @@ public static DimColumnFilterInfo getNoDictionaryValKeyMemberForFilter(
       }
     
       /**
    +   * This method will get the no dictionary data based on filters and same
    +   * will be in ColumnFilterInfo
    +   *
    +   * @param evaluateResultListFinal
    +   * @param isIncludeFilter
    +   * @return ColumnFilterInfo
    +   */
    +  public static ColumnFilterInfo getMeasureValKeyMemberForFilter(
    +      List<String> evaluateResultListFinal, boolean isIncludeFilter, DataType dataType,
    +      CarbonMeasure carbonMeasure) throws FilterUnsupportedException {
    +    List<byte[]> filterValuesList = new ArrayList<byte[]>(20);
    +    String result = null;
    +    try {
    +      int length = evaluateResultListFinal.size();
    +      for (int i = 0; i < length; i++) {
    +        result = evaluateResultListFinal.get(i);
    +        if (CarbonCommonConstants.MEMBER_DEFAULT_VAL.equals(result)) {
    +          filterValuesList.add(new byte[0]);
    +          continue;
    +        }
    +        // TODO have to understand what method to be used for measures.
    +        // filterValuesList
    +        //  .add(DataTypeUtil.getBytesBasedOnDataTypeForNoDictionaryColumn(result, dataType));
    --- End diff --
    
    is this comment required now? please remove if not required


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125153674
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/IncludeFilterExecuterImpl.java ---
    @@ -17,65 +17,174 @@
     package org.apache.carbondata.core.scan.filter.executer;
     
     import java.io.IOException;
    +import java.math.BigDecimal;
    +import java.nio.ByteBuffer;
     import java.util.BitSet;
    +import java.util.Comparator;
     
     import org.apache.carbondata.core.datastore.block.SegmentProperties;
     import org.apache.carbondata.core.datastore.chunk.DimensionColumnDataChunk;
    +import org.apache.carbondata.core.datastore.chunk.MeasureColumnDataChunk;
     import org.apache.carbondata.core.datastore.chunk.impl.DimensionRawColumnChunk;
    +import org.apache.carbondata.core.datastore.chunk.impl.MeasureRawColumnChunk;
    +import org.apache.carbondata.core.metadata.datatype.DataType;
     import org.apache.carbondata.core.scan.filter.FilterUtil;
    +import org.apache.carbondata.core.scan.filter.partition.PartitionFilterUtil;
     import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.DimColumnResolvedFilterInfo;
    +import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.MeasureColumnResolvedFilterInfo;
     import org.apache.carbondata.core.scan.processor.BlocksChunkHolder;
     import org.apache.carbondata.core.util.BitSetGroup;
     import org.apache.carbondata.core.util.ByteUtil;
     import org.apache.carbondata.core.util.CarbonUtil;
    +import org.apache.carbondata.core.util.DataTypeUtil;
     
     public class IncludeFilterExecuterImpl implements FilterExecuter {
     
       protected DimColumnResolvedFilterInfo dimColumnEvaluatorInfo;
       protected DimColumnExecuterFilterInfo dimColumnExecuterInfo;
    +  protected MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo;
    +  protected MeasureColumnExecuterFilterInfo msrColumnExecutorInfo;
       protected SegmentProperties segmentProperties;
    +  protected boolean isDimensionPresentInCurrentBlock = false;
    +  protected boolean isMeasurePresentInCurrentBlock = false;
    --- End diff --
    
    remove this flags and use null check of `dimColumnExecuterInfo` and `msrColumnExecutorInfo`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    retest this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125191362
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/resolver/ConditionalFilterResolverImpl.java ---
    @@ -198,21 +237,31 @@ public AbsoluteTableIdentifier getTableIdentifier() {
        */
       public void getStartKey(SegmentProperties segmentProperties, long[] startKey,
           SortedMap<Integer, byte[]> setOfStartKeyByteArray, List<long[]> startKeyList) {
    -    FilterUtil.getStartKey(dimColResolvedFilterInfo.getDimensionResolvedFilterInstance(),
    -        segmentProperties, startKey, startKeyList);
    -    FilterUtil.getStartKeyForNoDictionaryDimension(dimColResolvedFilterInfo,
    -        segmentProperties, setOfStartKeyByteArray);
    +    if (null != dimColResolvedFilterInfo) {
    +      FilterUtil.getStartKey(dimColResolvedFilterInfo.getDimensionResolvedFilterInstance(),
    +          segmentProperties, startKey, startKeyList);
    +      FilterUtil.getStartKeyForNoDictionaryDimension(dimColResolvedFilterInfo, segmentProperties,
    +          setOfStartKeyByteArray);
    +    }
    +// else {
    +//      FilterUtil.getStartKey(dimColResolvedFilterInfo.getDimensionResolvedFilterInstance(),
    +//          segmentProperties, startKey, startKeyList);
    +//      FilterUtil.getStartKeyForNoDictionaryDimension(dimColResolvedFilterInfo, segmentProperties,
    +//          setOfStartKeyByteArray);
    +//    }
    --- End diff --
    
    Done.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125198825
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
    @@ -1042,12 +1144,17 @@ public static FilterExecuter getFilterExecuterTree(
        * @param dimension
        * @param dimColumnExecuterInfo
        */
    -  public static void prepareKeysFromSurrogates(DimColumnFilterInfo filterValues,
    +  public static void prepareKeysFromSurrogates(ColumnFilterInfo filterValues,
           SegmentProperties segmentProperties, CarbonDimension dimension,
    -      DimColumnExecuterFilterInfo dimColumnExecuterInfo) {
    -    byte[][] keysBasedOnFilter = getKeyArray(filterValues, dimension, segmentProperties);
    -    dimColumnExecuterInfo.setFilterKeys(keysBasedOnFilter);
    -
    +      DimColumnExecuterFilterInfo dimColumnExecuterInfo, CarbonMeasure measures,
    +      MeasureColumnExecuterFilterInfo msrColumnExecuterInfo) {
    +    if (null != measures) {
    --- End diff --
    
    This if check is required in order to setFilterKeys in respective measures or dimensions. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125153654
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/IncludeFilterExecuterImpl.java ---
    @@ -17,65 +17,174 @@
     package org.apache.carbondata.core.scan.filter.executer;
     
     import java.io.IOException;
    +import java.math.BigDecimal;
    +import java.nio.ByteBuffer;
     import java.util.BitSet;
    +import java.util.Comparator;
     
     import org.apache.carbondata.core.datastore.block.SegmentProperties;
     import org.apache.carbondata.core.datastore.chunk.DimensionColumnDataChunk;
    +import org.apache.carbondata.core.datastore.chunk.MeasureColumnDataChunk;
     import org.apache.carbondata.core.datastore.chunk.impl.DimensionRawColumnChunk;
    +import org.apache.carbondata.core.datastore.chunk.impl.MeasureRawColumnChunk;
    +import org.apache.carbondata.core.metadata.datatype.DataType;
     import org.apache.carbondata.core.scan.filter.FilterUtil;
    +import org.apache.carbondata.core.scan.filter.partition.PartitionFilterUtil;
     import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.DimColumnResolvedFilterInfo;
    +import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.MeasureColumnResolvedFilterInfo;
     import org.apache.carbondata.core.scan.processor.BlocksChunkHolder;
     import org.apache.carbondata.core.util.BitSetGroup;
     import org.apache.carbondata.core.util.ByteUtil;
     import org.apache.carbondata.core.util.CarbonUtil;
    +import org.apache.carbondata.core.util.DataTypeUtil;
     
     public class IncludeFilterExecuterImpl implements FilterExecuter {
     
       protected DimColumnResolvedFilterInfo dimColumnEvaluatorInfo;
       protected DimColumnExecuterFilterInfo dimColumnExecuterInfo;
    +  protected MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo;
    +  protected MeasureColumnExecuterFilterInfo msrColumnExecutorInfo;
       protected SegmentProperties segmentProperties;
    +  protected boolean isDimensionPresentInCurrentBlock = false;
    +  protected boolean isMeasurePresentInCurrentBlock = false;
       /**
        * is dimension column data is natural sorted
        */
    -  private boolean isNaturalSorted;
    +  private boolean isNaturalSorted = false;
    --- End diff --
    
    default is false only right, no need to add


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

Posted by CarbonDataQA <gi...@git.apache.org>.
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    Build Success with Spark 1.6, Please check CI http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/254/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125152709
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
    @@ -209,9 +233,29 @@ private static FilterExecuter getIncludeFilterExecuter(
        * @return
        */
       private static FilterExecuter getExcludeFilterExecuter(
    -      DimColumnResolvedFilterInfo dimColResolvedFilterInfo, SegmentProperties segmentProperties) {
    +      DimColumnResolvedFilterInfo dimColResolvedFilterInfo,
    +      MeasureColumnResolvedFilterInfo msrColResolvedFilterInfo,
    +      SegmentProperties segmentProperties) {
     
    -    if (dimColResolvedFilterInfo.getDimension().isColumnar()) {
    +    if (null != msrColResolvedFilterInfo && msrColResolvedFilterInfo.getMeasure().isColumnar()) {
    --- End diff --
    
    even here `isColumnar` check not required


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    
    Refer to this link for build results (access rights to CI server needed): 
    https://builds.apache.org/job/carbondata-pr-spark-1.6/583/<h2>Failed Tests: <span class='status-failure'>134</span></h2><h3><a name='carbondata-pr-spark-1.6/org.apache.carbondata:carbondata-spark' /><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark/testReport'>carbondata-pr-spark-1.6/org.apache.carbondata:carbondata-spark</a>: <span class='status-failure'>7</span></h3><ul><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark/testReport/org.apache.carbondata.integration.spark.testsuite.dataload/SparkDatasourceSuite/read_and_write_using_CarbonContext/'><strong>org.apache.carbondata.integration.spark.testsuite.dataload.SparkDatasourceSuite.read and write using CarbonContext</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark/testReport/org.apache.carbondata.integration.spark.testsuite.dataload/Spark
 DatasourceSuite/read_and_write_using_CarbonContext_with_compression/'><strong>org.apache.carbondata.integration.spark.testsuite.dataload.SparkDatasourceSuite.read and write using CarbonContext with compression</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark/testReport/org.apache.carbondata.integration.spark.testsuite.dataload/SparkDatasourceSuite/test_overwrite/'><strong>org.apache.carbondata.integration.spark.testsuite.dataload.SparkDatasourceSuite.test overwrite</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark/testReport/org.apache.carbondata.integration.spark.testsuite.dataload/SparkDatasourceSuite/read_and_write_using_CarbonContext__multiple_load/'><strong>org.apache.carbondata.integration.spark.testsuite.dataload.SparkDatasourceSuite.read and write using CarbonContext, multiple load</strong></a></li><li><a href='https://bui
 lds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark/testReport/org.apache.carbondata.integration.spark.testsuite.dataload/SparkDatasourceSuite/query_using_SQLContext/'><strong>org.apache.carbondata.integration.spark.testsuite.dataload.SparkDatasourceSuite.query using SQLContext</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark/testReport/org.apache.carbondata.integration.spark.testsuite.dataload/SparkDatasourceSuite/query_using_SQLContext_without_providing_schema/'><strong>org.apache.carbondata.integration.spark.testsuite.dataload.SparkDatasourceSuite.query using SQLContext without providing schema</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark/testReport/org.apache.carbondata.spark.testsuite.datacompaction/DataCompactionTest/check_if_compaction_with_Updates/'><strong>org.apache.carbondata.spark
 .testsuite.datacompaction.DataCompactionTest.check if compaction with Updates</strong></a></li></ul><h3><a name='carbondata-pr-spark-1.6/org.apache.carbondata:carbondata-spark-common-test' /><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport'>carbondata-pr-spark-1.6/org.apache.carbondata:carbondata-spark-common-test</a>: <span class='status-failure'>127</span></h3><ul><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.integration.spark.testsuite.primitiveTypes/FloatDataTypeTestCase/select_row_whose_rating_is_more_than_2_8_from_tfloat/'><strong>org.apache.carbondata.integration.spark.testsuite.primitiveTypes.FloatDataTypeTestCase.select row whose rating is more than 2.8 from tfloat</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spar
 k-common-test/testReport/org.apache.carbondata.integration.spark.testsuite.primitiveTypes/FloatDataTypeTestCase/select_row_whose_rating_is_3_5_from_tfloat/'><strong>org.apache.carbondata.integration.spark.testsuite.primitiveTypes.FloatDataTypeTestCase.select row whose rating is 3.5 from tfloat</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.allqueries/AllDataTypesTestCaseAggregate/select__imei_from_Carbon_automation_test_where_contractNumber_is_NOT_null/'><strong>org.apache.carbondata.spark.testsuite.allqueries.AllDataTypesTestCaseAggregate.select  imei from Carbon_automation_test where contractNumber is NOT null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.allqueries/AllDataTypesTestCaseAggregate/select_
 _count_bomCode__from_Carbon_automation_test_where_contractNumber_is_NOT_null/'><strong>org.apache.carbondata.spark.testsuite.allqueries.AllDataTypesTestCaseAggregate.select  count(bomCode) from Carbon_automation_test where contractNumber is NOT null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.allqueries/AllDataTypesTestCaseAggregate/select__channelsName_from_Carbon_automation_test_where_contractNumber_is_NOT_null/'><strong>org.apache.carbondata.spark.testsuite.allqueries.AllDataTypesTestCaseAggregate.select  channelsName from Carbon_automation_test where contractNumber is NOT null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.allqueries/AllDataTypesTestCaseAggregate/select__channelsId_from_Carbon_autom
 ation_test_where_gamePointId_is_NOT_null/'><strong>org.apache.carbondata.spark.testsuite.allqueries.AllDataTypesTestCaseAggregate.select  channelsId from Carbon_automation_test where gamePointId is NOT null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.allqueries/AllDataTypesTestCaseAggregate/select__channelsName_from_Carbon_automation_test_where_gamePointId_is_NOT_null/'><strong>org.apache.carbondata.spark.testsuite.allqueries.AllDataTypesTestCaseAggregate.select  channelsName from Carbon_automation_test where gamePointId is NOT null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.bigdecimal/TestBigDecimal/test_filter_query_on_big_decimal_column/'><strong>org.apache.carbondata.spark.testsuite.bigdecimal.T
 estBigDecimal.test filter query on big decimal column</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.bigdecimal/TestBigInt/test_big_int_data_type_storage_for_boundary_values/'><strong>org.apache.carbondata.spark.testsuite.bigdecimal.TestBigInt.test big int data type storage for boundary values</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.bigdecimal/TestNullAndEmptyFields/test_filter_query_on_column_is_null/'><strong>org.apache.carbondata.spark.testsuite.bigdecimal.TestNullAndEmptyFields.test filter query on column is null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsui
 te.bigdecimal/TestNullAndEmptyFields/test_filter_query_on_column_is_not_null/'><strong>org.apache.carbondata.spark.testsuite.bigdecimal.TestNullAndEmptyFields.test filter query on column is not null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.bigdecimal/TestNullAndEmptyFieldsUnsafe/test_filter_query_on_column_is_null/'><strong>org.apache.carbondata.spark.testsuite.bigdecimal.TestNullAndEmptyFieldsUnsafe.test filter query on column is null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.bigdecimal/TestNullAndEmptyFieldsUnsafe/test_filter_query_on_column_is_not_null/'><strong>org.apache.carbondata.spark.testsuite.bigdecimal.TestNullAndEmptyFieldsUnsafe.test filter query on column is not null</strong></a></
 li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestGlobalSortDataLoad/LOAD_with_DELETE/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestGlobalSortDataLoad.LOAD with DELETE</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestGlobalSortDataLoad/LOAD_with_UPDATE/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestGlobalSortDataLoad.LOAD with UPDATE</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataFrame/test_load_dataframe_with_saving_compressed_csv_files/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataFrame.
 test load dataframe with saving compressed csv files</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataFrame/test_load_dataframe_with_saving_csv_uncompressed_files/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataFrame.test load dataframe with saving csv uncompressed files</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataFrame/test_load_dataframe_without_saving_csv_files/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataFrame.test load dataframe without saving csv files</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.car
 bondata.spark.testsuite.dataload/TestLoadDataFrame/test_load_dataframe_with_integer_columns_included_in_the_dictionary/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataFrame.test load dataframe with integer columns included in the dictionary</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataFrame/test_load_dataframe_with_string_column_excluded_from_the_dictionary/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataFrame.test load dataframe with string column excluded from the dictionary</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataFrame/test_load_dataframe_with_both_dictionary_include_and_exclude_specified/'><strong>org.apache.ca
 rbondata.spark.testsuite.dataload.TestLoadDataFrame.test load dataframe with both dictionary include and exclude specified</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataFrame/test_load_dataframe_with_single_pass_enabled/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataFrame.test load dataframe with single pass enabled</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataFrame/test_load_dataframe_with_single_pass_disabled/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataFrame.test load dataframe with single pass disabled</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$ca
 rbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithDiffTimestampFormat/test_load_data_with_different_timestamp_format/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithDiffTimestampFormat.test load data with different timestamp format</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxDefaultFormat/test_carbon_table_data_loading_with_special_character_2/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxDefaultFormat.test carbon table data loading with special character 2</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxDefaultFormat/test_d
 ata_which_contain_column_with_decimal_data_type_in_array_of_struct_/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxDefaultFormat.test data which contain column with decimal data type in array of struct.</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxUnsafe/test_carbon_table_data_loading_with_special_character_2/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxUnsafe.test carbon table data loading with special character 2</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxUnsafe/test_data_which_contain_column_with_decimal_data_type_in_array_of_struct_/'><strong>org.apache.c
 arbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxUnsafe.test data which contain column with decimal data type in array of struct.</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxV1Format/test_carbon_table_data_loading_with_special_character_2/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxV1Format.test carbon table data loading with special character 2</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxV1Format/test_data_which_contain_column_with_decimal_data_type_in_array_of_struct_/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxV1Format.test data which conta
 in column with decimal data type in array of struct.</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxV2Format/test_carbon_table_data_loading_with_special_character_2/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxV2Format.test carbon table data loading with special character 2</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxV2Format/test_data_which_contain_column_with_decimal_data_type_in_array_of_struct_/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxV2Format.test data which contain column with decimal data type in array of struct.</strong></a></li><li><a href='https:
 //builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/CastColumnTestCase/Dictionary_String_Greater_Than/'><strong>org.apache.carbondata.spark.testsuite.detailquery.CastColumnTestCase.Dictionary String Greater Than</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/CastColumnTestCase/Dictionary_INT_Greater_Than/'><strong>org.apache.carbondata.spark.testsuite.detailquery.CastColumnTestCase.Dictionary INT Greater Than</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/CastColumnTestCase/Dictionary_String_Greater_Than_equal/'><strong>org.apache.carbondata.spark.testsuite.detailquery.Cas
 tColumnTestCase.Dictionary String Greater Than equal</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/CastColumnTestCase/Dictionary_INT_Greater_Than_equal/'><strong>org.apache.carbondata.spark.testsuite.detailquery.CastColumnTestCase.Dictionary INT Greater Than equal</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/CastColumnTestCase/Dictionary_String_less_Than/'><strong>org.apache.carbondata.spark.testsuite.detailquery.CastColumnTestCase.Dictionary String less Than</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/CastColumnTestCase/Dicti
 onary_INT_Less_Than/'><strong>org.apache.carbondata.spark.testsuite.detailquery.CastColumnTestCase.Dictionary INT Less Than</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/CastColumnTestCase/Dictionary_String_Less_Than_equal/'><strong>org.apache.carbondata.spark.testsuite.detailquery.CastColumnTestCase.Dictionary String Less Than equal</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/CastColumnTestCase/Dictionary_INT_Less_Than_equal/'><strong>org.apache.carbondata.spark.testsuite.detailquery.CastColumnTestCase.Dictionary INT Less Than equal</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testRepo
 rt/org.apache.carbondata.spark.testsuite.detailquery/CastColumnTestCase/NO_Dictionary_INT_Greater_Than/'><strong>org.apache.carbondata.spark.testsuite.detailquery.CastColumnTestCase.NO Dictionary INT Greater Than</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/CastColumnTestCase/NO_Dictionary_INT_Greater_Than_equal/'><strong>org.apache.carbondata.spark.testsuite.detailquery.CastColumnTestCase.NO Dictionary INT Greater Than equal</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/CastColumnTestCase/NO_Dictionary_INT_Less_Than/'><strong>org.apache.carbondata.spark.testsuite.detailquery.CastColumnTestCase.NO Dictionary INT Less Than</strong></a></li><li><a href='https://builds.apache.org/jo
 b/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/CastColumnTestCase/NO_Dictionary_INT_Less_Than_equal/'><strong>org.apache.carbondata.spark.testsuite.detailquery.CastColumnTestCase.NO Dictionary INT Less Than equal</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/ExpressionWithNullTestCase/test_to_check_in_expression_with_null_values/'><strong>org.apache.carbondata.spark.testsuite.detailquery.ExpressionWithNullTestCase.test to check in expression with null values</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/ExpressionWithNullTestCase/test_to_check_not_in_expression_with_null_values/'><strong
 >org.apache.carbondata.spark.testsuite.detailquery.ExpressionWithNullTestCase.test to check not in expression with null values</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/ExpressionWithNullTestCase/test_to_check_equals_expression_with_null_values/'><strong>org.apache.carbondata.spark.testsuite.detailquery.ExpressionWithNullTestCase.test to check equals expression with null values</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/ExpressionWithNullTestCase/test_to_check_not_equals_expression_with_null_values/'><strong>org.apache.carbondata.spark.testsuite.detailquery.ExpressionWithNullTestCase.test to check not equals expression with null values</strong></a></li><li><a href='https://
 builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_for_dictionary_columns_OR/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test for dictionary columns OR</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_for_dictionary_columns_OR_AND/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test for dictionary columns OR AND</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_for_dictionary_columns_OR_AND_OR/'><strong>org.apache.carbondata.spark.testsuite.de
 tailquery.RangeFilterMyTests.test for dictionary columns OR AND OR</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_range_filter_for_measure_in_dictionary_include/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test range filter for measure in dictionary include</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_range_filter_for_measure_in_dictionary_include_and_condition/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test range filter for measure in dictionary include and condition</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache
 .carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_range_filter_for_measure_in_dictionary_include_or_condition/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test range filter for measure in dictionary include or condition</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_range_filter_for_measure_in_dictionary_include_or_and_condition/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test range filter for measure in dictionary include or and condition</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_range_f
 ilter_for_measure_in_dictionary_include_or_and_condition_1/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test range filter for measure in dictionary include or and condition 1</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_range_filter_for_measure_in_dictionary_include_boundary_values/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test range filter for measure in dictionary include boundary values</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_range_filter_for_measure_in_dictionary_include_same_values_and/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFi
 lterMyTests.test range filter for measure in dictionary include same values and</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_range_filter_for_measure_in_dictionary_include_same_values_or/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test range filter for measure in dictionary include same values or</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_for_dictionary_exclude_columns_or_condition/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test for dictionary exclude columns or condition</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/o
 rg.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_for_dictionary_exclude_columns_or_and_condition/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test for dictionary exclude columns or and condition</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_for_dictionary_exclude_columns_boundary_condition/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test for dictionary exclude columns boundary condition</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_range_filter_for_multiple_columns_and_condition/'>
 <strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test range filter for multiple columns and condition</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_range_filter_for_multiple_columns_or_condition/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test range filter for multiple columns or condition</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_range_filter_for_multiplecolumns_conditions/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test range filter for multiplecolumns conditions</strong></a></li><li><a href='https://builds.apache.org/job/carbondat
 a-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_range_filter_for_more_columns_conditions/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test range filter for more columns conditions</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_range_filter_for_multiple_columns_and_or_combination/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test range filter for multiple columns and or combination</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterMyTests/test_range_filter_for_more_columns_boundary
 _conditions/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterMyTests.test range filter for more columns boundary conditions</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterTestCase/Range_filter_Dictionary_outside_Boundary/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterTestCase.Range filter Dictionary outside Boundary</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterTestCase/Range_filter_Dictionary_duplicate_filters2/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterTestCase.Range filter Dictionary duplicate filters2</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6
 /583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterTestCase/Range_filter_Dictionary_duplicate_filters3/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterTestCase.Range filter Dictionary duplicate filters3</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterTestCase/Range_filter_Dictionary_multiple_filters2/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterTestCase.Range filter Dictionary multiple filters2</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterTestCase/Range_filter_with_join/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFi
 lterTestCase.Range filter with join</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/RangeFilterTestCase/Range_filter_with_join_1/'><strong>org.apache.carbondata.spark.testsuite.detailquery.RangeFilterTestCase.Range filter with join 1</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/FilterProcessorTestCase/Is_not_null_filter/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.FilterProcessorTestCase.Is not null filter</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/FilterProcessorTestCase/Greater_Than_Filter/'><strong>org.apache.car
 bondata.spark.testsuite.filterexpr.FilterProcessorTestCase.Greater Than Filter</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/FilterProcessorTestCase/Greater_Than_Filter_with_decimal/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.FilterProcessorTestCase.Greater Than Filter with decimal</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/FilterProcessorTestCase/Greater_Than_equal_to_Filter/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.FilterProcessorTestCase.Greater Than equal to Filter</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.test
 suite.filterexpr/FilterProcessorTestCase/Greater_Than_equal_to_Filter_with_limit/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.FilterProcessorTestCase.Greater Than equal to Filter with limit</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/FilterProcessorTestCase/Greater_Than_equal_to_Filter_with_aggregation_limit/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.FilterProcessorTestCase.Greater Than equal to Filter with aggregation limit</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/FilterProcessorTestCase/Greater_Than_equal_to_Filter_with_decimal/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.FilterProcessorTestCase.Greater Than equal to Filter with d
 ecimal</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/FilterProcessorTestCase/Include_Filter/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.FilterProcessorTestCase.Include Filter</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/NullMeasureValueTestCaseFilter/select_ID_from_t3_where_salary_is_not_null/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.NullMeasureValueTestCaseFilter.select ID from t3 where salary is not null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.filterexpr/NullMeasureValueTestCaseFilter/select_ID_from_t3_
 where_salary_is_null/'><strong>org.apache.carbondata.spark.testsuite.filterexpr.NullMeasureValueTestCaseFilter.select ID from t3 where salary is null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/DeleteCarbonTableTestCase/delete_data_from__carbon_table_where_clause__/'><strong>org.apache.carbondata.spark.testsuite.iud.DeleteCarbonTableTestCase.delete data from  carbon table[where clause ]</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/DeleteCarbonTableTestCase/delete_data_from__carbon_table_where_numeric_condition___/'><strong>org.apache.carbondata.spark.testsuite.iud.DeleteCarbonTableTestCase.delete data from  carbon table[where numeric condition  ]</strong></a></li><li><a href='https://builds.ap
 ache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/HorizontalCompactionTestCase/test_IUD_Horizontal_Compaction_Delete/'><strong>org.apache.carbondata.spark.testsuite.iud.HorizontalCompactionTestCase.test IUD Horizontal Compaction Delete</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/HorizontalCompactionTestCase/test_IUD_Horizontal_Compaction_Update_Delete_and_Clean/'><strong>org.apache.carbondata.spark.testsuite.iud.HorizontalCompactionTestCase.test IUD Horizontal Compaction Update Delete and Clean</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/HorizontalCompactionTestCase/test_IUD_Horizontal_Compaction_Ch
 eck_Column_Cardinality/'><strong>org.apache.carbondata.spark.testsuite.iud.HorizontalCompactionTestCase.test IUD Horizontal Compaction Check Column Cardinality</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/HorizontalCompactionTestCase/test_IUD_Horizontal_Compaction_Segment_Delete_Test_Case/'><strong>org.apache.carbondata.spark.testsuite.iud.HorizontalCompactionTestCase.test IUD Horizontal Compaction Segment Delete Test Case</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/HorizontalCompactionTestCase/test_case_full_table_delete/'><strong>org.apache.carbondata.spark.testsuite.iud.HorizontalCompactionTestCase.test case full table delete</strong></a></li><li><a href='https://builds.apache.org/job/carbo
 ndata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/UpdateCarbonTableTestCase/update_carbon__special_characters__in_value__test_parsing_logic__/'><strong>org.apache.carbondata.spark.testsuite.iud.UpdateCarbonTableTestCase.update carbon [special characters  in value- test parsing logic ]</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.iud/UpdateCarbonTableTestCase/update_carbon__sub_query__between_and_existing_in_outer_condition__Customer_query____/'><strong>org.apache.carbondata.spark.testsuite.iud.UpdateCarbonTableTestCase.update carbon [sub query, between and existing in outer condition.(Customer query ) ]</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbo
 ndata.spark.testsuite.nullvalueserialization/TestNullValueSerialization/test_filter_query_on_column_is_null/'><strong>org.apache.carbondata.spark.testsuite.nullvalueserialization.TestNullValueSerialization.test filter query on column is null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.nullvalueserialization/TestNullValueSerialization/test_filter_query_on_column_is_not_null/'><strong>org.apache.carbondata.spark.testsuite.nullvalueserialization.TestNullValueSerialization.test filter query on column is not null</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_int/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionT
 able.allTypeTable_hash_int</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_bigint/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_bigint</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_float/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_float</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTab
 le/allTypeTable_hash_double/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_double</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_decimal/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_decimal</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_int/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_int</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$car
 bondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_bigint/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_bigint</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_float/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_float</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_double/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_dou
 ble</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_decimal/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_decimal</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestDataLoadingForPartitionTable/badrecords_on_partition_column/'><strong>org.apache.carbondata.spark.testsuite.partition.TestDataLoadingForPartitionTable.badrecords on partition column</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestQueryForPartitionTable/detail_que
 ry_on_partition_table__hash_table/'><strong>org.apache.carbondata.spark.testsuite.partition.TestQueryForPartitionTable.detail query on partition table: hash table</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestQueryForPartitionTable/detail_query_on_partition_table__list_partition/'><strong>org.apache.carbondata.spark.testsuite.partition.TestQueryForPartitionTable.detail query on partition table: list partition</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumns/filter_on_sort_columns_include_no_dictionary__direct_dictionary_and_dictioanry/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumns.filter on sort_columns include no-dictionary, direct-dict
 ionary and dictioanry</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumns/unsorted_table_creation__query_data_loading_with_heap_and_safe_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumns.unsorted table creation, query data loading with heap and safe sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumns/unsorted_table_creation__query_and_data_loading_with_heap_and_unsafe_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumns.unsorted table creation, query and data loading with heap and unsafe sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark
 -1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumns/unsorted_table_creation__query_and_loading_with_heap_and_inmemory_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumns.unsorted table creation, query and loading with heap and inmemory sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumns/unsorted_table_creation__query_and_data_loading_with_offheap_and_safe_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumns.unsorted table creation, query and data loading with offheap and safe sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.
 spark.testsuite.sortcolumns/TestSortColumns/unsorted_table_creation__query_and_data_loading_with_offheap_and_unsafe_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumns.unsorted table creation, query and data loading with offheap and unsafe sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumns/unsorted_table_creation__query_and_data_loading_with_offheap_and_inmemory_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumns.unsorted table creation, query and data loading with offheap and inmemory sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumnsWithUnsafe/filter_on_sort_colu
 mns_include_no_dictionary__direct_dictionary_and_dictioanry/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumnsWithUnsafe.filter on sort_columns include no-dictionary, direct-dictionary and dictioanry</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumnsWithUnsafe/unsorted_table_creation__query_data_loading_with_heap_and_safe_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumnsWithUnsafe.unsorted table creation, query data loading with heap and safe sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumnsWithUnsafe/unsorted_table_creation__query_and_data_loading_with_heap_and_unsafe_sort_config/'><
 strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumnsWithUnsafe.unsorted table creation, query and data loading with heap and unsafe sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumnsWithUnsafe/unsorted_table_creation__query_and_loading_with_heap_and_inmemory_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumnsWithUnsafe.unsorted table creation, query and loading with heap and inmemory sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumnsWithUnsafe/unsorted_table_creation__query_and_data_loading_with_offheap_and_safe_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolum
 ns.TestSortColumnsWithUnsafe.unsorted table creation, query and data loading with offheap and safe sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumnsWithUnsafe/unsorted_table_creation__query_and_data_loading_with_offheap_and_unsafe_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumnsWithUnsafe.unsorted table creation, query and data loading with offheap and unsafe sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.sortcolumns/TestSortColumnsWithUnsafe/unsorted_table_creation__query_and_data_loading_with_offheap_and_inmemory_sort_config/'><strong>org.apache.carbondata.spark.testsuite.sortcolumns.TestSortColumnsWithUnsafe.unsorted
  table creation, query and data loading with offheap and inmemory sort config</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_int/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_int</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_bigint/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_bigint</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark
 .testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_float/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_float</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_double/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_double</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/583/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_decimal/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_decimal</strong></a></li></ul>



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125190813
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/expression/ColumnExpression.java ---
    @@ -31,12 +32,16 @@
     
       private boolean isDimension;
     
    +  private boolean isMeasure;
    --- End diff --
    
    Just kept it in order to be in sync with dimension implementation. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125152979
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/executer/ExcludeFilterExecuterImpl.java ---
    @@ -18,56 +18,152 @@
     
     import java.io.IOException;
     import java.util.BitSet;
    +import java.util.Comparator;
     
     import org.apache.carbondata.core.datastore.block.SegmentProperties;
     import org.apache.carbondata.core.datastore.chunk.DimensionColumnDataChunk;
    +import org.apache.carbondata.core.datastore.chunk.MeasureColumnDataChunk;
     import org.apache.carbondata.core.datastore.chunk.impl.DimensionRawColumnChunk;
    +import org.apache.carbondata.core.datastore.chunk.impl.MeasureRawColumnChunk;
    +import org.apache.carbondata.core.metadata.datatype.DataType;
     import org.apache.carbondata.core.scan.filter.FilterUtil;
    +import org.apache.carbondata.core.scan.filter.partition.PartitionFilterUtil;
     import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.DimColumnResolvedFilterInfo;
    +import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.MeasureColumnResolvedFilterInfo;
     import org.apache.carbondata.core.scan.processor.BlocksChunkHolder;
     import org.apache.carbondata.core.util.BitSetGroup;
     import org.apache.carbondata.core.util.CarbonUtil;
    +import org.apache.carbondata.core.util.DataTypeUtil;
     
     public class ExcludeFilterExecuterImpl implements FilterExecuter {
     
       protected DimColumnResolvedFilterInfo dimColEvaluatorInfo;
       protected DimColumnExecuterFilterInfo dimColumnExecuterInfo;
    +  protected MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo;
    +  protected MeasureColumnExecuterFilterInfo msrColumnExecutorInfo;
       protected SegmentProperties segmentProperties;
    +  protected boolean isDimensionPresentInCurrentBlock = false;
    +  protected boolean isMeasurePresentInCurrentBlock = false;
       /**
        * is dimension column data is natural sorted
        */
    -  private boolean isNaturalSorted;
    +  private boolean isNaturalSorted = false;
    +
       public ExcludeFilterExecuterImpl(DimColumnResolvedFilterInfo dimColEvaluatorInfo,
    -      SegmentProperties segmentProperties) {
    -    this.dimColEvaluatorInfo = dimColEvaluatorInfo;
    -    dimColumnExecuterInfo = new DimColumnExecuterFilterInfo();
    +      MeasureColumnResolvedFilterInfo msrColumnEvaluatorInfo, SegmentProperties segmentProperties,
    +      boolean isMeasure) {
         this.segmentProperties = segmentProperties;
    -    FilterUtil.prepareKeysFromSurrogates(dimColEvaluatorInfo.getFilterValues(), segmentProperties,
    -        dimColEvaluatorInfo.getDimension(), dimColumnExecuterInfo);
    -    isNaturalSorted = dimColEvaluatorInfo.getDimension().isUseInvertedIndex() && dimColEvaluatorInfo
    -        .getDimension().isSortColumn();
    +    if (isMeasure == false) {
    --- End diff --
    
    just use `!isMeasure`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125153423
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
    @@ -395,6 +440,58 @@ public static DimColumnFilterInfo getNoDictionaryValKeyMemberForFilter(
       }
     
       /**
    +   * This method will get the no dictionary data based on filters and same
    +   * will be in ColumnFilterInfo
    +   *
    +   * @param evaluateResultListFinal
    +   * @param isIncludeFilter
    +   * @return ColumnFilterInfo
    +   */
    +  public static ColumnFilterInfo getMeasureValKeyMemberForFilter(
    +      List<String> evaluateResultListFinal, boolean isIncludeFilter, DataType dataType,
    +      CarbonMeasure carbonMeasure) throws FilterUnsupportedException {
    +    List<byte[]> filterValuesList = new ArrayList<byte[]>(20);
    +    String result = null;
    +    try {
    +      int length = evaluateResultListFinal.size();
    +      for (int i = 0; i < length; i++) {
    +        result = evaluateResultListFinal.get(i);
    +        if (CarbonCommonConstants.MEMBER_DEFAULT_VAL.equals(result)) {
    +          filterValuesList.add(new byte[0]);
    +          continue;
    +        }
    +        // TODO have to understand what method to be used for measures.
    +        // filterValuesList
    +        //  .add(DataTypeUtil.getBytesBasedOnDataTypeForNoDictionaryColumn(result, dataType));
    +
    +        filterValuesList
    +            .add(DataTypeUtil.getMeasureByteArrayBasedOnDataTypes(result, dataType, carbonMeasure));
    --- End diff --
    
    I believe we should directly keep objects without converting to binary for measures. It will avoid conversions later 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [CARBONDATA-1257] Measure Filter implementation

Posted by CarbonDataQA <gi...@git.apache.org>.
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/3294/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata issue #1079: [WIP]Measure Filter implementation

Posted by CarbonDataQA <gi...@git.apache.org>.
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1079
  
    Build Failed with Spark 1.6, Please check CI http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/206/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] carbondata pull request #1079: [WIP]Measure Filter implementation

Posted by sounakr <gi...@git.apache.org>.
Github user sounakr commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1079#discussion_r125190940
  
    --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---
    @@ -395,6 +440,58 @@ public static DimColumnFilterInfo getNoDictionaryValKeyMemberForFilter(
       }
     
       /**
    +   * This method will get the no dictionary data based on filters and same
    +   * will be in ColumnFilterInfo
    +   *
    +   * @param evaluateResultListFinal
    +   * @param isIncludeFilter
    +   * @return ColumnFilterInfo
    +   */
    +  public static ColumnFilterInfo getMeasureValKeyMemberForFilter(
    +      List<String> evaluateResultListFinal, boolean isIncludeFilter, DataType dataType,
    +      CarbonMeasure carbonMeasure) throws FilterUnsupportedException {
    +    List<byte[]> filterValuesList = new ArrayList<byte[]>(20);
    +    String result = null;
    +    try {
    +      int length = evaluateResultListFinal.size();
    +      for (int i = 0; i < length; i++) {
    +        result = evaluateResultListFinal.get(i);
    +        if (CarbonCommonConstants.MEMBER_DEFAULT_VAL.equals(result)) {
    +          filterValuesList.add(new byte[0]);
    +          continue;
    +        }
    +        // TODO have to understand what method to be used for measures.
    +        // filterValuesList
    +        //  .add(DataTypeUtil.getBytesBasedOnDataTypeForNoDictionaryColumn(result, dataType));
    --- End diff --
    
    Done. Removed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---