You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@metamodel.apache.org by ka...@apache.org on 2017/07/27 01:49:24 UTC

[1/7] metamodel git commit: guava is now available in core

Repository: metamodel
Updated Branches:
  refs/heads/master d841acc17 -> b08aec1dc


guava is now available in core


Project: http://git-wip-us.apache.org/repos/asf/metamodel/repo
Commit: http://git-wip-us.apache.org/repos/asf/metamodel/commit/65574d34
Tree: http://git-wip-us.apache.org/repos/asf/metamodel/tree/65574d34
Diff: http://git-wip-us.apache.org/repos/asf/metamodel/diff/65574d34

Branch: refs/heads/master
Commit: 65574d345bec251428868b24dc0f9322b3c93e14
Parents: 6f9e094
Author: Jörg Unbehauen <jo...@unbehauen.net>
Authored: Thu Jul 20 17:01:02 2017 +0200
Committer: Jörg Unbehauen <jo...@unbehauen.net>
Committed: Fri Jul 21 23:25:37 2017 +0200

----------------------------------------------------------------------
 core/pom.xml | 4 ++++
 1 file changed, 4 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/metamodel/blob/65574d34/core/pom.xml
----------------------------------------------------------------------
diff --git a/core/pom.xml b/core/pom.xml
index 3af58c9..aeee377 100644
--- a/core/pom.xml
+++ b/core/pom.xml
@@ -32,6 +32,10 @@ under the License.
 			<artifactId>slf4j-api</artifactId>
 		</dependency>
 		<dependency>
+			<groupId>com.google.guava</groupId>
+			<artifactId>guava</artifactId>
+		</dependency>
+		<dependency>
 			<groupId>org.slf4j</groupId>
 			<artifactId>slf4j-nop</artifactId>
 			<scope>test</scope>


[4/7] metamodel git commit: decreasing test size to accommodate travis-ci (hopefully)

Posted by ka...@apache.org.
decreasing test size to accommodate travis-ci (hopefully)


Project: http://git-wip-us.apache.org/repos/asf/metamodel/repo
Commit: http://git-wip-us.apache.org/repos/asf/metamodel/commit/88a18289
Tree: http://git-wip-us.apache.org/repos/asf/metamodel/tree/88a18289
Diff: http://git-wip-us.apache.org/repos/asf/metamodel/diff/88a18289

Branch: refs/heads/master
Commit: 88a18289532c0477f5eea086b19c94955fc40b36
Parents: ee2b916
Author: Jörg Unbehauen <jo...@unbehauen.net>
Authored: Sat Jul 22 00:34:50 2017 +0200
Committer: Jörg Unbehauen <jo...@unbehauen.net>
Committed: Sat Jul 22 00:34:50 2017 +0200

----------------------------------------------------------------------
 core/src/test/java/org/apache/metamodel/MetaModelHelperTest.java | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/metamodel/blob/88a18289/core/src/test/java/org/apache/metamodel/MetaModelHelperTest.java
----------------------------------------------------------------------
diff --git a/core/src/test/java/org/apache/metamodel/MetaModelHelperTest.java b/core/src/test/java/org/apache/metamodel/MetaModelHelperTest.java
index 50591ca..705c6f1 100644
--- a/core/src/test/java/org/apache/metamodel/MetaModelHelperTest.java
+++ b/core/src/test/java/org/apache/metamodel/MetaModelHelperTest.java
@@ -213,7 +213,7 @@ public class MetaModelHelperTest extends MetaModelTestCase {
     
     
     
-    private int bigDataSetSize = 10000;
+    private int bigDataSetSize = 3000;
 
     /**
      * 
@@ -380,7 +380,7 @@ public class MetaModelHelperTest extends MetaModelTestCase {
         count++;
       }
       
-      assertTrue(count == 10000);
+      assertTrue(count == bigDataSetSize);
       
       
     }


[6/7] metamodel git commit: METAMODEL-1144: Minor formatting and warning-elemination

Posted by ka...@apache.org.
METAMODEL-1144: Minor formatting and warning-elemination

Fixes #148

Project: http://git-wip-us.apache.org/repos/asf/metamodel/repo
Commit: http://git-wip-us.apache.org/repos/asf/metamodel/commit/ddfa13e6
Tree: http://git-wip-us.apache.org/repos/asf/metamodel/tree/ddfa13e6
Diff: http://git-wip-us.apache.org/repos/asf/metamodel/diff/ddfa13e6

Branch: refs/heads/master
Commit: ddfa13e6c73fcb03a4314994e2196a825abae4a3
Parents: dda13e2
Author: Kasper Sørensen <i....@gmail.com>
Authored: Wed Jul 26 18:46:36 2017 -0700
Committer: Kasper Sørensen <i....@gmail.com>
Committed: Wed Jul 26 18:48:02 2017 -0700

----------------------------------------------------------------------
 CHANGES.md                                      |   1 +
 .../org/apache/metamodel/MetaModelHelper.java   |  48 ++++----
 .../apache/metamodel/MetaModelHelperTest.java   | 109 +++++++++----------
 .../metamodel/jdbc/MultiJDBCDataSetTest.java    |  91 ++++++----------
 4 files changed, 105 insertions(+), 144 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/metamodel/blob/ddfa13e6/CHANGES.md
----------------------------------------------------------------------
diff --git a/CHANGES.md b/CHANGES.md
index a468ab1..2a5aa51 100644
--- a/CHANGES.md
+++ b/CHANGES.md
@@ -7,6 +7,7 @@
  * [METAMODEL-1139] - Employed Java 8 functional types (java.util.function) in favor of (now deprecated) Ref, Action, Func. 
  * [METAMODEL-1140] - Allowed SalesforceDataContext without a security token.
  * [METAMODEL-1141] - Added RFC 4180 compliant CSV parsing.
+ * [METAMODEL-1144] - Optimized evaluation of conditional client-side JOIN statements.
 
 ### Apache MetaModel 4.6.0
 

http://git-wip-us.apache.org/repos/asf/metamodel/blob/ddfa13e6/core/src/main/java/org/apache/metamodel/MetaModelHelper.java
----------------------------------------------------------------------
diff --git a/core/src/main/java/org/apache/metamodel/MetaModelHelper.java b/core/src/main/java/org/apache/metamodel/MetaModelHelper.java
index f30a08a..a2681da 100644
--- a/core/src/main/java/org/apache/metamodel/MetaModelHelper.java
+++ b/core/src/main/java/org/apache/metamodel/MetaModelHelper.java
@@ -175,7 +175,7 @@ public final class MetaModelHelper {
     }
 
     public static DataSet getCarthesianProduct(DataSet[] fromDataSets, Iterable<FilterItem> whereItems) {
-        assert(fromDataSets.length>0);
+        assert (fromDataSets.length > 0);
         // First check if carthesian product is even nescesary
         if (fromDataSets.length == 1) {
             return getFiltered(fromDataSets[0], whereItems);
@@ -185,76 +185,75 @@ public final class MetaModelHelper {
 
         DataSet joined = dsIter.next();
 
-        while(dsIter.hasNext()){
-            joined = nestedLoopJoin(
-                    dsIter.next(),
-                    joined,
-                    (whereItems));
+        while (dsIter.hasNext()) {
+            joined = nestedLoopJoin(dsIter.next(), joined, (whereItems));
 
         }
 
         return joined;
 
-
     }
 
     /**
-     * Executes a simple nested loop join. The innerLoopDs will be copied in an in-memory dataset.
+     * Executes a simple nested loop join. The innerLoopDs will be copied in an
+     * in-memory dataset.
      *
      */
-    public static InMemoryDataSet nestedLoopJoin(DataSet innerLoopDs,  DataSet outerLoopDs, Iterable<FilterItem> filtersIterable){
+    public static InMemoryDataSet nestedLoopJoin(DataSet innerLoopDs, DataSet outerLoopDs,
+            Iterable<FilterItem> filtersIterable) {
 
         List<FilterItem> filters = new ArrayList<>();
-        for(FilterItem fi : filtersIterable){
+        for (FilterItem fi : filtersIterable) {
             filters.add(fi);
         }
         List<Row> innerRows = innerLoopDs.toRows();
 
-
-        List<SelectItem> allItems = new ArrayList<>(Arrays.asList(outerLoopDs.getSelectItems())) ;
+        List<SelectItem> allItems = new ArrayList<>(Arrays.asList(outerLoopDs.getSelectItems()));
         allItems.addAll(Arrays.asList(innerLoopDs.getSelectItems()));
 
-        Set<FilterItem> applicableFilters = applicableFilters(filters,allItems);
+        Set<FilterItem> applicableFilters = applicableFilters(filters, allItems);
 
         DataSetHeader jointHeader = new CachingDataSetHeader(allItems);
 
         List<Row> resultRows = new ArrayList<>();
-        for(Row outerRow: outerLoopDs){
-            for(Row innerRow: innerRows){
+        for (Row outerRow : outerLoopDs) {
+            for (Row innerRow : innerRows) {
 
                 Object[] joinedRowObjects = new Object[outerRow.getValues().length + innerRow.getValues().length];
 
-                System.arraycopy(outerRow.getValues(),0,joinedRowObjects,0,outerRow.getValues().length);
-                System.arraycopy(innerRow.getValues(),0,joinedRowObjects,outerRow.getValues().length,innerRow.getValues().length);
-
-                Row joinedRow =  new DefaultRow(jointHeader,joinedRowObjects);
+                System.arraycopy(outerRow.getValues(), 0, joinedRowObjects, 0, outerRow.getValues().length);
+                System.arraycopy(innerRow.getValues(), 0, joinedRowObjects, outerRow.getValues().length, innerRow
+                        .getValues().length);
 
+                Row joinedRow = new DefaultRow(jointHeader, joinedRowObjects);
 
-                if(applicableFilters.isEmpty()|| applicableFilters.stream().allMatch(fi -> fi.accept(joinedRow))){
+                if (applicableFilters.isEmpty() || applicableFilters.stream().allMatch(fi -> fi.accept(joinedRow))) {
                     resultRows.add(joinedRow);
                 }
             }
         }
 
-        return new InMemoryDataSet(jointHeader,resultRows);
+        return new InMemoryDataSet(jointHeader, resultRows);
     }
 
     /**
      * Filters the FilterItems such that only the FilterItems are returned,
      * which contain SelectItems that are contained in selectItemList
+     * 
      * @param filters
      * @param selectItemList
      * @return
      */
-    private static  Set<FilterItem> applicableFilters(Collection<FilterItem> filters, Collection<SelectItem> selectItemList) {
+    private static Set<FilterItem> applicableFilters(Collection<FilterItem> filters,
+            Collection<SelectItem> selectItemList) {
 
         Set<SelectItem> items = new HashSet<SelectItem>(selectItemList);
 
-        return filters.stream().filter( fi -> {
+        return filters.stream().filter(fi -> {
             Collection<SelectItem> fiSelectItems = new ArrayList<>();
             fiSelectItems.add(fi.getSelectItem());
             Object operand = fi.getOperand();
-            if(operand instanceof SelectItem){
+            if (operand instanceof SelectItem) {
                 fiSelectItems.add((SelectItem) operand);
             }
 
@@ -263,7 +262,6 @@ public final class MetaModelHelper {
         }).collect(Collectors.toSet());
     }
 
-
     public static DataSet getFiltered(DataSet dataSet, Iterable<FilterItem> filterItems) {
         List<IRowFilter> filters = CollectionUtils.map(filterItems, filterItem -> {
             return filterItem;

http://git-wip-us.apache.org/repos/asf/metamodel/blob/ddfa13e6/core/src/test/java/org/apache/metamodel/MetaModelHelperTest.java
----------------------------------------------------------------------
diff --git a/core/src/test/java/org/apache/metamodel/MetaModelHelperTest.java b/core/src/test/java/org/apache/metamodel/MetaModelHelperTest.java
index 705c6f1..a84cef1 100644
--- a/core/src/test/java/org/apache/metamodel/MetaModelHelperTest.java
+++ b/core/src/test/java/org/apache/metamodel/MetaModelHelperTest.java
@@ -116,14 +116,17 @@ public class MetaModelHelperTest extends MetaModelTestCase {
     public void testSimpleCarthesianProduct() throws Exception {
         DataSet dataSet = MetaModelHelper.getCarthesianProduct(createDataSet1(), createDataSet2());
         List<String> results = new ArrayList<String>();
-        
-        while(dataSet.next()){
-          results.add(dataSet.getRow().toString());
+
+        while (dataSet.next()) {
+            results.add(dataSet.getRow().toString());
         }
         assertEquals(2, dataSet.getSelectItems().length);
         assertEquals(9, results.size());
         assertTrue(results.contains("Row[values=[f, b]]"));
         assertTrue(results.contains("Row[values=[f, a]]"));
+        assertTrue(results.contains("Row[values=[f, r]]"));
+        assertTrue(results.contains("Row[values=[o, b]]"));
+        assertTrue(results.contains("Row[values=[o, a]]"));
         assertTrue(results.contains("Row[values=[o, r]]"));
     }
 
@@ -182,8 +185,8 @@ public class MetaModelHelperTest extends MetaModelTestCase {
         data1.add(new Object[] { "f" });
         data1.add(new Object[] { "o" });
         data1.add(new Object[] { "o" });
-        DataSet dataSet1 = createDataSet(
-                new SelectItem[] { new SelectItem(new MutableColumn("foo", ColumnType.VARCHAR)) }, data1);
+        DataSet dataSet1 = createDataSet(new SelectItem[] { new SelectItem(new MutableColumn("foo",
+                ColumnType.VARCHAR)) }, data1);
         return dataSet1;
     }
 
@@ -200,8 +203,8 @@ public class MetaModelHelperTest extends MetaModelTestCase {
         List<Object[]> data3 = new ArrayList<Object[]>();
         data3.add(new Object[] { "w00p", true });
         data3.add(new Object[] { "yippie", false });
-        DataSet dataSet3 = createDataSet(new SelectItem[] { new SelectItem("expression", "e"),
-                new SelectItem("webish?", "w") }, data3);
+        DataSet dataSet3 = createDataSet(new SelectItem[] { new SelectItem("expression", "e"), new SelectItem("webish?",
+                "w") }, data3);
         return dataSet3;
     }
 
@@ -210,9 +213,7 @@ public class MetaModelHelperTest extends MetaModelTestCase {
         DataSet dataSet4 = createDataSet(new SelectItem[] { new SelectItem("abc", "abc") }, data4);
         return dataSet4;
     }
-    
-    
-    
+
     private int bigDataSetSize = 3000;
 
     /**
@@ -220,42 +221,33 @@ public class MetaModelHelperTest extends MetaModelTestCase {
      * @return a big dataset, mocking an employee table
      */
     private DataSet createDataSet5() {
-      List<Object[]> data5 = new ArrayList<Object[]>();
-      
-      
-      for(int i = 0; i<bigDataSetSize;i++){
-        data5.add(new Object[]{i,"Person_" + i, bigDataSetSize-(i+1) });
-      }
-      
-      DataSet dataSet5 = createDataSet(
-          new SelectItem[] { 
-              new SelectItem(new MutableColumn("nr", ColumnType.BIGINT)),
-              new SelectItem(new MutableColumn("name", ColumnType.STRING)),
-              new SelectItem(new MutableColumn("dnr", ColumnType.BIGINT))
-          }, 
-          data5);
-      return dataSet5;
-  }
-    
+        List<Object[]> data5 = new ArrayList<Object[]>();
+
+        for (int i = 0; i < bigDataSetSize; i++) {
+            data5.add(new Object[] { i, "Person_" + i, bigDataSetSize - (i + 1) });
+        }
+
+        DataSet dataSet5 = createDataSet(new SelectItem[] { new SelectItem(new MutableColumn("nr", ColumnType.BIGINT)),
+                new SelectItem(new MutableColumn("name", ColumnType.STRING)), new SelectItem(new MutableColumn("dnr",
+                        ColumnType.BIGINT)) }, data5);
+        return dataSet5;
+    }
+
     /**
      * 
      * @return a big dataset, mocking an department table
      */
     private DataSet createDataSet6() {
-      List<Object[]> data6 = new ArrayList<Object[]>();
-
-      for(int i = 0; i<bigDataSetSize;i++){
-        data6.add(new Object[]{i,"Department_" + i });
-      }
-      
-      DataSet dataSet6 = createDataSet(
-          new SelectItem[] { 
-              new SelectItem(new MutableColumn("nr", ColumnType.BIGINT)),
-              new SelectItem(new MutableColumn("name", ColumnType.STRING)),
-          }, 
-          data6);
-      return dataSet6;
-  }
+        List<Object[]> data6 = new ArrayList<Object[]>();
+
+        for (int i = 0; i < bigDataSetSize; i++) {
+            data6.add(new Object[] { i, "Department_" + i });
+        }
+
+        DataSet dataSet6 = createDataSet(new SelectItem[] { new SelectItem(new MutableColumn("nr", ColumnType.BIGINT)),
+                new SelectItem(new MutableColumn("name", ColumnType.STRING)), }, data6);
+        return dataSet6;
+    }
 
     public void testGetTables() throws Exception {
         MutableTable table1 = new MutableTable("table1");
@@ -365,23 +357,22 @@ public class MetaModelHelperTest extends MetaModelTestCase {
         assertEquals("Row[values=[1, 2, null]]", joinedDs.getRow().toString());
         assertFalse(joinedDs.next());
     }
-    
-    
-    public void testCarthesianProductScalability(){
-      
-      DataSet employees = createDataSet5();
-      DataSet departmens = createDataSet6();
-      
-      FilterItem fi = new FilterItem(employees.getSelectItems()[2], OperatorType.EQUALS_TO,departmens.getSelectItems()[0]);
-      
-      DataSet joined =  MetaModelHelper.getCarthesianProduct(new DataSet[]{employees,departmens}, fi);
-      int count = 0; 
-      while(joined.next()){
-        count++;
-      }
-      
-      assertTrue(count == bigDataSetSize);
-      
-      
+
+    public void testCarthesianProductScalability() {
+
+        DataSet employees = createDataSet5();
+        DataSet departmens = createDataSet6();
+
+        FilterItem fi = new FilterItem(employees.getSelectItems()[2], OperatorType.EQUALS_TO, departmens
+                .getSelectItems()[0]);
+
+        DataSet joined = MetaModelHelper.getCarthesianProduct(new DataSet[] { employees, departmens }, fi);
+        int count = 0;
+        while (joined.next()) {
+            count++;
+        }
+
+        assertTrue(count == bigDataSetSize);
+
     }
 }

http://git-wip-us.apache.org/repos/asf/metamodel/blob/ddfa13e6/jdbc/src/test/java/org/apache/metamodel/jdbc/MultiJDBCDataSetTest.java
----------------------------------------------------------------------
diff --git a/jdbc/src/test/java/org/apache/metamodel/jdbc/MultiJDBCDataSetTest.java b/jdbc/src/test/java/org/apache/metamodel/jdbc/MultiJDBCDataSetTest.java
index d910362..0b60f95 100644
--- a/jdbc/src/test/java/org/apache/metamodel/jdbc/MultiJDBCDataSetTest.java
+++ b/jdbc/src/test/java/org/apache/metamodel/jdbc/MultiJDBCDataSetTest.java
@@ -36,11 +36,11 @@ import java.sql.DriverManager;
 import java.util.concurrent.TimeUnit;
 
 /**
- * A test case using two simple h2 in memory databases for executing single query over both databases.
+ * A test case using two simple h2 in memory databases for executing single
+ * query over both databases.
  */
 public class MultiJDBCDataSetTest {
 
-
     public static final String DRIVER_CLASS = "org.h2.Driver";
     public static final String EMP_URL_MEMORY_DATABASE = "jdbc:h2:mem:emp";
     public static final String DEP_URL_MEMORY_DATABASE = "jdbc:h2:mem:dep";
@@ -53,109 +53,80 @@ public class MultiJDBCDataSetTest {
 
     private int employeeSize = 10000;
     private int departmentSize = 1000;
-    int employeesPerDepartment =  employeeSize/ departmentSize;
-
+    int employeesPerDepartment = employeeSize / departmentSize;
 
     private static final Logger logger = LoggerFactory.getLogger(MultiJDBCDataSetTest.class);
 
-
     @Before
     public void setup() throws Exception {
         Class.forName(DRIVER_CLASS);
         emp_conn = DriverManager.getConnection(EMP_URL_MEMORY_DATABASE);
-        dep_conn =  DriverManager.getConnection(DEP_URL_MEMORY_DATABASE);
-
+        dep_conn = DriverManager.getConnection(DEP_URL_MEMORY_DATABASE);
 
         emp_dcon = new JdbcDataContext(emp_conn);
         dep_dcon = new JdbcDataContext(dep_conn);
 
+        emp_dcon.executeUpdate(new CreateTable(emp_dcon.getDefaultSchema(), "employee").withColumn("id").ofType(
+                ColumnType.INTEGER).asPrimaryKey().withColumn("name").ofType(ColumnType.VARCHAR).ofSize(200).withColumn(
+                        "dep_id").ofType(ColumnType.INTEGER));
 
-
-
-        emp_dcon.executeUpdate(new CreateTable(emp_dcon.getDefaultSchema(),"employee")
-                .withColumn("id").ofType(ColumnType.INTEGER).asPrimaryKey()
-                .withColumn("name").ofType(ColumnType.VARCHAR).ofSize(200)
-                .withColumn("dep_id").ofType(ColumnType.INTEGER));
-
-
-        for(int i = 0;i<employeeSize;i++){
-            emp_dcon.executeUpdate(new InsertInto(emp_dcon.getDefaultSchema().getTableByName("employee"))
-                    .value("id",i)
-                    .value("name","emp" + i)
-                    .value("dep_id",i% departmentSize));
+        for (int i = 0; i < employeeSize; i++) {
+            emp_dcon.executeUpdate(new InsertInto(emp_dcon.getDefaultSchema().getTableByName("employee")).value("id", i)
+                    .value("name", "emp" + i).value("dep_id", i % departmentSize));
         }
 
+        dep_dcon.executeUpdate(new CreateTable(dep_dcon.getDefaultSchema(), "department").withColumn("id").ofType(
+                ColumnType.INTEGER).asPrimaryKey().withColumn("name").ofType(ColumnType.VARCHAR).ofSize(200));
 
-        dep_dcon.executeUpdate(new CreateTable(dep_dcon.getDefaultSchema(),"department")
-                .withColumn("id").ofType(ColumnType.INTEGER).asPrimaryKey()
-                .withColumn("name").ofType(ColumnType.VARCHAR).ofSize(200));
-
-
-        for(int i = 0; i< departmentSize; i++){
-            dep_dcon.executeUpdate(new InsertInto(dep_dcon.getDefaultSchema().getTableByName("department"))
-                    .value("id",i)
-                    .value("name","dep" + i));
+        for (int i = 0; i < departmentSize; i++) {
+            dep_dcon.executeUpdate(new InsertInto(dep_dcon.getDefaultSchema().getTableByName("department")).value("id",
+                    i).value("name", "dep" + i));
         }
 
     }
 
-
     @After
-    public void tearDown(){
+    public void tearDown() {
         dep_dcon.executeUpdate(new DropTable("department"));
         emp_dcon.executeUpdate(new DropTable("employee"));
     }
 
-
-
     @Test
-    public void testJoin(){
+    public void testJoin() {
         Stopwatch duration = Stopwatch.createStarted();
-        CompositeDataContext compDcon = new CompositeDataContext(this.emp_dcon,this.dep_dcon );
-
-        DataSet ds = compDcon.query()
-                .from("employee")
-                .innerJoin("department")
-                .on("dep_id","id")
-                .selectAll()
-                .execute();
+        CompositeDataContext compDcon = new CompositeDataContext(this.emp_dcon, this.dep_dcon);
+
+        DataSet ds = compDcon.query().from("employee").innerJoin("department").on("dep_id", "id").selectAll().execute();
         int rowCount = 0;
-        while(ds.next()){
+        while (ds.next()) {
             Row row = ds.getRow();
+            Assert.assertNotNull(row);
             rowCount++;
         }
         duration.stop();
         logger.info("Test duration was {} ms", duration.elapsed(TimeUnit.MILLISECONDS));
 
-        Assert.assertEquals(employeeSize,rowCount);
+        Assert.assertEquals(employeeSize, rowCount);
 
     }
 
     @Test
-    public void testSelectiveJoin(){
+    public void testSelectiveJoin() {
         Stopwatch duration = Stopwatch.createStarted();
-        CompositeDataContext compDcon = new CompositeDataContext(this.emp_dcon,this.dep_dcon );
-
-        DataSet ds = compDcon.query()
-                .from("employee")
-                .innerJoin("department")
-                .on("dep_id","id")
-                .selectAll()
-                .where(compDcon.getTableByQualifiedLabel("department").getColumnByName("id")).eq(1)
-                .execute();
+        CompositeDataContext compDcon = new CompositeDataContext(this.emp_dcon, this.dep_dcon);
+
+        DataSet ds = compDcon.query().from("employee").innerJoin("department").on("dep_id", "id").selectAll().where(
+                compDcon.getTableByQualifiedLabel("department").getColumnByName("id")).eq(1).execute();
         int rowCount = 0;
-        while(ds.next()){
+        while (ds.next()) {
             Row row = ds.getRow();
+            Assert.assertNotNull(row);
             rowCount++;
         }
         duration.stop();
         logger.info("Test duration was {} ms", duration.elapsed(TimeUnit.MILLISECONDS));
 
-        Assert.assertEquals(employeesPerDepartment,rowCount);
-
+        Assert.assertEquals(employeesPerDepartment, rowCount);
     }
 
-
-
-
 }


[2/7] metamodel git commit: Simple nested loop join implementation

Posted by ka...@apache.org.
Simple nested loop join implementation


Project: http://git-wip-us.apache.org/repos/asf/metamodel/repo
Commit: http://git-wip-us.apache.org/repos/asf/metamodel/commit/ee2b9167
Tree: http://git-wip-us.apache.org/repos/asf/metamodel/tree/ee2b9167
Diff: http://git-wip-us.apache.org/repos/asf/metamodel/diff/ee2b9167

Branch: refs/heads/master
Commit: ee2b91671d8cb6b35046aabba8426724831b7205
Parents: ef5ac06
Author: Jörg Unbehauen <jo...@unbehauen.net>
Authored: Tue May 3 12:34:03 2016 +0200
Committer: Jörg Unbehauen <jo...@unbehauen.net>
Committed: Fri Jul 21 23:25:38 2017 +0200

----------------------------------------------------------------------
 .../java/org/apache/metamodel/JoinHelper.java   | 133 +++++++++++++++++++
 .../org/apache/metamodel/MetaModelHelper.java   |  79 +++--------
 2 files changed, 152 insertions(+), 60 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/metamodel/blob/ee2b9167/core/src/main/java/org/apache/metamodel/JoinHelper.java
----------------------------------------------------------------------
diff --git a/core/src/main/java/org/apache/metamodel/JoinHelper.java b/core/src/main/java/org/apache/metamodel/JoinHelper.java
new file mode 100644
index 0000000..c8cdfa7
--- /dev/null
+++ b/core/src/main/java/org/apache/metamodel/JoinHelper.java
@@ -0,0 +1,133 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.metamodel;
+
+import com.google.common.collect.Lists;
+import org.apache.metamodel.data.*;
+import org.apache.metamodel.query.FilterItem;
+import org.apache.metamodel.query.SelectItem;
+
+import java.util.*;
+import java.util.function.Predicate;
+import java.util.stream.Collectors;
+
+/**
+ * Join Execution and related methods.
+ */
+public abstract class JoinHelper {
+
+
+    /**
+     * Executes a simple nested loop join. The innerLoopDs will be copied in an in-memory dataset.
+     *
+     * @param outerLoopDs
+     * @param innerLoopDs
+     * @param filters
+     * @return
+     */
+    public static InMemoryDataSet nestedLoopJoin( DataSet innerLoopDs,  DataSet outerLoopDs, Collection<FilterItem> filters){
+
+        List<Row> innerRows = innerLoopDs.toRows();
+
+
+        List<SelectItem> innerSelItems = Lists.newArrayList(innerLoopDs.getSelectItems());
+        List<SelectItem> outerSelItems = Lists.newArrayList(outerLoopDs.getSelectItems());
+        List<SelectItem> allItems = Lists.newArrayList(innerSelItems);
+        allItems.addAll(outerSelItems);
+
+
+        Set<FilterItem> filterAll = applicableFilters(filters, allItems);
+
+
+        DataSetHeader jointHeader = joinHeader(outerLoopDs, innerLoopDs);
+
+        List<Row> resultRows = Lists.newArrayList();
+        for(Row outerRow: outerLoopDs){
+            for(Row innerRow: innerRows){
+                Row joinedRow =  joinRow(outerRow,innerRow,jointHeader);
+                if(filterAll.isEmpty()|| filterAll.stream().allMatch(fi -> fi.accept(joinedRow))){
+                    resultRows.add(joinedRow);
+                }
+            }
+        }
+
+
+
+        return new InMemoryDataSet(jointHeader,resultRows);
+    }
+
+
+    public static  Set<FilterItem> applicableFilters(Collection<FilterItem> filters, Collection<SelectItem> selectItemList) {
+
+        Set<SelectItem> items = new HashSet<>(selectItemList);
+
+        return filters.stream().filter( fi -> {
+            Collection<SelectItem> fiSelectItems = Lists.newArrayList(fi.getSelectItem());
+            Object operand = fi.getOperand();
+            if(operand instanceof SelectItem){
+                fiSelectItems.add((SelectItem) operand);
+            }
+
+            return items.containsAll(fiSelectItems);
+
+        }).collect(Collectors.toSet());
+    }
+
+
+
+
+
+    /**
+     * joins two datasetheader.
+     * @param ds1 the headers for the left
+     * @param ds2 the tright headers
+     * @return
+     */
+    public static DataSetHeader joinHeader(DataSet ds1, DataSet ds2){
+        List<SelectItem> joinedSelectItems = Lists.newArrayList(ds1.getSelectItems());
+        joinedSelectItems.addAll(Lists.newArrayList(ds2.getSelectItems()));
+        return  new CachingDataSetHeader(joinedSelectItems);
+
+
+    }
+
+    /**
+     * Joins two rows into one.
+     *
+     * Consider parameter ordering to maintain backwards compatbility
+     *
+     * @param row1 the tuples, that will be on the left
+     * @param row2 the tuples, that will be on the right
+     * @param jointHeader
+     * @return
+     */
+    public static Row joinRow(Row row1, Row row2, DataSetHeader jointHeader){
+        Object[] joinedRow = new Object[row1.getValues().length + row2.getValues().length];
+
+        System.arraycopy(row1.getValues(),0,joinedRow,0,row1.getValues().length);
+        System.arraycopy(row2.getValues(),0,joinedRow,row1.getValues().length,row2.getValues().length);
+
+
+        return new DefaultRow(jointHeader,joinedRow);
+
+
+    }
+
+
+}

http://git-wip-us.apache.org/repos/asf/metamodel/blob/ee2b9167/core/src/main/java/org/apache/metamodel/MetaModelHelper.java
----------------------------------------------------------------------
diff --git a/core/src/main/java/org/apache/metamodel/MetaModelHelper.java b/core/src/main/java/org/apache/metamodel/MetaModelHelper.java
index 09d47bc..c788633 100644
--- a/core/src/main/java/org/apache/metamodel/MetaModelHelper.java
+++ b/core/src/main/java/org/apache/metamodel/MetaModelHelper.java
@@ -18,17 +18,10 @@
  */
 package org.apache.metamodel;
 
-import java.util.ArrayList;
-import java.util.Arrays;
-import java.util.Collection;
-import java.util.Collections;
-import java.util.Comparator;
-import java.util.HashMap;
-import java.util.HashSet;
-import java.util.List;
-import java.util.Map;
+import java.util.*;
 import java.util.Map.Entry;
 
+import com.google.common.collect.Lists;
 import org.apache.metamodel.data.CachingDataSetHeader;
 import org.apache.metamodel.data.DataSet;
 import org.apache.metamodel.data.DataSetHeader;
@@ -46,6 +39,7 @@ import org.apache.metamodel.data.SubSelectionDataSet;
 import org.apache.metamodel.query.FilterItem;
 import org.apache.metamodel.query.FromItem;
 import org.apache.metamodel.query.GroupByItem;
+import org.apache.metamodel.query.OperatorType;
 import org.apache.metamodel.query.OrderByItem;
 import org.apache.metamodel.query.Query;
 import org.apache.metamodel.query.ScalarFunction;
@@ -176,71 +170,36 @@ public final class MetaModelHelper {
     public static DataSet getCarthesianProduct(DataSet... fromDataSets) {
         return getCarthesianProduct(fromDataSets, new FilterItem[0]);
     }
+    
+    
+    
 
     public static DataSet getCarthesianProduct(DataSet[] fromDataSets, Iterable<FilterItem> whereItems) {
+        assert(fromDataSets.length>0);
         // First check if carthesian product is even nescesary
         if (fromDataSets.length == 1) {
             return getFiltered(fromDataSets[0], whereItems);
         }
+        // do a nested loop join, no matter what
+        Iterator<DataSet> dsIter = Lists.newArrayList(fromDataSets).iterator();
 
-        List<SelectItem> selectItems = new ArrayList<SelectItem>();
-        for (DataSet dataSet : fromDataSets) {
-            for (int i = 0; i < dataSet.getSelectItems().length; i++) {
-                SelectItem item = dataSet.getSelectItems()[i];
-                selectItems.add(item);
-            }
-        }
+        DataSet joined = dsIter.next();
 
-        int selectItemOffset = 0;
-        List<Object[]> data = new ArrayList<Object[]>();
-        for (int fromDataSetIndex = 0; fromDataSetIndex < fromDataSets.length; fromDataSetIndex++) {
-            DataSet fromDataSet = fromDataSets[fromDataSetIndex];
-            SelectItem[] fromSelectItems = fromDataSet.getSelectItems();
-            if (fromDataSetIndex == 0) {
-                while (fromDataSet.next()) {
-                    Object[] values = fromDataSet.getRow().getValues();
-                    Object[] row = new Object[selectItems.size()];
-                    System.arraycopy(values, 0, row, selectItemOffset, values.length);
-                    data.add(row);
-                }
-                fromDataSet.close();
-            } else {
-                List<Object[]> fromDataRows = new ArrayList<Object[]>();
-                while (fromDataSet.next()) {
-                    fromDataRows.add(fromDataSet.getRow().getValues());
-                }
-                fromDataSet.close();
-                for (int i = 0; i < data.size(); i = i + fromDataRows.size()) {
-                    Object[] originalRow = data.get(i);
-                    data.remove(i);
-                    for (int j = 0; j < fromDataRows.size(); j++) {
-                        Object[] newRow = fromDataRows.get(j);
-                        System.arraycopy(newRow, 0, originalRow, selectItemOffset, newRow.length);
-                        data.add(i + j, originalRow.clone());
-                    }
-                }
-            }
-            selectItemOffset += fromSelectItems.length;
-        }
+        while(dsIter.hasNext()){
+            joined = JoinHelper.nestedLoopJoin(
+                    dsIter.next(),
+                    joined,
+                    Lists.newArrayList(whereItems));
 
-        if (data.isEmpty()) {
-            return new EmptyDataSet(selectItems);
         }
 
-        final DataSetHeader header = new CachingDataSetHeader(selectItems);
-        final List<Row> rows = new ArrayList<Row>(data.size());
-        for (Object[] objects : data) {
-            rows.add(new DefaultRow(header, objects, null));
-        }
+        return joined;
+
 
-        DataSet result = new InMemoryDataSet(header, rows);
-        if (whereItems != null) {
-            DataSet filteredResult = getFiltered(result, whereItems);
-            result = filteredResult;
-        }
-        return result;
     }
 
+    
+
     public static DataSet getCarthesianProduct(DataSet[] fromDataSets, FilterItem... filterItems) {
         return getCarthesianProduct(fromDataSets, Arrays.asList(filterItems));
     }


[5/7] metamodel git commit: Moved JoinHelper into Metamodelhelper,

Posted by ka...@apache.org.
Moved JoinHelper into Metamodelhelper,

removed guava


Project: http://git-wip-us.apache.org/repos/asf/metamodel/repo
Commit: http://git-wip-us.apache.org/repos/asf/metamodel/commit/dda13e2e
Tree: http://git-wip-us.apache.org/repos/asf/metamodel/tree/dda13e2e
Diff: http://git-wip-us.apache.org/repos/asf/metamodel/diff/dda13e2e

Branch: refs/heads/master
Commit: dda13e2ec930d736881b77630fec5d45196a290f
Parents: 88a1828
Author: Jörg Unbehauen <jo...@unbehauen.net>
Authored: Tue Jul 25 16:10:59 2017 +0200
Committer: Jörg Unbehauen <jo...@unbehauen.net>
Committed: Tue Jul 25 16:10:59 2017 +0200

----------------------------------------------------------------------
 core/pom.xml                                    |   4 -
 .../java/org/apache/metamodel/JoinHelper.java   | 133 -------------------
 .../org/apache/metamodel/MetaModelHelper.java   |  82 ++++++++++--
 3 files changed, 71 insertions(+), 148 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/metamodel/blob/dda13e2e/core/pom.xml
----------------------------------------------------------------------
diff --git a/core/pom.xml b/core/pom.xml
index aeee377..3af58c9 100644
--- a/core/pom.xml
+++ b/core/pom.xml
@@ -32,10 +32,6 @@ under the License.
 			<artifactId>slf4j-api</artifactId>
 		</dependency>
 		<dependency>
-			<groupId>com.google.guava</groupId>
-			<artifactId>guava</artifactId>
-		</dependency>
-		<dependency>
 			<groupId>org.slf4j</groupId>
 			<artifactId>slf4j-nop</artifactId>
 			<scope>test</scope>

http://git-wip-us.apache.org/repos/asf/metamodel/blob/dda13e2e/core/src/main/java/org/apache/metamodel/JoinHelper.java
----------------------------------------------------------------------
diff --git a/core/src/main/java/org/apache/metamodel/JoinHelper.java b/core/src/main/java/org/apache/metamodel/JoinHelper.java
deleted file mode 100644
index c8cdfa7..0000000
--- a/core/src/main/java/org/apache/metamodel/JoinHelper.java
+++ /dev/null
@@ -1,133 +0,0 @@
-/**
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-package org.apache.metamodel;
-
-import com.google.common.collect.Lists;
-import org.apache.metamodel.data.*;
-import org.apache.metamodel.query.FilterItem;
-import org.apache.metamodel.query.SelectItem;
-
-import java.util.*;
-import java.util.function.Predicate;
-import java.util.stream.Collectors;
-
-/**
- * Join Execution and related methods.
- */
-public abstract class JoinHelper {
-
-
-    /**
-     * Executes a simple nested loop join. The innerLoopDs will be copied in an in-memory dataset.
-     *
-     * @param outerLoopDs
-     * @param innerLoopDs
-     * @param filters
-     * @return
-     */
-    public static InMemoryDataSet nestedLoopJoin( DataSet innerLoopDs,  DataSet outerLoopDs, Collection<FilterItem> filters){
-
-        List<Row> innerRows = innerLoopDs.toRows();
-
-
-        List<SelectItem> innerSelItems = Lists.newArrayList(innerLoopDs.getSelectItems());
-        List<SelectItem> outerSelItems = Lists.newArrayList(outerLoopDs.getSelectItems());
-        List<SelectItem> allItems = Lists.newArrayList(innerSelItems);
-        allItems.addAll(outerSelItems);
-
-
-        Set<FilterItem> filterAll = applicableFilters(filters, allItems);
-
-
-        DataSetHeader jointHeader = joinHeader(outerLoopDs, innerLoopDs);
-
-        List<Row> resultRows = Lists.newArrayList();
-        for(Row outerRow: outerLoopDs){
-            for(Row innerRow: innerRows){
-                Row joinedRow =  joinRow(outerRow,innerRow,jointHeader);
-                if(filterAll.isEmpty()|| filterAll.stream().allMatch(fi -> fi.accept(joinedRow))){
-                    resultRows.add(joinedRow);
-                }
-            }
-        }
-
-
-
-        return new InMemoryDataSet(jointHeader,resultRows);
-    }
-
-
-    public static  Set<FilterItem> applicableFilters(Collection<FilterItem> filters, Collection<SelectItem> selectItemList) {
-
-        Set<SelectItem> items = new HashSet<>(selectItemList);
-
-        return filters.stream().filter( fi -> {
-            Collection<SelectItem> fiSelectItems = Lists.newArrayList(fi.getSelectItem());
-            Object operand = fi.getOperand();
-            if(operand instanceof SelectItem){
-                fiSelectItems.add((SelectItem) operand);
-            }
-
-            return items.containsAll(fiSelectItems);
-
-        }).collect(Collectors.toSet());
-    }
-
-
-
-
-
-    /**
-     * joins two datasetheader.
-     * @param ds1 the headers for the left
-     * @param ds2 the tright headers
-     * @return
-     */
-    public static DataSetHeader joinHeader(DataSet ds1, DataSet ds2){
-        List<SelectItem> joinedSelectItems = Lists.newArrayList(ds1.getSelectItems());
-        joinedSelectItems.addAll(Lists.newArrayList(ds2.getSelectItems()));
-        return  new CachingDataSetHeader(joinedSelectItems);
-
-
-    }
-
-    /**
-     * Joins two rows into one.
-     *
-     * Consider parameter ordering to maintain backwards compatbility
-     *
-     * @param row1 the tuples, that will be on the left
-     * @param row2 the tuples, that will be on the right
-     * @param jointHeader
-     * @return
-     */
-    public static Row joinRow(Row row1, Row row2, DataSetHeader jointHeader){
-        Object[] joinedRow = new Object[row1.getValues().length + row2.getValues().length];
-
-        System.arraycopy(row1.getValues(),0,joinedRow,0,row1.getValues().length);
-        System.arraycopy(row2.getValues(),0,joinedRow,row1.getValues().length,row2.getValues().length);
-
-
-        return new DefaultRow(jointHeader,joinedRow);
-
-
-    }
-
-
-}

http://git-wip-us.apache.org/repos/asf/metamodel/blob/dda13e2e/core/src/main/java/org/apache/metamodel/MetaModelHelper.java
----------------------------------------------------------------------
diff --git a/core/src/main/java/org/apache/metamodel/MetaModelHelper.java b/core/src/main/java/org/apache/metamodel/MetaModelHelper.java
index c788633..f30a08a 100644
--- a/core/src/main/java/org/apache/metamodel/MetaModelHelper.java
+++ b/core/src/main/java/org/apache/metamodel/MetaModelHelper.java
@@ -20,8 +20,8 @@ package org.apache.metamodel;
 
 import java.util.*;
 import java.util.Map.Entry;
+import java.util.stream.Collectors;
 
-import com.google.common.collect.Lists;
 import org.apache.metamodel.data.CachingDataSetHeader;
 import org.apache.metamodel.data.DataSet;
 import org.apache.metamodel.data.DataSetHeader;
@@ -39,7 +39,6 @@ import org.apache.metamodel.data.SubSelectionDataSet;
 import org.apache.metamodel.query.FilterItem;
 import org.apache.metamodel.query.FromItem;
 import org.apache.metamodel.query.GroupByItem;
-import org.apache.metamodel.query.OperatorType;
 import org.apache.metamodel.query.OrderByItem;
 import org.apache.metamodel.query.Query;
 import org.apache.metamodel.query.ScalarFunction;
@@ -170,9 +169,10 @@ public final class MetaModelHelper {
     public static DataSet getCarthesianProduct(DataSet... fromDataSets) {
         return getCarthesianProduct(fromDataSets, new FilterItem[0]);
     }
-    
-    
-    
+
+    public static DataSet getCarthesianProduct(DataSet[] fromDataSets, FilterItem... filterItems) {
+        return getCarthesianProduct(fromDataSets, Arrays.asList(filterItems));
+    }
 
     public static DataSet getCarthesianProduct(DataSet[] fromDataSets, Iterable<FilterItem> whereItems) {
         assert(fromDataSets.length>0);
@@ -181,15 +181,15 @@ public final class MetaModelHelper {
             return getFiltered(fromDataSets[0], whereItems);
         }
         // do a nested loop join, no matter what
-        Iterator<DataSet> dsIter = Lists.newArrayList(fromDataSets).iterator();
+        Iterator<DataSet> dsIter = Arrays.asList(fromDataSets).iterator();
 
         DataSet joined = dsIter.next();
 
         while(dsIter.hasNext()){
-            joined = JoinHelper.nestedLoopJoin(
+            joined = nestedLoopJoin(
                     dsIter.next(),
                     joined,
-                    Lists.newArrayList(whereItems));
+                    (whereItems));
 
         }
 
@@ -198,12 +198,72 @@ public final class MetaModelHelper {
 
     }
 
-    
+    /**
+     * Executes a simple nested loop join. The innerLoopDs will be copied in an in-memory dataset.
+     *
+     */
+    public static InMemoryDataSet nestedLoopJoin(DataSet innerLoopDs,  DataSet outerLoopDs, Iterable<FilterItem> filtersIterable){
 
-    public static DataSet getCarthesianProduct(DataSet[] fromDataSets, FilterItem... filterItems) {
-        return getCarthesianProduct(fromDataSets, Arrays.asList(filterItems));
+        List<FilterItem> filters = new ArrayList<>();
+        for(FilterItem fi : filtersIterable){
+            filters.add(fi);
+        }
+        List<Row> innerRows = innerLoopDs.toRows();
+
+
+        List<SelectItem> allItems = new ArrayList<>(Arrays.asList(outerLoopDs.getSelectItems())) ;
+        allItems.addAll(Arrays.asList(innerLoopDs.getSelectItems()));
+
+        Set<FilterItem> applicableFilters = applicableFilters(filters,allItems);
+
+        DataSetHeader jointHeader = new CachingDataSetHeader(allItems);
+
+        List<Row> resultRows = new ArrayList<>();
+        for(Row outerRow: outerLoopDs){
+            for(Row innerRow: innerRows){
+
+                Object[] joinedRowObjects = new Object[outerRow.getValues().length + innerRow.getValues().length];
+
+                System.arraycopy(outerRow.getValues(),0,joinedRowObjects,0,outerRow.getValues().length);
+                System.arraycopy(innerRow.getValues(),0,joinedRowObjects,outerRow.getValues().length,innerRow.getValues().length);
+
+                Row joinedRow =  new DefaultRow(jointHeader,joinedRowObjects);
+
+
+                if(applicableFilters.isEmpty()|| applicableFilters.stream().allMatch(fi -> fi.accept(joinedRow))){
+                    resultRows.add(joinedRow);
+                }
+            }
+        }
+
+        return new InMemoryDataSet(jointHeader,resultRows);
     }
 
+    /**
+     * Filters the FilterItems such that only the FilterItems are returned,
+     * which contain SelectItems that are contained in selectItemList
+     * @param filters
+     * @param selectItemList
+     * @return
+     */
+    private static  Set<FilterItem> applicableFilters(Collection<FilterItem> filters, Collection<SelectItem> selectItemList) {
+
+        Set<SelectItem> items = new HashSet<SelectItem>(selectItemList);
+
+        return filters.stream().filter( fi -> {
+            Collection<SelectItem> fiSelectItems = new ArrayList<>();
+            fiSelectItems.add(fi.getSelectItem());
+            Object operand = fi.getOperand();
+            if(operand instanceof SelectItem){
+                fiSelectItems.add((SelectItem) operand);
+            }
+
+            return items.containsAll(fiSelectItems);
+
+        }).collect(Collectors.toSet());
+    }
+
+
     public static DataSet getFiltered(DataSet dataSet, Iterable<FilterItem> filterItems) {
         List<IRowFilter> filters = CollectionUtils.map(filterItems, filterItem -> {
             return filterItem;


[3/7] metamodel git commit: Test cases for the join implementation

Posted by ka...@apache.org.
Test cases for the join implementation


Project: http://git-wip-us.apache.org/repos/asf/metamodel/repo
Commit: http://git-wip-us.apache.org/repos/asf/metamodel/commit/ef5ac06f
Tree: http://git-wip-us.apache.org/repos/asf/metamodel/tree/ef5ac06f
Diff: http://git-wip-us.apache.org/repos/asf/metamodel/diff/ef5ac06f

Branch: refs/heads/master
Commit: ef5ac06f17937365603e2be2eb1333c9d1407062
Parents: 65574d3
Author: Jörg Unbehauen <jo...@unbehauen.net>
Authored: Tue Jul 18 15:28:30 2017 +0200
Committer: Jörg Unbehauen <jo...@unbehauen.net>
Committed: Fri Jul 21 23:25:38 2017 +0200

----------------------------------------------------------------------
 .../apache/metamodel/MetaModelHelperTest.java   |  88 ++++++++--
 .../metamodel/jdbc/MultiJDBCDataSetTest.java    | 161 +++++++++++++++++++
 2 files changed, 235 insertions(+), 14 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/metamodel/blob/ef5ac06f/core/src/test/java/org/apache/metamodel/MetaModelHelperTest.java
----------------------------------------------------------------------
diff --git a/core/src/test/java/org/apache/metamodel/MetaModelHelperTest.java b/core/src/test/java/org/apache/metamodel/MetaModelHelperTest.java
index 540aa95..50591ca 100644
--- a/core/src/test/java/org/apache/metamodel/MetaModelHelperTest.java
+++ b/core/src/test/java/org/apache/metamodel/MetaModelHelperTest.java
@@ -115,21 +115,16 @@ public class MetaModelHelperTest extends MetaModelTestCase {
 
     public void testSimpleCarthesianProduct() throws Exception {
         DataSet dataSet = MetaModelHelper.getCarthesianProduct(createDataSet1(), createDataSet2());
-
+        List<String> results = new ArrayList<String>();
+        
+        while(dataSet.next()){
+          results.add(dataSet.getRow().toString());
+        }
         assertEquals(2, dataSet.getSelectItems().length);
-        assertTrue(dataSet.next());
-        assertEquals("Row[values=[f, b]]", dataSet.getRow().toString());
-        assertTrue(dataSet.next());
-        assertEquals("Row[values=[f, a]]", dataSet.getRow().toString());
-        assertTrue(dataSet.next());
-        assertTrue(dataSet.next());
-        assertTrue(dataSet.next());
-        assertTrue(dataSet.next());
-        assertTrue(dataSet.next());
-        assertTrue(dataSet.next());
-        assertTrue(dataSet.next());
-        assertEquals("Row[values=[o, r]]", dataSet.getRow().toString());
-        assertFalse(dataSet.next());
+        assertEquals(9, results.size());
+        assertTrue(results.contains("Row[values=[f, b]]"));
+        assertTrue(results.contains("Row[values=[f, a]]"));
+        assertTrue(results.contains("Row[values=[o, r]]"));
     }
 
     public void testTripleCarthesianProduct() throws Exception {
@@ -215,6 +210,52 @@ public class MetaModelHelperTest extends MetaModelTestCase {
         DataSet dataSet4 = createDataSet(new SelectItem[] { new SelectItem("abc", "abc") }, data4);
         return dataSet4;
     }
+    
+    
+    
+    private int bigDataSetSize = 10000;
+
+    /**
+     * 
+     * @return a big dataset, mocking an employee table
+     */
+    private DataSet createDataSet5() {
+      List<Object[]> data5 = new ArrayList<Object[]>();
+      
+      
+      for(int i = 0; i<bigDataSetSize;i++){
+        data5.add(new Object[]{i,"Person_" + i, bigDataSetSize-(i+1) });
+      }
+      
+      DataSet dataSet5 = createDataSet(
+          new SelectItem[] { 
+              new SelectItem(new MutableColumn("nr", ColumnType.BIGINT)),
+              new SelectItem(new MutableColumn("name", ColumnType.STRING)),
+              new SelectItem(new MutableColumn("dnr", ColumnType.BIGINT))
+          }, 
+          data5);
+      return dataSet5;
+  }
+    
+    /**
+     * 
+     * @return a big dataset, mocking an department table
+     */
+    private DataSet createDataSet6() {
+      List<Object[]> data6 = new ArrayList<Object[]>();
+
+      for(int i = 0; i<bigDataSetSize;i++){
+        data6.add(new Object[]{i,"Department_" + i });
+      }
+      
+      DataSet dataSet6 = createDataSet(
+          new SelectItem[] { 
+              new SelectItem(new MutableColumn("nr", ColumnType.BIGINT)),
+              new SelectItem(new MutableColumn("name", ColumnType.STRING)),
+          }, 
+          data6);
+      return dataSet6;
+  }
 
     public void testGetTables() throws Exception {
         MutableTable table1 = new MutableTable("table1");
@@ -324,4 +365,23 @@ public class MetaModelHelperTest extends MetaModelTestCase {
         assertEquals("Row[values=[1, 2, null]]", joinedDs.getRow().toString());
         assertFalse(joinedDs.next());
     }
+    
+    
+    public void testCarthesianProductScalability(){
+      
+      DataSet employees = createDataSet5();
+      DataSet departmens = createDataSet6();
+      
+      FilterItem fi = new FilterItem(employees.getSelectItems()[2], OperatorType.EQUALS_TO,departmens.getSelectItems()[0]);
+      
+      DataSet joined =  MetaModelHelper.getCarthesianProduct(new DataSet[]{employees,departmens}, fi);
+      int count = 0; 
+      while(joined.next()){
+        count++;
+      }
+      
+      assertTrue(count == 10000);
+      
+      
+    }
 }

http://git-wip-us.apache.org/repos/asf/metamodel/blob/ef5ac06f/jdbc/src/test/java/org/apache/metamodel/jdbc/MultiJDBCDataSetTest.java
----------------------------------------------------------------------
diff --git a/jdbc/src/test/java/org/apache/metamodel/jdbc/MultiJDBCDataSetTest.java b/jdbc/src/test/java/org/apache/metamodel/jdbc/MultiJDBCDataSetTest.java
new file mode 100644
index 0000000..d910362
--- /dev/null
+++ b/jdbc/src/test/java/org/apache/metamodel/jdbc/MultiJDBCDataSetTest.java
@@ -0,0 +1,161 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.metamodel.jdbc;
+
+import com.google.common.base.Stopwatch;
+import org.apache.metamodel.CompositeDataContext;
+import org.apache.metamodel.UpdateableDataContext;
+import org.apache.metamodel.create.CreateTable;
+import org.apache.metamodel.data.DataSet;
+import org.apache.metamodel.data.Row;
+import org.apache.metamodel.drop.DropTable;
+import org.apache.metamodel.insert.InsertInto;
+import org.apache.metamodel.schema.ColumnType;
+import org.junit.*;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A test case using two simple h2 in memory databases for executing single query over both databases.
+ */
+public class MultiJDBCDataSetTest {
+
+
+    public static final String DRIVER_CLASS = "org.h2.Driver";
+    public static final String EMP_URL_MEMORY_DATABASE = "jdbc:h2:mem:emp";
+    public static final String DEP_URL_MEMORY_DATABASE = "jdbc:h2:mem:dep";
+
+    private Connection dep_conn;
+    private UpdateableDataContext dep_dcon;
+
+    private Connection emp_conn;
+    private UpdateableDataContext emp_dcon;
+
+    private int employeeSize = 10000;
+    private int departmentSize = 1000;
+    int employeesPerDepartment =  employeeSize/ departmentSize;
+
+
+    private static final Logger logger = LoggerFactory.getLogger(MultiJDBCDataSetTest.class);
+
+
+    @Before
+    public void setup() throws Exception {
+        Class.forName(DRIVER_CLASS);
+        emp_conn = DriverManager.getConnection(EMP_URL_MEMORY_DATABASE);
+        dep_conn =  DriverManager.getConnection(DEP_URL_MEMORY_DATABASE);
+
+
+        emp_dcon = new JdbcDataContext(emp_conn);
+        dep_dcon = new JdbcDataContext(dep_conn);
+
+
+
+
+        emp_dcon.executeUpdate(new CreateTable(emp_dcon.getDefaultSchema(),"employee")
+                .withColumn("id").ofType(ColumnType.INTEGER).asPrimaryKey()
+                .withColumn("name").ofType(ColumnType.VARCHAR).ofSize(200)
+                .withColumn("dep_id").ofType(ColumnType.INTEGER));
+
+
+        for(int i = 0;i<employeeSize;i++){
+            emp_dcon.executeUpdate(new InsertInto(emp_dcon.getDefaultSchema().getTableByName("employee"))
+                    .value("id",i)
+                    .value("name","emp" + i)
+                    .value("dep_id",i% departmentSize));
+        }
+
+
+        dep_dcon.executeUpdate(new CreateTable(dep_dcon.getDefaultSchema(),"department")
+                .withColumn("id").ofType(ColumnType.INTEGER).asPrimaryKey()
+                .withColumn("name").ofType(ColumnType.VARCHAR).ofSize(200));
+
+
+        for(int i = 0; i< departmentSize; i++){
+            dep_dcon.executeUpdate(new InsertInto(dep_dcon.getDefaultSchema().getTableByName("department"))
+                    .value("id",i)
+                    .value("name","dep" + i));
+        }
+
+    }
+
+
+    @After
+    public void tearDown(){
+        dep_dcon.executeUpdate(new DropTable("department"));
+        emp_dcon.executeUpdate(new DropTable("employee"));
+    }
+
+
+
+    @Test
+    public void testJoin(){
+        Stopwatch duration = Stopwatch.createStarted();
+        CompositeDataContext compDcon = new CompositeDataContext(this.emp_dcon,this.dep_dcon );
+
+        DataSet ds = compDcon.query()
+                .from("employee")
+                .innerJoin("department")
+                .on("dep_id","id")
+                .selectAll()
+                .execute();
+        int rowCount = 0;
+        while(ds.next()){
+            Row row = ds.getRow();
+            rowCount++;
+        }
+        duration.stop();
+        logger.info("Test duration was {} ms", duration.elapsed(TimeUnit.MILLISECONDS));
+
+        Assert.assertEquals(employeeSize,rowCount);
+
+    }
+
+    @Test
+    public void testSelectiveJoin(){
+        Stopwatch duration = Stopwatch.createStarted();
+        CompositeDataContext compDcon = new CompositeDataContext(this.emp_dcon,this.dep_dcon );
+
+        DataSet ds = compDcon.query()
+                .from("employee")
+                .innerJoin("department")
+                .on("dep_id","id")
+                .selectAll()
+                .where(compDcon.getTableByQualifiedLabel("department").getColumnByName("id")).eq(1)
+                .execute();
+        int rowCount = 0;
+        while(ds.next()){
+            Row row = ds.getRow();
+            rowCount++;
+        }
+        duration.stop();
+        logger.info("Test duration was {} ms", duration.elapsed(TimeUnit.MILLISECONDS));
+
+        Assert.assertEquals(employeesPerDepartment,rowCount);
+
+    }
+
+
+
+
+}


[7/7] metamodel git commit: Merge branch 'master' into feature/fasterJoin

Posted by ka...@apache.org.
Merge branch 'master' into feature/fasterJoin

Conflicts:
	CHANGES.md


Project: http://git-wip-us.apache.org/repos/asf/metamodel/repo
Commit: http://git-wip-us.apache.org/repos/asf/metamodel/commit/b08aec1d
Tree: http://git-wip-us.apache.org/repos/asf/metamodel/tree/b08aec1d
Diff: http://git-wip-us.apache.org/repos/asf/metamodel/diff/b08aec1d

Branch: refs/heads/master
Commit: b08aec1dcc54dc0bcb4ff42e0cd1411d6ec0e15b
Parents: ddfa13e d841acc
Author: Kasper Sørensen <i....@gmail.com>
Authored: Wed Jul 26 18:49:03 2017 -0700
Committer: Kasper Sørensen <i....@gmail.com>
Committed: Wed Jul 26 18:49:03 2017 -0700

----------------------------------------------------------------------
 CHANGES.md                                      |   1 +
 README.md                                       |  78 ++++++-------
 .../factory/DataContextPropertiesImpl.java      |   3 +
 jdbc/pom.xml                                    |   4 +
 .../metamodel/jdbc/JdbcMetadataLoader.java      |  92 +++++++++++----
 .../jdbc/dialects/DefaultQueryRewriter.java     |  24 ++--
 .../jdbc/dialects/HsqldbQueryRewriter.java      |  13 +++
 .../apache/metamodel/jdbc/H2databaseTest.java   | 104 +++++++++++------
 .../org/apache/metamodel/jdbc/HsqldbTest.java   | 115 ++++++++++++++-----
 .../apache/metamodel/pojo/PojoDataContext.java  |   4 +-
 .../metamodel/pojo/PojoDataContextFactory.java  |  71 ++++++++++++
 ....apache.metamodel.factory.DataContextFactory |   1 +
 pom.xml                                         |  84 ++++++++++++--
 spring/pom.xml                                  |  12 --
 14 files changed, 452 insertions(+), 154 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/metamodel/blob/b08aec1d/CHANGES.md
----------------------------------------------------------------------
diff --cc CHANGES.md
index 2a5aa51,1b8361c..f5cb59a
--- a/CHANGES.md
+++ b/CHANGES.md
@@@ -7,7 -7,7 +7,8 @@@
   * [METAMODEL-1139] - Employed Java 8 functional types (java.util.function) in favor of (now deprecated) Ref, Action, Func. 
   * [METAMODEL-1140] - Allowed SalesforceDataContext without a security token.
   * [METAMODEL-1141] - Added RFC 4180 compliant CSV parsing.
 + * [METAMODEL-1144] - Optimized evaluation of conditional client-side JOIN statements.
+  * [METAMODEL-1145] - Fixed bug with modelling JDBC table relationships when there are multiple keys involved in the relationship.
  
  ### Apache MetaModel 4.6.0
  

http://git-wip-us.apache.org/repos/asf/metamodel/blob/b08aec1d/README.md
----------------------------------------------------------------------
diff --cc README.md
index e5cf17a,e5cf17a..3156d74
--- a/README.md
+++ b/README.md
@@@ -1,40 -1,40 +1,40 @@@
--## Apache MetaModel
--
--MetaModel is a data access framework, providing a common interface for exploration and querying of different types of datastores.
--
--<div>
--<img src="http://metamodel.apache.org/img/logo.png" style="float: right; margin-left: 20px;" alt="MetaModel logo" />
--</div>
--
--### Mailing lists
--
-- * Developer list:  dev@metamodel.apache.org
-- * User list:  user@metamodel.apache.org
-- * Commits list:    commits@metamodel.apache.org
--
--### Website
--
--http://metamodel.apache.org/
--
--### Documentation
--
--Please check out our [wiki for user documentation](https://cwiki.apache.org/confluence/display/METAMODEL).
--
--### Building the code
--
--MetaModel uses maven as it's build tool. Code can be built with:
--
--```
--mvn clean install
--```
--
--### Running the integration tests
--
-- 1. Copy the file 'example-metamodel-integrationtest-configuration.properties' to your user home.
-- 2. Remove the 'example-' prefix from its filename
-- 3. Modify the file to enable properties of the integration tests that you're interested in.
-- 4. Re-run "mvn clean install".
--
--### Contributing
--
++## Apache MetaModel
++
++MetaModel is a data access framework, providing a common interface for exploration and querying of different types of datastores.
++
++<div>
++<img src="http://metamodel.apache.org/img/logo.png" style="float: right; margin-left: 20px;" alt="MetaModel logo" />
++</div>
++
++### Mailing lists
++
++ * Developer list:  dev@metamodel.apache.org
++ * User list:  user@metamodel.apache.org
++ * Commits list:    commits@metamodel.apache.org
++
++### Website
++
++http://metamodel.apache.org/
++
++### Documentation
++
++Please check out our [wiki for user documentation](https://cwiki.apache.org/confluence/display/METAMODEL).
++
++### Building the code
++
++MetaModel uses maven as it's build tool. Code can be built with:
++
++```
++mvn clean install
++```
++
++### Running the integration tests
++
++ 1. Copy the file 'example-metamodel-integrationtest-configuration.properties' to your user home.
++ 2. Remove the 'example-' prefix from its filename
++ 3. Modify the file to enable properties of the integration tests that you're interested in.
++ 4. Re-run "mvn clean install".
++
++### Contributing
++
  Please see [CONTRIBUTE.md](CONTRIBUTE.md)