You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@superset.apache.org by yj...@apache.org on 2020/11/30 05:30:34 UTC

[incubator-superset] branch table-chart-new-api updated (a1e87a8 -> 0fedbfc)

This is an automated email from the ASF dual-hosted git repository.

yjc pushed a change to branch table-chart-new-api
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git.


    omit a1e87a8  refactor: migrate table chart to new API
     add f121107  ci(docker): build image on release (#11828)
     add 43c69d5  fix: Adding and removing annotations (#11811)
     add ec01691  fix: wrong key used  FEATURE_FLAGS[LISTVIEW_DEFAULT_CARD_VIEW]  (#11824)
     add 671ff6c  fix: improve Avatar text size consistency (#11837)
     add e974b0b  Change subtabs order for Data tab (#11834)
     add 19e5ce0  chore: Move styles of Dashboard and FilterScopeSelector modals to Emotion (#11779)
     new 0fedbfc  refactor: migrate table chart to new API

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (a1e87a8)
            \
             N -- N -- N   refs/heads/table-chart-new-api (0fedbfc)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .github/workflows/docker.yml                       |  9 ++-
 .github/workflows/docker_build_push.sh             |  3 +
 .../src/components/FacePile/index.tsx              | 23 +++---
 superset-frontend/src/components/ModalTrigger.jsx  |  1 +
 .../dashboard/components/DeleteComponentModal.jsx  | 80 --------------------
 .../dashboard/components/RefreshIntervalModal.jsx  | 10 ++-
 .../components/filterscope/FilterScopeModal.jsx    | 12 ++-
 .../components/filterscope/FilterScopeSelector.jsx | 24 +++++-
 .../src/dashboard/stylesheets/dashboard.less       | 36 ---------
 .../stylesheets/filter-scope-selector.less         |  7 --
 .../components/controls/AnnotationLayer.jsx        | 87 ++++++++++++----------
 .../components/controls/AnnotationLayerControl.jsx | 53 +++++++------
 superset-frontend/src/views/CRUD/data/common.ts    | 12 +--
 superset/config.py                                 |  4 +-
 14 files changed, 149 insertions(+), 212 deletions(-)
 delete mode 100644 superset-frontend/src/dashboard/components/DeleteComponentModal.jsx


[incubator-superset] 01/01: refactor: migrate table chart to new API

Posted by yj...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

yjc pushed a commit to branch table-chart-new-api
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit 0fedbfc6640d218c5349596b8b478116850c7bb4
Author: Jesse Yang <je...@airbnb.com>
AuthorDate: Thu Jul 9 09:26:58 2020 -0700

    refactor: migrate table chart to new API
---
 .../integration/dashboard/url_params.test.js       |  3 +-
 .../spec/javascripts/explore/controlUtils_spec.jsx | 12 -------
 .../spec/javascripts/explore/store_spec.jsx        |  2 --
 superset-frontend/src/explore/controlUtils.js      |  5 +--
 superset-frontend/src/explore/store.js             |  3 --
 superset/common/query_context.py                   |  2 ++
 superset/connectors/sqla/models.py                 |  6 ++--
 superset/db_engine_specs/base.py                   | 32 +++++++----------
 superset/result_set.py                             |  4 +--
 superset/utils/core.py                             | 20 +++++++++--
 tests/sqla_models_tests.py                         | 42 +++++++++++-----------
 11 files changed, 60 insertions(+), 71 deletions(-)

diff --git a/superset-frontend/cypress-base/cypress/integration/dashboard/url_params.test.js b/superset-frontend/cypress-base/cypress/integration/dashboard/url_params.test.js
index ae98be1..4300a26 100644
--- a/superset-frontend/cypress-base/cypress/integration/dashboard/url_params.test.js
+++ b/superset-frontend/cypress-base/cypress/integration/dashboard/url_params.test.js
@@ -36,7 +36,7 @@ describe('Dashboard form data', () => {
     });
   });
 
-  it('should apply url params and queryFields to slice requests', () => {
+  it('should apply url params to slice requests', () => {
     const aliases = getChartAliases(dashboard.slices);
     // wait and verify one-by-one
     cy.wait(aliases).then(requests => {
@@ -48,7 +48,6 @@ describe('Dashboard form data', () => {
           if (isLegacyResponse(responseBody)) {
             const requestFormData = xhr.request.body;
             const requestParams = JSON.parse(requestFormData.get('form_data'));
-            expect(requestParams).to.have.property('queryFields');
             expect(requestParams.url_params).deep.eq(urlParams);
           } else {
             xhr.request.body.queries.forEach(query => {
diff --git a/superset-frontend/spec/javascripts/explore/controlUtils_spec.jsx b/superset-frontend/spec/javascripts/explore/controlUtils_spec.jsx
index e52f70d..137e670 100644
--- a/superset-frontend/spec/javascripts/explore/controlUtils_spec.jsx
+++ b/superset-frontend/spec/javascripts/explore/controlUtils_spec.jsx
@@ -198,18 +198,6 @@ describe('controlUtils', () => {
     });
   });
 
-  describe('queryFields', () => {
-    it('in formData', () => {
-      const controlsState = getAllControlsState('table', 'table', {}, {});
-      const formData = getFormDataFromControls(controlsState);
-      expect(formData.queryFields).toEqual({
-        all_columns: 'columns',
-        metric: 'metrics',
-        metrics: 'metrics',
-      });
-    });
-  });
-
   describe('findControlItem', () => {
     it('find control as a string', () => {
       const controlItem = findControlItem(
diff --git a/superset-frontend/spec/javascripts/explore/store_spec.jsx b/superset-frontend/spec/javascripts/explore/store_spec.jsx
index 56293f0..c6fb98b 100644
--- a/superset-frontend/spec/javascripts/explore/store_spec.jsx
+++ b/superset-frontend/spec/javascripts/explore/store_spec.jsx
@@ -72,12 +72,10 @@ describe('store', () => {
       const inputFormData = {
         datasource: '11_table',
         viz_type: 'table',
-        queryFields: staleQueryFields,
         this_should_no_be_here: true,
       };
       const outputFormData = applyDefaultFormData(inputFormData);
       expect(outputFormData.this_should_no_be_here).toBe(undefined);
-      expect(outputFormData.queryFields).not.toBe(staleQueryFields);
     });
   });
 });
diff --git a/superset-frontend/src/explore/controlUtils.js b/superset-frontend/src/explore/controlUtils.js
index c05e4de..7b83ea6 100644
--- a/superset-frontend/src/explore/controlUtils.js
+++ b/superset-frontend/src/explore/controlUtils.js
@@ -22,13 +22,10 @@ import { expandControlConfig } from '@superset-ui/chart-controls';
 import * as SECTIONS from './controlPanels/sections';
 
 export function getFormDataFromControls(controlsState) {
-  const formData = { queryFields: {} };
+  const formData = {};
   Object.keys(controlsState).forEach(controlName => {
     const control = controlsState[controlName];
     formData[controlName] = control.value;
-    if (control.hasOwnProperty('queryField')) {
-      formData.queryFields[controlName] = control.queryField;
-    }
   });
   return formData;
 }
diff --git a/superset-frontend/src/explore/store.js b/superset-frontend/src/explore/store.js
index 083ed58..bfef052 100644
--- a/superset-frontend/src/explore/store.js
+++ b/superset-frontend/src/explore/store.js
@@ -76,9 +76,6 @@ export function applyDefaultFormData(inputFormData) {
       formData[controlName] = inputFormData[controlName];
     }
   });
-
-  // always use dynamically generated queryFields
-  formData.queryFields = controlFormData.queryFields;
   return formData;
 }
 
diff --git a/superset/common/query_context.py b/superset/common/query_context.py
index 19e6668..e203e75 100644
--- a/superset/common/query_context.py
+++ b/superset/common/query_context.py
@@ -162,6 +162,8 @@ class QueryContext:
         df = payload["df"]
         status = payload["status"]
         if status != utils.QueryStatus.FAILED:
+            payload["columns"] = list(df.columns)
+            payload["column_types"] = utils.serialize_dtypes(df.dtypes)
             payload["data"] = self.get_data(df)
         del payload["df"]
 
diff --git a/superset/connectors/sqla/models.py b/superset/connectors/sqla/models.py
index e19c1b9..f4f099a 100644
--- a/superset/connectors/sqla/models.py
+++ b/superset/connectors/sqla/models.py
@@ -189,7 +189,7 @@ class TableColumn(Model, BaseColumn):
         """
         db_engine_spec = self.table.database.db_engine_spec
         return db_engine_spec.is_db_column_type_match(
-            self.type, utils.DbColumnType.NUMERIC
+            self.type, utils.GenericColumnType.NUMERIC
         )
 
     @property
@@ -199,7 +199,7 @@ class TableColumn(Model, BaseColumn):
         """
         db_engine_spec = self.table.database.db_engine_spec
         return db_engine_spec.is_db_column_type_match(
-            self.type, utils.DbColumnType.STRING
+            self.type, utils.GenericColumnType.STRING
         )
 
     @property
@@ -214,7 +214,7 @@ class TableColumn(Model, BaseColumn):
             return self.is_dttm
         db_engine_spec = self.table.database.db_engine_spec
         return db_engine_spec.is_db_column_type_match(
-            self.type, utils.DbColumnType.TEMPORAL
+            self.type, utils.GenericColumnType.TEMPORAL
         )
 
     def get_sqla_col(self, label: Optional[str] = None) -> Column:
diff --git a/superset/db_engine_specs/base.py b/superset/db_engine_specs/base.py
index 0aee290..e86a676 100644
--- a/superset/db_engine_specs/base.py
+++ b/superset/db_engine_specs/base.py
@@ -157,34 +157,28 @@ class BaseEngineSpec:  # pylint: disable=too-many-public-methods
     max_column_name_length = 0
     try_remove_schema_from_table_name = True  # pylint: disable=invalid-name
 
-    # default matching patterns for identifying column types
-    db_column_types: Dict[utils.DbColumnType, Tuple[Pattern[Any], ...]] = {
-        utils.DbColumnType.NUMERIC: (
+    # default matching patterns to convert database specific column types to
+    # more generic types
+    db_column_types: Dict[utils.GenericColumnType, Tuple[Pattern[Any], ...]] = {
+        utils.GenericColumnType.NUMERIC: (
             re.compile(r"BIT", re.IGNORECASE),
-            re.compile(r".*DOUBLE.*", re.IGNORECASE),
-            re.compile(r".*FLOAT.*", re.IGNORECASE),
-            re.compile(r".*INT.*", re.IGNORECASE),
-            re.compile(r".*NUMBER.*", re.IGNORECASE),
+            re.compile(
+                r".*(DOUBLE|FLOAT|INT|NUMBER|REAL|NUMERIC|DECIMAL|MONEY).*",
+                re.IGNORECASE,
+            ),
             re.compile(r".*LONG$", re.IGNORECASE),
-            re.compile(r".*REAL.*", re.IGNORECASE),
-            re.compile(r".*NUMERIC.*", re.IGNORECASE),
-            re.compile(r".*DECIMAL.*", re.IGNORECASE),
-            re.compile(r".*MONEY.*", re.IGNORECASE),
         ),
-        utils.DbColumnType.STRING: (
-            re.compile(r".*CHAR.*", re.IGNORECASE),
-            re.compile(r".*STRING.*", re.IGNORECASE),
-            re.compile(r".*TEXT.*", re.IGNORECASE),
+        utils.GenericColumnType.STRING: (
+            re.compile(r".*(CHAR|STRING|TEXT).*", re.IGNORECASE),
         ),
-        utils.DbColumnType.TEMPORAL: (
-            re.compile(r".*DATE.*", re.IGNORECASE),
-            re.compile(r".*TIME.*", re.IGNORECASE),
+        utils.GenericColumnType.TEMPORAL: (
+            re.compile(r".*(DATE|TIME).*", re.IGNORECASE),
         ),
     }
 
     @classmethod
     def is_db_column_type_match(
-        cls, db_column_type: Optional[str], target_column_type: utils.DbColumnType
+        cls, db_column_type: Optional[str], target_column_type: utils.GenericColumnType
     ) -> bool:
         """
         Check if a column type satisfies a pattern in a collection of regexes found in
diff --git a/superset/result_set.py b/superset/result_set.py
index d0a325b..367b04f 100644
--- a/superset/result_set.py
+++ b/superset/result_set.py
@@ -123,7 +123,7 @@ class SupersetResultSet:
                 if pa.types.is_nested(pa_data[i].type):
                     # TODO: revisit nested column serialization once nested types
                     #  are added as a natively supported column type in Superset
-                    #  (superset.utils.core.DbColumnType).
+                    #  (superset.utils.core.GenericColumnType).
                     stringified_arr = stringify_values(array[column])
                     pa_data[i] = pa.array(stringified_arr.tolist())
 
@@ -182,7 +182,7 @@ class SupersetResultSet:
 
     def is_temporal(self, db_type_str: Optional[str]) -> bool:
         return self.db_engine_spec.is_db_column_type_match(
-            db_type_str, utils.DbColumnType.TEMPORAL
+            db_type_str, utils.GenericColumnType.TEMPORAL
         )
 
     def data_type(self, col_name: str, pa_dtype: pa.DataType) -> Optional[str]:
diff --git a/superset/utils/core.py b/superset/utils/core.py
index b4e3ef5..b1ef721 100644
--- a/superset/utils/core.py
+++ b/superset/utils/core.py
@@ -37,7 +37,7 @@ from email.mime.image import MIMEImage
 from email.mime.multipart import MIMEMultipart
 from email.mime.text import MIMEText
 from email.utils import formatdate
-from enum import Enum
+from enum import Enum, IntEnum
 from time import struct_time
 from timeit import default_timer
 from types import TracebackType
@@ -1424,6 +1424,19 @@ def get_column_names_from_metrics(metrics: List[Metric]) -> List[str]:
             columns.append(column_name)
     return columns
 
+def serialize_dtypes(dtypes: List[np.dtype]) -> List[GenericColumnType]:
+    """Serialize pandas/numpy dtypes to JavaScript types"""
+    mapping = {
+        'object': GenericColumnType.STRING,
+        'category': GenericColumnType.STRING,
+        'datetime64[ns]': GenericColumnType.TEMPORAL,
+        'int64': GenericColumnType.NUMERIC,
+        'in32': GenericColumnType.NUMERIC,
+        'float64': GenericColumnType.NUMERIC,
+        'float32': GenericColumnType.NUMERIC,
+        'bool': GenericColumnType.BOOLEAN,
+    }
+    return [mapping.get(str(x), GenericColumnType.STRING) for x in dtypes]
 
 def indexed(
     items: List[Any], key: Union[str, Callable[[Any], Any]]
@@ -1483,14 +1496,15 @@ class QuerySource(Enum):
     SQL_LAB = 2
 
 
-class DbColumnType(Enum):
+class GenericColumnType(IntEnum):
     """
-    Generic database column type
+    Generic database column type that fits both frontend and backend.
     """
 
     NUMERIC = 0
     STRING = 1
     TEMPORAL = 2
+    BOOLEAN = 3
 
 
 class QueryMode(str, LenientEnum):
diff --git a/tests/sqla_models_tests.py b/tests/sqla_models_tests.py
index f6bb4fc..4657dce 100644
--- a/tests/sqla_models_tests.py
+++ b/tests/sqla_models_tests.py
@@ -26,7 +26,7 @@ from superset.connectors.sqla.models import SqlaTable, TableColumn
 from superset.db_engine_specs.druid import DruidEngineSpec
 from superset.exceptions import QueryObjectValidationError
 from superset.models.core import Database
-from superset.utils.core import DbColumnType, get_example_database, FilterOperator
+from superset.utils.core import GenericColumnType, get_example_database, FilterOperator
 
 from .base_tests import SupersetTestCase
 
@@ -74,34 +74,34 @@ class TestDatabaseModel(SupersetTestCase):
         col.is_dttm = True
         assert col.is_temporal is True
 
-    def test_db_column_types(self):
-        test_cases: Dict[str, DbColumnType] = {
+    def test_generic_column_types(self):
+        test_cases: Dict[str, GenericColumnType] = {
             # string
-            "CHAR": DbColumnType.STRING,
-            "VARCHAR": DbColumnType.STRING,
-            "NVARCHAR": DbColumnType.STRING,
-            "STRING": DbColumnType.STRING,
-            "TEXT": DbColumnType.STRING,
-            "NTEXT": DbColumnType.STRING,
+            "CHAR": GenericColumnType.STRING,
+            "VARCHAR": GenericColumnType.STRING,
+            "NVARCHAR": GenericColumnType.STRING,
+            "STRING": GenericColumnType.STRING,
+            "TEXT": GenericColumnType.STRING,
+            "NTEXT": GenericColumnType.STRING,
             # numeric
-            "INT": DbColumnType.NUMERIC,
-            "BIGINT": DbColumnType.NUMERIC,
-            "FLOAT": DbColumnType.NUMERIC,
-            "DECIMAL": DbColumnType.NUMERIC,
-            "MONEY": DbColumnType.NUMERIC,
+            "INT": GenericColumnType.NUMERIC,
+            "BIGINT": GenericColumnType.NUMERIC,
+            "FLOAT": GenericColumnType.NUMERIC,
+            "DECIMAL": GenericColumnType.NUMERIC,
+            "MONEY": GenericColumnType.NUMERIC,
             # temporal
-            "DATE": DbColumnType.TEMPORAL,
-            "DATETIME": DbColumnType.TEMPORAL,
-            "TIME": DbColumnType.TEMPORAL,
-            "TIMESTAMP": DbColumnType.TEMPORAL,
+            "DATE": GenericColumnType.TEMPORAL,
+            "DATETIME": GenericColumnType.TEMPORAL,
+            "TIME": GenericColumnType.TEMPORAL,
+            "TIMESTAMP": GenericColumnType.TEMPORAL,
         }
 
         tbl = SqlaTable(table_name="col_type_test_tbl", database=get_example_database())
         for str_type, db_col_type in test_cases.items():
             col = TableColumn(column_name="foo", type=str_type, table=tbl)
-            self.assertEqual(col.is_temporal, db_col_type == DbColumnType.TEMPORAL)
-            self.assertEqual(col.is_numeric, db_col_type == DbColumnType.NUMERIC)
-            self.assertEqual(col.is_string, db_col_type == DbColumnType.STRING)
+            self.assertEqual(col.is_temporal, db_col_type == GenericColumnType.TEMPORAL)
+            self.assertEqual(col.is_numeric, db_col_type == GenericColumnType.NUMERIC)
+            self.assertEqual(col.is_string, db_col_type == GenericColumnType.STRING)
 
     @patch("superset.jinja_context.g")
     def test_extra_cache_keys(self, flask_g):