You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@superset.apache.org by vi...@apache.org on 2020/01/04 19:52:01 UTC

[incubator-superset] branch 0.35 updated (62c2e15 -> bb5f5fc)

This is an automated email from the ASF dual-hosted git repository.

villebro pushed a change to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git.


    from 62c2e15  Bump version to 0.35.1
     new 5493709  docs: add a note to RELEASING about Slack Channel (#8419)
     new ee35bd5  spelling fix (#8442)
     new ba8bb6b  explain the need to enable async queries (#8444)
     new e941bda  introduce a space in command line option (#8438)
     new ddeb436  Split up tests/db_engine_test.py (#8449)
     new 3e6a206  [setup] Fix, download_url (#8439)
     new 811b0e9  [Datasource Editor] A few small UI changes in modal to prevent accidental edits (#8471)
     new 87d481b  Update UPDATING.md (#8465)
     new 27612a1  [datasource editor] Only one click target for edit action (#8495)
     new 240a1c4  Default page: remove title attribute to fall back on appbuilder.app_name (#8427)
     new 8e38ee2  [Dashboard Import] Log error on dashboard import failure (#8550)
     new 7db5283  Math.max(...array) considered harmful (#8575)
     new b9941df  Fix error when templateParams is undefined (#8581)
     new f92bc2c  Bump pyarrow to 0.15.1 due to CVE (#8583)
     new 611fd02  Fix for BigQuery connection checks and CSV uploads (#8511)
     new 946f5c1  [SECURITY] bump packages with security vulnerabilities (#8573)
     new 8e47b11  fix: default missing values to zero on area chart (#8678)
     new f416f91  [fix] Druid IS NULL/IS NOT NULL filters (#8714)
     new b7cc619  Migrate filter_immune_slice_fields (#8718)
     new ff092e2  fix: don't show filter popover on explore view load (#8729)
     new 6f102bb  Bump viz plugins for bug bash (#8759)
     new bb5f5fc  [database] [log] Fix, Limit the amount of info on response (#8918)

The 22 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 RELEASING/README.md                                |   5 +
 UPDATING.md                                        |  10 +-
 docs/index.rst                                     |   2 +-
 docs/installation.rst                              |   5 +-
 requirements.txt                                   |   2 +-
 setup.py                                           |   6 +-
 .../cypress/integration/explore/control.test.js    |   3 +-
 superset/assets/package-lock.json                  | 103 +--
 superset/assets/package.json                       |   5 +-
 .../datasource/DatasourceEditor_spec.jsx           |   1 +
 superset/assets/src/SqlLab/actions/sqlLab.js       |   2 +-
 .../components/FilterableTable/FilterableTable.jsx |   7 +-
 .../assets/src/datasource/DatasourceEditor.jsx     |  52 +-
 superset/assets/src/datasource/main.css            |   9 +
 .../src/explore/components/AdhocFilterOption.jsx   |   3 +-
 .../src/explore/components/AdhocMetricOption.jsx   |   4 +-
 .../components/controls/DatasourceControl.css}     |  30 +-
 .../components/controls/DatasourceControl.jsx      |  71 +-
 superset/connectors/druid/models.py                |   4 +-
 superset/db_engine_specs/base.py                   |  11 +-
 superset/db_engine_specs/bigquery.py               |   8 +
 superset/db_engine_specs/hive.py                   |   8 +-
 superset/models/core.py                            |  12 +
 .../templates/superset/models/database/macros.html |   1 +
 superset/views/core.py                             |   5 +-
 superset/views/database/api.py                     |   2 +
 superset/views/database/views.py                   |   8 +-
 superset/views/log/api.py                          |   1 +
 superset/viz.py                                    |   7 +-
 tests/db_engine_specs/base_engine_spec_tests.py    | 204 ++++++
 .../db_engine_specs/base_tests.py                  |  17 +-
 tests/db_engine_specs/bigquery_tests.py            |  39 +
 tests/db_engine_specs/hive_tests.py                | 152 ++++
 tests/db_engine_specs/mssql_tests.py               |  71 ++
 .../db_engine_specs/mysql_tests.py                 |  32 +-
 tests/db_engine_specs/oracle_tests.py              |  36 +
 .../db_engine_specs/pinot_tests.py                 |  24 +-
 tests/db_engine_specs/postgres_tests.py            |  72 ++
 tests/db_engine_specs/presto_tests.py              | 343 +++++++++
 tests/db_engine_specs_test.py                      | 810 ---------------------
 tests/druid_func_tests.py                          |  26 +
 41 files changed, 1199 insertions(+), 1014 deletions(-)
 copy superset/assets/{stylesheets/explore.css => src/explore/components/controls/DatasourceControl.css} (76%)
 create mode 100644 tests/db_engine_specs/base_engine_spec_tests.py
 copy superset/db_engine_specs/gsheets.py => tests/db_engine_specs/base_tests.py (60%)
 create mode 100644 tests/db_engine_specs/bigquery_tests.py
 create mode 100644 tests/db_engine_specs/hive_tests.py
 create mode 100644 tests/db_engine_specs/mssql_tests.py
 copy superset/migrations/versions/a65458420354_add_result_backend_time_logging.py => tests/db_engine_specs/mysql_tests.py (59%)
 create mode 100644 tests/db_engine_specs/oracle_tests.py
 copy superset/bin/superset => tests/db_engine_specs/pinot_tests.py (56%)
 mode change 100755 => 100644
 create mode 100644 tests/db_engine_specs/postgres_tests.py
 create mode 100644 tests/db_engine_specs/presto_tests.py
 delete mode 100644 tests/db_engine_specs_test.py


[incubator-superset] 14/22: Bump pyarrow to 0.15.1 due to CVE (#8583)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit f92bc2cb8c83293badf60a6eeffbd4596060c4ae
Author: Rob DiCiuccio <ro...@users.noreply.github.com>
AuthorDate: Fri Nov 15 17:08:23 2019 +0000

    Bump pyarrow to 0.15.1 due to CVE (#8583)
---
 requirements.txt | 2 +-
 setup.py         | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/requirements.txt b/requirements.txt
index c9d0d4c..95cf101 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -58,7 +58,7 @@ pathlib2==2.3.4
 polyline==1.4.0
 prison==0.1.2             # via flask-appbuilder
 py==1.8.0                 # via retry
-pyarrow==0.14.1
+pyarrow==0.15.1
 pycparser==2.19           # via cffi
 pyjwt==1.7.1              # via flask-appbuilder, flask-jwt-extended
 pyrsistent==0.15.4        # via jsonschema
diff --git a/setup.py b/setup.py
index a07cd7b..e854876 100644
--- a/setup.py
+++ b/setup.py
@@ -97,7 +97,7 @@ setup(
         "python-dateutil",
         "python-dotenv",
         "python-geohash",
-        "pyarrow>=0.14.1, <0.15.0",
+        "pyarrow>=0.15.1, <0.16.0",
         "pyyaml>=5.1",
         "retry>=0.9.2",
         "selenium>=3.141.0",


[incubator-superset] 11/22: [Dashboard Import] Log error on dashboard import failure (#8550)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit 8e38ee28c6b63e922c88bf80a687521bad164ebe
Author: Erik Ritter <er...@airbnb.com>
AuthorDate: Wed Nov 13 09:41:48 2019 -0800

    [Dashboard Import] Log error on dashboard import failure (#8550)
---
 superset/views/core.py | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/superset/views/core.py b/superset/views/core.py
index afe5609..8ea8c0b 100755
--- a/superset/views/core.py
+++ b/superset/views/core.py
@@ -1111,7 +1111,8 @@ class Superset(BaseSupersetView):
                     ),
                     "danger",
                 )
-            except Exception:
+            except Exception as e:
+                logging.exception(e)
                 flash(
                     _(
                         "An unknown error occurred. "


[incubator-superset] 15/22: Fix for BigQuery connection checks and CSV uploads (#8511)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit 611fd02a5f372da39a6729c9c74ec2f91dbf5b45
Author: Will Barrett <wi...@preset.io>
AuthorDate: Tue Nov 19 14:50:47 2019 -0800

    Fix for BigQuery connection checks and CSV uploads (#8511)
    
    * Fix for BigQuery connection checks and CSV uploads
    
    * Don't assume encrypted_extra will be populated
    
    * Fix undefined method error
    
    * Refactor to avoid circular import strangeness
---
 superset/db_engine_specs/base.py                        | 11 +++++------
 superset/db_engine_specs/bigquery.py                    |  8 ++++++++
 superset/db_engine_specs/hive.py                        |  8 ++++----
 superset/templates/superset/models/database/macros.html |  1 +
 superset/views/core.py                                  |  1 +
 superset/views/database/views.py                        |  8 ++++++--
 6 files changed, 25 insertions(+), 12 deletions(-)

diff --git a/superset/db_engine_specs/base.py b/superset/db_engine_specs/base.py
index 7060887..91fc360 100644
--- a/superset/db_engine_specs/base.py
+++ b/superset/db_engine_specs/base.py
@@ -27,7 +27,6 @@ import sqlparse
 from flask import g
 from flask_babel import lazy_gettext as _
 from sqlalchemy import column, DateTime, select
-from sqlalchemy.engine import create_engine
 from sqlalchemy.engine.base import Engine
 from sqlalchemy.engine.interfaces import Compiled, Dialect
 from sqlalchemy.engine.reflection import Inspector
@@ -52,9 +51,6 @@ class TimeGrain(NamedTuple):  # pylint: disable=too-few-public-methods
     duration: Optional[str]
 
 
-config = app.config
-
-
 QueryStatus = utils.QueryStatus
 config = app.config
 
@@ -388,12 +384,13 @@ class BaseEngineSpec:  # pylint: disable=too-many-public-methods
         df.to_sql(**kwargs)
 
     @classmethod
-    def create_table_from_csv(cls, form) -> None:
+    def create_table_from_csv(cls, form, database) -> None:
         """
         Create table from contents of a csv. Note: this method does not create
         metadata for the table.
 
         :param form: Parameters defining how to process data
+        :param database: Database model object for the target database
         """
 
         def _allowed_file(filename: str) -> bool:
@@ -422,10 +419,12 @@ class BaseEngineSpec:  # pylint: disable=too-many-public-methods
         }
         df = cls.csv_to_df(**csv_to_df_kwargs)
 
+        engine = cls.get_engine(database)
+
         df_to_sql_kwargs = {
             "df": df,
             "name": form.name.data,
-            "con": create_engine(form.con.data.sqlalchemy_uri_decrypted, echo=False),
+            "con": engine,
             "schema": form.schema.data,
             "if_exists": form.if_exists.data,
             "index": form.index.data,
diff --git a/superset/db_engine_specs/bigquery.py b/superset/db_engine_specs/bigquery.py
index db9f4e5..4571b9d 100644
--- a/superset/db_engine_specs/bigquery.py
+++ b/superset/db_engine_specs/bigquery.py
@@ -178,6 +178,7 @@ class BigQueryEngineSpec(BaseEngineSpec):
         """
         try:
             import pandas_gbq
+            from google.oauth2 import service_account
         except ImportError:
             raise Exception(
                 "Could not import the library `pandas_gbq`, which is "
@@ -187,10 +188,17 @@ class BigQueryEngineSpec(BaseEngineSpec):
 
         if not ("name" in kwargs and "schema" in kwargs):
             raise Exception("name and schema need to be defined in kwargs")
+
         gbq_kwargs = {}
         gbq_kwargs["project_id"] = kwargs["con"].engine.url.host
         gbq_kwargs["destination_table"] = f"{kwargs.pop('schema')}.{kwargs.pop('name')}"
 
+        # add credentials if they are set on the SQLAlchemy Dialect:
+        creds = kwargs["con"].dialect.credentials_info
+        if creds:
+            credentials = service_account.Credentials.from_service_account_info(creds)
+            gbq_kwargs["credentials"] = credentials
+
         # Only pass through supported kwargs
         supported_kwarg_keys = {"if_exists"}
         for key in supported_kwarg_keys:
diff --git a/superset/db_engine_specs/hive.py b/superset/db_engine_specs/hive.py
index 1c5319e..aea9663 100644
--- a/superset/db_engine_specs/hive.py
+++ b/superset/db_engine_specs/hive.py
@@ -23,7 +23,6 @@ from typing import Any, Dict, List, Optional, Tuple
 from urllib import parse
 
 from sqlalchemy import Column
-from sqlalchemy.engine import create_engine
 from sqlalchemy.engine.base import Engine
 from sqlalchemy.engine.reflection import Inspector
 from sqlalchemy.engine.url import make_url
@@ -98,7 +97,9 @@ class HiveEngineSpec(PrestoEngineSpec):
             return []
 
     @classmethod
-    def create_table_from_csv(cls, form) -> None:  # pylint: disable=too-many-locals
+    def create_table_from_csv(  # pylint: disable=too-many-locals
+        cls, form, database
+    ) -> None:
         """Uploads a csv file and creates a superset datasource in Hive."""
 
         def convert_to_hive_type(col_type):
@@ -174,8 +175,7 @@ class HiveEngineSpec(PrestoEngineSpec):
             ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS
             TEXTFILE LOCATION '{location}'
             tblproperties ('skip.header.line.count'='1')"""
-        logging.info(form.con.data)
-        engine = create_engine(form.con.data.sqlalchemy_uri_decrypted)
+        engine = cls.get_engine(database)
         engine.execute(sql)
 
     @classmethod
diff --git a/superset/templates/superset/models/database/macros.html b/superset/templates/superset/models/database/macros.html
index 2087a37..bc4427b 100644
--- a/superset/templates/superset/models/database/macros.html
+++ b/superset/templates/superset/models/database/macros.html
@@ -40,6 +40,7 @@
           name: $('#database_name').val(),
           impersonate_user: $('#impersonate_user').is(':checked'),
           extras: JSON.parse($("#extra").val()),
+          encrypted_extra: JSON.parse($("#encrypted_extra").val()),
         })
       } catch(parse_error){
         alert("Malformed JSON in the extras field: " + parse_error);
diff --git a/superset/views/core.py b/superset/views/core.py
index 8ea8c0b..68b2c0e 100755
--- a/superset/views/core.py
+++ b/superset/views/core.py
@@ -1696,6 +1696,7 @@ class Superset(BaseSupersetView):
                 # extras is sent as json, but required to be a string in the Database model
                 extra=json.dumps(request.json.get("extras", {})),
                 impersonate_user=request.json.get("impersonate_user"),
+                encrypted_extra=json.dumps(request.json.get("encrypted_extra", {})),
             )
             database.set_sqlalchemy_uri(uri)
 
diff --git a/superset/views/database/views.py b/superset/views/database/views.py
index bc32c19..4fa7063 100644
--- a/superset/views/database/views.py
+++ b/superset/views/database/views.py
@@ -120,8 +120,12 @@ class CsvToDatabaseView(SimpleFormView):
             utils.ensure_path_exists(config["UPLOAD_FOLDER"])
             csv_file.save(path)
             table_name = form.name.data
-            database = form.data.get("con")
-            database.db_engine_spec.create_table_from_csv(form)
+
+            con = form.data.get("con")
+            database = (
+                db.session.query(models.Database).filter_by(id=con.data.get("id")).one()
+            )
+            database.db_engine_spec.create_table_from_csv(form, database)
 
             table = (
                 db.session.query(SqlaTable)


[incubator-superset] 19/22: Migrate filter_immune_slice_fields (#8718)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit b7cc619142b76a64f7213776304ac7f4117dcb7d
Author: Beto Dealmeida <ro...@dealmeida.net>
AuthorDate: Tue Dec 3 10:26:54 2019 -0800

    Migrate filter_immune_slice_fields (#8718)
---
 superset/models/core.py | 12 ++++++++++++
 1 file changed, 12 insertions(+)

diff --git a/superset/models/core.py b/superset/models/core.py
index b919c1e..1601b7d 100755
--- a/superset/models/core.py
+++ b/superset/models/core.py
@@ -571,6 +571,7 @@ class Dashboard(Model, AuditMixinNullable, ImportMixin):
         slices = copy(dashboard_to_import.slices)
         old_to_new_slc_id_dict = {}
         new_filter_immune_slices = []
+        new_filter_immune_slice_fields = {}
         new_timed_refresh_immune_slices = []
         new_expanded_slices = {}
         i_params_dict = dashboard_to_import.params_dict
@@ -597,6 +598,13 @@ class Dashboard(Model, AuditMixinNullable, ImportMixin):
             ):
                 new_filter_immune_slices.append(new_slc_id_str)
             if (
+                "filter_immune_slice_fields" in i_params_dict
+                and old_slc_id_str in i_params_dict["filter_immune_slice_fields"]
+            ):
+                new_filter_immune_slice_fields[new_slc_id_str] = i_params_dict[
+                    "filter_immune_slice_fields"
+                ][old_slc_id_str]
+            if (
                 "timed_refresh_immune_slices" in i_params_dict
                 and old_slc_id_str in i_params_dict["timed_refresh_immune_slices"]
             ):
@@ -632,6 +640,10 @@ class Dashboard(Model, AuditMixinNullable, ImportMixin):
             dashboard_to_import.alter_params(
                 filter_immune_slices=new_filter_immune_slices
             )
+        if new_filter_immune_slice_fields:
+            dashboard_to_import.alter_params(
+                filter_immune_slice_fields=new_filter_immune_slice_fields
+            )
         if new_timed_refresh_immune_slices:
             dashboard_to_import.alter_params(
                 timed_refresh_immune_slices=new_timed_refresh_immune_slices


[incubator-superset] 04/22: introduce a space in command line option (#8438)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit e941bda7aa887daba0b1c09af8529062e281faad
Author: Christoph Lingg <ch...@lingg.eu>
AuthorDate: Fri Oct 25 01:00:33 2019 +0200

    introduce a space in command line option (#8438)
    
    see https://docs.celeryproject.org/en/latest/userguide/optimizing.html
---
 docs/installation.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/installation.rst b/docs/installation.rst
index b454e1d..9c714db 100644
--- a/docs/installation.rst
+++ b/docs/installation.rst
@@ -843,7 +843,7 @@ have the same configuration.
 
 * To start a Celery worker to leverage the configuration run: ::
 
-    celery worker --app=superset.tasks.celery_app:app --pool=prefork -Ofair -c 4
+    celery worker --app=superset.tasks.celery_app:app --pool=prefork -O fair -c 4
 
 * To start a job which schedules periodic background jobs, run ::
 


[incubator-superset] 17/22: fix: default missing values to zero on area chart (#8678)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit 8e47b1120e279a996953bbbedbf7ee0531ca698f
Author: Ville Brofeldt <33...@users.noreply.github.com>
AuthorDate: Mon Dec 2 18:53:21 2019 +0200

    fix: default missing values to zero on area chart (#8678)
    
    * Add fill_value to area chart pivot
    
    * Only fill for area chart
    
    * Linting
---
 superset/viz.py | 7 ++++++-
 1 file changed, 6 insertions(+), 1 deletion(-)

diff --git a/superset/viz.py b/superset/viz.py
index 0d11a9d..d1c4ddd 100644
--- a/superset/viz.py
+++ b/superset/viz.py
@@ -1085,6 +1085,7 @@ class NVD3TimeSeriesViz(NVD3Viz):
     verbose_name = _("Time Series - Line Chart")
     sort_series = False
     is_timeseries = True
+    pivot_fill_value: Optional[int] = None
 
     def to_series(self, df, classed="", title_suffix=""):
         cols = []
@@ -1157,7 +1158,10 @@ class NVD3TimeSeriesViz(NVD3Viz):
             )
         else:
             df = df.pivot_table(
-                index=DTTM_ALIAS, columns=fd.get("groupby"), values=self.metric_labels
+                index=DTTM_ALIAS,
+                columns=fd.get("groupby"),
+                values=self.metric_labels,
+                fill_value=self.pivot_fill_value,
             )
 
         rule = fd.get("resample_rule")
@@ -1443,6 +1447,7 @@ class NVD3TimeSeriesStackedViz(NVD3TimeSeriesViz):
     viz_type = "area"
     verbose_name = _("Time Series - Stacked")
     sort_series = True
+    pivot_fill_value = 0
 
 
 class DistributionPieViz(NVD3Viz):


[incubator-superset] 22/22: [database] [log] Fix, Limit the amount of info on response (#8918)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit bb5f5fc335d270e13c96214cab4bf334cfdab8f5
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Fri Jan 3 16:35:12 2020 +0000

    [database] [log] Fix, Limit the amount of info on response (#8918)
---
 superset/views/database/api.py | 2 ++
 superset/views/log/api.py      | 1 +
 2 files changed, 3 insertions(+)

diff --git a/superset/views/database/api.py b/superset/views/database/api.py
index 54448b7..493de1d 100644
--- a/superset/views/database/api.py
+++ b/superset/views/database/api.py
@@ -52,6 +52,8 @@ class DatabaseRestApi(DatabaseMixin, ModelRestApi):
         "allows_cost_estimate",
         "backend",
     ]
+    show_columns = list_columns
+
     # Removes the local limit for the page size
     max_page_size = -1
     validators_columns = {"sqlalchemy_uri": sqlalchemy_uri_validator}
diff --git a/superset/views/log/api.py b/superset/views/log/api.py
index 0ebbd5d..8658e32 100644
--- a/superset/views/log/api.py
+++ b/superset/views/log/api.py
@@ -38,6 +38,7 @@ class LogRestApi(LogMixin, ModelRestApi):
     resource_name = "log"
     allow_browser_login = True
     list_columns = ("user.username", "action", "dttm")
+    show_columns = list_columns
 
 
 if (


[incubator-superset] 18/22: [fix] Druid IS NULL/IS NOT NULL filters (#8714)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit f416f910c1092858fef7ffdd815c2712a33a3e58
Author: John Bodley <45...@users.noreply.github.com>
AuthorDate: Mon Dec 2 20:38:42 2019 -0800

    [fix] Druid IS NULL/IS NOT NULL filters (#8714)
---
 superset/connectors/druid/models.py |  4 ++--
 tests/druid_func_tests.py           | 26 ++++++++++++++++++++++++++
 2 files changed, 28 insertions(+), 2 deletions(-)

diff --git a/superset/connectors/druid/models.py b/superset/connectors/druid/models.py
index 8165351..218fd1c 100644
--- a/superset/connectors/druid/models.py
+++ b/superset/connectors/druid/models.py
@@ -1559,9 +1559,9 @@ class DruidDatasource(Model, BaseDatasource):
                     alphaNumeric=is_numeric_col,
                 )
             elif op == "IS NULL":
-                cond = Dimension(col) is None
+                cond = Filter(dimension=col, value="")
             elif op == "IS NOT NULL":
-                cond = Dimension(col) is not None
+                cond = ~Filter(dimension=col, value="")
 
             if filters:
                 filters = Filter(type="and", fields=[cond, filters])
diff --git a/tests/druid_func_tests.py b/tests/druid_func_tests.py
index 3afbfac..601cb99 100644
--- a/tests/druid_func_tests.py
+++ b/tests/druid_func_tests.py
@@ -224,6 +224,32 @@ class DruidFuncTestCase(unittest.TestCase):
     @unittest.skipUnless(
         SupersetTestCase.is_module_installed("pydruid"), "pydruid not installed"
     )
+    def test_get_filters_is_null_filter(self):
+        filtr = {"col": "A", "op": "IS NULL"}
+        col = DruidColumn(column_name="A")
+        column_dict = {"A": col}
+        res = DruidDatasource.get_filters([filtr], [], column_dict)
+        self.assertEqual("selector", res.filter["filter"]["type"])
+        self.assertEqual("", res.filter["filter"]["value"])
+
+    @unittest.skipUnless(
+        SupersetTestCase.is_module_installed("pydruid"), "pydruid not installed"
+    )
+    def test_get_filters_is_not_null_filter(self):
+        filtr = {"col": "A", "op": "IS NOT NULL"}
+        col = DruidColumn(column_name="A")
+        column_dict = {"A": col}
+        res = DruidDatasource.get_filters([filtr], [], column_dict)
+        self.assertEqual("not", res.filter["filter"]["type"])
+        self.assertIn("field", res.filter["filter"])
+        self.assertEqual(
+            "selector", res.filter["filter"]["field"].filter["filter"]["type"]
+        )
+        self.assertEqual("", res.filter["filter"]["field"].filter["filter"]["value"])
+
+    @unittest.skipUnless(
+        SupersetTestCase.is_module_installed("pydruid"), "pydruid not installed"
+    )
     def test_get_filters_constructs_regex_filter(self):
         filtr = {"col": "A", "op": "regex", "val": "[abc]"}
         col = DruidColumn(column_name="A")


[incubator-superset] 02/22: spelling fix (#8442)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit ee35bd587394725d169bf08e6b3ea6ccfbb64584
Author: Austin Pray <au...@austinpray.com>
AuthorDate: Thu Oct 24 11:32:51 2019 -0500

    spelling fix (#8442)
---
 docs/index.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/index.rst b/docs/index.rst
index fbbe57f..a6a487d 100644
--- a/docs/index.rst
+++ b/docs/index.rst
@@ -96,7 +96,7 @@ Features
 Databases
 ---------
 
-The following RDBMS are currently suppored:
+The following RDBMS are currently supported:
 
 - `Amazon Athena <https://aws.amazon.com/athena/>`_
 - `Amazon Redshift <https://aws.amazon.com/redshift/>`_


[incubator-superset] 21/22: Bump viz plugins for bug bash (#8759)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit 6f102bb9423ffed132f99cdd355a78849f88d93b
Author: Erik Ritter <er...@airbnb.com>
AuthorDate: Wed Dec 4 17:27:21 2019 -0800

    Bump viz plugins for bug bash (#8759)
---
 superset/assets/package-lock.json | 77 +++++++++++++++++++++++----------------
 superset/assets/package.json      |  4 +-
 2 files changed, 48 insertions(+), 33 deletions(-)

diff --git a/superset/assets/package-lock.json b/superset/assets/package-lock.json
index 7f05b87..c1b1045 100644
--- a/superset/assets/package-lock.json
+++ b/superset/assets/package-lock.json
@@ -3800,9 +3800,9 @@
       }
     },
     "@superset-ui/legacy-plugin-chart-paired-t-test": {
-      "version": "0.11.0",
-      "resolved": "https://registry.npmjs.org/@superset-ui/legacy-plugin-chart-paired-t-test/-/legacy-plugin-chart-paired-t-test-0.11.0.tgz",
-      "integrity": "sha512-XQjObNWy0lKQAwRP6hO6BrvAoaWZHe1IxfsAmELB7ikCYf7HSwWjWIR86JtLWQrz2fGq/42OH2A/pzuPATogkw==",
+      "version": "0.11.11",
+      "resolved": "https://registry.npmjs.org/@superset-ui/legacy-plugin-chart-paired-t-test/-/legacy-plugin-chart-paired-t-test-0.11.11.tgz",
+      "integrity": "sha512-mBi/3/AGQ6HM2tv+0h/NrcBDvN6/PV3OfZcxOGDZmxHFbkdjwX2obxf8MgVkvTlH6GcHEn2jfvtoIXOOwmQodA==",
       "requires": {
         "distributions": "^1.0.0",
         "prop-types": "^15.6.2",
@@ -3950,11 +3950,11 @@
       }
     },
     "@superset-ui/legacy-preset-chart-nvd3": {
-      "version": "0.11.5",
-      "resolved": "https://registry.npmjs.org/@superset-ui/legacy-preset-chart-nvd3/-/legacy-preset-chart-nvd3-0.11.5.tgz",
-      "integrity": "sha512-OgcivObF02xOOgThOB4iMykjp1Q2WudFAf2H94+TPGlX4vx3opcXInvtxju44bx70ywhtlgp3oYxt1nusE/taQ==",
+      "version": "0.11.11",
+      "resolved": "https://registry.npmjs.org/@superset-ui/legacy-preset-chart-nvd3/-/legacy-preset-chart-nvd3-0.11.11.tgz",
+      "integrity": "sha512-IWNNHZDQ8l2SAOfGupDpOYok88w5bdwBEDUjXuMfM7usgWUCLeM642XRX6hYYCnnVtxdX2kfyGYnvdbC9VXU6w==",
       "requires": {
-        "@data-ui/xy-chart": "^0.0.82",
+        "@data-ui/xy-chart": "^0.0.84",
         "d3": "^3.5.17",
         "d3-tip": "^0.9.1",
         "dompurify": "^2.0.6",
@@ -3968,11 +3968,11 @@
       },
       "dependencies": {
         "@data-ui/shared": {
-          "version": "0.0.82",
-          "resolved": "https://registry.npmjs.org/@data-ui/shared/-/shared-0.0.82.tgz",
-          "integrity": "sha512-Orh1Be8Mwgjr5Q42AH9BOLZ6wV3C/gEbzsRYqkGOaCtWTYJOTmAxlmnlvZe5CAyo7QDXQHiV2Xvk90FQcQ/NYQ==",
+          "version": "0.0.84",
+          "resolved": "https://registry.npmjs.org/@data-ui/shared/-/shared-0.0.84.tgz",
+          "integrity": "sha512-MsDLsFzBHFEREr/eF2/RX1o/cXioEg+VQTsM8gViW5ywGQ7Xo5+EqUOaBSrwqKAkvp3e8PaEZVkchPC54IBhrA==",
           "requires": {
-            "@data-ui/theme": "^0.0.82",
+            "@data-ui/theme": "^0.0.84",
             "@vx/event": "^0.0.165",
             "@vx/group": "^0.0.165",
             "@vx/shape": "^0.0.168",
@@ -3998,17 +3998,17 @@
           }
         },
         "@data-ui/theme": {
-          "version": "0.0.82",
-          "resolved": "https://registry.npmjs.org/@data-ui/theme/-/theme-0.0.82.tgz",
-          "integrity": "sha512-h7lE+jRmkIfKhePvfbpHy+2TcfxTVcffxqV4LubcdzAT7pue/mR90T9NBXwidMSGs2eo2fBl2QJGNICDFT7miA=="
+          "version": "0.0.84",
+          "resolved": "https://registry.npmjs.org/@data-ui/theme/-/theme-0.0.84.tgz",
+          "integrity": "sha512-jIoHftC/5c/LVJYF4VSBjjVjrjc0yj4mLkGe8p0eVO7qUYKVvlWx7PrpM7ucyefvuAaKIwlr+Nh2xPGPdADjaA=="
         },
         "@data-ui/xy-chart": {
-          "version": "0.0.82",
-          "resolved": "https://registry.npmjs.org/@data-ui/xy-chart/-/xy-chart-0.0.82.tgz",
-          "integrity": "sha512-pzxXv38UfOIMznqO5+iaxLV8+Wra8Rw5MpKnJdOtVdkB2LSgvfRyRyvDF72FsIYIMUmrOIy7XaiEVOsoTZRa9Q==",
+          "version": "0.0.84",
+          "resolved": "https://registry.npmjs.org/@data-ui/xy-chart/-/xy-chart-0.0.84.tgz",
+          "integrity": "sha512-4mRWEGfeQJ2kFXmQ81k1gDPx2zdkty6lt0+srui4zleSyhnBv1dmm9J03dq+qwr7+bpzjfq77nINV5HXWb31Bg==",
           "requires": {
-            "@data-ui/shared": "^0.0.82",
-            "@data-ui/theme": "^0.0.82",
+            "@data-ui/shared": "^0.0.84",
+            "@data-ui/theme": "^0.0.84",
             "@vx/axis": "^0.0.175",
             "@vx/curve": "^0.0.165",
             "@vx/event": "^0.0.165",
@@ -4018,11 +4018,11 @@
             "@vx/group": "^0.0.165",
             "@vx/pattern": "^0.0.165",
             "@vx/point": "^0.0.165",
-            "@vx/responsive": "^0.0.165",
+            "@vx/responsive": "^0.0.192",
             "@vx/scale": "^0.0.165",
             "@vx/shape": "^0.0.165",
             "@vx/stats": "^0.0.165",
-            "@vx/text": "0.0.183",
+            "@vx/text": "^0.0.192",
             "@vx/threshold": "0.0.170",
             "@vx/tooltip": "^0.0.165",
             "@vx/voronoi": "^0.0.165",
@@ -4180,9 +4180,9 @@
           "integrity": "sha512-spoHilhjcWNgccrSzBUPw+PXV81tYxeyEWBkgr35aGVU4m7YT86Ywvfemwp7AVVGPn+XJHrhB0ujAhDoyqFPoA=="
         },
         "@vx/responsive": {
-          "version": "0.0.165",
-          "resolved": "https://registry.npmjs.org/@vx/responsive/-/responsive-0.0.165.tgz",
-          "integrity": "sha512-b5PYEzsjgTGuH4qN2ujghq2uKQsPGBEtOAO1791WdA0j6rr0zbVsHVmJeEhvoOg0b3xhdNN1mXAzQr4K9lDaDw==",
+          "version": "0.0.192",
+          "resolved": "https://registry.npmjs.org/@vx/responsive/-/responsive-0.0.192.tgz",
+          "integrity": "sha512-HaXVwhSJXUfRbzRV+glxsX0ki2Hi1mdpz42iuGArVQgDPJEmBHjkXyoiXU8U6v66M7FAH+OyKgtc5j2bfhyYzA==",
           "requires": {
             "lodash": "^4.17.10",
             "prop-types": "^15.6.1",
@@ -4212,16 +4212,26 @@
           }
         },
         "@vx/text": {
-          "version": "0.0.183",
-          "resolved": "https://registry.npmjs.org/@vx/text/-/text-0.0.183.tgz",
-          "integrity": "sha512-SM97C6I2Oy3FdbjM0zb2oZ8xgPskQE3r0FdGHZgq6Dk1b3lYwuW3KqdXn598BRl3iL9jfSyR6vFN9z6NV0FFww==",
+          "version": "0.0.192",
+          "resolved": "https://registry.npmjs.org/@vx/text/-/text-0.0.192.tgz",
+          "integrity": "sha512-lyy7eXfmQ8SJF7Qx+bCRcaEgvVSa18Lp6eRMo3GMANumUh9kSe7LwgqRFSdBJ85WkPqX+UOkJVyCH7AOlt0IWA==",
           "requires": {
-            "@babel/core": "^7.0.0",
-            "babel-plugin-lodash": "^3.3.2",
             "classnames": "^2.2.5",
-            "lodash": "^4.17.4",
-            "prop-types": "^15.6.2",
+            "lodash": "^4.17.15",
+            "prop-types": "^15.7.2",
             "reduce-css-calc": "^1.3.0"
+          },
+          "dependencies": {
+            "prop-types": {
+              "version": "15.7.2",
+              "resolved": "https://registry.npmjs.org/prop-types/-/prop-types-15.7.2.tgz",
+              "integrity": "sha512-8QQikdH7//R2vurIJSutZ1smHYTcLpRWEOlHnzcWHmBYrOGUysKwSsrC89BCiFj3CbrfJ/nXFdJepOVrY1GCHQ==",
+              "requires": {
+                "loose-envify": "^1.4.0",
+                "object-assign": "^4.1.1",
+                "react-is": "^16.8.1"
+              }
+            }
           }
         },
         "@vx/tooltip": {
@@ -4233,6 +4243,11 @@
             "classnames": "^2.2.5",
             "prop-types": "^15.5.10"
           }
+        },
+        "react-is": {
+          "version": "16.12.0",
+          "resolved": "https://registry.npmjs.org/react-is/-/react-is-16.12.0.tgz",
+          "integrity": "sha512-rPCkf/mWBtKc97aLL9/txD8DZdemK0vkA3JMLShjlJB3Pj3s+lpf1KaBzMfQrAmhMQB0n1cU/SUGgKKBCe837Q=="
         }
       }
     },
diff --git a/superset/assets/package.json b/superset/assets/package.json
index 5796bf7..5fdc096 100644
--- a/superset/assets/package.json
+++ b/superset/assets/package.json
@@ -64,7 +64,7 @@
     "@superset-ui/legacy-plugin-chart-iframe": "^0.11.0",
     "@superset-ui/legacy-plugin-chart-map-box": "^0.11.0",
     "@superset-ui/legacy-plugin-chart-markup": "^0.11.0",
-    "@superset-ui/legacy-plugin-chart-paired-t-test": "^0.11.0",
+    "@superset-ui/legacy-plugin-chart-paired-t-test": "^0.11.11",
     "@superset-ui/legacy-plugin-chart-parallel-coordinates": "^0.11.0",
     "@superset-ui/legacy-plugin-chart-partition": "^0.11.0",
     "@superset-ui/legacy-plugin-chart-pivot-table": "^0.11.0",
@@ -77,7 +77,7 @@
     "@superset-ui/legacy-plugin-chart-world-map": "^0.11.0",
     "@superset-ui/legacy-preset-chart-big-number": "^0.11.0",
     "@superset-ui/legacy-preset-chart-deckgl": "^0.1.0",
-    "@superset-ui/legacy-preset-chart-nvd3": "^0.11.5",
+    "@superset-ui/legacy-preset-chart-nvd3": "^0.11.11",
     "@superset-ui/number-format": "^0.12.1",
     "@superset-ui/plugin-chart-table": "^0.11.0",
     "@superset-ui/preset-chart-xy": "^0.11.0",


[incubator-superset] 20/22: fix: don't show filter popover on explore view load (#8729)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit ff092e20f53ef44a52e0a6d1b35efa9635a3595f
Author: Maxime Beauchemin <ma...@gmail.com>
AuthorDate: Mon Dec 9 21:59:53 2019 -0800

    fix: don't show filter popover on explore view load (#8729)
    
    * fix: don't show filter popover on explore view load
    
    There's this confusing "feature" that I thought was a bug that shows the
    metric popover opened when entering the explore view when the filter
    comes from an active dashboard filter, based on the "fromFormData"
    attribute of the filter.
    
    The popover is confusing and often shows as misaligned with the actual
    element it's supposed to float over when overflowing.
    
    * warn
    
    * fix cypress
    
    * also shut off  metrics
---
 superset/assets/cypress/integration/explore/control.test.js  | 3 ++-
 superset/assets/src/explore/components/AdhocFilterOption.jsx | 3 +--
 superset/assets/src/explore/components/AdhocMetricOption.jsx | 4 ++--
 3 files changed, 5 insertions(+), 5 deletions(-)

diff --git a/superset/assets/cypress/integration/explore/control.test.js b/superset/assets/cypress/integration/explore/control.test.js
index d20cb46..86f6183 100644
--- a/superset/assets/cypress/integration/explore/control.test.js
+++ b/superset/assets/cypress/integration/explore/control.test.js
@@ -164,7 +164,7 @@ describe('AdhocFilters', () => {
         .trigger('mousedown')
         .click();
     });
-
+    cy.get('.adhoc-filter-option').click({ force: true });
     cy.get('#filter-edit-popover').within(() => {
       cy.get('[data-test=adhoc-filter-simple-value]').within(() => {
         cy.get('div.select-input').click({ force: true });
@@ -197,6 +197,7 @@ describe('AdhocFilters', () => {
         .click();
     });
 
+    cy.get('.adhoc-filter-option').click({ force: true });
     cy.get('#filter-edit-popover').within(() => {
       cy.get('#adhoc-filter-edit-tabs-tab-SQL').click();
       cy.get('.ace_content').click();
diff --git a/superset/assets/src/explore/components/AdhocFilterOption.jsx b/superset/assets/src/explore/components/AdhocFilterOption.jsx
index 195bbe4..d027737 100644
--- a/superset/assets/src/explore/components/AdhocFilterOption.jsx
+++ b/superset/assets/src/explore/components/AdhocFilterOption.jsx
@@ -43,7 +43,7 @@ export default class AdhocFilterOption extends React.PureComponent {
     this.onPopoverResize = this.onPopoverResize.bind(this);
     this.onOverlayEntered = this.onOverlayEntered.bind(this);
     this.onOverlayExited = this.onOverlayExited.bind(this);
-    this.state = { overlayShown: !this.props.adhocFilter.fromFormData };
+    this.state = { overlayShown: false };
   }
 
   onPopoverResize() {
@@ -88,7 +88,6 @@ export default class AdhocFilterOption extends React.PureComponent {
         overlay={overlay}
         rootClose
         shouldUpdatePosition
-        defaultOverlayShown={!adhocFilter.fromFormData}
         onEntered={this.onOverlayEntered}
         onExited={this.onOverlayExited}
       >
diff --git a/superset/assets/src/explore/components/AdhocMetricOption.jsx b/superset/assets/src/explore/components/AdhocMetricOption.jsx
index a1ca1ed..9d80b80 100644
--- a/superset/assets/src/explore/components/AdhocMetricOption.jsx
+++ b/superset/assets/src/explore/components/AdhocMetricOption.jsx
@@ -39,7 +39,7 @@ export default class AdhocMetricOption extends React.PureComponent {
     this.onOverlayEntered = this.onOverlayEntered.bind(this);
     this.onOverlayExited = this.onOverlayExited.bind(this);
     this.onPopoverResize = this.onPopoverResize.bind(this);
-    this.state = { overlayShown: !this.props.adhocMetric.fromFormData };
+    this.state = { overlayShown: false };
   }
 
   onPopoverResize() {
@@ -47,7 +47,7 @@ export default class AdhocMetricOption extends React.PureComponent {
   }
 
   onOverlayEntered() {
-    this.setState({ overlayShown: true });
+    this.setState({ overlayShown: false });
   }
 
   onOverlayExited() {


[incubator-superset] 10/22: Default page: remove title attribute to fall back on appbuilder.app_name (#8427)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit 240a1c490ac686cfe6504094689c68157cac2a3c
Author: Juliette Tisseyre <ju...@deepomatic.com>
AuthorDate: Wed Nov 6 22:10:39 2019 +0100

    Default page: remove title attribute to fall back on appbuilder.app_name (#8427)
---
 superset/views/core.py | 1 -
 1 file changed, 1 deletion(-)

diff --git a/superset/views/core.py b/superset/views/core.py
index 7483e55..afe5609 100755
--- a/superset/views/core.py
+++ b/superset/views/core.py
@@ -2911,7 +2911,6 @@ class Superset(BaseSupersetView):
         return self.render_template(
             "superset/basic.html",
             entry="welcome",
-            title="Superset",
             bootstrap_data=json.dumps(payload, default=utils.json_iso_dttm_ser),
         )
 


[incubator-superset] 08/22: Update UPDATING.md (#8465)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit 87d481b0a0ac38dbce9bdded363b48fc2bec29d8
Author: John Bodley <45...@users.noreply.github.com>
AuthorDate: Tue Oct 29 18:01:49 2019 -0700

    Update UPDATING.md (#8465)
---
 UPDATING.md | 10 ++++++----
 1 file changed, 6 insertions(+), 4 deletions(-)

diff --git a/UPDATING.md b/UPDATING.md
index b4aa9e5..bfeb546 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -69,16 +69,18 @@ which adds missing non-nullable fields to the `datasources` table. Depending on
 the integrity of the data, manual intervention may be required.
 
 * [5452](https://github.com/apache/incubator-superset/pull/5452): a change
-which adds missing non-nullable fields and uniqueness constraints to the
-`columns`and `table_columns` tables. Depending on the integrity of the data,
-manual intervention may be required.
+which adds missing non-nullable fields and uniqueness constraints (which may be 
+case insensitive depending on your database configuration) to the `columns`and 
+`table_columns` tables. Depending on the integrity of the data, manual 
+intervention may be required.
 * `fabmanager` command line is deprecated since Flask-AppBuilder 2.0.0, use
 the new `flask fab <command>` integrated with *Flask cli*.
 * `SUPERSET_UPDATE_PERMS` environment variable was replaced by
 `FAB_UPDATE_PERMS` config boolean key. To disable automatic
 creation of permissions set `FAB_UPDATE_PERMS = False` on config.
 * [5453](https://github.com/apache/incubator-superset/pull/5453): a change
-which adds missing non-nullable fields and uniqueness constraints to the metrics
+which adds missing non-nullable fields and uniqueness constraints (which may be 
+case insensitive depending on your database configuration) to the metrics
 and sql_metrics tables. Depending on the integrity of the data, manual
 intervention may be required.
 * [7616](https://github.com/apache/incubator-superset/pull/7616): this bug fix


[incubator-superset] 13/22: Fix error when templateParams is undefined (#8581)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit b9941df132962f6d31158269a39e72f366810210
Author: Beto Dealmeida <ro...@dealmeida.net>
AuthorDate: Thu Nov 14 20:22:06 2019 -0800

    Fix error when templateParams is undefined (#8581)
---
 superset/assets/src/SqlLab/actions/sqlLab.js | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/superset/assets/src/SqlLab/actions/sqlLab.js b/superset/assets/src/SqlLab/actions/sqlLab.js
index 8f36eca..81f940b 100644
--- a/superset/assets/src/SqlLab/actions/sqlLab.js
+++ b/superset/assets/src/SqlLab/actions/sqlLab.js
@@ -177,7 +177,7 @@ export function estimateQueryCost(query) {
     dispatch({ type: COST_ESTIMATE_STARTED, query }),
     SupersetClient.post({
       endpoint,
-      postPayload: { sql, templateParams: JSON.parse(templateParams) },
+      postPayload: { sql, templateParams: JSON.parse(templateParams || '{}') },
     })
       .then(({ json }) => dispatch({ type: COST_ESTIMATE_RETURNED, query, json }))
       .catch(response =>


[incubator-superset] 16/22: [SECURITY] bump packages with security vulnerabilities (#8573)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit 946f5c17b6dc91a9e748e3aacc1fa4dc1dc13f75
Author: ʈᵃᵢ <td...@gmail.com>
AuthorDate: Tue Nov 19 23:34:13 2019 -0800

    [SECURITY] bump packages with security vulnerabilities (#8573)
    
    * bump packages with security velnerabilities
    
    * bring back cypress
    
    * remove cypress
---
 superset/assets/package-lock.json | 26 +++++++++++++-------------
 superset/assets/package.json      |  1 +
 2 files changed, 14 insertions(+), 13 deletions(-)

diff --git a/superset/assets/package-lock.json b/superset/assets/package-lock.json
index 8296a36..7f05b87 100644
--- a/superset/assets/package-lock.json
+++ b/superset/assets/package-lock.json
@@ -12253,9 +12253,9 @@
       "dev": true
     },
     "handlebars": {
-      "version": "4.3.1",
-      "resolved": "https://registry.npmjs.org/handlebars/-/handlebars-4.3.1.tgz",
-      "integrity": "sha512-c0HoNHzDiHpBt4Kqe99N8tdLPKAnGCQ73gYMPWtAYM4PwGnf7xl8PBUHJqh9ijlzt2uQKaSRxbXRt+rZ7M2/kA==",
+      "version": "4.5.2",
+      "resolved": "https://registry.npmjs.org/handlebars/-/handlebars-4.5.2.tgz",
+      "integrity": "sha512-29Zxv/cynYB7mkT1rVWQnV7mGX6v7H/miQ6dbEpYTKq5eJBN7PsRB+ViYJlcT6JINTSu4dVB9kOqEun78h6Exg==",
       "dev": true,
       "requires": {
         "neo-async": "^2.6.0",
@@ -16011,9 +16011,9 @@
       "integrity": "sha512-8xOcRHvCjnocdS5cpwXQXVzmmh5e5+saE2QGoeQmbKmRS6J3VQppPOIt0MnmE+4xlZoumy0GPG0D0MVIQbNA1A=="
     },
     "lodash-es": {
-      "version": "4.17.11",
-      "resolved": "https://registry.npmjs.org/lodash-es/-/lodash-es-4.17.11.tgz",
-      "integrity": "sha512-DHb1ub+rMjjrxqlB3H56/6MXtm1lSksDp2rA2cNWjG8mlDUYFhUj3Di2Zn5IwSU87xLv8tNIQ7sSwE/YOX/D/Q=="
+      "version": "4.17.15",
+      "resolved": "https://registry.npmjs.org/lodash-es/-/lodash-es-4.17.15.tgz",
+      "integrity": "sha512-rlrc3yU3+JNOpZ9zj5pQtxnx2THmvRykwL4Xlxoa8I9lHBlVbbyPhgyPMioxVZ4NqyxaVVtaJnzsyOidQIhyyQ=="
     },
     "lodash.curry": {
       "version": "4.1.1",
@@ -24007,20 +24007,20 @@
       "integrity": "sha512-T3PVJ6uz8i0HzPxOF9SWzWAlfN/DavlpQqepn22xgve/5QecC+XMCAtmUNnY7C9StehaV6exjUCI801lOI7QlQ=="
     },
     "uglify-js": {
-      "version": "3.6.0",
-      "resolved": "https://registry.npmjs.org/uglify-js/-/uglify-js-3.6.0.tgz",
-      "integrity": "sha512-W+jrUHJr3DXKhrsS7NUVxn3zqMOFn0hL/Ei6v0anCIMoKC93TjcflTagwIHLW7SfMFfiQuktQyFVCFHGUE0+yg==",
+      "version": "3.6.9",
+      "resolved": "https://registry.npmjs.org/uglify-js/-/uglify-js-3.6.9.tgz",
+      "integrity": "sha512-pcnnhaoG6RtrvHJ1dFncAe8Od6Nuy30oaJ82ts6//sGSXOP5UjBMEthiProjXmMNHOfd93sqlkztifFMcb+4yw==",
       "dev": true,
       "optional": true,
       "requires": {
-        "commander": "~2.20.0",
+        "commander": "~2.20.3",
         "source-map": "~0.6.1"
       },
       "dependencies": {
         "commander": {
-          "version": "2.20.0",
-          "resolved": "https://registry.npmjs.org/commander/-/commander-2.20.0.tgz",
-          "integrity": "sha512-7j2y+40w61zy6YC2iRNpUe/NwhNyoXrYpHMrSunaMG64nRnaf96zO/KMQR4OyN/UnE5KLyEBnKHd4aG3rskjpQ==",
+          "version": "2.20.3",
+          "resolved": "https://registry.npmjs.org/commander/-/commander-2.20.3.tgz",
+          "integrity": "sha512-GpVkmM8vF2vQUkj2LvZmD35JxeJOLCwJ9cUkugyk2nuhbv3+mJvpLYYt+0+USMxE+oj+ey/lJEnhZw75x/OMcQ==",
           "dev": true,
           "optional": true
         },
diff --git a/superset/assets/package.json b/superset/assets/package.json
index cefd21d..5796bf7 100644
--- a/superset/assets/package.json
+++ b/superset/assets/package.json
@@ -104,6 +104,7 @@
     "jquery": "^3.4.1",
     "json-bigint": "^0.3.0",
     "lodash": "^4.17.15",
+    "lodash-es": "^4.17.14",
     "mathjs": "^3.20.2",
     "moment": "^2.20.1",
     "mousetrap": "^1.6.1",


[incubator-superset] 07/22: [Datasource Editor] A few small UI changes in modal to prevent accidental edits (#8471)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit 811b0e935f5b2d9c500298a992a6804978f16e8f
Author: Grace Guo <gr...@airbnb.com>
AuthorDate: Tue Oct 29 15:26:11 2019 -0700

    [Datasource Editor] A few small UI changes in modal to prevent accidental edits (#8471)
---
 .../datasource/DatasourceEditor_spec.jsx           |  1 +
 .../assets/src/datasource/DatasourceEditor.jsx     | 52 ++++++++++++----------
 superset/assets/src/datasource/main.css            |  9 ++++
 .../components/controls/DatasourceControl.jsx      | 12 ++---
 4 files changed, 44 insertions(+), 30 deletions(-)

diff --git a/superset/assets/spec/javascripts/datasource/DatasourceEditor_spec.jsx b/superset/assets/spec/javascripts/datasource/DatasourceEditor_spec.jsx
index 61d2f66..c23ff43 100644
--- a/superset/assets/spec/javascripts/datasource/DatasourceEditor_spec.jsx
+++ b/superset/assets/spec/javascripts/datasource/DatasourceEditor_spec.jsx
@@ -91,6 +91,7 @@ describe('DatasourceEditor', () => {
   });
 
   it('renders isSqla fields', () => {
+    wrapper.setState({ activeTabKey: 4 });
     expect(wrapper.state('isSqla')).toBe(true);
     expect(wrapper.find(Field).find({ fieldKey: 'fetch_values_predicate' }).exists()).toBe(true);
   });
diff --git a/superset/assets/src/datasource/DatasourceEditor.jsx b/superset/assets/src/datasource/DatasourceEditor.jsx
index 10a0861..71fa9af 100644
--- a/superset/assets/src/datasource/DatasourceEditor.jsx
+++ b/superset/assets/src/datasource/DatasourceEditor.jsx
@@ -565,30 +565,20 @@ export class DatasourceEditor extends React.PureComponent {
   }
 
   render() {
-    const datasource = this.state.datasource;
+    const { datasource, activeTabKey } = this.state;
     return (
       <div className="Datasource">
         {this.renderErrors()}
         <Tabs
           id="table-tabs"
           onSelect={this.handleTabSelect}
-          defaultActiveKey={1}
+          defaultActiveKey={activeTabKey}
         >
-          <Tab eventKey={1} title={t('Settings')}>
-            {this.state.activeTabKey === 1 &&
-              <div>
-                <Col md={6}>
-                  <FormContainer>
-                    {this.renderSettingsFieldset()}
-                  </FormContainer>
-                </Col>
-                <Col md={6}>
-                  <FormContainer>
-                    {this.renderAdvancedFieldset()}
-                  </FormContainer>
-                </Col>
-              </div>
-            }
+          <Tab
+            title={<CollectionTabTitle collection={datasource.metrics} title={t('Metrics')} />}
+            eventKey={1}
+          >
+            {activeTabKey === 1 && this.renderMetricCollection()}
           </Tab>
           <Tab
             title={
@@ -596,7 +586,7 @@ export class DatasourceEditor extends React.PureComponent {
             }
             eventKey={2}
           >
-            {this.state.activeTabKey === 2 &&
+            {activeTabKey === 2 &&
               <div>
                 <ColumnCollectionTable
                   columns={this.state.databaseColumns}
@@ -623,7 +613,7 @@ export class DatasourceEditor extends React.PureComponent {
               />}
             eventKey={3}
           >
-            {this.state.activeTabKey === 3 &&
+            {activeTabKey === 3 &&
               <ColumnCollectionTable
                 columns={this.state.calculatedColumns}
                 onChange={calculatedColumns => this.setColumns({ calculatedColumns })}
@@ -641,11 +631,25 @@ export class DatasourceEditor extends React.PureComponent {
               />
             }
           </Tab>
-          <Tab
-            title={<CollectionTabTitle collection={datasource.metrics} title={t('Metrics')} />}
-            eventKey={4}
-          >
-            {this.state.activeTabKey === 4 && this.renderMetricCollection()}
+          <Tab eventKey={4} title={t('Settings')}>
+            {activeTabKey === 4 &&
+            <div>
+              <div className="change-warning well">
+                <span className="bold">{t('Be careful.')} </span>
+                {t('Changing these settings will affect all charts using this datasource, including charts owned by other people.')}
+              </div>
+              <Col md={6}>
+                <FormContainer>
+                  {this.renderSettingsFieldset()}
+                </FormContainer>
+              </Col>
+              <Col md={6}>
+                <FormContainer>
+                  {this.renderAdvancedFieldset()}
+                </FormContainer>
+              </Col>
+            </div>
+            }
           </Tab>
         </Tabs>
       </div>
diff --git a/superset/assets/src/datasource/main.css b/superset/assets/src/datasource/main.css
index f551f7b..6081433 100644
--- a/superset/assets/src/datasource/main.css
+++ b/superset/assets/src/datasource/main.css
@@ -20,3 +20,12 @@
     height: 600px;
     overflow: auto;
 }
+
+.Datasource .change-warning {
+    margin: 16px 10px 0;
+    color: #FE4A49;
+}
+
+.Datasource .change-warning .bold {
+    font-weight: bold;
+}
diff --git a/superset/assets/src/explore/components/controls/DatasourceControl.jsx b/superset/assets/src/explore/components/controls/DatasourceControl.jsx
index 910a5fd..fc04ee9 100644
--- a/superset/assets/src/explore/components/controls/DatasourceControl.jsx
+++ b/superset/assets/src/explore/components/controls/DatasourceControl.jsx
@@ -124,11 +124,11 @@ class DatasourceControl extends React.PureComponent {
           <OverlayTrigger
             placement="right"
             overlay={
-              <Tooltip id={'error-tooltip'}>{t('Click to edit the datasource')}</Tooltip>
+              <Tooltip id={'error-tooltip'}>{t('Click to change the datasource')}</Tooltip>
             }
           >
             <div className="btn-group">
-              <Label onClick={this.toggleEditDatasourceModal} className="label-btn-label">
+              <Label onClick={this.toggleChangeDatasourceModal} className="label-btn-label">
                 {datasource.name}
               </Label>
             </div>
@@ -145,9 +145,9 @@ class DatasourceControl extends React.PureComponent {
           >
             <MenuItem
               eventKey="3"
-              onClick={this.toggleEditDatasourceModal}
+              onClick={this.toggleChangeDatasourceModal}
             >
-              {t('Edit Datasource')}
+              {t('Change Datasource')}
             </MenuItem>
             {datasource.type === 'table' &&
               <MenuItem
@@ -160,9 +160,9 @@ class DatasourceControl extends React.PureComponent {
               </MenuItem>}
             <MenuItem
               eventKey="3"
-              onClick={this.toggleChangeDatasourceModal}
+              onClick={this.toggleEditDatasourceModal}
             >
-              {t('Change Datasource')}
+              {t('Edit Datasource')}
             </MenuItem>
           </DropdownButton>
           <OverlayTrigger


[incubator-superset] 01/22: docs: add a note to RELEASING about Slack Channel (#8419)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit 5493709e50e4087935fe410d0c35fa617e5c7220
Author: Maxime Beauchemin <ma...@gmail.com>
AuthorDate: Tue Oct 22 21:42:12 2019 -0700

    docs: add a note to RELEASING about Slack Channel (#8419)
---
 RELEASING/README.md | 5 +++++
 1 file changed, 5 insertions(+)

diff --git a/RELEASING/README.md b/RELEASING/README.md
index c89f745..cc2cced 100644
--- a/RELEASING/README.md
+++ b/RELEASING/README.md
@@ -22,6 +22,11 @@ under the License.
 You'll probably want to run these commands manually and understand what
 they do prior to doing so.
 
+For coordinating on releases, on more operational topics that require more
+synchronous communications, we tend to use the `#apache-releases` channel
+on the Superset Slack. People crafting releases and those interested in
+partaking in the process should join the channel.
+
 ## Release setup
 
 First you need to setup a few things. This is a one-off and doesn't


[incubator-superset] 05/22: Split up tests/db_engine_test.py (#8449)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit ddeb43677f8d95255e78f7a371b5c0d14661e2a7
Author: Will Barrett <wi...@preset.io>
AuthorDate: Thu Oct 24 20:46:45 2019 -0700

    Split up tests/db_engine_test.py (#8449)
    
    * Split up db_engine_specs_test.py into a number of targeted files
    
    * Remove db_engine_specs_test.py
    
    * isort
---
 tests/db_engine_specs/base_engine_spec_tests.py | 204 ++++++
 tests/db_engine_specs/base_tests.py             |  28 +
 tests/db_engine_specs/bigquery_tests.py         |  39 ++
 tests/db_engine_specs/hive_tests.py             | 152 +++++
 tests/db_engine_specs/mssql_tests.py            |  71 +++
 tests/db_engine_specs/mysql_tests.py            |  30 +
 tests/db_engine_specs/oracle_tests.py           |  36 ++
 tests/db_engine_specs/pinot_tests.py            |  33 +
 tests/db_engine_specs/postgres_tests.py         |  72 +++
 tests/db_engine_specs/presto_tests.py           | 343 ++++++++++
 tests/db_engine_specs_test.py                   | 810 ------------------------
 11 files changed, 1008 insertions(+), 810 deletions(-)

diff --git a/tests/db_engine_specs/base_engine_spec_tests.py b/tests/db_engine_specs/base_engine_spec_tests.py
new file mode 100644
index 0000000..13f7b67
--- /dev/null
+++ b/tests/db_engine_specs/base_engine_spec_tests.py
@@ -0,0 +1,204 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from unittest import mock
+
+from superset import app
+from superset.db_engine_specs import engines
+from superset.db_engine_specs.base import BaseEngineSpec, builtin_time_grains
+from superset.db_engine_specs.sqlite import SqliteEngineSpec
+from superset.utils.core import get_example_database
+from tests.db_engine_specs.base_tests import DbEngineSpecTestCase
+
+
+class DbEngineSpecsTests(DbEngineSpecTestCase):
+    def test_extract_limit_from_query(self, engine_spec_class=BaseEngineSpec):
+        q0 = "select * from table"
+        q1 = "select * from mytable limit 10"
+        q2 = "select * from (select * from my_subquery limit 10) where col=1 limit 20"
+        q3 = "select * from (select * from my_subquery limit 10);"
+        q4 = "select * from (select * from my_subquery limit 10) where col=1 limit 20;"
+        q5 = "select * from mytable limit 20, 10"
+        q6 = "select * from mytable limit 10 offset 20"
+        q7 = "select * from mytable limit"
+        q8 = "select * from mytable limit 10.0"
+        q9 = "select * from mytable limit x"
+        q10 = "select * from mytable limit 20, x"
+        q11 = "select * from mytable limit x offset 20"
+
+        self.assertEqual(engine_spec_class.get_limit_from_sql(q0), None)
+        self.assertEqual(engine_spec_class.get_limit_from_sql(q1), 10)
+        self.assertEqual(engine_spec_class.get_limit_from_sql(q2), 20)
+        self.assertEqual(engine_spec_class.get_limit_from_sql(q3), None)
+        self.assertEqual(engine_spec_class.get_limit_from_sql(q4), 20)
+        self.assertEqual(engine_spec_class.get_limit_from_sql(q5), 10)
+        self.assertEqual(engine_spec_class.get_limit_from_sql(q6), 10)
+        self.assertEqual(engine_spec_class.get_limit_from_sql(q7), None)
+        self.assertEqual(engine_spec_class.get_limit_from_sql(q8), None)
+        self.assertEqual(engine_spec_class.get_limit_from_sql(q9), None)
+        self.assertEqual(engine_spec_class.get_limit_from_sql(q10), None)
+        self.assertEqual(engine_spec_class.get_limit_from_sql(q11), None)
+
+    def test_wrapped_semi_tabs(self):
+        self.sql_limit_regex(
+            "SELECT * FROM a  \t \n   ; \t  \n  ", "SELECT * FROM a\nLIMIT 1000"
+        )
+
+    def test_simple_limit_query(self):
+        self.sql_limit_regex("SELECT * FROM a", "SELECT * FROM a\nLIMIT 1000")
+
+    def test_modify_limit_query(self):
+        self.sql_limit_regex("SELECT * FROM a LIMIT 9999", "SELECT * FROM a LIMIT 1000")
+
+    def test_limit_query_with_limit_subquery(self):  # pylint: disable=invalid-name
+        self.sql_limit_regex(
+            "SELECT * FROM (SELECT * FROM a LIMIT 10) LIMIT 9999",
+            "SELECT * FROM (SELECT * FROM a LIMIT 10) LIMIT 1000",
+        )
+
+    def test_limit_with_expr(self):
+        self.sql_limit_regex(
+            """
+            SELECT
+                'LIMIT 777' AS a
+                , b
+            FROM
+            table
+            LIMIT 99990""",
+            """SELECT
+                'LIMIT 777' AS a
+                , b
+            FROM
+            table
+            LIMIT 1000""",
+        )
+
+    def test_limit_expr_and_semicolon(self):
+        self.sql_limit_regex(
+            """
+                SELECT
+                    'LIMIT 777' AS a
+                    , b
+                FROM
+                table
+                LIMIT         99990            ;""",
+            """SELECT
+                    'LIMIT 777' AS a
+                    , b
+                FROM
+                table
+                LIMIT         1000""",
+        )
+
+    def test_get_datatype(self):
+        self.assertEqual("VARCHAR", BaseEngineSpec.get_datatype("VARCHAR"))
+
+    def test_limit_with_implicit_offset(self):
+        self.sql_limit_regex(
+            """
+                SELECT
+                    'LIMIT 777' AS a
+                    , b
+                FROM
+                table
+                LIMIT 99990, 999999""",
+            """SELECT
+                    'LIMIT 777' AS a
+                    , b
+                FROM
+                table
+                LIMIT 99990, 1000""",
+        )
+
+    def test_limit_with_explicit_offset(self):
+        self.sql_limit_regex(
+            """
+                SELECT
+                    'LIMIT 777' AS a
+                    , b
+                FROM
+                table
+                LIMIT 99990
+                OFFSET 999999""",
+            """SELECT
+                    'LIMIT 777' AS a
+                    , b
+                FROM
+                table
+                LIMIT 1000
+                OFFSET 999999""",
+        )
+
+    def test_limit_with_non_token_limit(self):
+        self.sql_limit_regex(
+            """SELECT 'LIMIT 777'""", """SELECT 'LIMIT 777'\nLIMIT 1000"""
+        )
+
+    def test_time_grain_blacklist(self):
+        with app.app_context():
+            app.config["TIME_GRAIN_BLACKLIST"] = ["PT1M"]
+            time_grain_functions = SqliteEngineSpec.get_time_grain_functions()
+            self.assertNotIn("PT1M", time_grain_functions)
+
+    def test_time_grain_addons(self):
+        with app.app_context():
+            app.config["TIME_GRAIN_ADDONS"] = {"PTXM": "x seconds"}
+            app.config["TIME_GRAIN_ADDON_FUNCTIONS"] = {
+                "sqlite": {"PTXM": "ABC({col})"}
+            }
+            time_grains = SqliteEngineSpec.get_time_grains()
+            time_grain_addon = time_grains[-1]
+            self.assertEqual("PTXM", time_grain_addon.duration)
+            self.assertEqual("x seconds", time_grain_addon.label)
+
+    def test_engine_time_grain_validity(self):
+        time_grains = set(builtin_time_grains.keys())
+        # loop over all subclasses of BaseEngineSpec
+        for engine in engines.values():
+            if engine is not BaseEngineSpec:
+                # make sure time grain functions have been defined
+                self.assertGreater(len(engine.get_time_grain_functions()), 0)
+                # make sure all defined time grains are supported
+                defined_grains = {grain.duration for grain in engine.get_time_grains()}
+                intersection = time_grains.intersection(defined_grains)
+                self.assertSetEqual(defined_grains, intersection, engine)
+
+    def test_get_table_names(self):
+        inspector = mock.Mock()
+        inspector.get_table_names = mock.Mock(return_value=["schema.table", "table_2"])
+        inspector.get_foreign_table_names = mock.Mock(return_value=["table_3"])
+
+        """ Make sure base engine spec removes schema name from table name
+        ie. when try_remove_schema_from_table_name == True. """
+        base_result_expected = ["table", "table_2"]
+        base_result = BaseEngineSpec.get_table_names(
+            database=mock.ANY, schema="schema", inspector=inspector
+        )
+        self.assertListEqual(base_result_expected, base_result)
+
+    def test_column_datatype_to_string(self):
+        example_db = get_example_database()
+        sqla_table = example_db.get_table("energy_usage")
+        dialect = example_db.get_dialect()
+        col_names = [
+            example_db.db_engine_spec.column_datatype_to_string(c.type, dialect)
+            for c in sqla_table.columns
+        ]
+        if example_db.backend == "postgresql":
+            expected = ["VARCHAR(255)", "VARCHAR(255)", "DOUBLE PRECISION"]
+        else:
+            expected = ["VARCHAR(255)", "VARCHAR(255)", "FLOAT"]
+        self.assertEqual(col_names, expected)
diff --git a/tests/db_engine_specs/base_tests.py b/tests/db_engine_specs/base_tests.py
new file mode 100644
index 0000000..812e6b8
--- /dev/null
+++ b/tests/db_engine_specs/base_tests.py
@@ -0,0 +1,28 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from superset.db_engine_specs.mysql import MySQLEngineSpec
+from superset.models.core import Database
+from tests.base_tests import SupersetTestCase
+
+
+class DbEngineSpecTestCase(SupersetTestCase):
+    def sql_limit_regex(
+        self, sql, expected_sql, engine_spec_class=MySQLEngineSpec, limit=1000
+    ):
+        main = Database(database_name="test_database", sqlalchemy_uri="sqlite://")
+        limited = engine_spec_class.apply_limit_to_sql(sql, limit, main)
+        self.assertEqual(expected_sql, limited)
diff --git a/tests/db_engine_specs/bigquery_tests.py b/tests/db_engine_specs/bigquery_tests.py
new file mode 100644
index 0000000..ec23e86
--- /dev/null
+++ b/tests/db_engine_specs/bigquery_tests.py
@@ -0,0 +1,39 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from sqlalchemy import column
+
+from superset.db_engine_specs.bigquery import BigQueryEngineSpec
+from tests.db_engine_specs.base_tests import DbEngineSpecTestCase
+
+
+class BigQueryTestCase(DbEngineSpecTestCase):
+    def test_bigquery_sqla_column_label(self):
+        label = BigQueryEngineSpec.make_label_compatible(column("Col").name)
+        label_expected = "Col"
+        self.assertEqual(label, label_expected)
+
+        label = BigQueryEngineSpec.make_label_compatible(column("SUM(x)").name)
+        label_expected = "SUM_x__5f110"
+        self.assertEqual(label, label_expected)
+
+        label = BigQueryEngineSpec.make_label_compatible(column("SUM[x]").name)
+        label_expected = "SUM_x__7ebe1"
+        self.assertEqual(label, label_expected)
+
+        label = BigQueryEngineSpec.make_label_compatible(column("12345_col").name)
+        label_expected = "_12345_col_8d390"
+        self.assertEqual(label, label_expected)
diff --git a/tests/db_engine_specs/hive_tests.py b/tests/db_engine_specs/hive_tests.py
new file mode 100644
index 0000000..94a474d
--- /dev/null
+++ b/tests/db_engine_specs/hive_tests.py
@@ -0,0 +1,152 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from unittest import mock
+
+from superset.db_engine_specs.hive import HiveEngineSpec
+from tests.db_engine_specs.base_tests import DbEngineSpecTestCase
+
+
+class HiveTests(DbEngineSpecTestCase):
+    def test_0_progress(self):
+        log = """
+            17/02/07 18:26:27 INFO log.PerfLogger: <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
+            17/02/07 18:26:27 INFO log.PerfLogger: <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
+        """.split(
+            "\n"
+        )
+        self.assertEqual(0, HiveEngineSpec.progress(log))
+
+    def test_number_of_jobs_progress(self):
+        log = """
+            17/02/07 19:15:55 INFO ql.Driver: Total jobs = 2
+        """.split(
+            "\n"
+        )
+        self.assertEqual(0, HiveEngineSpec.progress(log))
+
+    def test_job_1_launched_progress(self):
+        log = """
+            17/02/07 19:15:55 INFO ql.Driver: Total jobs = 2
+            17/02/07 19:15:55 INFO ql.Driver: Launching Job 1 out of 2
+        """.split(
+            "\n"
+        )
+        self.assertEqual(0, HiveEngineSpec.progress(log))
+
+    def test_job_1_launched_stage_1(self):
+        log = """
+            17/02/07 19:15:55 INFO ql.Driver: Total jobs = 2
+            17/02/07 19:15:55 INFO ql.Driver: Launching Job 1 out of 2
+            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 0%,  reduce = 0%
+        """.split(
+            "\n"
+        )
+        self.assertEqual(0, HiveEngineSpec.progress(log))
+
+    def test_job_1_launched_stage_1_map_40_progress(
+        self
+    ):  # pylint: disable=invalid-name
+        log = """
+            17/02/07 19:15:55 INFO ql.Driver: Total jobs = 2
+            17/02/07 19:15:55 INFO ql.Driver: Launching Job 1 out of 2
+            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 0%,  reduce = 0%
+            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 40%,  reduce = 0%
+        """.split(
+            "\n"
+        )
+        self.assertEqual(10, HiveEngineSpec.progress(log))
+
+    def test_job_1_launched_stage_1_map_80_reduce_40_progress(
+        self
+    ):  # pylint: disable=invalid-name
+        log = """
+            17/02/07 19:15:55 INFO ql.Driver: Total jobs = 2
+            17/02/07 19:15:55 INFO ql.Driver: Launching Job 1 out of 2
+            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 0%,  reduce = 0%
+            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 40%,  reduce = 0%
+            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 80%,  reduce = 40%
+        """.split(
+            "\n"
+        )
+        self.assertEqual(30, HiveEngineSpec.progress(log))
+
+    def test_job_1_launched_stage_2_stages_progress(
+        self
+    ):  # pylint: disable=invalid-name
+        log = """
+            17/02/07 19:15:55 INFO ql.Driver: Total jobs = 2
+            17/02/07 19:15:55 INFO ql.Driver: Launching Job 1 out of 2
+            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 0%,  reduce = 0%
+            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 40%,  reduce = 0%
+            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 80%,  reduce = 40%
+            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-2 map = 0%,  reduce = 0%
+            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 100%,  reduce = 0%
+        """.split(
+            "\n"
+        )
+        self.assertEqual(12, HiveEngineSpec.progress(log))
+
+    def test_job_2_launched_stage_2_stages_progress(
+        self
+    ):  # pylint: disable=invalid-name
+        log = """
+            17/02/07 19:15:55 INFO ql.Driver: Total jobs = 2
+            17/02/07 19:15:55 INFO ql.Driver: Launching Job 1 out of 2
+            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 100%,  reduce = 0%
+            17/02/07 19:15:55 INFO ql.Driver: Launching Job 2 out of 2
+            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 0%,  reduce = 0%
+            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 40%,  reduce = 0%
+        """.split(
+            "\n"
+        )
+        self.assertEqual(60, HiveEngineSpec.progress(log))
+
+    def test_hive_error_msg(self):
+        msg = (
+            '{...} errorMessage="Error while compiling statement: FAILED: '
+            "SemanticException [Error 10001]: Line 4"
+            ":5 Table not found 'fact_ridesfdslakj'\", statusCode=3, "
+            "sqlState='42S02', errorCode=10001)){...}"
+        )
+        self.assertEqual(
+            (
+                "hive error: Error while compiling statement: FAILED: "
+                "SemanticException [Error 10001]: Line 4:5 "
+                "Table not found 'fact_ridesfdslakj'"
+            ),
+            HiveEngineSpec.extract_error_message(Exception(msg)),
+        )
+
+        e = Exception("Some string that doesn't match the regex")
+        self.assertEqual(f"hive error: {e}", HiveEngineSpec.extract_error_message(e))
+
+        msg = (
+            "errorCode=10001, "
+            'errorMessage="Error while compiling statement"), operationHandle'
+            '=None)"'
+        )
+        self.assertEqual(
+            ("hive error: Error while compiling statement"),
+            HiveEngineSpec.extract_error_message(Exception(msg)),
+        )
+
+    def test_hive_get_view_names_return_empty_list(
+        self
+    ):  # pylint: disable=invalid-name
+        self.assertEqual(
+            [], HiveEngineSpec.get_view_names(mock.ANY, mock.ANY, mock.ANY)
+        )
diff --git a/tests/db_engine_specs/mssql_tests.py b/tests/db_engine_specs/mssql_tests.py
new file mode 100644
index 0000000..989fa8c
--- /dev/null
+++ b/tests/db_engine_specs/mssql_tests.py
@@ -0,0 +1,71 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from sqlalchemy import column, table
+from sqlalchemy.dialects import mssql
+from sqlalchemy.sql import select
+from sqlalchemy.types import String, UnicodeText
+
+from superset.db_engine_specs.mssql import MssqlEngineSpec
+from tests.db_engine_specs.base_tests import DbEngineSpecTestCase
+
+
+class MssqlEngineSpecTest(DbEngineSpecTestCase):
+    def test_mssql_column_types(self):
+        def assert_type(type_string, type_expected):
+            type_assigned = MssqlEngineSpec.get_sqla_column_type(type_string)
+            if type_expected is None:
+                self.assertIsNone(type_assigned)
+            else:
+                self.assertIsInstance(type_assigned, type_expected)
+
+        assert_type("INT", None)
+        assert_type("STRING", String)
+        assert_type("CHAR(10)", String)
+        assert_type("VARCHAR(10)", String)
+        assert_type("TEXT", String)
+        assert_type("NCHAR(10)", UnicodeText)
+        assert_type("NVARCHAR(10)", UnicodeText)
+        assert_type("NTEXT", UnicodeText)
+
+    def test_where_clause_n_prefix(self):
+        dialect = mssql.dialect()
+        spec = MssqlEngineSpec
+        str_col = column("col", type_=spec.get_sqla_column_type("VARCHAR(10)"))
+        unicode_col = column("unicode_col", type_=spec.get_sqla_column_type("NTEXT"))
+        tbl = table("tbl")
+        sel = (
+            select([str_col, unicode_col])
+            .select_from(tbl)
+            .where(str_col == "abc")
+            .where(unicode_col == "abc")
+        )
+
+        query = str(
+            sel.compile(dialect=dialect, compile_kwargs={"literal_binds": True})
+        )
+        query_expected = (
+            "SELECT col, unicode_col \n"
+            "FROM tbl \n"
+            "WHERE col = 'abc' AND unicode_col = N'abc'"
+        )
+        self.assertEqual(query, query_expected)
+
+    def test_time_exp_mixd_case_col_1y(self):
+        col = column("MixedCase")
+        expr = MssqlEngineSpec.get_timestamp_expr(col, None, "P1Y")
+        result = str(expr.compile(None, dialect=mssql.dialect()))
+        self.assertEqual(result, "DATEADD(year, DATEDIFF(year, 0, [MixedCase]), 0)")
diff --git a/tests/db_engine_specs/mysql_tests.py b/tests/db_engine_specs/mysql_tests.py
new file mode 100644
index 0000000..22205a8
--- /dev/null
+++ b/tests/db_engine_specs/mysql_tests.py
@@ -0,0 +1,30 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+import unittest
+
+from superset.db_engine_specs.mysql import MySQLEngineSpec
+from tests.db_engine_specs.base_tests import DbEngineSpecTestCase
+
+
+class MySQLEngineSpecsTestCase(DbEngineSpecTestCase):
+    @unittest.skipUnless(
+        DbEngineSpecTestCase.is_module_installed("MySQLdb"), "mysqlclient not installed"
+    )
+    def test_get_datatype_mysql(self):
+        """Tests related to datatype mapping for MySQL"""
+        self.assertEqual("TINY", MySQLEngineSpec.get_datatype(1))
+        self.assertEqual("VARCHAR", MySQLEngineSpec.get_datatype(15))
diff --git a/tests/db_engine_specs/oracle_tests.py b/tests/db_engine_specs/oracle_tests.py
new file mode 100644
index 0000000..285f616
--- /dev/null
+++ b/tests/db_engine_specs/oracle_tests.py
@@ -0,0 +1,36 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from sqlalchemy import column
+from sqlalchemy.dialects import oracle
+
+from superset.db_engine_specs.oracle import OracleEngineSpec
+from tests.db_engine_specs.base_tests import DbEngineSpecTestCase
+
+
+class OracleTestCase(DbEngineSpecTestCase):
+    def test_oracle_sqla_column_name_length_exceeded(self):
+        col = column("This_Is_32_Character_Column_Name")
+        label = OracleEngineSpec.make_label_compatible(col.name)
+        self.assertEqual(label.quote, True)
+        label_expected = "3b26974078683be078219674eeb8f5"
+        self.assertEqual(label, label_expected)
+
+    def test_oracle_time_expression_reserved_keyword_1m_grain(self):
+        col = column("decimal")
+        expr = OracleEngineSpec.get_timestamp_expr(col, None, "P1M")
+        result = str(expr.compile(dialect=oracle.dialect()))
+        self.assertEqual(result, "TRUNC(CAST(\"decimal\" as DATE), 'MONTH')")
diff --git a/tests/db_engine_specs/pinot_tests.py b/tests/db_engine_specs/pinot_tests.py
new file mode 100644
index 0000000..a96e9c1
--- /dev/null
+++ b/tests/db_engine_specs/pinot_tests.py
@@ -0,0 +1,33 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from sqlalchemy import column
+
+from superset.db_engine_specs.pinot import PinotEngineSpec
+from tests.db_engine_specs.base_tests import DbEngineSpecTestCase
+
+
+class PinotTestCase(DbEngineSpecTestCase):
+    """ Tests pertaining to our Pinot database support """
+
+    def test_pinot_time_expression_sec_one_1m_grain(self):
+        col = column("tstamp")
+        expr = PinotEngineSpec.get_timestamp_expr(col, "epoch_s", "P1M")
+        result = str(expr.compile())
+        self.assertEqual(
+            result,
+            'DATETIMECONVERT(tstamp, "1:SECONDS:EPOCH", "1:SECONDS:EPOCH", "1:MONTHS")',
+        )  # noqa
diff --git a/tests/db_engine_specs/postgres_tests.py b/tests/db_engine_specs/postgres_tests.py
new file mode 100644
index 0000000..3204c53
--- /dev/null
+++ b/tests/db_engine_specs/postgres_tests.py
@@ -0,0 +1,72 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from unittest import mock
+
+from sqlalchemy import column, literal_column
+from sqlalchemy.dialects import postgresql
+
+from superset.db_engine_specs.postgres import PostgresEngineSpec
+from tests.db_engine_specs.base_tests import DbEngineSpecTestCase
+
+
+class PostgresTests(DbEngineSpecTestCase):
+    def test_get_table_names(self):
+        """ Make sure postgres doesn't try to remove schema name from table name
+        ie. when try_remove_schema_from_table_name == False. """
+        inspector = mock.Mock()
+        inspector.get_table_names = mock.Mock(return_value=["schema.table", "table_2"])
+        inspector.get_foreign_table_names = mock.Mock(return_value=["table_3"])
+
+        pg_result_expected = ["schema.table", "table_2", "table_3"]
+        pg_result = PostgresEngineSpec.get_table_names(
+            database=mock.ANY, schema="schema", inspector=inspector
+        )
+        self.assertListEqual(pg_result_expected, pg_result)
+
+    def test_time_exp_literal_no_grain(self):
+        col = literal_column("COALESCE(a, b)")
+        expr = PostgresEngineSpec.get_timestamp_expr(col, None, None)
+        result = str(expr.compile(None, dialect=postgresql.dialect()))
+        self.assertEqual(result, "COALESCE(a, b)")
+
+    def test_time_exp_literal_1y_grain(self):
+        col = literal_column("COALESCE(a, b)")
+        expr = PostgresEngineSpec.get_timestamp_expr(col, None, "P1Y")
+        result = str(expr.compile(None, dialect=postgresql.dialect()))
+        self.assertEqual(result, "DATE_TRUNC('year', COALESCE(a, b))")
+
+    def test_time_ex_lowr_col_no_grain(self):
+        col = column("lower_case")
+        expr = PostgresEngineSpec.get_timestamp_expr(col, None, None)
+        result = str(expr.compile(None, dialect=postgresql.dialect()))
+        self.assertEqual(result, "lower_case")
+
+    def test_time_exp_lowr_col_sec_1y(self):
+        col = column("lower_case")
+        expr = PostgresEngineSpec.get_timestamp_expr(col, "epoch_s", "P1Y")
+        result = str(expr.compile(None, dialect=postgresql.dialect()))
+        self.assertEqual(
+            result,
+            "DATE_TRUNC('year', "
+            "(timestamp 'epoch' + lower_case * interval '1 second'))",
+        )
+
+    def test_time_exp_mixd_case_col_1y(self):
+        col = column("MixedCase")
+        expr = PostgresEngineSpec.get_timestamp_expr(col, None, "P1Y")
+        result = str(expr.compile(None, dialect=postgresql.dialect()))
+        self.assertEqual(result, "DATE_TRUNC('year', \"MixedCase\")")
diff --git a/tests/db_engine_specs/presto_tests.py b/tests/db_engine_specs/presto_tests.py
new file mode 100644
index 0000000..b727310
--- /dev/null
+++ b/tests/db_engine_specs/presto_tests.py
@@ -0,0 +1,343 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from unittest import mock, skipUnless
+
+import pandas as pd
+from sqlalchemy.engine.result import RowProxy
+from sqlalchemy.sql import select
+
+from superset.db_engine_specs.presto import PrestoEngineSpec
+from tests.db_engine_specs.base_tests import DbEngineSpecTestCase
+
+
+class PrestoTests(DbEngineSpecTestCase):
+    @skipUnless(
+        DbEngineSpecTestCase.is_module_installed("pyhive"), "pyhive not installed"
+    )
+    def test_get_datatype_presto(self):
+        self.assertEqual("STRING", PrestoEngineSpec.get_datatype("string"))
+
+    def test_presto_get_view_names_return_empty_list(
+        self
+    ):  # pylint: disable=invalid-name
+        self.assertEqual(
+            [], PrestoEngineSpec.get_view_names(mock.ANY, mock.ANY, mock.ANY)
+        )
+
+    def verify_presto_column(self, column, expected_results):
+        inspector = mock.Mock()
+        inspector.engine.dialect.identifier_preparer.quote_identifier = mock.Mock()
+        keymap = {
+            "Column": (None, None, 0),
+            "Type": (None, None, 1),
+            "Null": (None, None, 2),
+        }
+        row = RowProxy(mock.Mock(), column, [None, None, None, None], keymap)
+        inspector.bind.execute = mock.Mock(return_value=[row])
+        results = PrestoEngineSpec.get_columns(inspector, "", "")
+        self.assertEqual(len(expected_results), len(results))
+        for expected_result, result in zip(expected_results, results):
+            self.assertEqual(expected_result[0], result["name"])
+            self.assertEqual(expected_result[1], str(result["type"]))
+
+    def test_presto_get_column(self):
+        presto_column = ("column_name", "boolean", "")
+        expected_results = [("column_name", "BOOLEAN")]
+        self.verify_presto_column(presto_column, expected_results)
+
+    @mock.patch.dict(
+        "superset._feature_flags", {"PRESTO_EXPAND_DATA": True}, clear=True
+    )
+    def test_presto_get_simple_row_column(self):
+        presto_column = ("column_name", "row(nested_obj double)", "")
+        expected_results = [("column_name", "ROW"), ("column_name.nested_obj", "FLOAT")]
+        self.verify_presto_column(presto_column, expected_results)
+
+    @mock.patch.dict(
+        "superset._feature_flags", {"PRESTO_EXPAND_DATA": True}, clear=True
+    )
+    def test_presto_get_simple_row_column_with_name_containing_whitespace(self):
+        presto_column = ("column name", "row(nested_obj double)", "")
+        expected_results = [("column name", "ROW"), ("column name.nested_obj", "FLOAT")]
+        self.verify_presto_column(presto_column, expected_results)
+
+    @mock.patch.dict(
+        "superset._feature_flags", {"PRESTO_EXPAND_DATA": True}, clear=True
+    )
+    def test_presto_get_simple_row_column_with_tricky_nested_field_name(self):
+        presto_column = ("column_name", 'row("Field Name(Tricky, Name)" double)', "")
+        expected_results = [
+            ("column_name", "ROW"),
+            ('column_name."Field Name(Tricky, Name)"', "FLOAT"),
+        ]
+        self.verify_presto_column(presto_column, expected_results)
+
+    @mock.patch.dict(
+        "superset._feature_flags", {"PRESTO_EXPAND_DATA": True}, clear=True
+    )
+    def test_presto_get_simple_array_column(self):
+        presto_column = ("column_name", "array(double)", "")
+        expected_results = [("column_name", "ARRAY")]
+        self.verify_presto_column(presto_column, expected_results)
+
+    @mock.patch.dict(
+        "superset._feature_flags", {"PRESTO_EXPAND_DATA": True}, clear=True
+    )
+    def test_presto_get_row_within_array_within_row_column(self):
+        presto_column = (
+            "column_name",
+            "row(nested_array array(row(nested_row double)), nested_obj double)",
+            "",
+        )
+        expected_results = [
+            ("column_name", "ROW"),
+            ("column_name.nested_array", "ARRAY"),
+            ("column_name.nested_array.nested_row", "FLOAT"),
+            ("column_name.nested_obj", "FLOAT"),
+        ]
+        self.verify_presto_column(presto_column, expected_results)
+
+    @mock.patch.dict(
+        "superset._feature_flags", {"PRESTO_EXPAND_DATA": True}, clear=True
+    )
+    def test_presto_get_array_within_row_within_array_column(self):
+        presto_column = (
+            "column_name",
+            "array(row(nested_array array(double), nested_obj double))",
+            "",
+        )
+        expected_results = [
+            ("column_name", "ARRAY"),
+            ("column_name.nested_array", "ARRAY"),
+            ("column_name.nested_obj", "FLOAT"),
+        ]
+        self.verify_presto_column(presto_column, expected_results)
+
+    def test_presto_get_fields(self):
+        cols = [
+            {"name": "column"},
+            {"name": "column.nested_obj"},
+            {"name": 'column."quoted.nested obj"'},
+        ]
+        actual_results = PrestoEngineSpec._get_fields(cols)
+        expected_results = [
+            {"name": '"column"', "label": "column"},
+            {"name": '"column"."nested_obj"', "label": "column.nested_obj"},
+            {
+                "name": '"column"."quoted.nested obj"',
+                "label": 'column."quoted.nested obj"',
+            },
+        ]
+        for actual_result, expected_result in zip(actual_results, expected_results):
+            self.assertEqual(actual_result.element.name, expected_result["name"])
+            self.assertEqual(actual_result.name, expected_result["label"])
+
+    @mock.patch.dict(
+        "superset._feature_flags", {"PRESTO_EXPAND_DATA": True}, clear=True
+    )
+    def test_presto_expand_data_with_simple_structural_columns(self):
+        cols = [
+            {"name": "row_column", "type": "ROW(NESTED_OBJ VARCHAR)"},
+            {"name": "array_column", "type": "ARRAY(BIGINT)"},
+        ]
+        data = [
+            {"row_column": ["a"], "array_column": [1, 2, 3]},
+            {"row_column": ["b"], "array_column": [4, 5, 6]},
+        ]
+        actual_cols, actual_data, actual_expanded_cols = PrestoEngineSpec.expand_data(
+            cols, data
+        )
+        expected_cols = [
+            {"name": "row_column", "type": "ROW(NESTED_OBJ VARCHAR)"},
+            {"name": "row_column.nested_obj", "type": "VARCHAR"},
+            {"name": "array_column", "type": "ARRAY(BIGINT)"},
+        ]
+
+        expected_data = [
+            {"array_column": 1, "row_column": ["a"], "row_column.nested_obj": "a"},
+            {"array_column": 2, "row_column": "", "row_column.nested_obj": ""},
+            {"array_column": 3, "row_column": "", "row_column.nested_obj": ""},
+            {"array_column": 4, "row_column": ["b"], "row_column.nested_obj": "b"},
+            {"array_column": 5, "row_column": "", "row_column.nested_obj": ""},
+            {"array_column": 6, "row_column": "", "row_column.nested_obj": ""},
+        ]
+
+        expected_expanded_cols = [{"name": "row_column.nested_obj", "type": "VARCHAR"}]
+        self.assertEqual(actual_cols, expected_cols)
+        self.assertEqual(actual_data, expected_data)
+        self.assertEqual(actual_expanded_cols, expected_expanded_cols)
+
+    @mock.patch.dict(
+        "superset._feature_flags", {"PRESTO_EXPAND_DATA": True}, clear=True
+    )
+    def test_presto_expand_data_with_complex_row_columns(self):
+        cols = [
+            {
+                "name": "row_column",
+                "type": "ROW(NESTED_OBJ1 VARCHAR, NESTED_ROW ROW(NESTED_OBJ2 VARCHAR))",
+            }
+        ]
+        data = [{"row_column": ["a1", ["a2"]]}, {"row_column": ["b1", ["b2"]]}]
+        actual_cols, actual_data, actual_expanded_cols = PrestoEngineSpec.expand_data(
+            cols, data
+        )
+        expected_cols = [
+            {
+                "name": "row_column",
+                "type": "ROW(NESTED_OBJ1 VARCHAR, NESTED_ROW ROW(NESTED_OBJ2 VARCHAR))",
+            },
+            {"name": "row_column.nested_row", "type": "ROW(NESTED_OBJ2 VARCHAR)"},
+            {"name": "row_column.nested_row.nested_obj2", "type": "VARCHAR"},
+            {"name": "row_column.nested_obj1", "type": "VARCHAR"},
+        ]
+        expected_data = [
+            {
+                "row_column": ["a1", ["a2"]],
+                "row_column.nested_obj1": "a1",
+                "row_column.nested_row": ["a2"],
+                "row_column.nested_row.nested_obj2": "a2",
+            },
+            {
+                "row_column": ["b1", ["b2"]],
+                "row_column.nested_obj1": "b1",
+                "row_column.nested_row": ["b2"],
+                "row_column.nested_row.nested_obj2": "b2",
+            },
+        ]
+
+        expected_expanded_cols = [
+            {"name": "row_column.nested_obj1", "type": "VARCHAR"},
+            {"name": "row_column.nested_row", "type": "ROW(NESTED_OBJ2 VARCHAR)"},
+            {"name": "row_column.nested_row.nested_obj2", "type": "VARCHAR"},
+        ]
+        self.assertEqual(actual_cols, expected_cols)
+        self.assertEqual(actual_data, expected_data)
+        self.assertEqual(actual_expanded_cols, expected_expanded_cols)
+
+    @mock.patch.dict(
+        "superset._feature_flags", {"PRESTO_EXPAND_DATA": True}, clear=True
+    )
+    def test_presto_expand_data_with_complex_array_columns(self):
+        cols = [
+            {"name": "int_column", "type": "BIGINT"},
+            {
+                "name": "array_column",
+                "type": "ARRAY(ROW(NESTED_ARRAY ARRAY(ROW(NESTED_OBJ VARCHAR))))",
+            },
+        ]
+        data = [
+            {"int_column": 1, "array_column": [[[["a"], ["b"]]], [[["c"], ["d"]]]]},
+            {"int_column": 2, "array_column": [[[["e"], ["f"]]], [[["g"], ["h"]]]]},
+        ]
+        actual_cols, actual_data, actual_expanded_cols = PrestoEngineSpec.expand_data(
+            cols, data
+        )
+        expected_cols = [
+            {"name": "int_column", "type": "BIGINT"},
+            {
+                "name": "array_column",
+                "type": "ARRAY(ROW(NESTED_ARRAY ARRAY(ROW(NESTED_OBJ VARCHAR))))",
+            },
+            {
+                "name": "array_column.nested_array",
+                "type": "ARRAY(ROW(NESTED_OBJ VARCHAR))",
+            },
+            {"name": "array_column.nested_array.nested_obj", "type": "VARCHAR"},
+        ]
+        expected_data = [
+            {
+                "array_column": [[["a"], ["b"]]],
+                "array_column.nested_array": ["a"],
+                "array_column.nested_array.nested_obj": "a",
+                "int_column": 1,
+            },
+            {
+                "array_column": "",
+                "array_column.nested_array": ["b"],
+                "array_column.nested_array.nested_obj": "b",
+                "int_column": "",
+            },
+            {
+                "array_column": [[["c"], ["d"]]],
+                "array_column.nested_array": ["c"],
+                "array_column.nested_array.nested_obj": "c",
+                "int_column": "",
+            },
+            {
+                "array_column": "",
+                "array_column.nested_array": ["d"],
+                "array_column.nested_array.nested_obj": "d",
+                "int_column": "",
+            },
+            {
+                "array_column": [[["e"], ["f"]]],
+                "array_column.nested_array": ["e"],
+                "array_column.nested_array.nested_obj": "e",
+                "int_column": 2,
+            },
+            {
+                "array_column": "",
+                "array_column.nested_array": ["f"],
+                "array_column.nested_array.nested_obj": "f",
+                "int_column": "",
+            },
+            {
+                "array_column": [[["g"], ["h"]]],
+                "array_column.nested_array": ["g"],
+                "array_column.nested_array.nested_obj": "g",
+                "int_column": "",
+            },
+            {
+                "array_column": "",
+                "array_column.nested_array": ["h"],
+                "array_column.nested_array.nested_obj": "h",
+                "int_column": "",
+            },
+        ]
+        expected_expanded_cols = [
+            {
+                "name": "array_column.nested_array",
+                "type": "ARRAY(ROW(NESTED_OBJ VARCHAR))",
+            },
+            {"name": "array_column.nested_array.nested_obj", "type": "VARCHAR"},
+        ]
+        self.assertEqual(actual_cols, expected_cols)
+        self.assertEqual(actual_data, expected_data)
+        self.assertEqual(actual_expanded_cols, expected_expanded_cols)
+
+    def test_presto_extra_table_metadata(self):
+        db = mock.Mock()
+        db.get_indexes = mock.Mock(return_value=[{"column_names": ["ds", "hour"]}])
+        db.get_extra = mock.Mock(return_value={})
+        df = pd.DataFrame({"ds": ["01-01-19"], "hour": [1]})
+        db.get_df = mock.Mock(return_value=df)
+        PrestoEngineSpec.get_create_view = mock.Mock(return_value=None)
+        result = PrestoEngineSpec.extra_table_metadata(db, "test_table", "test_schema")
+        self.assertEqual({"ds": "01-01-19", "hour": 1}, result["partitions"]["latest"])
+
+    def test_presto_where_latest_partition(self):
+        db = mock.Mock()
+        db.get_indexes = mock.Mock(return_value=[{"column_names": ["ds", "hour"]}])
+        db.get_extra = mock.Mock(return_value={})
+        df = pd.DataFrame({"ds": ["01-01-19"], "hour": [1]})
+        db.get_df = mock.Mock(return_value=df)
+        columns = [{"name": "ds"}, {"name": "hour"}]
+        result = PrestoEngineSpec.where_latest_partition(
+            "test_table", "test_schema", db, select(), columns
+        )
+        query_result = str(result.compile(compile_kwargs={"literal_binds": True}))
+        self.assertEqual("SELECT  \nWHERE ds = '01-01-19' AND hour = 1", query_result)
diff --git a/tests/db_engine_specs_test.py b/tests/db_engine_specs_test.py
deleted file mode 100644
index 619ae4f..0000000
--- a/tests/db_engine_specs_test.py
+++ /dev/null
@@ -1,810 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-import unittest
-from unittest import mock
-
-import pandas as pd
-from sqlalchemy import column, literal_column, table
-from sqlalchemy.dialects import mssql, oracle, postgresql
-from sqlalchemy.engine.result import RowProxy
-from sqlalchemy.sql import select
-from sqlalchemy.types import String, UnicodeText
-
-from superset import app
-from superset.db_engine_specs import engines
-from superset.db_engine_specs.base import BaseEngineSpec, builtin_time_grains
-from superset.db_engine_specs.bigquery import BigQueryEngineSpec
-from superset.db_engine_specs.hive import HiveEngineSpec
-from superset.db_engine_specs.mssql import MssqlEngineSpec
-from superset.db_engine_specs.mysql import MySQLEngineSpec
-from superset.db_engine_specs.oracle import OracleEngineSpec
-from superset.db_engine_specs.pinot import PinotEngineSpec
-from superset.db_engine_specs.postgres import PostgresEngineSpec
-from superset.db_engine_specs.presto import PrestoEngineSpec
-from superset.db_engine_specs.sqlite import SqliteEngineSpec
-from superset.models.core import Database
-from superset.utils.core import get_example_database
-
-from .base_tests import SupersetTestCase
-
-
-class DbEngineSpecsTestCase(SupersetTestCase):
-    def test_0_progress(self):
-        log = """
-            17/02/07 18:26:27 INFO log.PerfLogger: <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
-            17/02/07 18:26:27 INFO log.PerfLogger: <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
-        """.split(
-            "\n"
-        )
-        self.assertEqual(0, HiveEngineSpec.progress(log))
-
-    def test_number_of_jobs_progress(self):
-        log = """
-            17/02/07 19:15:55 INFO ql.Driver: Total jobs = 2
-        """.split(
-            "\n"
-        )
-        self.assertEqual(0, HiveEngineSpec.progress(log))
-
-    def test_job_1_launched_progress(self):
-        log = """
-            17/02/07 19:15:55 INFO ql.Driver: Total jobs = 2
-            17/02/07 19:15:55 INFO ql.Driver: Launching Job 1 out of 2
-        """.split(
-            "\n"
-        )
-        self.assertEqual(0, HiveEngineSpec.progress(log))
-
-    def test_job_1_launched_stage_1_0_progress(self):
-        log = """
-            17/02/07 19:15:55 INFO ql.Driver: Total jobs = 2
-            17/02/07 19:15:55 INFO ql.Driver: Launching Job 1 out of 2
-            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 0%,  reduce = 0%
-        """.split(
-            "\n"
-        )
-        self.assertEqual(0, HiveEngineSpec.progress(log))
-
-    def test_job_1_launched_stage_1_map_40_progress(self):
-        log = """
-            17/02/07 19:15:55 INFO ql.Driver: Total jobs = 2
-            17/02/07 19:15:55 INFO ql.Driver: Launching Job 1 out of 2
-            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 0%,  reduce = 0%
-            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 40%,  reduce = 0%
-        """.split(
-            "\n"
-        )
-        self.assertEqual(10, HiveEngineSpec.progress(log))
-
-    def test_job_1_launched_stage_1_map_80_reduce_40_progress(self):
-        log = """
-            17/02/07 19:15:55 INFO ql.Driver: Total jobs = 2
-            17/02/07 19:15:55 INFO ql.Driver: Launching Job 1 out of 2
-            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 0%,  reduce = 0%
-            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 40%,  reduce = 0%
-            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 80%,  reduce = 40%
-        """.split(
-            "\n"
-        )
-        self.assertEqual(30, HiveEngineSpec.progress(log))
-
-    def test_job_1_launched_stage_2_stages_progress(self):
-        log = """
-            17/02/07 19:15:55 INFO ql.Driver: Total jobs = 2
-            17/02/07 19:15:55 INFO ql.Driver: Launching Job 1 out of 2
-            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 0%,  reduce = 0%
-            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 40%,  reduce = 0%
-            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 80%,  reduce = 40%
-            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-2 map = 0%,  reduce = 0%
-            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 100%,  reduce = 0%
-        """.split(
-            "\n"
-        )
-        self.assertEqual(12, HiveEngineSpec.progress(log))
-
-    def test_job_2_launched_stage_2_stages_progress(self):
-        log = """
-            17/02/07 19:15:55 INFO ql.Driver: Total jobs = 2
-            17/02/07 19:15:55 INFO ql.Driver: Launching Job 1 out of 2
-            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 100%,  reduce = 0%
-            17/02/07 19:15:55 INFO ql.Driver: Launching Job 2 out of 2
-            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 0%,  reduce = 0%
-            17/02/07 19:16:09 INFO exec.Task: 2017-02-07 19:16:09,173 Stage-1 map = 40%,  reduce = 0%
-        """.split(
-            "\n"
-        )
-        self.assertEqual(60, HiveEngineSpec.progress(log))
-
-    def test_hive_error_msg(self):
-        msg = (
-            '{...} errorMessage="Error while compiling statement: FAILED: '
-            "SemanticException [Error 10001]: Line 4"
-            ":5 Table not found 'fact_ridesfdslakj'\", statusCode=3, "
-            "sqlState='42S02', errorCode=10001)){...}"
-        )
-        self.assertEqual(
-            (
-                "hive error: Error while compiling statement: FAILED: "
-                "SemanticException [Error 10001]: Line 4:5 "
-                "Table not found 'fact_ridesfdslakj'"
-            ),
-            HiveEngineSpec.extract_error_message(Exception(msg)),
-        )
-
-        e = Exception("Some string that doesn't match the regex")
-        self.assertEqual(f"hive error: {e}", HiveEngineSpec.extract_error_message(e))
-
-        msg = (
-            "errorCode=10001, "
-            'errorMessage="Error while compiling statement"), operationHandle'
-            '=None)"'
-        )
-        self.assertEqual(
-            ("hive error: Error while compiling statement"),
-            HiveEngineSpec.extract_error_message(Exception(msg)),
-        )
-
-    def get_generic_database(self):
-        return Database(database_name="test_database", sqlalchemy_uri="sqlite://")
-
-    def sql_limit_regex(
-        self, sql, expected_sql, engine_spec_class=MySQLEngineSpec, limit=1000
-    ):
-        main = self.get_generic_database()
-        limited = engine_spec_class.apply_limit_to_sql(sql, limit, main)
-        self.assertEqual(expected_sql, limited)
-
-    def test_extract_limit_from_query(self, engine_spec_class=MySQLEngineSpec):
-        q0 = "select * from table"
-        q1 = "select * from mytable limit 10"
-        q2 = "select * from (select * from my_subquery limit 10) where col=1 limit 20"
-        q3 = "select * from (select * from my_subquery limit 10);"
-        q4 = "select * from (select * from my_subquery limit 10) where col=1 limit 20;"
-        q5 = "select * from mytable limit 20, 10"
-        q6 = "select * from mytable limit 10 offset 20"
-        q7 = "select * from mytable limit"
-        q8 = "select * from mytable limit 10.0"
-        q9 = "select * from mytable limit x"
-        q10 = "select * from mytable limit 20, x"
-        q11 = "select * from mytable limit x offset 20"
-
-        self.assertEqual(engine_spec_class.get_limit_from_sql(q0), None)
-        self.assertEqual(engine_spec_class.get_limit_from_sql(q1), 10)
-        self.assertEqual(engine_spec_class.get_limit_from_sql(q2), 20)
-        self.assertEqual(engine_spec_class.get_limit_from_sql(q3), None)
-        self.assertEqual(engine_spec_class.get_limit_from_sql(q4), 20)
-        self.assertEqual(engine_spec_class.get_limit_from_sql(q5), 10)
-        self.assertEqual(engine_spec_class.get_limit_from_sql(q6), 10)
-        self.assertEqual(engine_spec_class.get_limit_from_sql(q7), None)
-        self.assertEqual(engine_spec_class.get_limit_from_sql(q8), None)
-        self.assertEqual(engine_spec_class.get_limit_from_sql(q9), None)
-        self.assertEqual(engine_spec_class.get_limit_from_sql(q10), None)
-        self.assertEqual(engine_spec_class.get_limit_from_sql(q11), None)
-
-    def test_wrapped_query(self):
-        self.sql_limit_regex(
-            "SELECT * FROM a",
-            "SELECT * \nFROM (SELECT * FROM a) AS inner_qry\n LIMIT 1000 OFFSET 0",
-            MssqlEngineSpec,
-        )
-
-    @unittest.skipUnless(
-        SupersetTestCase.is_module_installed("MySQLdb"), "mysqlclient not installed"
-    )
-    def test_wrapped_semi_tabs(self):
-        self.sql_limit_regex(
-            "SELECT * FROM a  \t \n   ; \t  \n  ", "SELECT * FROM a\nLIMIT 1000"
-        )
-
-    def test_simple_limit_query(self):
-        self.sql_limit_regex("SELECT * FROM a", "SELECT * FROM a\nLIMIT 1000")
-
-    def test_modify_limit_query(self):
-        self.sql_limit_regex("SELECT * FROM a LIMIT 9999", "SELECT * FROM a LIMIT 1000")
-
-    def test_limit_query_with_limit_subquery(self):
-        self.sql_limit_regex(
-            "SELECT * FROM (SELECT * FROM a LIMIT 10) LIMIT 9999",
-            "SELECT * FROM (SELECT * FROM a LIMIT 10) LIMIT 1000",
-        )
-
-    def test_limit_with_expr(self):
-        self.sql_limit_regex(
-            """
-            SELECT
-                'LIMIT 777' AS a
-                , b
-            FROM
-            table
-            LIMIT 99990""",
-            """SELECT
-                'LIMIT 777' AS a
-                , b
-            FROM
-            table
-            LIMIT 1000""",
-        )
-
-    def test_limit_expr_and_semicolon(self):
-        self.sql_limit_regex(
-            """
-                SELECT
-                    'LIMIT 777' AS a
-                    , b
-                FROM
-                table
-                LIMIT         99990            ;""",
-            """SELECT
-                    'LIMIT 777' AS a
-                    , b
-                FROM
-                table
-                LIMIT         1000""",
-        )
-
-    @unittest.skipUnless(
-        SupersetTestCase.is_module_installed("MySQLdb"), "mysqlclient not installed"
-    )
-    def test_get_datatype_mysql(self):
-        self.assertEqual("TINY", MySQLEngineSpec.get_datatype(1))
-        self.assertEqual("VARCHAR", MySQLEngineSpec.get_datatype(15))
-
-    @unittest.skipUnless(
-        SupersetTestCase.is_module_installed("pyhive"), "pyhive not installed"
-    )
-    def test_get_datatype_presto(self):
-        self.assertEqual("STRING", PrestoEngineSpec.get_datatype("string"))
-
-    def test_get_datatype(self):
-        self.assertEqual("VARCHAR", BaseEngineSpec.get_datatype("VARCHAR"))
-
-    def test_limit_with_implicit_offset(self):
-        self.sql_limit_regex(
-            """
-                SELECT
-                    'LIMIT 777' AS a
-                    , b
-                FROM
-                table
-                LIMIT 99990, 999999""",
-            """SELECT
-                    'LIMIT 777' AS a
-                    , b
-                FROM
-                table
-                LIMIT 99990, 1000""",
-        )
-
-    def test_limit_with_explicit_offset(self):
-        self.sql_limit_regex(
-            """
-                SELECT
-                    'LIMIT 777' AS a
-                    , b
-                FROM
-                table
-                LIMIT 99990
-                OFFSET 999999""",
-            """SELECT
-                    'LIMIT 777' AS a
-                    , b
-                FROM
-                table
-                LIMIT 1000
-                OFFSET 999999""",
-        )
-
-    def test_limit_with_non_token_limit(self):
-        self.sql_limit_regex(
-            """SELECT 'LIMIT 777'""", """SELECT 'LIMIT 777'\nLIMIT 1000"""
-        )
-
-    def test_time_grain_blacklist(self):
-        with app.app_context():
-            app.config["TIME_GRAIN_BLACKLIST"] = ["PT1M"]
-            time_grain_functions = SqliteEngineSpec.get_time_grain_functions()
-            self.assertNotIn("PT1M", time_grain_functions)
-
-    def test_time_grain_addons(self):
-        with app.app_context():
-            app.config["TIME_GRAIN_ADDONS"] = {"PTXM": "x seconds"}
-            app.config["TIME_GRAIN_ADDON_FUNCTIONS"] = {
-                "sqlite": {"PTXM": "ABC({col})"}
-            }
-            time_grains = SqliteEngineSpec.get_time_grains()
-            time_grain_addon = time_grains[-1]
-            self.assertEqual("PTXM", time_grain_addon.duration)
-            self.assertEqual("x seconds", time_grain_addon.label)
-
-    def test_engine_time_grain_validity(self):
-        time_grains = set(builtin_time_grains.keys())
-        # loop over all subclasses of BaseEngineSpec
-        for engine in engines.values():
-            if engine is not BaseEngineSpec:
-                # make sure time grain functions have been defined
-                self.assertGreater(len(engine.get_time_grain_functions()), 0)
-                # make sure all defined time grains are supported
-                defined_grains = {grain.duration for grain in engine.get_time_grains()}
-                intersection = time_grains.intersection(defined_grains)
-                self.assertSetEqual(defined_grains, intersection, engine)
-
-    def test_presto_get_view_names_return_empty_list(self):
-        self.assertEqual(
-            [], PrestoEngineSpec.get_view_names(mock.ANY, mock.ANY, mock.ANY)
-        )
-
-    def verify_presto_column(self, column, expected_results):
-        inspector = mock.Mock()
-        inspector.engine.dialect.identifier_preparer.quote_identifier = mock.Mock()
-        keymap = {
-            "Column": (None, None, 0),
-            "Type": (None, None, 1),
-            "Null": (None, None, 2),
-        }
-        row = RowProxy(mock.Mock(), column, [None, None, None, None], keymap)
-        inspector.bind.execute = mock.Mock(return_value=[row])
-        results = PrestoEngineSpec.get_columns(inspector, "", "")
-        self.assertEqual(len(expected_results), len(results))
-        for expected_result, result in zip(expected_results, results):
-            self.assertEqual(expected_result[0], result["name"])
-            self.assertEqual(expected_result[1], str(result["type"]))
-
-    def test_presto_get_column(self):
-        presto_column = ("column_name", "boolean", "")
-        expected_results = [("column_name", "BOOLEAN")]
-        self.verify_presto_column(presto_column, expected_results)
-
-    @mock.patch.dict(
-        "superset._feature_flags", {"PRESTO_EXPAND_DATA": True}, clear=True
-    )
-    def test_presto_get_simple_row_column(self):
-        presto_column = ("column_name", "row(nested_obj double)", "")
-        expected_results = [("column_name", "ROW"), ("column_name.nested_obj", "FLOAT")]
-        self.verify_presto_column(presto_column, expected_results)
-
-    @mock.patch.dict(
-        "superset._feature_flags", {"PRESTO_EXPAND_DATA": True}, clear=True
-    )
-    def test_presto_get_simple_row_column_with_name_containing_whitespace(self):
-        presto_column = ("column name", "row(nested_obj double)", "")
-        expected_results = [("column name", "ROW"), ("column name.nested_obj", "FLOAT")]
-        self.verify_presto_column(presto_column, expected_results)
-
-    @mock.patch.dict(
-        "superset._feature_flags", {"PRESTO_EXPAND_DATA": True}, clear=True
-    )
-    def test_presto_get_simple_row_column_with_tricky_nested_field_name(self):
-        presto_column = ("column_name", 'row("Field Name(Tricky, Name)" double)', "")
-        expected_results = [
-            ("column_name", "ROW"),
-            ('column_name."Field Name(Tricky, Name)"', "FLOAT"),
-        ]
-        self.verify_presto_column(presto_column, expected_results)
-
-    @mock.patch.dict(
-        "superset._feature_flags", {"PRESTO_EXPAND_DATA": True}, clear=True
-    )
-    def test_presto_get_simple_array_column(self):
-        presto_column = ("column_name", "array(double)", "")
-        expected_results = [("column_name", "ARRAY")]
-        self.verify_presto_column(presto_column, expected_results)
-
-    @mock.patch.dict(
-        "superset._feature_flags", {"PRESTO_EXPAND_DATA": True}, clear=True
-    )
-    def test_presto_get_row_within_array_within_row_column(self):
-        presto_column = (
-            "column_name",
-            "row(nested_array array(row(nested_row double)), nested_obj double)",
-            "",
-        )
-        expected_results = [
-            ("column_name", "ROW"),
-            ("column_name.nested_array", "ARRAY"),
-            ("column_name.nested_array.nested_row", "FLOAT"),
-            ("column_name.nested_obj", "FLOAT"),
-        ]
-        self.verify_presto_column(presto_column, expected_results)
-
-    @mock.patch.dict(
-        "superset._feature_flags", {"PRESTO_EXPAND_DATA": True}, clear=True
-    )
-    def test_presto_get_array_within_row_within_array_column(self):
-        presto_column = (
-            "column_name",
-            "array(row(nested_array array(double), nested_obj double))",
-            "",
-        )
-        expected_results = [
-            ("column_name", "ARRAY"),
-            ("column_name.nested_array", "ARRAY"),
-            ("column_name.nested_obj", "FLOAT"),
-        ]
-        self.verify_presto_column(presto_column, expected_results)
-
-    def test_presto_get_fields(self):
-        cols = [
-            {"name": "column"},
-            {"name": "column.nested_obj"},
-            {"name": 'column."quoted.nested obj"'},
-        ]
-        actual_results = PrestoEngineSpec._get_fields(cols)
-        expected_results = [
-            {"name": '"column"', "label": "column"},
-            {"name": '"column"."nested_obj"', "label": "column.nested_obj"},
-            {
-                "name": '"column"."quoted.nested obj"',
-                "label": 'column."quoted.nested obj"',
-            },
-        ]
-        for actual_result, expected_result in zip(actual_results, expected_results):
-            self.assertEqual(actual_result.element.name, expected_result["name"])
-            self.assertEqual(actual_result.name, expected_result["label"])
-
-    @mock.patch.dict(
-        "superset._feature_flags", {"PRESTO_EXPAND_DATA": True}, clear=True
-    )
-    def test_presto_expand_data_with_simple_structural_columns(self):
-        cols = [
-            {"name": "row_column", "type": "ROW(NESTED_OBJ VARCHAR)"},
-            {"name": "array_column", "type": "ARRAY(BIGINT)"},
-        ]
-        data = [
-            {"row_column": ["a"], "array_column": [1, 2, 3]},
-            {"row_column": ["b"], "array_column": [4, 5, 6]},
-        ]
-        actual_cols, actual_data, actual_expanded_cols = PrestoEngineSpec.expand_data(
-            cols, data
-        )
-        expected_cols = [
-            {"name": "row_column", "type": "ROW(NESTED_OBJ VARCHAR)"},
-            {"name": "row_column.nested_obj", "type": "VARCHAR"},
-            {"name": "array_column", "type": "ARRAY(BIGINT)"},
-        ]
-
-        expected_data = [
-            {"array_column": 1, "row_column": ["a"], "row_column.nested_obj": "a"},
-            {"array_column": 2, "row_column": "", "row_column.nested_obj": ""},
-            {"array_column": 3, "row_column": "", "row_column.nested_obj": ""},
-            {"array_column": 4, "row_column": ["b"], "row_column.nested_obj": "b"},
-            {"array_column": 5, "row_column": "", "row_column.nested_obj": ""},
-            {"array_column": 6, "row_column": "", "row_column.nested_obj": ""},
-        ]
-
-        expected_expanded_cols = [{"name": "row_column.nested_obj", "type": "VARCHAR"}]
-        self.assertEqual(actual_cols, expected_cols)
-        self.assertEqual(actual_data, expected_data)
-        self.assertEqual(actual_expanded_cols, expected_expanded_cols)
-
-    @mock.patch.dict(
-        "superset._feature_flags", {"PRESTO_EXPAND_DATA": True}, clear=True
-    )
-    def test_presto_expand_data_with_complex_row_columns(self):
-        cols = [
-            {
-                "name": "row_column",
-                "type": "ROW(NESTED_OBJ1 VARCHAR, NESTED_ROW ROW(NESTED_OBJ2 VARCHAR))",
-            }
-        ]
-        data = [{"row_column": ["a1", ["a2"]]}, {"row_column": ["b1", ["b2"]]}]
-        actual_cols, actual_data, actual_expanded_cols = PrestoEngineSpec.expand_data(
-            cols, data
-        )
-        expected_cols = [
-            {
-                "name": "row_column",
-                "type": "ROW(NESTED_OBJ1 VARCHAR, NESTED_ROW ROW(NESTED_OBJ2 VARCHAR))",
-            },
-            {"name": "row_column.nested_row", "type": "ROW(NESTED_OBJ2 VARCHAR)"},
-            {"name": "row_column.nested_row.nested_obj2", "type": "VARCHAR"},
-            {"name": "row_column.nested_obj1", "type": "VARCHAR"},
-        ]
-        expected_data = [
-            {
-                "row_column": ["a1", ["a2"]],
-                "row_column.nested_obj1": "a1",
-                "row_column.nested_row": ["a2"],
-                "row_column.nested_row.nested_obj2": "a2",
-            },
-            {
-                "row_column": ["b1", ["b2"]],
-                "row_column.nested_obj1": "b1",
-                "row_column.nested_row": ["b2"],
-                "row_column.nested_row.nested_obj2": "b2",
-            },
-        ]
-
-        expected_expanded_cols = [
-            {"name": "row_column.nested_obj1", "type": "VARCHAR"},
-            {"name": "row_column.nested_row", "type": "ROW(NESTED_OBJ2 VARCHAR)"},
-            {"name": "row_column.nested_row.nested_obj2", "type": "VARCHAR"},
-        ]
-        self.assertEqual(actual_cols, expected_cols)
-        self.assertEqual(actual_data, expected_data)
-        self.assertEqual(actual_expanded_cols, expected_expanded_cols)
-
-    @mock.patch.dict(
-        "superset._feature_flags", {"PRESTO_EXPAND_DATA": True}, clear=True
-    )
-    def test_presto_expand_data_with_complex_array_columns(self):
-        cols = [
-            {"name": "int_column", "type": "BIGINT"},
-            {
-                "name": "array_column",
-                "type": "ARRAY(ROW(NESTED_ARRAY ARRAY(ROW(NESTED_OBJ VARCHAR))))",
-            },
-        ]
-        data = [
-            {"int_column": 1, "array_column": [[[["a"], ["b"]]], [[["c"], ["d"]]]]},
-            {"int_column": 2, "array_column": [[[["e"], ["f"]]], [[["g"], ["h"]]]]},
-        ]
-        actual_cols, actual_data, actual_expanded_cols = PrestoEngineSpec.expand_data(
-            cols, data
-        )
-        expected_cols = [
-            {"name": "int_column", "type": "BIGINT"},
-            {
-                "name": "array_column",
-                "type": "ARRAY(ROW(NESTED_ARRAY ARRAY(ROW(NESTED_OBJ VARCHAR))))",
-            },
-            {
-                "name": "array_column.nested_array",
-                "type": "ARRAY(ROW(NESTED_OBJ VARCHAR))",
-            },
-            {"name": "array_column.nested_array.nested_obj", "type": "VARCHAR"},
-        ]
-        expected_data = [
-            {
-                "array_column": [[["a"], ["b"]]],
-                "array_column.nested_array": ["a"],
-                "array_column.nested_array.nested_obj": "a",
-                "int_column": 1,
-            },
-            {
-                "array_column": "",
-                "array_column.nested_array": ["b"],
-                "array_column.nested_array.nested_obj": "b",
-                "int_column": "",
-            },
-            {
-                "array_column": [[["c"], ["d"]]],
-                "array_column.nested_array": ["c"],
-                "array_column.nested_array.nested_obj": "c",
-                "int_column": "",
-            },
-            {
-                "array_column": "",
-                "array_column.nested_array": ["d"],
-                "array_column.nested_array.nested_obj": "d",
-                "int_column": "",
-            },
-            {
-                "array_column": [[["e"], ["f"]]],
-                "array_column.nested_array": ["e"],
-                "array_column.nested_array.nested_obj": "e",
-                "int_column": 2,
-            },
-            {
-                "array_column": "",
-                "array_column.nested_array": ["f"],
-                "array_column.nested_array.nested_obj": "f",
-                "int_column": "",
-            },
-            {
-                "array_column": [[["g"], ["h"]]],
-                "array_column.nested_array": ["g"],
-                "array_column.nested_array.nested_obj": "g",
-                "int_column": "",
-            },
-            {
-                "array_column": "",
-                "array_column.nested_array": ["h"],
-                "array_column.nested_array.nested_obj": "h",
-                "int_column": "",
-            },
-        ]
-        expected_expanded_cols = [
-            {
-                "name": "array_column.nested_array",
-                "type": "ARRAY(ROW(NESTED_OBJ VARCHAR))",
-            },
-            {"name": "array_column.nested_array.nested_obj", "type": "VARCHAR"},
-        ]
-        self.assertEqual(actual_cols, expected_cols)
-        self.assertEqual(actual_data, expected_data)
-        self.assertEqual(actual_expanded_cols, expected_expanded_cols)
-
-    def test_presto_extra_table_metadata(self):
-        db = mock.Mock()
-        db.get_indexes = mock.Mock(return_value=[{"column_names": ["ds", "hour"]}])
-        db.get_extra = mock.Mock(return_value={})
-        df = pd.DataFrame({"ds": ["01-01-19"], "hour": [1]})
-        db.get_df = mock.Mock(return_value=df)
-        PrestoEngineSpec.get_create_view = mock.Mock(return_value=None)
-        result = PrestoEngineSpec.extra_table_metadata(db, "test_table", "test_schema")
-        self.assertEqual({"ds": "01-01-19", "hour": 1}, result["partitions"]["latest"])
-
-    def test_presto_where_latest_partition(self):
-        db = mock.Mock()
-        db.get_indexes = mock.Mock(return_value=[{"column_names": ["ds", "hour"]}])
-        db.get_extra = mock.Mock(return_value={})
-        df = pd.DataFrame({"ds": ["01-01-19"], "hour": [1]})
-        db.get_df = mock.Mock(return_value=df)
-        columns = [{"name": "ds"}, {"name": "hour"}]
-        result = PrestoEngineSpec.where_latest_partition(
-            "test_table", "test_schema", db, select(), columns
-        )
-        query_result = str(result.compile(compile_kwargs={"literal_binds": True}))
-        self.assertEqual("SELECT  \nWHERE ds = '01-01-19' AND hour = 1", query_result)
-
-    def test_hive_get_view_names_return_empty_list(self):
-        self.assertEqual(
-            [], HiveEngineSpec.get_view_names(mock.ANY, mock.ANY, mock.ANY)
-        )
-
-    def test_bigquery_sqla_column_label(self):
-        label = BigQueryEngineSpec.make_label_compatible(column("Col").name)
-        label_expected = "Col"
-        self.assertEqual(label, label_expected)
-
-        label = BigQueryEngineSpec.make_label_compatible(column("SUM(x)").name)
-        label_expected = "SUM_x__5f110"
-        self.assertEqual(label, label_expected)
-
-        label = BigQueryEngineSpec.make_label_compatible(column("SUM[x]").name)
-        label_expected = "SUM_x__7ebe1"
-        self.assertEqual(label, label_expected)
-
-        label = BigQueryEngineSpec.make_label_compatible(column("12345_col").name)
-        label_expected = "_12345_col_8d390"
-        self.assertEqual(label, label_expected)
-
-    def test_oracle_sqla_column_name_length_exceeded(self):
-        col = column("This_Is_32_Character_Column_Name")
-        label = OracleEngineSpec.make_label_compatible(col.name)
-        self.assertEqual(label.quote, True)
-        label_expected = "3b26974078683be078219674eeb8f5"
-        self.assertEqual(label, label_expected)
-
-    def test_mssql_column_types(self):
-        def assert_type(type_string, type_expected):
-            type_assigned = MssqlEngineSpec.get_sqla_column_type(type_string)
-            if type_expected is None:
-                self.assertIsNone(type_assigned)
-            else:
-                self.assertIsInstance(type_assigned, type_expected)
-
-        assert_type("INT", None)
-        assert_type("STRING", String)
-        assert_type("CHAR(10)", String)
-        assert_type("VARCHAR(10)", String)
-        assert_type("TEXT", String)
-        assert_type("NCHAR(10)", UnicodeText)
-        assert_type("NVARCHAR(10)", UnicodeText)
-        assert_type("NTEXT", UnicodeText)
-
-    def test_mssql_where_clause_n_prefix(self):
-        dialect = mssql.dialect()
-        spec = MssqlEngineSpec
-        str_col = column("col", type_=spec.get_sqla_column_type("VARCHAR(10)"))
-        unicode_col = column("unicode_col", type_=spec.get_sqla_column_type("NTEXT"))
-        tbl = table("tbl")
-        sel = (
-            select([str_col, unicode_col])
-            .select_from(tbl)
-            .where(str_col == "abc")
-            .where(unicode_col == "abc")
-        )
-
-        query = str(
-            sel.compile(dialect=dialect, compile_kwargs={"literal_binds": True})
-        )
-        query_expected = (
-            "SELECT col, unicode_col \n"
-            "FROM tbl \n"
-            "WHERE col = 'abc' AND unicode_col = N'abc'"
-        )
-        self.assertEqual(query, query_expected)
-
-    def test_get_table_names(self):
-        inspector = mock.Mock()
-        inspector.get_table_names = mock.Mock(return_value=["schema.table", "table_2"])
-        inspector.get_foreign_table_names = mock.Mock(return_value=["table_3"])
-
-        """ Make sure base engine spec removes schema name from table name
-        ie. when try_remove_schema_from_table_name == True. """
-        base_result_expected = ["table", "table_2"]
-        base_result = BaseEngineSpec.get_table_names(
-            database=mock.ANY, schema="schema", inspector=inspector
-        )
-        self.assertListEqual(base_result_expected, base_result)
-
-        """ Make sure postgres doesn't try to remove schema name from table name
-        ie. when try_remove_schema_from_table_name == False. """
-        pg_result_expected = ["schema.table", "table_2", "table_3"]
-        pg_result = PostgresEngineSpec.get_table_names(
-            database=mock.ANY, schema="schema", inspector=inspector
-        )
-        self.assertListEqual(pg_result_expected, pg_result)
-
-    def test_pg_time_expression_literal_no_grain(self):
-        col = literal_column("COALESCE(a, b)")
-        expr = PostgresEngineSpec.get_timestamp_expr(col, None, None)
-        result = str(expr.compile(dialect=postgresql.dialect()))
-        self.assertEqual(result, "COALESCE(a, b)")
-
-    def test_pg_time_expression_literal_1y_grain(self):
-        col = literal_column("COALESCE(a, b)")
-        expr = PostgresEngineSpec.get_timestamp_expr(col, None, "P1Y")
-        result = str(expr.compile(dialect=postgresql.dialect()))
-        self.assertEqual(result, "DATE_TRUNC('year', COALESCE(a, b))")
-
-    def test_pg_time_expression_lower_column_no_grain(self):
-        col = column("lower_case")
-        expr = PostgresEngineSpec.get_timestamp_expr(col, None, None)
-        result = str(expr.compile(dialect=postgresql.dialect()))
-        self.assertEqual(result, "lower_case")
-
-    def test_pg_time_expression_lower_case_column_sec_1y_grain(self):
-        col = column("lower_case")
-        expr = PostgresEngineSpec.get_timestamp_expr(col, "epoch_s", "P1Y")
-        result = str(expr.compile(dialect=postgresql.dialect()))
-        self.assertEqual(
-            result,
-            "DATE_TRUNC('year', (timestamp 'epoch' + lower_case * interval '1 second'))",
-        )
-
-    def test_pg_time_expression_mixed_case_column_1y_grain(self):
-        col = column("MixedCase")
-        expr = PostgresEngineSpec.get_timestamp_expr(col, None, "P1Y")
-        result = str(expr.compile(dialect=postgresql.dialect()))
-        self.assertEqual(result, "DATE_TRUNC('year', \"MixedCase\")")
-
-    def test_mssql_time_expression_mixed_case_column_1y_grain(self):
-        col = column("MixedCase")
-        expr = MssqlEngineSpec.get_timestamp_expr(col, None, "P1Y")
-        result = str(expr.compile(dialect=mssql.dialect()))
-        self.assertEqual(result, "DATEADD(year, DATEDIFF(year, 0, [MixedCase]), 0)")
-
-    def test_oracle_time_expression_reserved_keyword_1m_grain(self):
-        col = column("decimal")
-        expr = OracleEngineSpec.get_timestamp_expr(col, None, "P1M")
-        result = str(expr.compile(dialect=oracle.dialect()))
-        self.assertEqual(result, "TRUNC(CAST(\"decimal\" as DATE), 'MONTH')")
-
-    def test_pinot_time_expression_sec_1m_grain(self):
-        col = column("tstamp")
-        expr = PinotEngineSpec.get_timestamp_expr(col, "epoch_s", "P1M")
-        result = str(expr.compile())
-        self.assertEqual(
-            result,
-            'DATETIMECONVERT(tstamp, "1:SECONDS:EPOCH", "1:SECONDS:EPOCH", "1:MONTHS")',
-        )
-
-    def test_column_datatype_to_string(self):
-        example_db = get_example_database()
-        sqla_table = example_db.get_table("energy_usage")
-        dialect = example_db.get_dialect()
-        col_names = [
-            example_db.db_engine_spec.column_datatype_to_string(c.type, dialect)
-            for c in sqla_table.columns
-        ]
-        if example_db.backend == "postgresql":
-            expected = ["VARCHAR(255)", "VARCHAR(255)", "DOUBLE PRECISION"]
-        else:
-            expected = ["VARCHAR(255)", "VARCHAR(255)", "FLOAT"]
-        self.assertEqual(col_names, expected)


[incubator-superset] 09/22: [datasource editor] Only one click target for edit action (#8495)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit 27612a16c1935d55f075375a8dc68ad02b9d9d1f
Author: Grace Guo <gr...@airbnb.com>
AuthorDate: Fri Nov 1 17:22:38 2019 -0700

    [datasource editor] Only one click target for edit action (#8495)
---
 .../components/controls/DatasourceControl.css      | 35 +++++++++++
 .../components/controls/DatasourceControl.jsx      | 71 ++++++++++------------
 2 files changed, 66 insertions(+), 40 deletions(-)

diff --git a/superset/assets/src/explore/components/controls/DatasourceControl.css b/superset/assets/src/explore/components/controls/DatasourceControl.css
new file mode 100644
index 0000000..87ea089
--- /dev/null
+++ b/superset/assets/src/explore/components/controls/DatasourceControl.css
@@ -0,0 +1,35 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+#datasource_menu {
+    border-radius: 2px;
+    padding-left: 8px;
+    padding-right: 8px;
+}
+
+#datasource_menu .caret {
+    position: relative;
+    padding-right: 8px;
+    margin-left: 4px;
+    color: #fff;
+    top: -8px;
+}
+
+#datasource_menu + ul {
+    margin-top: 26px;
+}
diff --git a/superset/assets/src/explore/components/controls/DatasourceControl.jsx b/superset/assets/src/explore/components/controls/DatasourceControl.jsx
index fc04ee9..a2a35c2 100644
--- a/superset/assets/src/explore/components/controls/DatasourceControl.jsx
+++ b/superset/assets/src/explore/components/controls/DatasourceControl.jsx
@@ -36,6 +36,8 @@ import ColumnOption from '../../../components/ColumnOption';
 import MetricOption from '../../../components/MetricOption';
 import DatasourceModal from '../../../datasource/DatasourceModal';
 import ChangeDatasourceModal from '../../../datasource/ChangeDatasourceModal';
+import TooltipWrapper from '../../../components/TooltipWrapper';
+import './DatasourceControl.css';
 
 const propTypes = {
   onChange: PropTypes.func,
@@ -115,56 +117,45 @@ class DatasourceControl extends React.PureComponent {
   }
 
   render() {
-    const { menuExpanded, showChangeDatasourceModal, showEditDatasourceModal } = this.state;
+    const { showChangeDatasourceModal, showEditDatasourceModal } = this.state;
     const { datasource, onChange, onDatasourceSave, value } = this.props;
     return (
       <div>
         <ControlHeader {...this.props} />
         <div className="btn-group label-dropdown">
-          <OverlayTrigger
-            placement="right"
-            overlay={
-              <Tooltip id={'error-tooltip'}>{t('Click to change the datasource')}</Tooltip>
-            }
-          >
-            <div className="btn-group">
-              <Label onClick={this.toggleChangeDatasourceModal} className="label-btn-label">
-                {datasource.name}
-              </Label>
-            </div>
-          </OverlayTrigger>
-          <DropdownButton
-            noCaret
-            title={
-              <span>
-                <i className={`float-right expander fa fa-angle-${menuExpanded ? 'up' : 'down'}`} />
-              </span>}
-            className="label label-btn m-r-5"
-            bsSize="sm"
-            id="datasource_menu"
+          <TooltipWrapper
+            label="change-datasource"
+            tooltip={t('Click to change the datasource')}
           >
-            <MenuItem
-              eventKey="3"
-              onClick={this.toggleChangeDatasourceModal}
+            <DropdownButton
+              title={datasource.name}
+              className="label label-default label-btn m-r-5"
+              bsSize="sm"
+              id="datasource_menu"
             >
-              {t('Change Datasource')}
-            </MenuItem>
-            {datasource.type === 'table' &&
               <MenuItem
                 eventKey="3"
-                href={`/superset/sqllab?datasourceKey=${value}`}
-                target="_blank"
-                rel="noopener noreferrer"
+                onClick={this.toggleChangeDatasourceModal}
               >
-                {t('Explore in SQL Lab')}
-              </MenuItem>}
-            <MenuItem
-              eventKey="3"
-              onClick={this.toggleEditDatasourceModal}
-            >
-              {t('Edit Datasource')}
-            </MenuItem>
-          </DropdownButton>
+                {t('Change Datasource')}
+              </MenuItem>
+              {datasource.type === 'table' &&
+                <MenuItem
+                  eventKey="3"
+                  href={`/superset/sqllab?datasourceKey=${value}`}
+                  target="_blank"
+                  rel="noopener noreferrer"
+                >
+                  {t('Explore in SQL Lab')}
+                </MenuItem>}
+              <MenuItem
+                eventKey="3"
+                onClick={this.toggleEditDatasourceModal}
+              >
+                {t('Edit Datasource')}
+              </MenuItem>
+            </DropdownButton>
+          </TooltipWrapper>
           <OverlayTrigger
             placement="right"
             overlay={


[incubator-superset] 12/22: Math.max(...array) considered harmful (#8575)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit 7db52832f226470b70c820d1d2aaf5722f7b7aff
Author: Beto Dealmeida <ro...@dealmeida.net>
AuthorDate: Thu Nov 14 17:07:18 2019 -0800

    Math.max(...array) considered harmful (#8575)
    
    * Do not use Math.max
    
    * Small fix
---
 superset/assets/src/components/FilterableTable/FilterableTable.jsx | 7 +++++--
 1 file changed, 5 insertions(+), 2 deletions(-)

diff --git a/superset/assets/src/components/FilterableTable/FilterableTable.jsx b/superset/assets/src/components/FilterableTable/FilterableTable.jsx
index b8c09ef..3f4b4f4 100644
--- a/superset/assets/src/components/FilterableTable/FilterableTable.jsx
+++ b/superset/assets/src/components/FilterableTable/FilterableTable.jsx
@@ -170,10 +170,13 @@ export default class FilterableTable extends PureComponent {
     ).map(dimension => dimension.width);
 
     this.props.orderedColumnKeys.forEach((key, index) => {
-      widthsByColumnKey[key] = Math.max(...colWidths.slice(
+      // we can't use Math.max(...colWidths.slice(...)) here since the number
+      // of elements might be bigger than the number of allowed arguments in a
+      // Javascript function
+      widthsByColumnKey[key] = colWidths.slice(
         index * (this.list.size + 1),
         (index + 1) * (this.list.size + 1),
-      )) + PADDING;
+      ).reduce((a, b) => Math.max(a, b)) + PADDING;
     });
 
     return widthsByColumnKey;


[incubator-superset] 06/22: [setup] Fix, download_url (#8439)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit 3e6a206c1e459d578a399a12f04258a45c1de2bd
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Fri Oct 25 08:59:06 2019 +0100

    [setup] Fix, download_url (#8439)
    
    * [setup] Fix, download_url
---
 setup.py | 4 +---
 1 file changed, 1 insertion(+), 3 deletions(-)

diff --git a/setup.py b/setup.py
index 5f616a2..a07cd7b 100644
--- a/setup.py
+++ b/setup.py
@@ -121,9 +121,7 @@ setup(
     author="Apache Software Foundation",
     author_email="dev@superset.incubator.apache.org",
     url="https://superset.apache.org/",
-    download_url=(
-        "https://dist.apache.org/repos/dist/release/superset/" + version_string
-    ),
+    download_url="https://www.apache.org/dist/incubator/superset/" + version_string,
     classifiers=[
         "Programming Language :: Python :: 3.6",
         "Programming Language :: Python :: 3.7",


[incubator-superset] 03/22: explain the need to enable async queries (#8444)

Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

villebro pushed a commit to branch 0.35
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git

commit ba8bb6b4d79fc1b8e28c68cee2ad8b6e29264431
Author: Christoph Lingg <ch...@lingg.eu>
AuthorDate: Thu Oct 24 18:52:16 2019 +0200

    explain the need to enable async queries (#8444)
---
 docs/installation.rst | 3 +++
 1 file changed, 3 insertions(+)

diff --git a/docs/installation.rst b/docs/installation.rst
index 9360437..b454e1d 100644
--- a/docs/installation.rst
+++ b/docs/installation.rst
@@ -888,6 +888,9 @@ cache store when upgrading an existing environment.
   entire setup. If not, background jobs can get scheduled multiple times
   resulting in weird behaviors like duplicate delivery of reports,
   higher than expected load / traffic etc.
+  
+* SQL Lab will only run your queries asynchronously if you enable 
+  "Asynchronous Query Execution" in your database settings.
 
 
 Email Reports