You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@superset.apache.org by mi...@apache.org on 2024/02/01 19:53:21 UTC

(superset) branch 3.1 updated (6cdaf479f2 -> 86794cbb5e)

This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a change to branch 3.1
in repository https://gitbox.apache.org/repos/asf/superset.git


    from 6cdaf479f2 fix(sqllab): autosync fail on migrated queryEditor (#26922)
     new 1d9cfdabd1 feat(sqlparse): improve table parsing (#26476)
     new 1be9eb577a fix(deck.gl Multiple Layer Chart): Add Contour and Heatmap Layer as options (#25923)
     new ba9032e8d0 fix: prevent guest user from modifying metrics (#26749)
     new 81353e280b fix: Bar charts horizontal margin adjustment error (#26817)
     new 61b4017b2f fix(pinot): typo in the name for epoch_ms_to_dttm (#26906)
     new de61591a6a fix: handle CRLF endings causing sqlglot failure (#26911)
     new 322812b1c8 fix: dashboard import validation (#26887)
     new e8a3c5d5f1 fix: Allow exporting saved queries without schema information (#26889)
     new 86794cbb5e fix(cache): remove unused webserver config & handle trailing slashes (#22849)

The 9 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 requirements/base.txt                              |  15 +-
 requirements/testing.txt                           |   6 +-
 setup.py                                           |   1 +
 .../legacy-preset-chart-deckgl/src/index.ts        |   2 +
 .../src/layers/Arc/Arc.tsx                         |   8 +-
 .../src/layers/Geojson/Geojson.tsx                 |   4 +-
 .../src/layers/Heatmap/Heatmap.tsx                 |   2 +-
 .../src/layers/Path/Path.tsx                       |   2 +-
 .../src/layers/Polygon/Polygon.tsx                 |  10 +-
 .../src/layers/Scatter/Scatter.tsx                 |  10 +-
 .../src/layers/Screengrid/Screengrid.tsx           |   4 +-
 .../legacy-preset-chart-deckgl/src/layers/index.ts |   4 +
 .../src/Timeseries/transformProps.ts               |   1 +
 .../src/Timeseries/transformers.ts                 |  42 ++--
 .../plugin-chart-echarts/src/utils/series.ts       |  14 ++
 .../plugin-chart-echarts/test/utils/series.test.ts |  32 +++
 superset/commands/dashboard/importers/v1/utils.py  |   8 +-
 superset/commands/dataset/duplicate.py             |   5 +-
 superset/commands/query/export.py                  |  11 +-
 superset/commands/sql_lab/export.py                |   5 +-
 superset/common/query_object.py                    |   2 +-
 superset/config.py                                 |   4 -
 superset/connectors/sqla/models.py                 |   2 +-
 superset/connectors/sqla/utils.py                  |   2 +-
 superset/db_engine_specs/base.py                   |  12 +-
 superset/db_engine_specs/bigquery.py               |   2 +-
 superset/db_engine_specs/pinot.py                  |   2 +-
 superset/models/helpers.py                         |   2 +-
 superset/models/sql_lab.py                         |   6 +-
 superset/security/manager.py                       |  31 ++-
 superset/sql_lab.py                                |  11 +-
 superset/sql_parse.py                              | 248 ++++++++++++++-------
 superset/sql_validators/presto_db.py               |   4 +-
 superset/sqllab/query_render.py                    |   6 +-
 superset/tasks/cache.py                            |   4 +-
 .../db_engine_specs/pinot_tests.py                 |  14 ++
 .../security/guest_token_security_tests.py         |  15 ++
 tests/integration_tests/superset_test_config.py    |   1 -
 .../superset_test_config_thumbnails.py             |   1 -
 tests/integration_tests/tasks/test_cache.py        |  58 +++++
 .../commands/importers/v1/import_test.py           | 110 ++++++++-
 tests/unit_tests/security/manager_test.py          |  76 +++++++
 tests/unit_tests/sql_parse_tests.py                |  55 ++++-
 43 files changed, 675 insertions(+), 179 deletions(-)
 create mode 100644 tests/integration_tests/tasks/test_cache.py


(superset) 01/09: feat(sqlparse): improve table parsing (#26476)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 1d9cfdabd1816c71f716c8d7e213558d5b7ff05e
Author: Beto Dealmeida <ro...@dealmeida.net>
AuthorDate: Mon Jan 22 11:16:50 2024 -0500

    feat(sqlparse): improve table parsing (#26476)
    
    (cherry picked from commit c0b57bd1c3d487a661315a1944aca9f9ce728d51)
---
 requirements/base.txt                  |  15 +-
 requirements/testing.txt               |   6 +-
 setup.py                               |   1 +
 superset/commands/dataset/duplicate.py |   5 +-
 superset/commands/sql_lab/export.py    |   5 +-
 superset/connectors/sqla/models.py     |   2 +-
 superset/connectors/sqla/utils.py      |   2 +-
 superset/db_engine_specs/base.py       |  12 +-
 superset/db_engine_specs/bigquery.py   |   2 +-
 superset/models/helpers.py             |   2 +-
 superset/models/sql_lab.py             |   6 +-
 superset/security/manager.py           |   5 +-
 superset/sql_lab.py                    |  11 +-
 superset/sql_parse.py                  | 246 ++++++++++++++++++++++-----------
 superset/sql_validators/presto_db.py   |   4 +-
 superset/sqllab/query_render.py        |   6 +-
 tests/unit_tests/sql_parse_tests.py    |  55 ++++++--
 17 files changed, 265 insertions(+), 120 deletions(-)

diff --git a/requirements/base.txt b/requirements/base.txt
index 2c20094f72..da95b8c571 100644
--- a/requirements/base.txt
+++ b/requirements/base.txt
@@ -141,7 +141,9 @@ geographiclib==1.52
 geopy==2.2.0
     # via apache-superset
 greenlet==2.0.2
-    # via shillelagh
+    # via
+    #   shillelagh
+    #   sqlalchemy
 gunicorn==21.2.0
     # via apache-superset
 hashids==1.3.1
@@ -155,7 +157,10 @@ idna==3.2
     #   email-validator
     #   requests
 importlib-metadata==6.6.0
-    # via apache-superset
+    # via
+    #   apache-superset
+    #   flask
+    #   shillelagh
 importlib-resources==5.12.0
     # via limits
 isodate==0.6.0
@@ -327,6 +332,8 @@ sqlalchemy-utils==0.38.3
     # via
     #   apache-superset
     #   flask-appbuilder
+sqlglot==20.8.0
+    # via apache-superset
 sqlparse==0.4.4
     # via apache-superset
 sshtunnel==0.4.0
@@ -376,7 +383,9 @@ wtforms-json==0.3.5
 xlsxwriter==3.0.7
     # via apache-superset
 zipp==3.15.0
-    # via importlib-metadata
+    # via
+    #   importlib-metadata
+    #   importlib-resources
 
 # The following packages are considered to be unsafe in a requirements file:
 # setuptools
diff --git a/requirements/testing.txt b/requirements/testing.txt
index 3bf3c78d03..b40497c8fc 100644
--- a/requirements/testing.txt
+++ b/requirements/testing.txt
@@ -24,10 +24,6 @@ db-dtypes==1.1.1
     # via pandas-gbq
 docker==6.1.1
     # via -r requirements/testing.in
-exceptiongroup==1.1.1
-    # via pytest
-ephem==4.1.4
-    # via lunarcalendar
 flask-testing==0.8.1
     # via -r requirements/testing.in
 fonttools==4.39.4
@@ -121,6 +117,8 @@ pyee==9.0.4
     # via playwright
 pyfakefs==5.2.2
     # via -r requirements/testing.in
+pyhive[presto]==0.7.0
+    # via apache-superset
 pytest==7.3.1
     # via
     #   -r requirements/testing.in
diff --git a/setup.py b/setup.py
index 913cfc5c35..996d6dfe47 100644
--- a/setup.py
+++ b/setup.py
@@ -125,6 +125,7 @@ setup(
         "slack_sdk>=3.19.0, <4",
         "sqlalchemy>=1.4, <2",
         "sqlalchemy-utils>=0.38.3, <0.39",
+        "sqlglot>=20,<21",
         "sqlparse>=0.4.4, <0.5",
         "tabulate>=0.8.9, <0.9",
         "typing-extensions>=4, <5",
diff --git a/superset/commands/dataset/duplicate.py b/superset/commands/dataset/duplicate.py
index 0ae47c35bc..850290422e 100644
--- a/superset/commands/dataset/duplicate.py
+++ b/superset/commands/dataset/duplicate.py
@@ -70,7 +70,10 @@ class DuplicateDatasetCommand(CreateMixin, BaseCommand):
             table.normalize_columns = self._base_model.normalize_columns
             table.always_filter_main_dttm = self._base_model.always_filter_main_dttm
             table.is_sqllab_view = True
-            table.sql = ParsedQuery(self._base_model.sql).stripped()
+            table.sql = ParsedQuery(
+                self._base_model.sql,
+                engine=database.db_engine_spec.engine,
+            ).stripped()
             db.session.add(table)
             cols = []
             for config_ in self._base_model.columns:
diff --git a/superset/commands/sql_lab/export.py b/superset/commands/sql_lab/export.py
index 1b9b0e0344..aa6050f27f 100644
--- a/superset/commands/sql_lab/export.py
+++ b/superset/commands/sql_lab/export.py
@@ -115,7 +115,10 @@ class SqlResultExportCommand(BaseCommand):
                 limit = None
             else:
                 sql = self._query.executed_sql
-                limit = ParsedQuery(sql).limit
+                limit = ParsedQuery(
+                    sql,
+                    engine=self._query.database.db_engine_spec.engine,
+                ).limit
             if limit is not None and self._query.limiting_factor in {
                 LimitingFactor.QUERY,
                 LimitingFactor.DROPDOWN,
diff --git a/superset/connectors/sqla/models.py b/superset/connectors/sqla/models.py
index 598bc6741b..bd54032d5d 100644
--- a/superset/connectors/sqla/models.py
+++ b/superset/connectors/sqla/models.py
@@ -1458,7 +1458,7 @@ class SqlaTable(
             return self.get_sqla_table(), None
 
         from_sql = self.get_rendered_sql(template_processor)
-        parsed_query = ParsedQuery(from_sql)
+        parsed_query = ParsedQuery(from_sql, engine=self.db_engine_spec.engine)
         if not (
             parsed_query.is_unknown()
             or self.db_engine_spec.is_readonly_query(parsed_query)
diff --git a/superset/connectors/sqla/utils.py b/superset/connectors/sqla/utils.py
index 66594084c8..688be53515 100644
--- a/superset/connectors/sqla/utils.py
+++ b/superset/connectors/sqla/utils.py
@@ -111,7 +111,7 @@ def get_virtual_table_metadata(dataset: SqlaTable) -> list[ResultSetColumnType]:
     sql = dataset.get_template_processor().process_template(
         dataset.sql, **dataset.template_params_dict
     )
-    parsed_query = ParsedQuery(sql)
+    parsed_query = ParsedQuery(sql, engine=db_engine_spec.engine)
     if not db_engine_spec.is_readonly_query(parsed_query):
         raise SupersetSecurityException(
             SupersetError(
diff --git a/superset/db_engine_specs/base.py b/superset/db_engine_specs/base.py
index 6bc8c444d6..5edd4b7c06 100644
--- a/superset/db_engine_specs/base.py
+++ b/superset/db_engine_specs/base.py
@@ -900,7 +900,7 @@ class BaseEngineSpec:  # pylint: disable=too-many-public-methods
             return database.compile_sqla_query(qry)
 
         if cls.limit_method == LimitMethod.FORCE_LIMIT:
-            parsed_query = sql_parse.ParsedQuery(sql)
+            parsed_query = sql_parse.ParsedQuery(sql, engine=cls.engine)
             sql = parsed_query.set_or_update_query_limit(limit, force=force)
 
         return sql
@@ -981,7 +981,7 @@ class BaseEngineSpec:  # pylint: disable=too-many-public-methods
         :param sql: SQL query
         :return: Value of limit clause in query
         """
-        parsed_query = sql_parse.ParsedQuery(sql)
+        parsed_query = sql_parse.ParsedQuery(sql, engine=cls.engine)
         return parsed_query.limit
 
     @classmethod
@@ -993,7 +993,7 @@ class BaseEngineSpec:  # pylint: disable=too-many-public-methods
         :param limit: New limit to insert/replace into query
         :return: Query with new limit
         """
-        parsed_query = sql_parse.ParsedQuery(sql)
+        parsed_query = sql_parse.ParsedQuery(sql, engine=cls.engine)
         return parsed_query.set_or_update_query_limit(limit)
 
     @classmethod
@@ -1490,7 +1490,7 @@ class BaseEngineSpec:  # pylint: disable=too-many-public-methods
         :param database: Database instance
         :return: Dictionary with different costs
         """
-        parsed_query = ParsedQuery(statement)
+        parsed_query = ParsedQuery(statement, engine=cls.engine)
         sql = parsed_query.stripped()
         sql_query_mutator = current_app.config["SQL_QUERY_MUTATOR"]
         mutate_after_split = current_app.config["MUTATE_AFTER_SPLIT"]
@@ -1525,7 +1525,7 @@ class BaseEngineSpec:  # pylint: disable=too-many-public-methods
                 "Database does not support cost estimation"
             )
 
-        parsed_query = sql_parse.ParsedQuery(sql)
+        parsed_query = sql_parse.ParsedQuery(sql, engine=cls.engine)
         statements = parsed_query.get_statements()
 
         costs = []
@@ -1586,7 +1586,7 @@ class BaseEngineSpec:  # pylint: disable=too-many-public-methods
         :return:
         """
         if not cls.allows_sql_comments:
-            query = sql_parse.strip_comments_from_sql(query)
+            query = sql_parse.strip_comments_from_sql(query, engine=cls.engine)
 
         if cls.arraysize:
             cursor.arraysize = cls.arraysize
diff --git a/superset/db_engine_specs/bigquery.py b/superset/db_engine_specs/bigquery.py
index 8e7ed0bf7d..a8d834276e 100644
--- a/superset/db_engine_specs/bigquery.py
+++ b/superset/db_engine_specs/bigquery.py
@@ -435,7 +435,7 @@ class BigQueryEngineSpec(BaseEngineSpec):  # pylint: disable=too-many-public-met
         if not cls.get_allow_cost_estimate(extra):
             raise SupersetException("Database does not support cost estimation")
 
-        parsed_query = sql_parse.ParsedQuery(sql)
+        parsed_query = sql_parse.ParsedQuery(sql, engine=cls.engine)
         statements = parsed_query.get_statements()
         costs = []
         for statement in statements:
diff --git a/superset/models/helpers.py b/superset/models/helpers.py
index 7bfbbddfd3..65d0b53728 100644
--- a/superset/models/helpers.py
+++ b/superset/models/helpers.py
@@ -1094,7 +1094,7 @@ class ExploreMixin:  # pylint: disable=too-many-public-methods
         """
 
         from_sql = self.get_rendered_sql(template_processor)
-        parsed_query = ParsedQuery(from_sql)
+        parsed_query = ParsedQuery(from_sql, engine=self.db_engine_spec.engine)
         if not (
             parsed_query.is_unknown()
             or self.db_engine_spec.is_readonly_query(parsed_query)
diff --git a/superset/models/sql_lab.py b/superset/models/sql_lab.py
index 7e63e984df..aff8c1ce3d 100644
--- a/superset/models/sql_lab.py
+++ b/superset/models/sql_lab.py
@@ -183,7 +183,7 @@ class Query(
 
     @property
     def sql_tables(self) -> list[Table]:
-        return list(ParsedQuery(self.sql).tables)
+        return list(ParsedQuery(self.sql, engine=self.db_engine_spec.engine).tables)
 
     @property
     def columns(self) -> list["TableColumn"]:
@@ -427,7 +427,9 @@ class SavedQuery(Model, AuditMixinNullable, ExtraJSONMixin, ImportExportMixin):
 
     @property
     def sql_tables(self) -> list[Table]:
-        return list(ParsedQuery(self.sql).tables)
+        return list(
+            ParsedQuery(self.sql, engine=self.database.db_engine_spec.engine).tables
+        )
 
     @property
     def last_run_humanized(self) -> str:
diff --git a/superset/security/manager.py b/superset/security/manager.py
index b8978104d9..ef0a5f9a49 100644
--- a/superset/security/manager.py
+++ b/superset/security/manager.py
@@ -1877,7 +1877,10 @@ class SupersetSecurityManager(  # pylint: disable=too-many-public-methods
                 default_schema = database.get_default_schema_for_query(query)
                 tables = {
                     Table(table_.table, table_.schema or default_schema)
-                    for table_ in sql_parse.ParsedQuery(query.sql).tables
+                    for table_ in sql_parse.ParsedQuery(
+                        query.sql,
+                        engine=database.db_engine_spec.engine,
+                    ).tables
                 }
             elif table:
                 tables = {table}
diff --git a/superset/sql_lab.py b/superset/sql_lab.py
index efbef6560a..e9b4d406f8 100644
--- a/superset/sql_lab.py
+++ b/superset/sql_lab.py
@@ -208,7 +208,7 @@ def execute_sql_statement(  # pylint: disable=too-many-arguments, too-many-local
     database: Database = query.database
     db_engine_spec = database.db_engine_spec
 
-    parsed_query = ParsedQuery(sql_statement)
+    parsed_query = ParsedQuery(sql_statement, engine=db_engine_spec.engine)
     if is_feature_enabled("RLS_IN_SQLLAB"):
         # There are two ways to insert RLS: either replacing the table with a subquery
         # that has the RLS, or appending the RLS to the ``WHERE`` clause. The former is
@@ -228,7 +228,8 @@ def execute_sql_statement(  # pylint: disable=too-many-arguments, too-many-local
                     database.id,
                     query.schema,
                 )
-            )
+            ),
+            engine=db_engine_spec.engine,
         )
 
     sql = parsed_query.stripped()
@@ -419,7 +420,11 @@ def execute_sql_statements(
         )
 
     # Breaking down into multiple statements
-    parsed_query = ParsedQuery(rendered_query, strip_comments=True)
+    parsed_query = ParsedQuery(
+        rendered_query,
+        strip_comments=True,
+        engine=db_engine_spec.engine,
+    )
     if not db_engine_spec.run_multiple_statements_as_one:
         statements = parsed_query.get_statements()
         logger.info(
diff --git a/superset/sql_parse.py b/superset/sql_parse.py
index cecd673276..07704171de 100644
--- a/superset/sql_parse.py
+++ b/superset/sql_parse.py
@@ -14,15 +14,22 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+
+# pylint: disable=too-many-lines
+
 import logging
 import re
-from collections.abc import Iterator
+import urllib.parse
+from collections.abc import Iterable, Iterator
 from dataclasses import dataclass
 from typing import Any, cast, Optional
-from urllib import parse
 
 import sqlparse
 from sqlalchemy import and_
+from sqlglot import exp, parse, parse_one
+from sqlglot.dialects import Dialects
+from sqlglot.errors import ParseError
+from sqlglot.optimizer.scope import Scope, ScopeType, traverse_scope
 from sqlparse import keywords
 from sqlparse.lexer import Lexer
 from sqlparse.sql import (
@@ -53,7 +60,7 @@ from superset.utils.backports import StrEnum
 
 try:
     from sqloxide import parse_sql as sqloxide_parse
-except:  # pylint: disable=bare-except
+except (ImportError, ModuleNotFoundError):
     sqloxide_parse = None
 
 RESULT_OPERATIONS = {"UNION", "INTERSECT", "EXCEPT", "SELECT"}
@@ -72,6 +79,59 @@ sqlparser_sql_regex.insert(25, (r"'(''|\\\\|\\|[^'])*'", sqlparse.tokens.String.
 lex.set_SQL_REGEX(sqlparser_sql_regex)
 
 
+# mapping between DB engine specs and sqlglot dialects
+SQLGLOT_DIALECTS = {
+    "ascend": Dialects.HIVE,
+    "awsathena": Dialects.PRESTO,
+    "bigquery": Dialects.BIGQUERY,
+    "clickhouse": Dialects.CLICKHOUSE,
+    "clickhousedb": Dialects.CLICKHOUSE,
+    "cockroachdb": Dialects.POSTGRES,
+    # "crate": ???
+    # "databend": ???
+    "databricks": Dialects.DATABRICKS,
+    # "db2": ???
+    # "dremio": ???
+    "drill": Dialects.DRILL,
+    # "druid": ???
+    "duckdb": Dialects.DUCKDB,
+    # "dynamodb": ???
+    # "elasticsearch": ???
+    # "exa": ???
+    # "firebird": ???
+    # "firebolt": ???
+    "gsheets": Dialects.SQLITE,
+    "hana": Dialects.POSTGRES,
+    "hive": Dialects.HIVE,
+    # "ibmi": ???
+    # "impala": ???
+    # "kustokql": ???
+    # "kylin": ???
+    # "mssql": ???
+    "mysql": Dialects.MYSQL,
+    "netezza": Dialects.POSTGRES,
+    # "ocient": ???
+    # "odelasticsearch": ???
+    "oracle": Dialects.ORACLE,
+    # "pinot": ???
+    "postgresql": Dialects.POSTGRES,
+    "presto": Dialects.PRESTO,
+    "pydoris": Dialects.DORIS,
+    "redshift": Dialects.REDSHIFT,
+    # "risingwave": ???
+    # "rockset": ???
+    "shillelagh": Dialects.SQLITE,
+    "snowflake": Dialects.SNOWFLAKE,
+    # "solr": ???
+    "sqlite": Dialects.SQLITE,
+    "starrocks": Dialects.STARROCKS,
+    "superset": Dialects.SQLITE,
+    "teradatasql": Dialects.TERADATA,
+    "trino": Dialects.TRINO,
+    "vertica": Dialects.POSTGRES,
+}
+
+
 class CtasMethod(StrEnum):
     TABLE = "TABLE"
     VIEW = "VIEW"
@@ -150,7 +210,7 @@ def get_cte_remainder_query(sql: str) -> tuple[Optional[str], str]:
     return cte, remainder
 
 
-def strip_comments_from_sql(statement: str) -> str:
+def strip_comments_from_sql(statement: str, engine: Optional[str] = None) -> str:
     """
     Strips comments from a SQL statement, does a simple test first
     to avoid always instantiating the expensive ParsedQuery constructor
@@ -160,7 +220,11 @@ def strip_comments_from_sql(statement: str) -> str:
     :param statement: A string with the SQL statement
     :return: SQL statement without comments
     """
-    return ParsedQuery(statement).strip_comments() if "--" in statement else statement
+    return (
+        ParsedQuery(statement, engine=engine).strip_comments()
+        if "--" in statement
+        else statement
+    )
 
 
 @dataclass(eq=True, frozen=True)
@@ -179,7 +243,7 @@ class Table:
         """
 
         return ".".join(
-            parse.quote(part, safe="").replace(".", "%2E")
+            urllib.parse.quote(part, safe="").replace(".", "%2E")
             for part in [self.catalog, self.schema, self.table]
             if part
         )
@@ -189,11 +253,17 @@ class Table:
 
 
 class ParsedQuery:
-    def __init__(self, sql_statement: str, strip_comments: bool = False):
+    def __init__(
+        self,
+        sql_statement: str,
+        strip_comments: bool = False,
+        engine: Optional[str] = None,
+    ):
         if strip_comments:
             sql_statement = sqlparse.format(sql_statement, strip_comments=True)
 
         self.sql: str = sql_statement
+        self._dialect = SQLGLOT_DIALECTS.get(engine) if engine else None
         self._tables: set[Table] = set()
         self._alias_names: set[str] = set()
         self._limit: Optional[int] = None
@@ -206,14 +276,94 @@ class ParsedQuery:
     @property
     def tables(self) -> set[Table]:
         if not self._tables:
-            for statement in self._parsed:
-                self._extract_from_token(statement)
-
-            self._tables = {
-                table for table in self._tables if str(table) not in self._alias_names
-            }
+            self._tables = self._extract_tables_from_sql()
         return self._tables
 
+    def _extract_tables_from_sql(self) -> set[Table]:
+        """
+        Extract all table references in a query.
+
+        Note: this uses sqlglot, since it's better at catching more edge cases.
+        """
+        try:
+            statements = parse(self.sql, dialect=self._dialect)
+        except ParseError:
+            logger.warning("Unable to parse SQL (%s): %s", self._dialect, self.sql)
+            return set()
+
+        return {
+            table
+            for statement in statements
+            for table in self._extract_tables_from_statement(statement)
+            if statement
+        }
+
+    def _extract_tables_from_statement(self, statement: exp.Expression) -> set[Table]:
+        """
+        Extract all table references in a single statement.
+
+        Please not that this is not trivial; consider the following queries:
+
+            DESCRIBE some_table;
+            SHOW PARTITIONS FROM some_table;
+            WITH masked_name AS (SELECT * FROM some_table) SELECT * FROM masked_name;
+
+        See the unit tests for other tricky cases.
+        """
+        sources: Iterable[exp.Table]
+
+        if isinstance(statement, exp.Describe):
+            # A `DESCRIBE` query has no sources in sqlglot, so we need to explicitly
+            # query for all tables.
+            sources = statement.find_all(exp.Table)
+        elif isinstance(statement, exp.Command):
+            # Commands, like `SHOW COLUMNS FROM foo`, have to be converted into a
+            # `SELECT` statetement in order to extract tables.
+            literal = statement.find(exp.Literal)
+            if not literal:
+                return set()
+
+            pseudo_query = parse_one(f"SELECT {literal.this}", dialect=self._dialect)
+            sources = pseudo_query.find_all(exp.Table)
+        else:
+            sources = [
+                source
+                for scope in traverse_scope(statement)
+                for source in scope.sources.values()
+                if isinstance(source, exp.Table) and not self._is_cte(source, scope)
+            ]
+
+        return {
+            Table(
+                source.name,
+                source.db if source.db != "" else None,
+                source.catalog if source.catalog != "" else None,
+            )
+            for source in sources
+        }
+
+    def _is_cte(self, source: exp.Table, scope: Scope) -> bool:
+        """
+        Is the source a CTE?
+
+        CTEs in the parent scope look like tables (and are represented by
+        exp.Table objects), but should not be considered as such;
+        otherwise a user with access to table `foo` could access any table
+        with a query like this:
+
+            WITH foo AS (SELECT * FROM target_table) SELECT * FROM foo
+
+        """
+        parent_sources = scope.parent.sources if scope.parent else {}
+        ctes_in_scope = {
+            name
+            for name, parent_scope in parent_sources.items()
+            if isinstance(parent_scope, Scope)
+            and parent_scope.scope_type == ScopeType.CTE
+        }
+
+        return source.name in ctes_in_scope
+
     @property
     def limit(self) -> Optional[int]:
         return self._limit
@@ -393,28 +543,6 @@ class ParsedQuery:
     def _is_identifier(token: Token) -> bool:
         return isinstance(token, (IdentifierList, Identifier))
 
-    def _process_tokenlist(self, token_list: TokenList) -> None:
-        """
-        Add table names to table set
-
-        :param token_list: TokenList to be processed
-        """
-        # exclude subselects
-        if "(" not in str(token_list):
-            table = self.get_table(token_list)
-            if table and not table.table.startswith(CTE_PREFIX):
-                self._tables.add(table)
-            return
-
-        # store aliases
-        if token_list.has_alias():
-            self._alias_names.add(token_list.get_alias())
-
-        # some aliases are not parsed properly
-        if token_list.tokens[0].ttype == Name:
-            self._alias_names.add(token_list.tokens[0].value)
-        self._extract_from_token(token_list)
-
     def as_create_table(
         self,
         table_name: str,
@@ -441,50 +569,6 @@ class ParsedQuery:
         exec_sql += f"CREATE {method} {full_table_name} AS \n{sql}"
         return exec_sql
 
-    def _extract_from_token(self, token: Token) -> None:
-        """
-        <Identifier> store a list of subtokens and <IdentifierList> store lists of
-        subtoken list.
-
-        It extracts <IdentifierList> and <Identifier> from :param token: and loops
-        through all subtokens recursively. It finds table_name_preceding_token and
-        passes <IdentifierList> and <Identifier> to self._process_tokenlist to populate
-        self._tables.
-
-        :param token: instance of Token or child class, e.g. TokenList, to be processed
-        """
-        if not hasattr(token, "tokens"):
-            return
-
-        table_name_preceding_token = False
-
-        for item in token.tokens:
-            if item.is_group and (
-                not self._is_identifier(item) or isinstance(item.tokens[0], Parenthesis)
-            ):
-                self._extract_from_token(item)
-
-            if item.ttype in Keyword and (
-                item.normalized in PRECEDES_TABLE_NAME
-                or item.normalized.endswith(" JOIN")
-            ):
-                table_name_preceding_token = True
-                continue
-
-            if item.ttype in Keyword:
-                table_name_preceding_token = False
-                continue
-            if table_name_preceding_token:
-                if isinstance(item, Identifier):
-                    self._process_tokenlist(item)
-                elif isinstance(item, IdentifierList):
-                    for token2 in item.get_identifiers():
-                        if isinstance(token2, TokenList):
-                            self._process_tokenlist(token2)
-            elif isinstance(item, IdentifierList):
-                if any(not self._is_identifier(token2) for token2 in item.tokens):
-                    self._extract_from_token(item)
-
     def set_or_update_query_limit(self, new_limit: int, force: bool = False) -> str:
         """Returns the query with the specified limit.
 
@@ -881,7 +965,7 @@ def insert_rls_in_predicate(
 
 
 # mapping between sqloxide and SQLAlchemy dialects
-SQLOXITE_DIALECTS = {
+SQLOXIDE_DIALECTS = {
     "ansi": {"trino", "trinonative", "presto"},
     "hive": {"hive", "databricks"},
     "ms": {"mssql"},
@@ -914,7 +998,7 @@ def extract_table_references(
     tree = None
 
     if sqloxide_parse:
-        for dialect, sqla_dialects in SQLOXITE_DIALECTS.items():
+        for dialect, sqla_dialects in SQLOXIDE_DIALECTS.items():
             if sqla_dialect in sqla_dialects:
                 break
         sql_text = RE_JINJA_BLOCK.sub(" ", sql_text)
diff --git a/superset/sql_validators/presto_db.py b/superset/sql_validators/presto_db.py
index c01b938671..fed1ff3bfa 100644
--- a/superset/sql_validators/presto_db.py
+++ b/superset/sql_validators/presto_db.py
@@ -50,7 +50,7 @@ class PrestoDBSQLValidator(BaseSQLValidator):
     ) -> Optional[SQLValidationAnnotation]:
         # pylint: disable=too-many-locals
         db_engine_spec = database.db_engine_spec
-        parsed_query = ParsedQuery(statement)
+        parsed_query = ParsedQuery(statement, engine=db_engine_spec.engine)
         sql = parsed_query.stripped()
 
         # Hook to allow environment-specific mutation (usually comments) to the SQL
@@ -154,7 +154,7 @@ class PrestoDBSQLValidator(BaseSQLValidator):
         For example, "SELECT 1 FROM default.mytable" becomes "EXPLAIN (TYPE
         VALIDATE) SELECT 1 FROM default.mytable.
         """
-        parsed_query = ParsedQuery(sql)
+        parsed_query = ParsedQuery(sql, engine=database.db_engine_spec.engine)
         statements = parsed_query.get_statements()
 
         logger.info("Validating %i statement(s)", len(statements))
diff --git a/superset/sqllab/query_render.py b/superset/sqllab/query_render.py
index f4c1c26c6e..5597bcb086 100644
--- a/superset/sqllab/query_render.py
+++ b/superset/sqllab/query_render.py
@@ -58,7 +58,11 @@ class SqlQueryRenderImpl(SqlQueryRender):
                 database=query_model.database, query=query_model
             )
 
-            parsed_query = ParsedQuery(query_model.sql, strip_comments=True)
+            parsed_query = ParsedQuery(
+                query_model.sql,
+                strip_comments=True,
+                engine=query_model.database.db_engine_spec.engine,
+            )
             rendered_query = sql_template_processor.process_template(
                 parsed_query.stripped(), **execution_context.template_params
             )
diff --git a/tests/unit_tests/sql_parse_tests.py b/tests/unit_tests/sql_parse_tests.py
index efd8838101..f650b77734 100644
--- a/tests/unit_tests/sql_parse_tests.py
+++ b/tests/unit_tests/sql_parse_tests.py
@@ -40,11 +40,11 @@ from superset.sql_parse import (
 )
 
 
-def extract_tables(query: str) -> set[Table]:
+def extract_tables(query: str, engine: Optional[str] = None) -> set[Table]:
     """
     Helper function to extract tables referenced in a query.
     """
-    return ParsedQuery(query).tables
+    return ParsedQuery(query, engine=engine).tables
 
 
 def test_table() -> None:
@@ -96,8 +96,13 @@ def test_extract_tables() -> None:
         Table("left_table")
     }
 
-    # reverse select
-    assert extract_tables("FROM t1 SELECT field") == {Table("t1")}
+    assert extract_tables(
+        "SELECT FROM (SELECT FROM forbidden_table) AS forbidden_table;"
+    ) == {Table("forbidden_table")}
+
+    assert extract_tables(
+        "select * from (select * from forbidden_table) forbidden_table"
+    ) == {Table("forbidden_table")}
 
 
 def test_extract_tables_subselect() -> None:
@@ -263,14 +268,16 @@ def test_extract_tables_illdefined() -> None:
     assert extract_tables("SELECT * FROM schemaname.") == set()
     assert extract_tables("SELECT * FROM catalogname.schemaname.") == set()
     assert extract_tables("SELECT * FROM catalogname..") == set()
-    assert extract_tables("SELECT * FROM catalogname..tbname") == set()
+    assert extract_tables("SELECT * FROM catalogname..tbname") == {
+        Table(table="tbname", schema=None, catalog="catalogname")
+    }
 
 
 def test_extract_tables_show_tables_from() -> None:
     """
     Test ``SHOW TABLES FROM``.
     """
-    assert extract_tables("SHOW TABLES FROM s1 like '%order%'") == set()
+    assert extract_tables("SHOW TABLES FROM s1 like '%order%'", "mysql") == set()
 
 
 def test_extract_tables_show_columns_from() -> None:
@@ -311,7 +318,7 @@ WHERE regionkey IN (SELECT regionkey FROM t2)
             """
 SELECT name
 FROM t1
-WHERE regionkey EXISTS (SELECT regionkey FROM t2)
+WHERE EXISTS (SELECT 1 FROM t2 WHERE t1.regionkey = t2.regionkey);
 """
         )
         == {Table("t1"), Table("t2")}
@@ -526,6 +533,18 @@ select * from (select key from q1) a
         == {Table("src")}
     )
 
+    # weird query with circular dependency
+    assert (
+        extract_tables(
+            """
+with src as ( select key from q2 where key = '5'),
+q2 as ( select key from src where key = '5')
+select * from (select key from src) a
+"""
+        )
+        == set()
+    )
+
 
 def test_extract_tables_multistatement() -> None:
     """
@@ -665,7 +684,8 @@ def test_extract_tables_nested_select() -> None:
 select (extractvalue(1,concat(0x7e,(select GROUP_CONCAT(TABLE_NAME)
 from INFORMATION_SCHEMA.COLUMNS
 WHERE TABLE_SCHEMA like "%bi%"),0x7e)));
-"""
+""",
+            "mysql",
         )
         == {Table("COLUMNS", "INFORMATION_SCHEMA")}
     )
@@ -676,7 +696,8 @@ WHERE TABLE_SCHEMA like "%bi%"),0x7e)));
 select (extractvalue(1,concat(0x7e,(select GROUP_CONCAT(COLUMN_NAME)
 from INFORMATION_SCHEMA.COLUMNS
 WHERE TABLE_NAME="bi_achievement_daily"),0x7e)));
-"""
+""",
+            "mysql",
         )
         == {Table("COLUMNS", "INFORMATION_SCHEMA")}
     )
@@ -1306,6 +1327,14 @@ def test_sqlparse_issue_652():
             "(SELECT table_name FROM /**/ information_schema.tables WHERE table_name LIKE '%user%' LIMIT 1)",
             True,
         ),
+        (
+            "SELECT FROM (SELECT FROM forbidden_table) AS forbidden_table;",
+            True,
+        ),
+        (
+            "SELECT * FROM (SELECT * FROM forbidden_table) forbidden_table",
+            True,
+        ),
     ],
 )
 def test_has_table_query(sql: str, expected: bool) -> None:
@@ -1790,13 +1819,17 @@ def test_extract_table_references(mocker: MockerFixture) -> None:
     assert extract_table_references(
         sql,
         "trino",
-    ) == {Table(table="other_table", schema=None, catalog=None)}
+    ) == {
+        Table(table="table", schema=None, catalog=None),
+        Table(table="other_table", schema=None, catalog=None),
+    }
     logger.warning.assert_called_once()
 
     logger = mocker.patch("superset.migrations.shared.utils.logger")
     sql = "SELECT * FROM table UNION ALL SELECT * FROM other_table"
     assert extract_table_references(sql, "trino", show_warning=False) == {
-        Table(table="other_table", schema=None, catalog=None)
+        Table(table="table", schema=None, catalog=None),
+        Table(table="other_table", schema=None, catalog=None),
     }
     logger.warning.assert_not_called()
 


(superset) 03/09: fix: prevent guest user from modifying metrics (#26749)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit ba9032e8d0f2789de86502c8604ab7de88ed932a
Author: Beto Dealmeida <ro...@dealmeida.net>
AuthorDate: Mon Jan 29 12:59:33 2024 -0500

    fix: prevent guest user from modifying metrics (#26749)
    
    (cherry picked from commit fade4806ceebde32a775c04d86a46c7e93bc371f)
---
 superset/common/query_object.py                    |  2 +-
 superset/security/manager.py                       | 26 +++++++-
 .../security/guest_token_security_tests.py         | 15 +++++
 tests/unit_tests/security/manager_test.py          | 76 ++++++++++++++++++++++
 4 files changed, 117 insertions(+), 2 deletions(-)

diff --git a/superset/common/query_object.py b/superset/common/query_object.py
index 989df5775b..5109c465e0 100644
--- a/superset/common/query_object.py
+++ b/superset/common/query_object.py
@@ -125,7 +125,7 @@ class QueryObject:  # pylint: disable=too-many-instance-attributes
         order_desc: bool = True,
         orderby: list[OrderBy] | None = None,
         post_processing: list[dict[str, Any] | None] | None = None,
-        row_limit: int | None,
+        row_limit: int | None = None,
         row_offset: int | None = None,
         series_columns: list[Column] | None = None,
         series_limit: int = 0,
diff --git a/superset/security/manager.py b/superset/security/manager.py
index ef0a5f9a49..28046d0fef 100644
--- a/superset/security/manager.py
+++ b/superset/security/manager.py
@@ -1819,7 +1819,7 @@ class SupersetSecurityManager(  # pylint: disable=too-many-public-methods
         return []
 
     def raise_for_access(
-        # pylint: disable=too-many-arguments,too-many-branches,too-many-locals
+        # pylint: disable=too-many-arguments,too-many-branches,too-many-locals,too-many-statements
         self,
         dashboard: Optional["Dashboard"] = None,
         database: Optional["Database"] = None,
@@ -1909,6 +1909,30 @@ class SupersetSecurityManager(  # pylint: disable=too-many-public-methods
                     self.get_table_access_error_object(denied)
                 )
 
+        if self.is_guest_user() and query_context:
+            # Guest users MUST not modify the payload so it's requesting a different
+            # chart or different ad-hoc metrics from what's saved.
+            form_data = query_context.form_data
+            stored_chart = query_context.slice_
+
+            if (
+                form_data is None
+                or stored_chart is None
+                or form_data.get("slice_id") != stored_chart.id
+                or form_data.get("metrics", []) != stored_chart.params_dict["metrics"]
+                or any(
+                    query.metrics != stored_chart.params_dict["metrics"]
+                    for query in query_context.queries
+                )
+            ):
+                raise SupersetSecurityException(
+                    SupersetError(
+                        error_type=SupersetErrorType.DASHBOARD_SECURITY_ACCESS_ERROR,
+                        message=_("Guest user cannot modify chart payload"),
+                        level=ErrorLevel.ERROR,
+                    )
+                )
+
         if datasource or query_context or viz:
             form_data = None
 
diff --git a/tests/integration_tests/security/guest_token_security_tests.py b/tests/integration_tests/security/guest_token_security_tests.py
index 4cd8a77f76..b812929433 100644
--- a/tests/integration_tests/security/guest_token_security_tests.py
+++ b/tests/integration_tests/security/guest_token_security_tests.py
@@ -288,7 +288,10 @@ class TestGuestUserDatasourceAccess(SupersetTestCase):
                         form_data={
                             "dashboardId": self.dash.id,
                             "slice_id": self.chart.id,
+                            "metrics": self.chart.params_dict["metrics"],
                         },
+                        slice_=self.chart,
+                        queries=[],
                     )
                 }
             )
@@ -304,7 +307,11 @@ class TestGuestUserDatasourceAccess(SupersetTestCase):
                             "dashboardId": self.dash.id,
                             "native_filter_id": "NATIVE_FILTER-ABCDEFGH",
                             "type": "NATIVE_FILTER",
+                            "slice_id": self.chart.id,
+                            "metrics": self.chart.params_dict["metrics"],
                         },
+                        slice_=self.chart,
+                        queries=[],
                     )
                 }
             )
@@ -382,7 +389,11 @@ class TestGuestUserDatasourceAccess(SupersetTestCase):
                             form_data={
                                 "dashboardId": self.dash.id,
                                 "type": "NATIVE_FILTER",
+                                "slice_id": self.chart.id,
+                                "metrics": self.chart.params_dict["metrics"],
                             },
+                            slice_=self.chart,
+                            queries=[],
                         )
                     }
                 )
@@ -399,7 +410,11 @@ class TestGuestUserDatasourceAccess(SupersetTestCase):
                                 "dashboardId": self.dash.id,
                                 "native_filter_id": "NATIVE_FILTER-ABCDEFGH",
                                 "type": "NATIVE_FILTER",
+                                "slice_id": self.chart.id,
+                                "metrics": self.chart.params_dict["metrics"],
                             },
+                            slice_=self.chart,
+                            queries=[],
                         )
                     }
                 )
diff --git a/tests/unit_tests/security/manager_test.py b/tests/unit_tests/security/manager_test.py
index 1843e7261c..ad6e53e993 100644
--- a/tests/unit_tests/security/manager_test.py
+++ b/tests/unit_tests/security/manager_test.py
@@ -18,6 +18,7 @@
 import pytest
 from pytest_mock import MockFixture
 
+from superset.common.query_object import QueryObject
 from superset.exceptions import SupersetSecurityException
 from superset.extensions import appbuilder
 from superset.security.manager import SupersetSecurityManager
@@ -31,6 +32,81 @@ def test_security_manager(app_context: None) -> None:
     assert sm
 
 
+def test_raise_for_access_guest_user(
+    mocker: MockFixture,
+    app_context: None,
+) -> None:
+    """
+    Test that guest user can't modify chart payload.
+    """
+    sm = SupersetSecurityManager(appbuilder)
+    mocker.patch.object(sm, "is_guest_user", return_value=True)
+    mocker.patch.object(sm, "can_access", return_value=True)
+
+    query_context = mocker.MagicMock()
+    query_context.slice_.id = 42
+    stored_metrics = [
+        {
+            "aggregate": None,
+            "column": None,
+            "datasourceWarning": False,
+            "expressionType": "SQL",
+            "hasCustomLabel": False,
+            "label": "COUNT(*) + 1",
+            "optionName": "metric_ssa1gwimio_cxpyjc7vj3s",
+            "sqlExpression": "COUNT(*) + 1",
+        }
+    ]
+    query_context.slice_.params_dict = {
+        "metrics": stored_metrics,
+    }
+
+    # normal request
+    query_context.form_data = {
+        "slice_id": 42,
+        "metrics": stored_metrics,
+    }
+    query_context.queries = [QueryObject(metrics=stored_metrics)]  # type: ignore
+    sm.raise_for_access(query_context=query_context)
+
+    # tampered requests
+    query_context.form_data = {
+        "slice_id": 43,
+        "metrics": stored_metrics,
+    }
+    query_context.queries = [QueryObject(metrics=stored_metrics)]  # type: ignore
+    with pytest.raises(SupersetSecurityException):
+        sm.raise_for_access(query_context=query_context)
+
+    tampered_metrics = [
+        {
+            "aggregate": None,
+            "column": None,
+            "datasourceWarning": False,
+            "expressionType": "SQL",
+            "hasCustomLabel": False,
+            "label": "COUNT(*) + 2",
+            "optionName": "metric_ssa1gwimio_cxpyjc7vj3s",
+            "sqlExpression": "COUNT(*) + 2",
+        }
+    ]
+
+    query_context.form_data = {
+        "slice_id": 42,
+        "metrics": tampered_metrics,
+    }
+    with pytest.raises(SupersetSecurityException):
+        sm.raise_for_access(query_context=query_context)
+
+    query_context.form_data = {
+        "slice_id": 42,
+        "metrics": stored_metrics,
+    }
+    query_context.queries = [QueryObject(metrics=tampered_metrics)]  # type: ignore
+    with pytest.raises(SupersetSecurityException):
+        sm.raise_for_access(query_context=query_context)
+
+
 def test_raise_for_access_query_default_schema(
     mocker: MockFixture,
     app_context: None,


(superset) 02/09: fix(deck.gl Multiple Layer Chart): Add Contour and Heatmap Layer as options (#25923)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 1be9eb577ae31ae952011b20a2d4fa6824799ab9
Author: Matthew Chiang <36...@users.noreply.github.com>
AuthorDate: Mon Jan 29 05:10:11 2024 -0800

    fix(deck.gl Multiple Layer Chart): Add Contour and Heatmap Layer as options (#25923)
    
    (cherry picked from commit 64ba5797df92d0f8067ccd2b30ba6ff58e0bd791)
---
 .../plugins/legacy-preset-chart-deckgl/src/index.ts            |  2 ++
 .../plugins/legacy-preset-chart-deckgl/src/layers/Arc/Arc.tsx  |  8 ++++----
 .../legacy-preset-chart-deckgl/src/layers/Geojson/Geojson.tsx  |  4 ++--
 .../legacy-preset-chart-deckgl/src/layers/Heatmap/Heatmap.tsx  |  2 +-
 .../legacy-preset-chart-deckgl/src/layers/Path/Path.tsx        |  2 +-
 .../legacy-preset-chart-deckgl/src/layers/Polygon/Polygon.tsx  | 10 +++++-----
 .../legacy-preset-chart-deckgl/src/layers/Scatter/Scatter.tsx  | 10 +++++-----
 .../src/layers/Screengrid/Screengrid.tsx                       |  4 ++--
 .../plugins/legacy-preset-chart-deckgl/src/layers/index.ts     |  4 ++++
 9 files changed, 26 insertions(+), 20 deletions(-)

diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/index.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/index.ts
index 819964173e..fc4aa7fca0 100644
--- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/index.ts
+++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/index.ts
@@ -26,3 +26,5 @@ export { default as PathChartPlugin } from './layers/Path';
 export { default as PolygonChartPlugin } from './layers/Polygon';
 export { default as ScatterChartPlugin } from './layers/Scatter';
 export { default as ScreengridChartPlugin } from './layers/Screengrid';
+export { default as ContourChartPlugin } from './layers/Contour';
+export { default as HeatmapChartPlugin } from './layers/Heatmap';
diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Arc/Arc.tsx b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Arc/Arc.tsx
index 1bc19618b6..7ad1870170 100644
--- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Arc/Arc.tsx
+++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Arc/Arc.tsx
@@ -45,16 +45,16 @@ function setTooltipContent(formData: QueryFormData) {
     <div className="deckgl-tooltip">
       <TooltipRow
         label={t('Start (Longitude, Latitude): ')}
-        value={`${o.object.sourcePosition[0]}, ${o.object.sourcePosition[1]}`}
+        value={`${o.object?.sourcePosition?.[0]}, ${o.object?.sourcePosition?.[1]}`}
       />
       <TooltipRow
         label={t('End (Longitude, Latitude): ')}
-        value={`${o.object.targetPosition[0]}, ${o.object.targetPosition[1]}`}
+        value={`${o.object?.targetPosition?.[0]}, ${o.object?.targetPosition?.[1]}`}
       />
       {formData.dimension && (
         <TooltipRow
-          label={`${formData.dimension}: `}
-          value={`${o.object.cat_color}`}
+          label={`${formData?.dimension}: `}
+          value={`${o.object?.cat_color}`}
         />
       )}
     </div>
diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Geojson/Geojson.tsx b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Geojson/Geojson.tsx
index c8c9d4863c..a5dc2a14a0 100644
--- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Geojson/Geojson.tsx
+++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Geojson/Geojson.tsx
@@ -93,13 +93,13 @@ const recurseGeoJson = (
 
 function setTooltipContent(o: JsonObject) {
   return (
-    o.object.extraProps && (
+    o.object?.extraProps && (
       <div className="deckgl-tooltip">
         {Object.keys(o.object.extraProps).map((prop, index) => (
           <TooltipRow
             key={`prop-${index}`}
             label={`${prop}: `}
-            value={`${o.object.extraProps[prop]}`}
+            value={`${o.object.extraProps?.[prop]}`}
           />
         ))}
       </div>
diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Heatmap/Heatmap.tsx b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Heatmap/Heatmap.tsx
index b491d6dba1..5a1453459d 100644
--- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Heatmap/Heatmap.tsx
+++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Heatmap/Heatmap.tsx
@@ -30,7 +30,7 @@ function setTooltipContent(o: JsonObject) {
     <div className="deckgl-tooltip">
       <TooltipRow
         label={t('Centroid (Longitude and Latitude): ')}
-        value={`(${o.coordinate[0]}, ${o.coordinate[1]})`}
+        value={`(${o?.coordinate[0]}, ${o?.coordinate[1]})`}
       />
     </div>
   );
diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Path/Path.tsx b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Path/Path.tsx
index c4f13f0e57..f2f5c35e3d 100644
--- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Path/Path.tsx
+++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Path/Path.tsx
@@ -29,7 +29,7 @@ import { Point } from '../../types';
 
 function setTooltipContent(o: JsonObject) {
   return (
-    o.object.extraProps && (
+    o.object?.extraProps && (
       <div className="deckgl-tooltip">
         {Object.keys(o.object.extraProps).map((prop, index) => (
           <TooltipRow
diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Polygon/Polygon.tsx b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Polygon/Polygon.tsx
index 460c4a3b51..08f2e45140 100644
--- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Polygon/Polygon.tsx
+++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Polygon/Polygon.tsx
@@ -62,27 +62,27 @@ function getElevation(
 
 function setTooltipContent(formData: PolygonFormData) {
   return (o: JsonObject) => {
-    const metricLabel = formData.metric.label || formData.metric;
+    const metricLabel = formData?.metric?.label || formData?.metric;
 
     return (
       <div className="deckgl-tooltip">
-        {o.object.name && (
+        {o.object?.name && (
           <TooltipRow
             // eslint-disable-next-line prefer-template
             label={t('name') + ': '}
             value={`${o.object.name}`}
           />
         )}
-        {o.object[formData.line_column] && (
+        {o.object?.[formData?.line_column] && (
           <TooltipRow
             label={`${formData.line_column}: `}
             value={`${o.object[formData.line_column]}`}
           />
         )}
-        {formData.metric && (
+        {formData?.metric && (
           <TooltipRow
             label={`${metricLabel}: `}
-            value={`${o.object[metricLabel]}`}
+            value={`${o.object?.[metricLabel]}`}
           />
         )}
       </div>
diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Scatter/Scatter.tsx b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Scatter/Scatter.tsx
index c529e5c1d9..3ce1dcf25e 100644
--- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Scatter/Scatter.tsx
+++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Scatter/Scatter.tsx
@@ -48,17 +48,17 @@ function setTooltipContent(
         <TooltipRow
           // eslint-disable-next-line prefer-template
           label={t('Longitude and Latitude') + ': '}
-          value={`${o.object.position[0]}, ${o.object.position[1]}`}
+          value={`${o.object?.position?.[0]}, ${o.object?.position?.[1]}`}
         />
-        {o.object.cat_color && (
+        {o.object?.cat_color && (
           <TooltipRow
             // eslint-disable-next-line prefer-template
             label={t('Category') + ': '}
-            value={`${o.object.cat_color}`}
+            value={`${o.object?.cat_color}`}
           />
         )}
-        {o.object.metric && (
-          <TooltipRow label={`${label}: `} value={`${o.object.metric}`} />
+        {o.object?.metric && (
+          <TooltipRow label={`${label}: `} value={`${o.object?.metric}`} />
         )}
       </div>
     );
diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Screengrid/Screengrid.tsx b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Screengrid/Screengrid.tsx
index 7e47cc9530..9584c8f53e 100644
--- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Screengrid/Screengrid.tsx
+++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/Screengrid/Screengrid.tsx
@@ -45,12 +45,12 @@ function setTooltipContent(o: JsonObject) {
       <TooltipRow
         // eslint-disable-next-line prefer-template
         label={t('Longitude and Latitude') + ': '}
-        value={`${o.coordinate[0]}, ${o.coordinate[1]}`}
+        value={`${o?.coordinate?.[0]}, ${o?.coordinate?.[1]}`}
       />
       <TooltipRow
         // eslint-disable-next-line prefer-template
         label={t('Weight') + ': '}
-        value={`${o.object.cellWeight}`}
+        value={`${o.object?.cellWeight}`}
       />
     </div>
   );
diff --git a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/index.ts b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/index.ts
index b77d5bd12c..9747a50b1e 100644
--- a/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/index.ts
+++ b/superset-frontend/plugins/legacy-preset-chart-deckgl/src/layers/index.ts
@@ -25,6 +25,8 @@ import { getLayer as deck_scatter } from './Scatter/Scatter';
 import { getLayer as deck_geojson } from './Geojson/Geojson';
 import { getLayer as deck_arc } from './Arc/Arc';
 import { getLayer as deck_polygon } from './Polygon/Polygon';
+import { getLayer as deck_heatmap } from './Heatmap/Heatmap';
+import { getLayer as deck_contour } from './Contour/Contour';
 
 const layerGenerators = {
   deck_grid,
@@ -35,6 +37,8 @@ const layerGenerators = {
   deck_geojson,
   deck_arc,
   deck_polygon,
+  deck_heatmap,
+  deck_contour,
 };
 
 export default layerGenerators;


(superset) 07/09: fix: dashboard import validation (#26887)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 322812b1c8938d480a5f5862562ff1d3e459d642
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Thu Feb 1 10:09:59 2024 +0000

    fix: dashboard import validation (#26887)
    
    (cherry picked from commit 36ce9e26f0da7893946d787488a30722bdb4d51b)
---
 superset/commands/dashboard/importers/v1/utils.py  |   8 +-
 .../commands/importers/v1/import_test.py           | 110 +++++++++++++++++++--
 2 files changed, 111 insertions(+), 7 deletions(-)

diff --git a/superset/commands/dashboard/importers/v1/utils.py b/superset/commands/dashboard/importers/v1/utils.py
index 1deb44949a..712161bc16 100644
--- a/superset/commands/dashboard/importers/v1/utils.py
+++ b/superset/commands/dashboard/importers/v1/utils.py
@@ -157,7 +157,13 @@ def import_dashboard(
     )
     existing = session.query(Dashboard).filter_by(uuid=config["uuid"]).first()
     if existing:
-        if not overwrite or not can_write:
+        if overwrite and can_write and hasattr(g, "user") and g.user:
+            if not security_manager.can_access_dashboard(existing):
+                raise ImportFailedError(
+                    "A dashboard already exists and user doesn't "
+                    "have permissions to overwrite it"
+                )
+        elif not overwrite or not can_write:
             return existing
         config["id"] = existing.id
     elif not can_write:
diff --git a/tests/unit_tests/dashboards/commands/importers/v1/import_test.py b/tests/unit_tests/dashboards/commands/importers/v1/import_test.py
index 67e0897755..190f0660f5 100644
--- a/tests/unit_tests/dashboards/commands/importers/v1/import_test.py
+++ b/tests/unit_tests/dashboards/commands/importers/v1/import_test.py
@@ -23,6 +23,7 @@ from pytest_mock import MockFixture
 from sqlalchemy.orm.session import Session
 
 from superset.commands.exceptions import ImportFailedError
+from superset.utils.core import override_user
 
 
 def test_import_dashboard(mocker: MockFixture, session: Session) -> None:
@@ -31,8 +32,6 @@ def test_import_dashboard(mocker: MockFixture, session: Session) -> None:
     """
     from superset import security_manager
     from superset.commands.dashboard.importers.v1.utils import import_dashboard
-    from superset.connectors.sqla.models import SqlaTable
-    from superset.models.core import Database
     from superset.models.slice import Slice
     from tests.integration_tests.fixtures.importexport import dashboard_config
 
@@ -48,6 +47,8 @@ def test_import_dashboard(mocker: MockFixture, session: Session) -> None:
     assert dashboard.description is None
     assert dashboard.is_managed_externally is False
     assert dashboard.external_url is None
+    # Assert that the can write to dashboard was checked
+    security_manager.can_access.assert_called_once_with("can_write", "Dashboard")
 
 
 def test_import_dashboard_managed_externally(
@@ -59,8 +60,6 @@ def test_import_dashboard_managed_externally(
     """
     from superset import security_manager
     from superset.commands.dashboard.importers.v1.utils import import_dashboard
-    from superset.connectors.sqla.models import SqlaTable
-    from superset.models.core import Database
     from superset.models.slice import Slice
     from tests.integration_tests.fixtures.importexport import dashboard_config
 
@@ -77,6 +76,9 @@ def test_import_dashboard_managed_externally(
     assert dashboard.is_managed_externally is True
     assert dashboard.external_url == "https://example.org/my_dashboard"
 
+    # Assert that the can write to dashboard was checked
+    security_manager.can_access.assert_called_once_with("can_write", "Dashboard")
+
 
 def test_import_dashboard_without_permission(
     mocker: MockFixture,
@@ -87,8 +89,6 @@ def test_import_dashboard_without_permission(
     """
     from superset import security_manager
     from superset.commands.dashboard.importers.v1.utils import import_dashboard
-    from superset.connectors.sqla.models import SqlaTable
-    from superset.models.core import Database
     from superset.models.slice import Slice
     from tests.integration_tests.fixtures.importexport import dashboard_config
 
@@ -105,3 +105,101 @@ def test_import_dashboard_without_permission(
         str(excinfo.value)
         == "Dashboard doesn't exist and user doesn't have permission to create dashboards"
     )
+
+    # Assert that the can write to dashboard was checked
+    security_manager.can_access.assert_called_once_with("can_write", "Dashboard")
+
+
+def test_import_existing_dashboard_without_permission(
+    mocker: MockFixture,
+    session: Session,
+) -> None:
+    """
+    Test importing a dashboard when a user doesn't have permissions to create.
+    """
+    from superset import security_manager
+    from superset.commands.dashboard.importers.v1.utils import g, import_dashboard
+    from superset.models.dashboard import Dashboard
+    from superset.models.slice import Slice
+    from tests.integration_tests.fixtures.importexport import dashboard_config
+
+    mocker.patch.object(security_manager, "can_access", return_value=True)
+    mocker.patch.object(security_manager, "can_access_dashboard", return_value=False)
+    mock_g = mocker.patch(
+        "superset.commands.dashboard.importers.v1.utils.g"
+    )  # Replace with the actual path to g
+    mock_g.user = mocker.MagicMock(return_value=True)
+
+    engine = session.get_bind()
+    Slice.metadata.create_all(engine)  # pylint: disable=no-member
+    Dashboard.metadata.create_all(engine)  # pylint: disable=no-member
+
+    dashboard_obj = Dashboard(
+        id=100,
+        dashboard_title="Test dash",
+        slug=None,
+        slices=[],
+        published=True,
+        uuid="c4b28c4e-a1fe-4cf8-a5ac-d6f11d6fdd51",
+    )
+    session.add(dashboard_obj)
+    session.flush()
+    config = copy.deepcopy(dashboard_config)
+
+    with pytest.raises(ImportFailedError) as excinfo:
+        import_dashboard(session, config, overwrite=True)
+    assert (
+        str(excinfo.value)
+        == "A dashboard already exists and user doesn't have permissions to overwrite it"
+    )
+
+    # Assert that the can write to dashboard was checked
+    security_manager.can_access.assert_called_once_with("can_write", "Dashboard")
+    security_manager.can_access_dashboard.assert_called_once_with(dashboard_obj)
+
+
+def test_import_existing_dashboard_with_permission(
+    mocker: MockFixture,
+    session: Session,
+) -> None:
+    """
+    Test importing a dashboard when a user doesn't have permissions to create.
+    """
+    from flask_appbuilder.security.sqla.models import Role, User
+
+    from superset import security_manager
+    from superset.commands.dashboard.importers.v1.utils import import_dashboard
+    from superset.models.dashboard import Dashboard
+    from tests.integration_tests.fixtures.importexport import dashboard_config
+
+    mocker.patch.object(security_manager, "can_access", return_value=True)
+    mocker.patch.object(security_manager, "can_access_dashboard", return_value=True)
+
+    engine = session.get_bind()
+    Dashboard.metadata.create_all(engine)  # pylint: disable=no-member
+
+    admin = User(
+        first_name="Alice",
+        last_name="Doe",
+        email="adoe@example.org",
+        username="admin",
+        roles=[Role(name="Admin")],
+    )
+
+    dashboard_obj = Dashboard(
+        id=100,
+        dashboard_title="Test dash",
+        slug=None,
+        slices=[],
+        published=True,
+        uuid="c4b28c4e-a1fe-4cf8-a5ac-d6f11d6fdd51",
+    )
+    session.add(dashboard_obj)
+    session.flush()
+    config = copy.deepcopy(dashboard_config)
+
+    with override_user(admin):
+        import_dashboard(session, config, overwrite=True)
+    # Assert that the can write to dashboard was checked
+    security_manager.can_access.assert_called_once_with("can_write", "Dashboard")
+    security_manager.can_access_dashboard.assert_called_once_with(dashboard_obj)


(superset) 04/09: fix: Bar charts horizontal margin adjustment error (#26817)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 81353e280ba8d432bcef62762e5b786d28e6f161
Author: Michael S. Molina <70...@users.noreply.github.com>
AuthorDate: Mon Jan 29 15:25:16 2024 -0500

    fix: Bar charts horizontal margin adjustment error (#26817)
    
    (cherry picked from commit 84c48d11d8b3bef244823643804f5fd3d6e3ca86)
---
 .../src/Timeseries/transformProps.ts               |  1 +
 .../src/Timeseries/transformers.ts                 | 42 +++++++++++++---------
 .../plugin-chart-echarts/src/utils/series.ts       | 14 ++++++++
 .../plugin-chart-echarts/test/utils/series.test.ts | 32 +++++++++++++++++
 4 files changed, 72 insertions(+), 17 deletions(-)

diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformProps.ts b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformProps.ts
index 3bbe285aec..a5c380c5ff 100644
--- a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformProps.ts
+++ b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformProps.ts
@@ -437,6 +437,7 @@ export default function transformProps(
     yAxisTitlePosition,
     convertInteger(yAxisTitleMargin),
     convertInteger(xAxisTitleMargin),
+    isHorizontal,
   );
 
   const legendData = rawSeries
diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformers.ts b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformers.ts
index 3b62417f16..5b9fa43047 100644
--- a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformers.ts
+++ b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformers.ts
@@ -550,6 +550,7 @@ export function getPadding(
   yAxisTitlePosition?: string,
   yAxisTitleMargin?: number,
   xAxisTitleMargin?: number,
+  isHorizontal?: boolean,
 ): {
   bottom: number;
   left: number;
@@ -560,21 +561,28 @@ export function getPadding(
     ? TIMESERIES_CONSTANTS.yAxisLabelTopOffset
     : 0;
   const xAxisOffset = addXAxisTitleOffset ? Number(xAxisTitleMargin) || 0 : 0;
-  return getChartPadding(showLegend, legendOrientation, margin, {
-    top:
-      yAxisTitlePosition && yAxisTitlePosition === 'Top'
-        ? TIMESERIES_CONSTANTS.gridOffsetTop + (Number(yAxisTitleMargin) || 0)
-        : TIMESERIES_CONSTANTS.gridOffsetTop + yAxisOffset,
-    bottom: zoomable
-      ? TIMESERIES_CONSTANTS.gridOffsetBottomZoomable + xAxisOffset
-      : TIMESERIES_CONSTANTS.gridOffsetBottom + xAxisOffset,
-    left:
-      yAxisTitlePosition === 'Left'
-        ? TIMESERIES_CONSTANTS.gridOffsetLeft + (Number(yAxisTitleMargin) || 0)
-        : TIMESERIES_CONSTANTS.gridOffsetLeft,
-    right:
-      showLegend && legendOrientation === LegendOrientation.Right
-        ? 0
-        : TIMESERIES_CONSTANTS.gridOffsetRight,
-  });
+  return getChartPadding(
+    showLegend,
+    legendOrientation,
+    margin,
+    {
+      top:
+        yAxisTitlePosition && yAxisTitlePosition === 'Top'
+          ? TIMESERIES_CONSTANTS.gridOffsetTop + (Number(yAxisTitleMargin) || 0)
+          : TIMESERIES_CONSTANTS.gridOffsetTop + yAxisOffset,
+      bottom: zoomable
+        ? TIMESERIES_CONSTANTS.gridOffsetBottomZoomable + xAxisOffset
+        : TIMESERIES_CONSTANTS.gridOffsetBottom + xAxisOffset,
+      left:
+        yAxisTitlePosition === 'Left'
+          ? TIMESERIES_CONSTANTS.gridOffsetLeft +
+            (Number(yAxisTitleMargin) || 0)
+          : TIMESERIES_CONSTANTS.gridOffsetLeft,
+      right:
+        showLegend && legendOrientation === LegendOrientation.Right
+          ? 0
+          : TIMESERIES_CONSTANTS.gridOffsetRight,
+    },
+    isHorizontal,
+  );
 }
diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/utils/series.ts b/superset-frontend/plugins/plugin-chart-echarts/src/utils/series.ts
index 6a51e7cbf7..8c87bf4333 100644
--- a/superset-frontend/plugins/plugin-chart-echarts/src/utils/series.ts
+++ b/superset-frontend/plugins/plugin-chart-echarts/src/utils/series.ts
@@ -466,6 +466,7 @@ export function getChartPadding(
   orientation: LegendOrientation,
   margin?: string | number | null,
   padding?: { top?: number; bottom?: number; left?: number; right?: number },
+  isHorizontal?: boolean,
 ): {
   bottom: number;
   left: number;
@@ -486,6 +487,19 @@ export function getChartPadding(
   }
 
   const { bottom = 0, left = 0, right = 0, top = 0 } = padding || {};
+
+  if (isHorizontal) {
+    return {
+      left:
+        left + (orientation === LegendOrientation.Bottom ? legendMargin : 0),
+      right:
+        right + (orientation === LegendOrientation.Right ? legendMargin : 0),
+      top: top + (orientation === LegendOrientation.Top ? legendMargin : 0),
+      bottom:
+        bottom + (orientation === LegendOrientation.Left ? legendMargin : 0),
+    };
+  }
+
   return {
     left: left + (orientation === LegendOrientation.Left ? legendMargin : 0),
     right: right + (orientation === LegendOrientation.Right ? legendMargin : 0),
diff --git a/superset-frontend/plugins/plugin-chart-echarts/test/utils/series.test.ts b/superset-frontend/plugins/plugin-chart-echarts/test/utils/series.test.ts
index 3d7d21c8d0..10768a47f8 100644
--- a/superset-frontend/plugins/plugin-chart-echarts/test/utils/series.test.ts
+++ b/superset-frontend/plugins/plugin-chart-echarts/test/utils/series.test.ts
@@ -795,6 +795,14 @@ describe('getChartPadding', () => {
       right: 0,
       top: 0,
     });
+    expect(
+      getChartPadding(true, LegendOrientation.Left, 100, undefined, true),
+    ).toEqual({
+      bottom: 100,
+      left: 0,
+      right: 0,
+      top: 0,
+    });
   });
 
   it('should return the correct padding for right orientation', () => {
@@ -804,6 +812,14 @@ describe('getChartPadding', () => {
       right: 50,
       top: 0,
     });
+    expect(
+      getChartPadding(true, LegendOrientation.Right, 50, undefined, true),
+    ).toEqual({
+      bottom: 0,
+      left: 0,
+      right: 50,
+      top: 0,
+    });
   });
 
   it('should return the correct padding for top orientation', () => {
@@ -813,6 +829,14 @@ describe('getChartPadding', () => {
       right: 0,
       top: 20,
     });
+    expect(
+      getChartPadding(true, LegendOrientation.Top, 20, undefined, true),
+    ).toEqual({
+      bottom: 0,
+      left: 0,
+      right: 0,
+      top: 20,
+    });
   });
 
   it('should return the correct padding for bottom orientation', () => {
@@ -822,6 +846,14 @@ describe('getChartPadding', () => {
       right: 0,
       top: 0,
     });
+    expect(
+      getChartPadding(true, LegendOrientation.Bottom, 10, undefined, true),
+    ).toEqual({
+      bottom: 0,
+      left: 10,
+      right: 0,
+      top: 0,
+    });
   });
 });
 


(superset) 05/09: fix(pinot): typo in the name for epoch_ms_to_dttm (#26906)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 61b4017b2f50ddddadc6477c6e8bd3ddde65a161
Author: Erich <13...@users.noreply.github.com>
AuthorDate: Tue Jan 30 23:49:55 2024 -0500

    fix(pinot): typo in the name for epoch_ms_to_dttm (#26906)
    
    (cherry picked from commit 484901f4832b64845931f728db3e367f7f7c562c)
---
 superset/db_engine_specs/pinot.py                      |  2 +-
 tests/integration_tests/db_engine_specs/pinot_tests.py | 14 ++++++++++++++
 2 files changed, 15 insertions(+), 1 deletion(-)

diff --git a/superset/db_engine_specs/pinot.py b/superset/db_engine_specs/pinot.py
index 2cafd5ecb0..0f53cfa77b 100644
--- a/superset/db_engine_specs/pinot.py
+++ b/superset/db_engine_specs/pinot.py
@@ -62,7 +62,7 @@ class PinotEngineSpec(BaseEngineSpec):
         )
 
     @classmethod
-    def epoch_ms_to_dttm_(cls) -> str:
+    def epoch_ms_to_dttm(cls) -> str:
         return (
             "DATETIMECONVERT({col}, '1:MILLISECONDS:EPOCH', "
             + "'1:MILLISECONDS:EPOCH', '1:MILLISECONDS')"
diff --git a/tests/integration_tests/db_engine_specs/pinot_tests.py b/tests/integration_tests/db_engine_specs/pinot_tests.py
index 3998d20940..c8deef6fc4 100755
--- a/tests/integration_tests/db_engine_specs/pinot_tests.py
+++ b/tests/integration_tests/db_engine_specs/pinot_tests.py
@@ -84,6 +84,20 @@ class TestPinotDbEngineSpec(TestDbEngineSpec):
             expected,
         )
 
+    def test_pinot_time_expression_millisec_one_1m_grain(self):
+        col = column("tstamp")
+        expr = PinotEngineSpec.get_timestamp_expr(col, "epoch_ms", "P1M")
+        result = str(expr.compile())
+        expected = (
+            "CAST(DATE_TRUNC('month', CAST("
+            + "DATETIMECONVERT(tstamp, '1:MILLISECONDS:EPOCH', "
+            + "'1:MILLISECONDS:EPOCH', '1:MILLISECONDS') AS TIMESTAMP)) AS TIMESTAMP)"
+        )
+        self.assertEqual(
+            result,
+            expected,
+        )
+
     def test_invalid_get_time_expression_arguments(self):
         with self.assertRaises(NotImplementedError):
             PinotEngineSpec.get_timestamp_expr(column("tstamp"), None, "P0.25Y")


(superset) 06/09: fix: handle CRLF endings causing sqlglot failure (#26911)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit de61591a6a224233f170fdfedcc104918ca7edd2
Author: mapledan <ma...@gmail.com>
AuthorDate: Thu Feb 1 10:07:43 2024 +0800

    fix: handle CRLF endings causing sqlglot failure (#26911)
    
    (cherry picked from commit f2bf9f72e4f17604f5db80f25815525236a7269a)
---
 superset/sql_parse.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/superset/sql_parse.py b/superset/sql_parse.py
index 07704171de..7b89ab8f0e 100644
--- a/superset/sql_parse.py
+++ b/superset/sql_parse.py
@@ -286,7 +286,7 @@ class ParsedQuery:
         Note: this uses sqlglot, since it's better at catching more edge cases.
         """
         try:
-            statements = parse(self.sql, dialect=self._dialect)
+            statements = parse(self.stripped(), dialect=self._dialect)
         except ParseError:
             logger.warning("Unable to parse SQL (%s): %s", self._dialect, self.sql)
             return set()
@@ -494,7 +494,7 @@ class ParsedQuery:
         return self._parsed[0].get_type() == "UNKNOWN"
 
     def stripped(self) -> str:
-        return self.sql.strip(" \t\n;")
+        return self.sql.strip(" \t\r\n;")
 
     def strip_comments(self) -> str:
         return sqlparse.format(self.stripped(), strip_comments=True)


(superset) 09/09: fix(cache): remove unused webserver config & handle trailing slashes (#22849)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 86794cbb5eef340b7fa51f139561b1029c5b491c
Author: Usiel Riedl <us...@automattic.com>
AuthorDate: Thu Feb 1 07:34:07 2024 -0800

    fix(cache): remove unused webserver config & handle trailing slashes (#22849)
    
    (cherry picked from commit 56069b05f9cf4d0c725d1b4b0ad6038b50837cd4)
---
 superset/config.py                                 |  4 --
 superset/tasks/cache.py                            |  4 +-
 tests/integration_tests/superset_test_config.py    |  1 -
 .../superset_test_config_thumbnails.py             |  1 -
 tests/integration_tests/tasks/test_cache.py        | 58 ++++++++++++++++++++++
 5 files changed, 60 insertions(+), 8 deletions(-)

diff --git a/superset/config.py b/superset/config.py
index ca801442d9..6ef25fb3aa 100644
--- a/superset/config.py
+++ b/superset/config.py
@@ -154,10 +154,6 @@ FILTER_SELECT_ROW_LIMIT = 10000
 # values may be "Last day", "Last week", "<ISO date> : now", etc.
 DEFAULT_TIME_FILTER = NO_TIME_RANGE
 
-SUPERSET_WEBSERVER_PROTOCOL = "http"
-SUPERSET_WEBSERVER_ADDRESS = "0.0.0.0"
-SUPERSET_WEBSERVER_PORT = 8088
-
 # This is an important setting, and should be lower than your
 # [load balancer / proxy / envoy / kong / ...] timeout settings.
 # You should also make sure to configure your WSGI server
diff --git a/superset/tasks/cache.py b/superset/tasks/cache.py
index 569797ba27..1f60a5cd84 100644
--- a/superset/tasks/cache.py
+++ b/superset/tasks/cache.py
@@ -32,6 +32,7 @@ from superset.models.slice import Slice
 from superset.tags.models import Tag, TaggedObject
 from superset.utils.date_parser import parse_human_datetime
 from superset.utils.machine_auth import MachineAuthProvider
+from superset.utils.urls import get_url_path
 
 logger = get_task_logger(__name__)
 logger.setLevel(logging.INFO)
@@ -233,8 +234,7 @@ def fetch_url(data: str, headers: dict[str, str]) -> dict[str, str]:
     """
     result = {}
     try:
-        baseurl = app.config["WEBDRIVER_BASEURL"]
-        url = f"{baseurl}api/v1/chart/warm_up_cache"
+        url = get_url_path("Superset.warm_up_cache")
         logger.info("Fetching %s with payload %s", url, data)
         req = request.Request(
             url, data=bytes(data, "utf-8"), headers=headers, method="PUT"
diff --git a/tests/integration_tests/superset_test_config.py b/tests/integration_tests/superset_test_config.py
index 60b2d1107e..580ce14d35 100644
--- a/tests/integration_tests/superset_test_config.py
+++ b/tests/integration_tests/superset_test_config.py
@@ -36,7 +36,6 @@ SQLALCHEMY_DATABASE_URI = "sqlite:///" + os.path.join(
     DATA_DIR, "unittests.integration_tests.db"
 )
 DEBUG = False
-SUPERSET_WEBSERVER_PORT = 8081
 SILENCE_FAB = False
 # Allowing SQLALCHEMY_DATABASE_URI and SQLALCHEMY_EXAMPLES_URI to be defined as an env vars for
 # continuous integration
diff --git a/tests/integration_tests/superset_test_config_thumbnails.py b/tests/integration_tests/superset_test_config_thumbnails.py
index a761ef6611..0b5710d342 100644
--- a/tests/integration_tests/superset_test_config_thumbnails.py
+++ b/tests/integration_tests/superset_test_config_thumbnails.py
@@ -24,7 +24,6 @@ SQLALCHEMY_DATABASE_URI = "sqlite:///" + os.path.join(
     DATA_DIR, "unittests.integration_tests.db"
 )
 DEBUG = True
-SUPERSET_WEBSERVER_PORT = 8081
 
 # Allowing SQLALCHEMY_DATABASE_URI to be defined as an env var for
 # continuous integration
diff --git a/tests/integration_tests/tasks/test_cache.py b/tests/integration_tests/tasks/test_cache.py
new file mode 100644
index 0000000000..943b444f76
--- /dev/null
+++ b/tests/integration_tests/tasks/test_cache.py
@@ -0,0 +1,58 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from unittest import mock
+
+import pytest
+
+from tests.integration_tests.test_app import app
+
+
+@pytest.mark.parametrize(
+    "base_url",
+    [
+        "http://base-url",
+        "http://base-url/",
+    ],
+    ids=["Without trailing slash", "With trailing slash"],
+)
+@mock.patch("superset.tasks.cache.request.Request")
+@mock.patch("superset.tasks.cache.request.urlopen")
+def test_fetch_url(mock_urlopen, mock_request_cls, base_url):
+    from superset.tasks.cache import fetch_url
+
+    mock_request = mock.MagicMock()
+    mock_request_cls.return_value = mock_request
+
+    mock_urlopen.return_value = mock.MagicMock()
+    mock_urlopen.return_value.code = 200
+
+    app.config["WEBDRIVER_BASEURL"] = base_url
+    headers = {"key": "value"}
+    data = "data"
+    data_encoded = b"data"
+
+    result = fetch_url(data, headers)
+
+    assert data == result["success"]
+    mock_request_cls.assert_called_once_with(
+        "http://base-url/superset/warm_up_cache/",
+        data=data_encoded,
+        headers=headers,
+        method="PUT",
+    )
+    # assert the same Request object is used
+    mock_urlopen.assert_called_once_with(mock_request, timeout=mock.ANY)


(superset) 08/09: fix: Allow exporting saved queries without schema information (#26889)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit e8a3c5d5f12ff7df2bf9e3324ce45674f5d850ed
Author: Sebastian Bernauer <se...@stackable.de>
AuthorDate: Thu Feb 1 16:12:29 2024 +0100

    fix: Allow exporting saved queries without schema information (#26889)
    
    (cherry picked from commit 4c5176eea82e3b168c5d11f130387d5913b33efa)
---
 superset/commands/query/export.py | 11 ++++++++---
 1 file changed, 8 insertions(+), 3 deletions(-)

diff --git a/superset/commands/query/export.py b/superset/commands/query/export.py
index a8fa8acbf0..43a110c3b9 100644
--- a/superset/commands/query/export.py
+++ b/superset/commands/query/export.py
@@ -40,11 +40,16 @@ class ExportSavedQueriesCommand(ExportModelsCommand):
     def _export(
         model: SavedQuery, export_related: bool = True
     ) -> Iterator[tuple[str, str]]:
-        # build filename based on database, optional schema, and label
+        # build filename based on database, optional schema, and label.
+        # we call secure_filename() multiple times and join the directories afterwards,
+        # as secure_filename() replaces "/" with "_".
         database_slug = secure_filename(model.database.database_name)
-        schema_slug = secure_filename(model.schema)
         query_slug = secure_filename(model.label) or str(model.uuid)
-        file_name = f"queries/{database_slug}/{schema_slug}/{query_slug}.yaml"
+        if model.schema is None:
+            file_name = f"queries/{database_slug}/{query_slug}.yaml"
+        else:
+            schema_slug = secure_filename(model.schema)
+            file_name = f"queries/{database_slug}/{schema_slug}/{query_slug}.yaml"
 
         payload = model.export_to_dict(
             recursive=False,