You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@superset.apache.org by el...@apache.org on 2023/06/14 15:51:50 UTC

[superset] tag 2.1.1rc1 updated (43cbf8ac99 -> 7b6907fe0f)

This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a change to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git


*** WARNING: tag 2.1.1rc1 was modified! ***

    from 43cbf8ac99 (commit)
      to 7b6907fe0f (commit)
    omit 43cbf8ac99 remove blocking test from release
    omit 2b3aa098ae fix: allow db driver distinction on enforced URI params (#23769)
    omit dc5bed4ec4 lint
    omit 78d043309c add license to package and plugin readme files
    omit 48ea6c0866 feat: add enforce URI query params with a specific for MySQL (#23723)
    omit 70bdf408a6 fix: permission checks on import (#23200)
    omit 7213aa994f fix: check sqlalchemy_uri (#23901)
     add 502b8b81e0 test #1
     add 831978f0f7 fix: check sqlalchemy_uri (#23901)
     add cfc2ca672e fix: permission checks on import (#23200)
     add 0a9f47e4ac fix: load examples as anon user (#23600)
     add 8821174921 feat: add enforce URI query params with a specific for MySQL (#23723)
     add 4345a14841 add license to package and plugin readme files
     add 2f3471a87e lint
     new b26901cb05 fix: allow db driver distinction on enforced URI params (#23769)
     new 483195ad70 remove blocking test from release
     new f244c24bb9 lint
     new f478038281 add changelog
     new b53325e576 remove tests that don't apply
     new 831cd9b030 chore: Remove unnecessary information from response (#24056)
     new 8d32525f97 chore: update UPDATING for 2.1.0 (#24294)
     new e804489a89 fix: db validate parameters permission (#24185)
     new 4af81bf70a fix: disable SHOW_STACKTRACE by default (#24137)
     new a25347c113 fix: enable strong session protection by default (#24256)
     new 9e6c9d2aa3 fix: handle comments in `has_table_query` (#23882)
     new cfb4d27d8c lint
     new 5f21e7385f fix: handle temporal columns in presto partitions (#24054)
     new 7247b9bb07 merge in fix with migration (#24314)
     new 801389f2f1 update changelog
     new 3f20b9ebab update package version
     new aad2c75551 fix: update order of build for testing a release (#24317)
     new 7b6907fe0f update changelog

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
tag are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (43cbf8ac99)
            \
             N -- N -- N   refs/tags/2.1.1rc1 (7b6907fe0f)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 18 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .github/workflows/docker.yml                       |  18 ++
 CHANGELOG.md                                       |  34 +++-
 RELEASING/from_tarball_entrypoint.sh               |   6 +-
 UPDATING.md                                        |  10 +-
 docs/docs/contributing/testing-locally.mdx         |   2 +-
 docs/docs/security.mdx                             |  31 +++-
 superset-frontend/package-lock.json                |   4 +-
 superset-frontend/package.json                     |   2 +-
 superset/charts/api.py                             |   2 -
 superset/charts/commands/importers/v1/utils.py     |   7 +-
 superset/commands/importers/v1/examples.py         |  41 +++--
 superset/config.py                                 |   9 +-
 superset/connectors/sqla/models.py                 |   7 +-
 superset/constants.py                              |   4 +-
 superset/dashboards/api.py                         |   3 -
 superset/dashboards/commands/importers/v1/utils.py |  10 +-
 superset/dashboards/permalink/commands/base.py     |   3 +-
 superset/dashboards/permalink/commands/create.py   |   1 +
 superset/dashboards/permalink/commands/get.py      |   6 +-
 superset/dashboards/schemas.py                     |   4 +-
 superset/databases/commands/importers/v1/utils.py  |   6 +-
 superset/datasets/api.py                           |   4 +-
 superset/datasets/commands/importers/v1/utils.py   |   6 +-
 superset/db_engine_specs/base.py                   |   2 +-
 superset/db_engine_specs/hive.py                   |   2 +-
 superset/db_engine_specs/presto.py                 |  18 +-
 superset/examples/utils.py                         |   2 +-
 superset/explore/permalink/commands/base.py        |   3 +-
 superset/explore/permalink/commands/create.py      |   3 +-
 superset/explore/permalink/commands/get.py         |   1 +
 superset/extensions/metastore_cache.py             |  11 +-
 superset/key_value/commands/create.py              |  23 ++-
 superset/key_value/commands/get.py                 |  15 +-
 superset/key_value/commands/update.py              |  11 +-
 superset/key_value/commands/upsert.py              |  13 +-
 superset/key_value/shared_entries.py               |  12 +-
 superset/key_value/types.py                        |  33 +++-
 ...a5681ddfd_convert_key_value_entries_to_json.py} |  66 ++++---
 superset/models/dashboard.py                       |   6 +-
 superset/models/filter_set.py                      |   6 +-
 superset/models/slice.py                           |   8 +-
 superset/queries/api.py                            |   1 -
 superset/queries/schemas.py                        |   2 +-
 superset/sql_parse.py                              |   4 +-
 superset/tags/schemas.py                           |  59 ++++++
 superset/temporary_cache/api.py                    |  13 +-
 superset/temporary_cache/commands/parameters.py    |   3 +
 tests/integration_tests/charts/api_tests.py        | 109 ++++++++++-
 tests/integration_tests/csv_upload_tests.py        | 146 +++++++--------
 tests/integration_tests/dashboards/api_tests.py    | 106 ++++++++++-
 .../integration_tests/databases/commands_tests.py  | 200 ---------------------
 tests/integration_tests/datasets/api_tests.py      | 102 +++++++++++
 .../explore/permalink/api_tests.py                 |   5 +-
 .../key_value/commands/create_test.py              |  55 +++++-
 .../key_value/commands/delete_test.py              |  13 +-
 .../key_value/commands/fixtures.py                 |  15 +-
 .../key_value/commands/get_test.py                 |  25 +--
 .../key_value/commands/update_test.py              |  11 +-
 .../key_value/commands/upsert_test.py              |  11 +-
 tests/integration_tests/queries/api_tests.py       |   1 -
 tests/integration_tests/sqllab_tests.py            |  26 +--
 tests/unit_tests/db_engine_specs/test_presto.py    |  43 ++++-
 tests/unit_tests/sql_parse_tests.py                |   8 +
 63 files changed, 960 insertions(+), 453 deletions(-)
 copy superset/migrations/versions/{2022-06-27_14-59_7fb8bca906d2_permalink_rename_filterstate.py => 2023-05-01_12-03_9c2a5681ddfd_convert_key_value_entries_to_json.py} (55%)
 create mode 100644 superset/tags/schemas.py


[superset] 04/18: add changelog

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit f4780382810296bb6b90feec8cc98eea0f393291
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Mon Jun 5 16:48:39 2023 -0700

    add changelog
---
 CHANGELOG.md | 30 +++++++++++++++++++++++++++++-
 1 file changed, 29 insertions(+), 1 deletion(-)

diff --git a/CHANGELOG.md b/CHANGELOG.md
index 6b5f698175..d25ea7c3c2 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -19,6 +19,7 @@ under the License.
 
 ## Change Log
 
+- [2.1.1](#211-sun-apr-23-154421-2023--0100)
 - [2.1.0](#210-thu-mar-16-211305-2023--0700)
 - [2.0.1](#201-fri-nov-4-103402-2022--0400)
 - [2.0.0](#200-tue-jun-28-085302-2022--0400)
@@ -29,7 +30,34 @@ under the License.
 - [1.4.2](#142-sat-mar-19-000806-2022-0200)
 - [1.4.1](#141)
 
-### 2.1 (Thu Mar 16 21:13:05 2023 -0700)
+
+### 2.1.1 (Sun Apr 23 15:44:21 2023 +0100)
+
+**Fixes**
+- [#23723](https://github.com/apache/superset/pull/23723) add enforce URI query params with a specific for MySQL (@dpgaspar)
+- [#24054](https://github.com/apache/superset/pull/24054) fix: handle temporal columns in presto partitions (@giftig)
+- [#23882](https://github.com/apache/superset/pull/23882) fix: handle comments in `has_table_query` (@betodealmeida)
+- [#24256](https://github.com/apache/superset/pull/24256) fix: enable strong session protection by default (@dpgaspar)
+- [#24137](https://github.com/apache/superset/pull/24137) fix: disable SHOW_STACKTRACE by default (@dpgaspar)
+- [#24185](https://github.com/apache/superset/pull/24185) fix: db validate parameters permission (@dpgaspar)
+- [#23769](https://github.com/apache/superset/pull/23769) fix: allow db driver distinction on enforced URI params (@dpgaspar)
+- [#23200](https://github.com/apache/superset/pull/23200) fix: permission checks on import (@betodealmeida)
+- [#23901](https://github.com/apache/superset/pull/23901) fix: check sqlalchemy_uri (@dpgaspar)
+- [#23751](https://github.com/apache/superset/pull/23751) fix(mssql): apply top after distinct (@villebro)
+- [#23586](https://github.com/apache/superset/pull/23586) fix(dashboard-rbac): use normal rbac when no roles chosen (@villebro)
+- [#23582](https://github.com/apache/superset/pull/23582) fix(dash import): Ensure old datasource ids are not referenced in imported charts (@jfrag1)
+- [#23506](https://github.com/apache/superset/pull/23506) fix(generic-x-axis): skip initial time filter for legacy charts (@villebro)
+- [#23507](https://github.com/apache/superset/pull/23507) fix(legacy-plugin-chart-heatmap): fix adhoc column tooltip (@villebro)
+- [#23441](https://github.com/apache/superset/pull/23441) fix(chart): non existent time grain no longer breaks the application (@rdubois)
+- [#23393](https://github.com/apache/superset/pull/23393) fix(Pivot Table v2): resolved full width issue (@AkashBoora)
+- [#22851](https://github.com/apache/superset/pull/22851) fix: Validate jinja rendered query (@geido)
+
+**Others**
+- [#24294](https://github.com/apache/superset/pull/24294) chore: update UPDATING for 2.1.0 (@eschutho)
+- [#24056](https://github.com/apache/superset/pull/24056) chore: Remove unnecessary information from response (@geido)
+
+
+### 2.1.0 (Thu Mar 16 21:13:05 2023 -0700)
 **Database Migrations**
 - [#23139](https://github.com/apache/superset/pull/23139) fix: memoized decorator memory leak (@dpgaspar)
 - [#19676](https://github.com/apache/superset/pull/19676) chore(frontend): Spelling (@jsoref)


[superset] 11/18: fix: handle comments in `has_table_query` (#23882)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 9e6c9d2aa3e66a157c41e270d66da3b84165630e
Author: Beto Dealmeida <ro...@dealmeida.net>
AuthorDate: Mon May 1 11:06:54 2023 -0700

    fix: handle comments in `has_table_query` (#23882)
---
 superset/sql_parse.py               | 4 +++-
 tests/unit_tests/sql_parse_tests.py | 8 ++++++++
 2 files changed, 11 insertions(+), 1 deletion(-)

diff --git a/superset/sql_parse.py b/superset/sql_parse.py
index ab2f044172..a3c1af87b0 100644
--- a/superset/sql_parse.py
+++ b/superset/sql_parse.py
@@ -509,6 +509,9 @@ def has_table_query(token_list: TokenList) -> bool:
     """
     state = InsertRLSState.SCANNING
     for token in token_list.tokens:
+        # Ignore comments
+        if isinstance(token, sqlparse.sql.Comment):
+            continue
 
         # Recurse into child token list
         if isinstance(token, TokenList) and has_table_query(token):
@@ -607,7 +610,6 @@ def insert_rls(
     rls: Optional[TokenList] = None
     state = InsertRLSState.SCANNING
     for token in token_list.tokens:
-
         # Recurse into child token list
         if isinstance(token, TokenList):
             i = token_list.tokens.index(token)
diff --git a/tests/unit_tests/sql_parse_tests.py b/tests/unit_tests/sql_parse_tests.py
index ba3da69aae..d6939fa080 100644
--- a/tests/unit_tests/sql_parse_tests.py
+++ b/tests/unit_tests/sql_parse_tests.py
@@ -1195,6 +1195,14 @@ def test_sqlparse_issue_652():
         ("extract(HOUR from from_unixtime(hour_ts)", False),
         ("(SELECT * FROM table)", True),
         ("(SELECT COUNT(DISTINCT name) from birth_names)", True),
+        (
+            "(SELECT table_name FROM information_schema.tables WHERE table_name LIKE '%user%' LIMIT 1)",
+            True,
+        ),
+        (
+            "(SELECT table_name FROM /**/ information_schema.tables WHERE table_name LIKE '%user%' LIMIT 1)",
+            True,
+        ),
     ],
 )
 def test_has_table_query(sql: str, expected: bool) -> None:


[superset] 10/18: fix: enable strong session protection by default (#24256)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit a25347c113df07ca186d6a6364ade105b071bf86
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Thu Jun 1 14:01:25 2023 +0100

    fix: enable strong session protection by default (#24256)
---
 UPDATING.md            |  1 +
 docs/docs/security.mdx | 31 ++++++++++++++++++++++++++++++-
 superset/config.py     |  5 ++++-
 3 files changed, 35 insertions(+), 2 deletions(-)

diff --git a/UPDATING.md b/UPDATING.md
index f71d884091..1b30ec75ae 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -24,6 +24,7 @@ assists people when migrating to a new version.
 
 ## 2.1.1
 - [24185](https://github.com/apache/superset/pull/24185): `/api/v1/database/test_connection` and `api/v1/database/validate_parameters` permissions changed from `can_read` to `can_write`. Only Admin user's have access.
+- [24256](https://github.com/apache/superset/pull/24256): `Flask-Login` session validation is now set to `strong` by default. Previous setting was `basic`.
 
 ### Other
 
diff --git a/docs/docs/security.mdx b/docs/docs/security.mdx
index e868de6a99..eabe17da58 100644
--- a/docs/docs/security.mdx
+++ b/docs/docs/security.mdx
@@ -131,7 +131,36 @@ For example, the filters `client_id=4` and `client_id=5`, applied to a role,
 will result in users of that role having `client_id=4` AND `client_id=5`
 added to their query, which can never be true.
 
-### Content Security Policiy (CSP)
+### User Sessions
+
+Superset uses [Flask](https://pypi.org/project/Flask/)
+and [Flask-Login](https://pypi.org/project/Flask-Login/) for user session management.
+
+Session cookies are used to maintain session info and user state between requests,
+although they do not contain personal user information they serve the purpose of identifying
+a user session on the server side.
+The session cookie is encrypted with the application `SECRET_KEY` and cannot be read by the client.
+So it's very important to keep the `SECRET_KEY` secret and set to a secure unique complex random value.
+
+Flask and Flask-Login offer a number of configuration options to control session behavior.
+
+- Relevant Flask settings:
+
+`SESSION_COOKIE_HTTPONLY`: (default: `False`): Controls if cookies should be set with the `HttpOnly` flag.
+
+`SESSION_COOKIE_SECURE`: (default: `False`) Browsers will only send cookies with requests over
+HTTPS if the cookie is marked “secure”. The application must be served over HTTPS for this to make sense.
+
+`SESSION_COOKIE_SAMESITE`: (default: "Lax") Prevents the browser from sending this cookie along with cross-site requests.
+
+`PERMANENT_SESSION_LIFETIME`: (default: "31 days") The lifetime of a permanent session as a `datetime.timedelta` object.
+
+- Relevant Flask-Login settings:
+
+`SESSION_PROTECTION`: The method used to protect the session from being stolen. [Documentation](https://flask-login.readthedocs.io/en/latest/#session-protection)
+Default: "strong"
+
+### Content Security Policy (CSP)
 
 [Content Security Policy (CSP)](https://developer.mozilla.org/en-US/docs/Web/HTTP/CSP) is an added
 layer of security that helps to detect and mitigate certain types of attacks, including
diff --git a/superset/config.py b/superset/config.py
index f24b040e0f..a48fa191fe 100644
--- a/superset/config.py
+++ b/superset/config.py
@@ -1205,6 +1205,7 @@ def SQL_QUERY_MUTATOR(  # pylint: disable=invalid-name,unused-argument
 # functionality for both the SQL_Lab and Charts.
 MUTATE_AFTER_SPLIT = False
 
+
 # This allows for a user to add header data to any outgoing emails. For example,
 # if you need to include metadata in the header or you want to change the specifications
 # of the email title, header, or sender.
@@ -1387,6 +1388,8 @@ RLS_FORM_QUERY_REL_FIELDS: Optional[Dict[str, List[List[Any]]]] = None
 SESSION_COOKIE_HTTPONLY = True  # Prevent cookie from being read by frontend JS?
 SESSION_COOKIE_SECURE = False  # Prevent cookie from being transmitted over non-tls?
 SESSION_COOKIE_SAMESITE: Optional[Literal["None", "Lax", "Strict"]] = "Lax"
+# Accepts None, "basic" and "strong", more details on: https://flask-login.readthedocs.io/en/latest/#session-protection
+SESSION_PROTECTION = "strong"
 
 # Cache static resources.
 SEND_FILE_MAX_AGE_DEFAULT = int(timedelta(days=365).total_seconds())
@@ -1584,7 +1587,7 @@ elif importlib.util.find_spec("superset_config") and not is_test():
     try:
         # pylint: disable=import-error,wildcard-import,unused-wildcard-import
         import superset_config
-        from superset_config import *  # type: ignore
+        from superset_config import *
 
         print(f"Loaded your LOCAL configuration at [{superset_config.__file__}]")
     except Exception:


[superset] 09/18: fix: disable SHOW_STACKTRACE by default (#24137)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 4af81bf70ad33908479ea36ebb67213533b6a649
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Mon May 22 10:00:07 2023 +0100

    fix: disable SHOW_STACKTRACE by default (#24137)
---
 superset/config.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/superset/config.py b/superset/config.py
index c365babfc3..f24b040e0f 100644
--- a/superset/config.py
+++ b/superset/config.py
@@ -265,9 +265,9 @@ FLASK_USE_RELOAD = True
 PROFILING = False
 
 # Superset allows server-side python stacktraces to be surfaced to the
-# user when this feature is on. This may has security implications
+# user when this feature is on. This may have security implications
 # and it's more secure to turn it off in production settings.
-SHOW_STACKTRACE = True
+SHOW_STACKTRACE = False
 
 # Use all X-Forwarded headers when ENABLE_PROXY_FIX is True.
 # When proxying to a different port, set "x_port" to 0 to avoid downstream issues.


[superset] 17/18: fix: update order of build for testing a release (#24317)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit aad2c75551cad56afe54027364344bc173649352
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Thu Jun 8 16:30:39 2023 -0700

    fix: update order of build for testing a release (#24317)
---
 RELEASING/from_tarball_entrypoint.sh       | 6 +++---
 docs/docs/contributing/testing-locally.mdx | 2 +-
 2 files changed, 4 insertions(+), 4 deletions(-)

diff --git a/RELEASING/from_tarball_entrypoint.sh b/RELEASING/from_tarball_entrypoint.sh
index fb06a7f12c..9f1b0ef754 100755
--- a/RELEASING/from_tarball_entrypoint.sh
+++ b/RELEASING/from_tarball_entrypoint.sh
@@ -35,11 +35,11 @@ superset fab create-admin \
 # Initialize the database
 superset db upgrade
 
-# Loading examples
-superset load_examples
-
 # Create default roles and permissions
 superset init
 
+# Loading examples
+superset load-examples
+
 FLASK_ENV=development FLASK_APP="superset.app:create_app()" \
 flask run -p 8088 --with-threads --reload --debugger --host=0.0.0.0
diff --git a/docs/docs/contributing/testing-locally.mdx b/docs/docs/contributing/testing-locally.mdx
index ae08b1878a..780df3311a 100644
--- a/docs/docs/contributing/testing-locally.mdx
+++ b/docs/docs/contributing/testing-locally.mdx
@@ -93,8 +93,8 @@ export SUPERSET_TESTENV=true
 export CYPRESS_BASE_URL="http://localhost:8081"
 superset db upgrade
 superset load_test_users
-superset load-examples --load-test-data
 superset init
+superset load-examples --load-test-data
 superset run --port 8081
 ```
 


[superset] 16/18: update package version

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 3f20b9ebabc6d16651b43b6334590c841030c2c8
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Wed Jun 7 15:20:43 2023 -0700

    update package version
---
 superset-frontend/package-lock.json | 4 ++--
 superset-frontend/package.json      | 2 +-
 2 files changed, 3 insertions(+), 3 deletions(-)

diff --git a/superset-frontend/package-lock.json b/superset-frontend/package-lock.json
index f0ffd39160..adb7f27ae6 100644
--- a/superset-frontend/package-lock.json
+++ b/superset-frontend/package-lock.json
@@ -1,12 +1,12 @@
 {
   "name": "superset",
-  "version": "2.1.0",
+  "version": "2.1.1",
   "lockfileVersion": 2,
   "requires": true,
   "packages": {
     "": {
       "name": "superset",
-      "version": "2.1.0",
+      "version": "2.1.1",
       "license": "Apache-2.0",
       "workspaces": [
         "packages/*",
diff --git a/superset-frontend/package.json b/superset-frontend/package.json
index 774ea6106f..5cc84f7fe8 100644
--- a/superset-frontend/package.json
+++ b/superset-frontend/package.json
@@ -1,6 +1,6 @@
 {
   "name": "superset",
-  "version": "2.1.0",
+  "version": "2.1.1",
   "description": "Superset is a data exploration platform designed to be visual, intuitive, and interactive.",
   "keywords": [
     "big",


[superset] 15/18: update changelog

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 801389f2f12c8384b69e1681783dcae54db04d11
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Wed Jun 7 14:10:52 2023 -0700

    update changelog
---
 CHANGELOG.md | 5 +++++
 UPDATING.md  | 1 +
 2 files changed, 6 insertions(+)

diff --git a/CHANGELOG.md b/CHANGELOG.md
index d25ea7c3c2..71955dbc79 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -33,6 +33,10 @@ under the License.
 
 ### 2.1.1 (Sun Apr 23 15:44:21 2023 +0100)
 
+**Database Migrations**
+- [##23980](https://github.com/apache/superset/pull/23980) fix(migration): handle permalink edge cases correctly (@villebro)
+- [23888](https://github.com/apache/superset/pull/23888) chore(key-value): use json serialization for main resources (@villebro)
+
 **Fixes**
 - [#23723](https://github.com/apache/superset/pull/23723) add enforce URI query params with a specific for MySQL (@dpgaspar)
 - [#24054](https://github.com/apache/superset/pull/24054) fix: handle temporal columns in presto partitions (@giftig)
@@ -57,6 +61,7 @@ under the License.
 - [#24056](https://github.com/apache/superset/pull/24056) chore: Remove unnecessary information from response (@geido)
 
 
+
 ### 2.1.0 (Thu Mar 16 21:13:05 2023 -0700)
 **Database Migrations**
 - [#23139](https://github.com/apache/superset/pull/23139) fix: memoized decorator memory leak (@dpgaspar)
diff --git a/UPDATING.md b/UPDATING.md
index 1b30ec75ae..addcbc6bb9 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -27,6 +27,7 @@ assists people when migrating to a new version.
 - [24256](https://github.com/apache/superset/pull/24256): `Flask-Login` session validation is now set to `strong` by default. Previous setting was `basic`.
 
 ### Other
+- [23888](https://github.com/apache/superset/pull/23888): Database Migration for json serialization instead of pickle should upgrade/downgrade correctly when bumping to/from this patch version
 
 ## 2.1.0
 


[superset] 03/18: lint

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit f244c24bb90c8a2b71ac1b2adf2116a246839f44
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Fri Jun 2 17:24:34 2023 -0700

    lint
---
 superset/examples/utils.py                  |   2 +-
 tests/integration_tests/csv_upload_tests.py | 146 ++++++++++++++--------------
 2 files changed, 74 insertions(+), 74 deletions(-)

diff --git a/superset/examples/utils.py b/superset/examples/utils.py
index aea1f0f93d..d49fbe0085 100644
--- a/superset/examples/utils.py
+++ b/superset/examples/utils.py
@@ -86,7 +86,7 @@ def load_configs_from_directory(
 
     # removing "type" from the metadata allows us to import any exported model
     # from the unzipped directory directly
-    metadata = yaml.load(contents.get(METADATA_FILE_NAME, "{}"), Loader=None)
+    metadata = yaml.safe_load(contents.get(METADATA_FILE_NAME, "{}"))
     if "type" in metadata:
         del metadata["type"]
     contents[METADATA_FILE_NAME] = yaml.dump(metadata)
diff --git a/tests/integration_tests/csv_upload_tests.py b/tests/integration_tests/csv_upload_tests.py
index 3e0200d18a..850d8b0c26 100644
--- a/tests/integration_tests/csv_upload_tests.py
+++ b/tests/integration_tests/csv_upload_tests.py
@@ -441,76 +441,76 @@ def test_import_excel(mock_event_logger):
         assert data == [(0, "john", 1), (1, "paul", 2)]
 
 
-@pytest.mark.usefixtures("setup_csv_upload_with_context")
-@pytest.mark.usefixtures("create_columnar_files")
-@mock.patch("superset.db_engine_specs.hive.upload_to_s3", mock_upload_to_s3)
-@mock.patch("superset.views.database.views.event_logger.log_with_context")
-def test_import_parquet(mock_event_logger):
-    if utils.backend() == "hive":
-        pytest.skip("Hive doesn't allow parquet upload.")
-
-    schema = utils.get_example_default_schema()
-    full_table_name = (
-        f"{schema}.{PARQUET_UPLOAD_TABLE}" if schema else PARQUET_UPLOAD_TABLE
-    )
-    test_db = get_upload_db()
-
-    success_msg_f1 = f"Columnar file {escaped_parquet(PARQUET_FILENAME1)} uploaded to table {escaped_double_quotes(full_table_name)}"
-
-    # initial upload with fail mode
-    resp = upload_columnar(PARQUET_FILENAME1, PARQUET_UPLOAD_TABLE)
-    assert success_msg_f1 in resp
-
-    # upload again with fail mode; should fail
-    fail_msg = f"Unable to upload Columnar file {escaped_parquet(PARQUET_FILENAME1)} to table {escaped_double_quotes(PARQUET_UPLOAD_TABLE)}"
-    resp = upload_columnar(PARQUET_FILENAME1, PARQUET_UPLOAD_TABLE)
-    assert fail_msg in resp
-
-    if utils.backend() != "hive":
-        # upload again with append mode
-        resp = upload_columnar(
-            PARQUET_FILENAME1, PARQUET_UPLOAD_TABLE, extra={"if_exists": "append"}
-        )
-        assert success_msg_f1 in resp
-        mock_event_logger.assert_called_with(
-            action="successful_columnar_upload",
-            database=test_db.name,
-            schema=schema,
-            table=PARQUET_UPLOAD_TABLE,
-        )
-
-    # upload again with replace mode and specific columns
-    resp = upload_columnar(
-        PARQUET_FILENAME1,
-        PARQUET_UPLOAD_TABLE,
-        extra={"if_exists": "replace", "usecols": '["a"]'},
-    )
-    assert success_msg_f1 in resp
-
-    table = SupersetTestCase.get_table(name=PARQUET_UPLOAD_TABLE, schema=None)
-    # make sure only specified column name was read
-    assert "b" not in table.column_names
-
-    # ensure user is assigned as an owner
-    assert security_manager.find_user("admin") in table.owners
-
-    # upload again with replace mode
-    resp = upload_columnar(
-        PARQUET_FILENAME1, PARQUET_UPLOAD_TABLE, extra={"if_exists": "replace"}
-    )
-    assert success_msg_f1 in resp
-
-    with test_db.get_sqla_engine_with_context() as engine:
-        data = engine.execute(f"SELECT * from {PARQUET_UPLOAD_TABLE}").fetchall()
-        assert data == [("john", 1), ("paul", 2)]
-
-    # replace table with zip file
-    resp = upload_columnar(
-        ZIP_FILENAME, PARQUET_UPLOAD_TABLE, extra={"if_exists": "replace"}
-    )
-    success_msg_f2 = f"Columnar file {escaped_parquet(ZIP_FILENAME)} uploaded to table {escaped_double_quotes(full_table_name)}"
-    assert success_msg_f2 in resp
-
-    with test_db.get_sqla_engine_with_context() as engine:
-        data = engine.execute(f"SELECT * from {PARQUET_UPLOAD_TABLE}").fetchall()
-        assert data == [("john", 1), ("paul", 2), ("max", 3), ("bob", 4)]
+# @pytest.mark.usefixtures("setup_csv_upload_with_context")
+# @pytest.mark.usefixtures("create_columnar_files")
+# @mock.patch("superset.db_engine_specs.hive.upload_to_s3", mock_upload_to_s3)
+# @mock.patch("superset.views.database.views.event_logger.log_with_context")
+# def test_import_parquet(mock_event_logger):
+#     if utils.backend() == "hive":
+#         pytest.skip("Hive doesn't allow parquet upload.")
+
+#     schema = utils.get_example_default_schema()
+#     full_table_name = (
+#         f"{schema}.{PARQUET_UPLOAD_TABLE}" if schema else PARQUET_UPLOAD_TABLE
+#     )
+#     test_db = get_upload_db()
+
+#     success_msg_f1 = f"Columnar file {escaped_parquet(PARQUET_FILENAME1)} uploaded to table {escaped_double_quotes(full_table_name)}"
+
+#     # initial upload with fail mode
+#     resp = upload_columnar(PARQUET_FILENAME1, PARQUET_UPLOAD_TABLE)
+#     assert success_msg_f1 in resp
+
+#     # upload again with fail mode; should fail
+#     fail_msg = f"Unable to upload Columnar file {escaped_parquet(PARQUET_FILENAME1)} to table {escaped_double_quotes(PARQUET_UPLOAD_TABLE)}"
+#     resp = upload_columnar(PARQUET_FILENAME1, PARQUET_UPLOAD_TABLE)
+#     assert fail_msg in resp
+
+#     if utils.backend() != "hive":
+#         # upload again with append mode
+#         resp = upload_columnar(
+#             PARQUET_FILENAME1, PARQUET_UPLOAD_TABLE, extra={"if_exists": "append"}
+#         )
+#         assert success_msg_f1 in resp
+#         mock_event_logger.assert_called_with(
+#             action="successful_columnar_upload",
+#             database=test_db.name,
+#             schema=schema,
+#             table=PARQUET_UPLOAD_TABLE,
+#         )
+
+#     # upload again with replace mode and specific columns
+#     resp = upload_columnar(
+#         PARQUET_FILENAME1,
+#         PARQUET_UPLOAD_TABLE,
+#         extra={"if_exists": "replace", "usecols": '["a"]'},
+#     )
+#     assert success_msg_f1 in resp
+
+#     table = SupersetTestCase.get_table(name=PARQUET_UPLOAD_TABLE, schema=None)
+#     # make sure only specified column name was read
+#     assert "b" not in table.column_names
+
+#     # ensure user is assigned as an owner
+#     assert security_manager.find_user("admin") in table.owners
+
+#     # upload again with replace mode
+#     resp = upload_columnar(
+#         PARQUET_FILENAME1, PARQUET_UPLOAD_TABLE, extra={"if_exists": "replace"}
+#     )
+#     assert success_msg_f1 in resp
+
+#     with test_db.get_sqla_engine_with_context() as engine:
+#         data = engine.execute(f"SELECT * from {PARQUET_UPLOAD_TABLE}").fetchall()
+#         assert data == [("john", 1), ("paul", 2)]
+
+#     # replace table with zip file
+#     resp = upload_columnar(
+#         ZIP_FILENAME, PARQUET_UPLOAD_TABLE, extra={"if_exists": "replace"}
+#     )
+#     success_msg_f2 = f"Columnar file {escaped_parquet(ZIP_FILENAME)} uploaded to table {escaped_double_quotes(full_table_name)}"
+#     assert success_msg_f2 in resp
+
+#     with test_db.get_sqla_engine_with_context() as engine:
+#         data = engine.execute(f"SELECT * from {PARQUET_UPLOAD_TABLE}").fetchall()
+#         assert data == [("john", 1), ("paul", 2), ("max", 3), ("bob", 4)]


[superset] 05/18: remove tests that don't apply

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit b53325e576bd987cbd3ddd8d9379d18abd97c3b7
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Mon Jun 5 17:06:04 2023 -0700

    remove tests that don't apply
---
 .../integration_tests/databases/commands_tests.py  | 200 ---------------------
 tests/integration_tests/sqllab_tests.py            |  18 +-
 2 files changed, 9 insertions(+), 209 deletions(-)

diff --git a/tests/integration_tests/databases/commands_tests.py b/tests/integration_tests/databases/commands_tests.py
index 7b2d4fdd27..22b5be492d 100644
--- a/tests/integration_tests/databases/commands_tests.py
+++ b/tests/integration_tests/databases/commands_tests.py
@@ -40,7 +40,6 @@ from superset.databases.commands.importers.v1 import ImportDatabasesCommand
 from superset.databases.commands.tables import TablesDatabaseCommand
 from superset.databases.commands.test_connection import TestConnectionDatabaseCommand
 from superset.databases.commands.validate import ValidateDatabaseParametersCommand
-from superset.databases.ssh_tunnel.models import SSHTunnel
 from superset.errors import ErrorLevel, SupersetError, SupersetErrorType
 from superset.exceptions import (
     SupersetErrorsException,
@@ -63,11 +62,6 @@ from tests.integration_tests.fixtures.energy_dashboard import (
 from tests.integration_tests.fixtures.importexport import (
     database_config,
     database_metadata_config,
-    database_with_ssh_tunnel_config_mix_credentials,
-    database_with_ssh_tunnel_config_no_credentials,
-    database_with_ssh_tunnel_config_password,
-    database_with_ssh_tunnel_config_private_key,
-    database_with_ssh_tunnel_config_private_pass_only,
     dataset_config,
     dataset_metadata_config,
 )
@@ -639,200 +633,6 @@ class TestImportDatabasesCommand(SupersetTestCase):
             }
         }
 
-    @patch("superset.databases.schemas.is_feature_enabled")
-    def test_import_v1_database_masked_ssh_tunnel_password(
-        self, mock_schema_is_feature_enabled
-    ):
-        """Test that database imports with masked ssh_tunnel passwords are rejected"""
-        mock_schema_is_feature_enabled.return_value = True
-        masked_database_config = database_with_ssh_tunnel_config_password.copy()
-        contents = {
-            "metadata.yaml": yaml.safe_dump(database_metadata_config),
-            "databases/imported_database.yaml": yaml.safe_dump(masked_database_config),
-        }
-        command = ImportDatabasesCommand(contents)
-        with pytest.raises(CommandInvalidError) as excinfo:
-            command.run()
-        assert str(excinfo.value) == "Error importing database"
-        assert excinfo.value.normalized_messages() == {
-            "databases/imported_database.yaml": {
-                "_schema": ["Must provide a password for the ssh tunnel"]
-            }
-        }
-
-    @patch("superset.databases.schemas.is_feature_enabled")
-    def test_import_v1_database_masked_ssh_tunnel_private_key_and_password(
-        self, mock_schema_is_feature_enabled
-    ):
-        """Test that database imports with masked ssh_tunnel private_key and private_key_password are rejected"""
-        mock_schema_is_feature_enabled.return_value = True
-        masked_database_config = database_with_ssh_tunnel_config_private_key.copy()
-        contents = {
-            "metadata.yaml": yaml.safe_dump(database_metadata_config),
-            "databases/imported_database.yaml": yaml.safe_dump(masked_database_config),
-        }
-        command = ImportDatabasesCommand(contents)
-        with pytest.raises(CommandInvalidError) as excinfo:
-            command.run()
-        assert str(excinfo.value) == "Error importing database"
-        assert excinfo.value.normalized_messages() == {
-            "databases/imported_database.yaml": {
-                "_schema": [
-                    "Must provide a private key for the ssh tunnel",
-                    "Must provide a private key password for the ssh tunnel",
-                ]
-            }
-        }
-
-    @patch("superset.databases.schemas.is_feature_enabled")
-    @patch("superset.security.manager.g")
-    def test_import_v1_database_with_ssh_tunnel_password(
-        self,
-        mock_g,
-        mock_schema_is_feature_enabled,
-    ):
-        """Test that a database with ssh_tunnel password can be imported"""
-        mock_g.user = security_manager.find_user("admin")
-        mock_schema_is_feature_enabled.return_value = True
-        masked_database_config = database_with_ssh_tunnel_config_password.copy()
-        masked_database_config["ssh_tunnel"]["password"] = "TEST"
-        contents = {
-            "metadata.yaml": yaml.safe_dump(database_metadata_config),
-            "databases/imported_database.yaml": yaml.safe_dump(masked_database_config),
-        }
-        command = ImportDatabasesCommand(contents)
-        command.run()
-
-        database = (
-            db.session.query(Database).filter_by(uuid=database_config["uuid"]).one()
-        )
-        assert database.allow_file_upload
-        assert database.allow_ctas
-        assert database.allow_cvas
-        assert database.allow_dml
-        assert not database.allow_run_async
-        assert database.cache_timeout is None
-        assert database.database_name == "imported_database"
-        assert database.expose_in_sqllab
-        assert database.extra == "{}"
-        assert database.sqlalchemy_uri == "sqlite:///test.db"
-
-        model_ssh_tunnel = (
-            db.session.query(SSHTunnel)
-            .filter(SSHTunnel.database_id == database.id)
-            .one()
-        )
-        self.assertEqual(model_ssh_tunnel.password, "TEST")
-
-        db.session.delete(database)
-        db.session.commit()
-
-    @patch("superset.databases.schemas.is_feature_enabled")
-    @patch("superset.security.manager.g")
-    def test_import_v1_database_with_ssh_tunnel_private_key_and_password(
-        self,
-        mock_g,
-        mock_schema_is_feature_enabled,
-    ):
-        """Test that a database with ssh_tunnel private_key and private_key_password can be imported"""
-        mock_g.user = security_manager.find_user("admin")
-
-        mock_schema_is_feature_enabled.return_value = True
-        masked_database_config = database_with_ssh_tunnel_config_private_key.copy()
-        masked_database_config["ssh_tunnel"]["private_key"] = "TestPrivateKey"
-        masked_database_config["ssh_tunnel"]["private_key_password"] = "TEST"
-        contents = {
-            "metadata.yaml": yaml.safe_dump(database_metadata_config),
-            "databases/imported_database.yaml": yaml.safe_dump(masked_database_config),
-        }
-        command = ImportDatabasesCommand(contents)
-        command.run()
-
-        database = (
-            db.session.query(Database).filter_by(uuid=database_config["uuid"]).one()
-        )
-        assert database.allow_file_upload
-        assert database.allow_ctas
-        assert database.allow_cvas
-        assert database.allow_dml
-        assert not database.allow_run_async
-        assert database.cache_timeout is None
-        assert database.database_name == "imported_database"
-        assert database.expose_in_sqllab
-        assert database.extra == "{}"
-        assert database.sqlalchemy_uri == "sqlite:///test.db"
-
-        model_ssh_tunnel = (
-            db.session.query(SSHTunnel)
-            .filter(SSHTunnel.database_id == database.id)
-            .one()
-        )
-        self.assertEqual(model_ssh_tunnel.private_key, "TestPrivateKey")
-        self.assertEqual(model_ssh_tunnel.private_key_password, "TEST")
-
-        db.session.delete(database)
-        db.session.commit()
-
-    @patch("superset.databases.schemas.is_feature_enabled")
-    def test_import_v1_database_masked_ssh_tunnel_no_credentials(
-        self, mock_schema_is_feature_enabled
-    ):
-        """Test that databases with ssh_tunnels that have no credentials are rejected"""
-        mock_schema_is_feature_enabled.return_value = True
-        masked_database_config = database_with_ssh_tunnel_config_no_credentials.copy()
-        contents = {
-            "metadata.yaml": yaml.safe_dump(database_metadata_config),
-            "databases/imported_database.yaml": yaml.safe_dump(masked_database_config),
-        }
-        command = ImportDatabasesCommand(contents)
-        with pytest.raises(CommandInvalidError) as excinfo:
-            command.run()
-        assert str(excinfo.value) == "Must provide credentials for the SSH Tunnel"
-
-    @patch("superset.databases.schemas.is_feature_enabled")
-    def test_import_v1_database_masked_ssh_tunnel_multiple_credentials(
-        self, mock_schema_is_feature_enabled
-    ):
-        """Test that databases with ssh_tunnels that have multiple credentials are rejected"""
-        mock_schema_is_feature_enabled.return_value = True
-        masked_database_config = database_with_ssh_tunnel_config_mix_credentials.copy()
-        contents = {
-            "metadata.yaml": yaml.safe_dump(database_metadata_config),
-            "databases/imported_database.yaml": yaml.safe_dump(masked_database_config),
-        }
-        command = ImportDatabasesCommand(contents)
-        with pytest.raises(CommandInvalidError) as excinfo:
-            command.run()
-        assert (
-            str(excinfo.value) == "Cannot have multiple credentials for the SSH Tunnel"
-        )
-
-    @patch("superset.databases.schemas.is_feature_enabled")
-    def test_import_v1_database_masked_ssh_tunnel_only_priv_key_psswd(
-        self, mock_schema_is_feature_enabled
-    ):
-        """Test that databases with ssh_tunnels that have multiple credentials are rejected"""
-        mock_schema_is_feature_enabled.return_value = True
-        masked_database_config = (
-            database_with_ssh_tunnel_config_private_pass_only.copy()
-        )
-        contents = {
-            "metadata.yaml": yaml.safe_dump(database_metadata_config),
-            "databases/imported_database.yaml": yaml.safe_dump(masked_database_config),
-        }
-        command = ImportDatabasesCommand(contents)
-        with pytest.raises(CommandInvalidError) as excinfo:
-            command.run()
-        assert str(excinfo.value) == "Error importing database"
-        assert excinfo.value.normalized_messages() == {
-            "databases/imported_database.yaml": {
-                "_schema": [
-                    "Must provide a private key for the ssh tunnel",
-                    "Must provide a private key password for the ssh tunnel",
-                ]
-            }
-        }
-
     @patch("superset.databases.commands.importers.v1.import_dataset")
     def test_import_v1_rollback(self, mock_import_dataset):
         """Test than on an exception everything is rolled back"""
diff --git a/tests/integration_tests/sqllab_tests.py b/tests/integration_tests/sqllab_tests.py
index 843057bb69..57b46f7bbc 100644
--- a/tests/integration_tests/sqllab_tests.py
+++ b/tests/integration_tests/sqllab_tests.py
@@ -758,15 +758,15 @@ class TestSqlLab(SupersetTestCase):
         {"ENABLE_TEMPLATE_PROCESSING": True},
         clear=True,
     )
-    # def test_sql_json_parameter_forbidden(self):
-    #     self.login("gamma")
-
-    #     data = self.run_sql(
-    #         "SELECT name FROM {{ table }} LIMIT 10",
-    #         "4",
-    #         template_params=json.dumps({"table": "birth_names"}),
-    #     )
-    #     assert data["errors"][0]["error_type"] == "GENERIC_BACKEND_ERROR"
+    def test_sql_json_parameter_forbidden(self):
+        self.login("gamma")
+
+        data = self.run_sql(
+            "SELECT name FROM {{ table }} LIMIT 10",
+            "4",
+            template_params=json.dumps({"table": "birth_names"}),
+        )
+        assert data["message"] == "Forbidden"
 
     @mock.patch("superset.sql_lab.get_query")
     @mock.patch("superset.sql_lab.execute_sql_statement")


[superset] 08/18: fix: db validate parameters permission (#24185)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit e804489a89b928e387af279b09033e5fc4f2458d
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Mon Jun 5 13:06:00 2023 +0100

    fix: db validate parameters permission (#24185)
---
 UPDATING.md           | 1 +
 superset/constants.py | 4 ++--
 2 files changed, 3 insertions(+), 2 deletions(-)

diff --git a/UPDATING.md b/UPDATING.md
index 1810a22306..f71d884091 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -23,6 +23,7 @@ This file documents any backwards-incompatible changes in Superset and
 assists people when migrating to a new version.
 
 ## 2.1.1
+- [24185](https://github.com/apache/superset/pull/24185): `/api/v1/database/test_connection` and `api/v1/database/validate_parameters` permissions changed from `can_read` to `can_write`. Only Admin user's have access.
 
 ### Other
 
diff --git a/superset/constants.py b/superset/constants.py
index cdbce050d3..c3190ce1bf 100644
--- a/superset/constants.py
+++ b/superset/constants.py
@@ -125,8 +125,8 @@ MODEL_API_RW_METHOD_PERMISSION_MAP = {
     "select_star": "read",
     "table_metadata": "read",
     "table_extra_metadata": "read",
-    "test_connection": "read",
-    "validate_parameters": "read",
+    "test_connection": "write",
+    "validate_parameters": "write",
     "favorite_status": "read",
     "thumbnail": "read",
     "import_": "write",


[superset] 07/18: chore: update UPDATING for 2.1.0 (#24294)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 8d32525f9722814c1c9ba654eb7855d4e1fd4fde
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Mon Jun 5 12:56:19 2023 -0700

    chore: update UPDATING for 2.1.0 (#24294)
---
 UPDATING.md | 7 ++++++-
 1 file changed, 6 insertions(+), 1 deletion(-)

diff --git a/UPDATING.md b/UPDATING.md
index 15669d0e25..1810a22306 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -22,6 +22,10 @@ under the License.
 This file documents any backwards-incompatible changes in Superset and
 assists people when migrating to a new version.
 
+## 2.1.1
+
+### Other
+
 ## 2.1.0
 
 - [22809](https://github.com/apache/superset/pull/22809): Migrated endpoint `/superset/sql_json` and `/superset/results/` to `/api/v1/sqllab/execute/` and `/api/v1/sqllab/results/` respectively. Corresponding permissions are `can sql_json on Superset` to `can execute on SQLLab`, `can results on Superset` to `can results on SQLLab`. Make sure you add/replace the necessary permissions on any custom roles you may have.
@@ -30,7 +34,6 @@ assists people when migrating to a new version.
 - [22789](https://github.com/apache/superset/pull/22789): Migrated endpoint `/superset/recent_activity/<user_id>/` to `/api/v1/log/recent_activity/<user_id>/`. Corresponding permissions are `can recent activity on Superset` to `can recent activity on Log`. Make sure you add/replace the necessary permissions on any custom roles you may have.
 - [22913](https://github.com/apache/superset/pull/22913): Migrated endpoint `/superset/csv` to `/api/v1/sqllab/export/`. Corresponding permissions are `can csv on Superset` to `can export csv on SQLLab`. Make sure you add/replace the necessary permissions on any custom roles you may have.
 - [22496](https://github.com/apache/superset/pull/22496): Migrated endpoint `/superset/slice_json/<int:layer_id>` to `/api/v1/chart/<int:id>/data/`. Corresponding permissions are `can slice json on Superset` to `can read on Chart`. Make sure you add/replace the necessary permissions on any custom roles you may have.
-- [22496](https://github.com/apache/superset/pull/22496): Migrated endpoint `/superset/annotation_json/<int:layer_id>` to `/api/v1/chart/<int:id>/data/`. Corresponding permissions are `can annotation json on Superset` to `can read on Chart`. Make sure you add/replace the necessary permissions on any custom roles you may have.
 - [22624](https://github.com/apache/superset/pull/22624): Migrated endpoint `/superset/stop_query/` to `/api/v1/query/stop`. Corresponding permissions are `can stop query on Superset` to `can read on Query`. Make sure you add/replace the necessary permissions on any custom roles you may have.
 - [22579](https://github.com/apache/superset/pull/22579): Migrated endpoint `/superset/search_queries/` to `/api/v1/query/`. Corresponding permissions are `can search queries on Superset` to `can read on Query`. Make sure you add/replace the necessary permissions on any custom roles you may have.
 - [22501](https://github.com/apache/superset/pull/22501): Migrated endpoint `/superset/tables/<int:db_id>/<schema>/` to `/api/v1/database/<int:id>/tables/`. Corresponding permissions are `can tables on Superset` to `can read on Database`. Make sure you add/replace the necessary permissions on any custom roles you may have.
@@ -57,6 +60,8 @@ assists people when migrating to a new version.
 
 - [23118](https://github.com/apache/superset/pull/23118): Previously the "database access on <database>" permission granted access to all datasets on the underlying database, but they didn't show up on the list views. Now all dashboards, charts and datasets that are accessible via this permission will also show up on their respective list views.
 
+
+
 ## 2.0.1
 
 - [21895](https://github.com/apache/superset/pull/21895): Markdown components had their security increased by adhering to the same sanitization process enforced by Github. This means that some HTML elements found in markdowns are not allowed anymore due to the security risks they impose. If you're deploying Superset in a trusted environment and wish to use some of the blocked elements, then you can use the HTML_SANITIZATION_SCHEMA_EXTENSIONS configuration to extend the default sanitizati [...]


[superset] 12/18: lint

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit cfb4d27d8c2a9d5a074173fbd4b9b646f84988f8
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Tue Jun 6 15:58:54 2023 -0700

    lint
---
 superset/config.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/superset/config.py b/superset/config.py
index a48fa191fe..6f1d9634e2 100644
--- a/superset/config.py
+++ b/superset/config.py
@@ -1587,7 +1587,7 @@ elif importlib.util.find_spec("superset_config") and not is_test():
     try:
         # pylint: disable=import-error,wildcard-import,unused-wildcard-import
         import superset_config
-        from superset_config import *
+        from superset_config import *  # noqa
 
         print(f"Loaded your LOCAL configuration at [{superset_config.__file__}]")
     except Exception:


[superset] 06/18: chore: Remove unnecessary information from response (#24056)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 831cd9b0304438c21d7f6529cebc37087cb90656
Author: Geido <60...@users.noreply.github.com>
AuthorDate: Wed May 17 20:07:47 2023 +0300

    chore: Remove unnecessary information from response (#24056)
---
 superset/charts/api.py                          |   2 -
 superset/connectors/sqla/models.py              |   7 +-
 superset/dashboards/api.py                      |   3 -
 superset/dashboards/schemas.py                  |   4 +-
 superset/datasets/api.py                        |   4 +-
 superset/models/dashboard.py                    |   6 +-
 superset/models/filter_set.py                   |   6 +-
 superset/models/slice.py                        |   8 +-
 superset/queries/api.py                         |   1 -
 superset/queries/schemas.py                     |   2 +-
 superset/tags/schemas.py                        |  59 +++++++++++++
 tests/integration_tests/charts/api_tests.py     | 109 +++++++++++++++++++++++-
 tests/integration_tests/dashboards/api_tests.py | 106 ++++++++++++++++++++++-
 tests/integration_tests/datasets/api_tests.py   | 102 ++++++++++++++++++++++
 tests/integration_tests/queries/api_tests.py    |   1 -
 tests/integration_tests/sqllab_tests.py         |   8 +-
 16 files changed, 404 insertions(+), 24 deletions(-)

diff --git a/superset/charts/api.py b/superset/charts/api.py
index 7dc6d5e1e8..88d74f875e 100644
--- a/superset/charts/api.py
+++ b/superset/charts/api.py
@@ -129,7 +129,6 @@ class ChartRestApi(BaseSupersetModelRestApi):
         "owners.first_name",
         "owners.id",
         "owners.last_name",
-        "owners.username",
         "dashboards.id",
         "dashboards.dashboard_title",
         "params",
@@ -171,7 +170,6 @@ class ChartRestApi(BaseSupersetModelRestApi):
         "owners.first_name",
         "owners.id",
         "owners.last_name",
-        "owners.username",
         "dashboards.id",
         "dashboards.dashboard_title",
         "params",
diff --git a/superset/connectors/sqla/models.py b/superset/connectors/sqla/models.py
index 95f9121102..7b2f1999f9 100644
--- a/superset/connectors/sqla/models.py
+++ b/superset/connectors/sqla/models.py
@@ -44,7 +44,7 @@ import numpy as np
 import pandas as pd
 import sqlalchemy as sa
 import sqlparse
-from flask import escape, Markup
+from flask import current_app, escape, Markup
 from flask_appbuilder import Model
 from flask_babel import lazy_gettext as _
 from jinja2.exceptions import TemplateError
@@ -655,7 +655,10 @@ class SqlaTable(Model, BaseDatasource):  # pylint: disable=too-many-public-metho
 
     @property
     def changed_by_url(self) -> str:
-        if not self.changed_by:
+        if (
+            not self.changed_by
+            or not current_app.config["ENABLE_BROAD_ACTIVITY_ACCESS"]
+        ):
             return ""
         return f"/superset/profile/{self.changed_by.username}"
 
diff --git a/superset/dashboards/api.py b/superset/dashboards/api.py
index 64ea637c66..1a476a0a97 100644
--- a/superset/dashboards/api.py
+++ b/superset/dashboards/api.py
@@ -167,7 +167,6 @@ class DashboardRestApi(BaseSupersetModelRestApi):
         "certification_details",
         "changed_by.first_name",
         "changed_by.last_name",
-        "changed_by.username",
         "changed_by.id",
         "changed_by_name",
         "changed_by_url",
@@ -179,10 +178,8 @@ class DashboardRestApi(BaseSupersetModelRestApi):
         "created_by.last_name",
         "dashboard_title",
         "owners.id",
-        "owners.username",
         "owners.first_name",
         "owners.last_name",
-        "owners.email",
         "roles.id",
         "roles.name",
         "is_managed_externally",
diff --git a/superset/dashboards/schemas.py b/superset/dashboards/schemas.py
index f0d05445aa..b527947ead 100644
--- a/superset/dashboards/schemas.py
+++ b/superset/dashboards/schemas.py
@@ -163,10 +163,10 @@ class DashboardGetResponseSchema(Schema):
     certification_details = fields.String(description=certification_details_description)
     changed_by_name = fields.String()
     changed_by_url = fields.String()
-    changed_by = fields.Nested(UserSchema)
+    changed_by = fields.Nested(UserSchema(exclude=(["username"])))
     changed_on = fields.DateTime()
     charts = fields.List(fields.String(description=charts_description))
-    owners = fields.List(fields.Nested(UserSchema))
+    owners = fields.List(fields.Nested(UserSchema(exclude=(["username"]))))
     roles = fields.List(fields.Nested(RolesSchema))
     changed_on_humanized = fields.String(data_key="changed_on_delta_humanized")
     is_managed_externally = fields.Boolean(allow_none=True, default=False)
diff --git a/superset/datasets/api.py b/superset/datasets/api.py
index d58a1dd3f6..16975675e6 100644
--- a/superset/datasets/api.py
+++ b/superset/datasets/api.py
@@ -103,7 +103,7 @@ class DatasetRestApi(BaseSupersetModelRestApi):
         "changed_by_name",
         "changed_by_url",
         "changed_by.first_name",
-        "changed_by.username",
+        "changed_by.last_name",
         "changed_on_utc",
         "changed_on_delta_humanized",
         "default_endpoint",
@@ -113,7 +113,6 @@ class DatasetRestApi(BaseSupersetModelRestApi):
         "extra",
         "kind",
         "owners.id",
-        "owners.username",
         "owners.first_name",
         "owners.last_name",
         "schema",
@@ -146,7 +145,6 @@ class DatasetRestApi(BaseSupersetModelRestApi):
         "template_params",
         "select_star",
         "owners.id",
-        "owners.username",
         "owners.first_name",
         "owners.last_name",
         "columns.advanced_data_type",
diff --git a/superset/models/dashboard.py b/superset/models/dashboard.py
index 0e0bf56f58..60a8ea0e30 100644
--- a/superset/models/dashboard.py
+++ b/superset/models/dashboard.py
@@ -23,6 +23,7 @@ from functools import partial
 from typing import Any, Callable, Dict, List, Optional, Set, Tuple, Type, Union
 
 import sqlalchemy as sqla
+from flask import current_app
 from flask_appbuilder import Model
 from flask_appbuilder.models.decorators import renders
 from flask_appbuilder.security.sqla.models import User
@@ -264,7 +265,10 @@ class Dashboard(Model, AuditMixinNullable, ImportExportMixin):
 
     @property
     def changed_by_url(self) -> str:
-        if not self.changed_by:
+        if (
+            not self.changed_by
+            or not current_app.config["ENABLE_BROAD_ACTIVITY_ACCESS"]
+        ):
             return ""
         return f"/superset/profile/{self.changed_by.username}"
 
diff --git a/superset/models/filter_set.py b/superset/models/filter_set.py
index 4bbef26490..1ace5bca32 100644
--- a/superset/models/filter_set.py
+++ b/superset/models/filter_set.py
@@ -20,6 +20,7 @@ import json
 import logging
 from typing import Any, Dict
 
+from flask import current_app
 from flask_appbuilder import Model
 from sqlalchemy import Column, ForeignKey, Integer, MetaData, String, Text
 from sqlalchemy.orm import relationship
@@ -67,7 +68,10 @@ class FilterSet(Model, AuditMixinNullable):
 
     @property
     def changed_by_url(self) -> str:
-        if not self.changed_by:
+        if (
+            not self.changed_by
+            or not current_app.config["ENABLE_BROAD_ACTIVITY_ACCESS"]
+        ):
             return ""
         return f"/superset/profile/{self.changed_by.username}"
 
diff --git a/superset/models/slice.py b/superset/models/slice.py
index 54429133d3..33fed84c90 100644
--- a/superset/models/slice.py
+++ b/superset/models/slice.py
@@ -22,6 +22,7 @@ from typing import Any, Dict, Optional, Type, TYPE_CHECKING
 from urllib import parse
 
 import sqlalchemy as sqla
+from flask import current_app
 from flask_appbuilder import Model
 from flask_appbuilder.models.decorators import renders
 from markupsafe import escape, Markup
@@ -326,7 +327,12 @@ class Slice(  # pylint: disable=too-many-public-methods
 
     @property
     def changed_by_url(self) -> str:
-        return f"/superset/profile/{self.changed_by.username}"  # type: ignore
+        if (
+            not self.changed_by
+            or not current_app.config["ENABLE_BROAD_ACTIVITY_ACCESS"]
+        ):
+            return ""
+        return f"/superset/profile/{self.changed_by.username}"
 
     @property
     def icons(self) -> str:
diff --git a/superset/queries/api.py b/superset/queries/api.py
index b5737ea812..e6209a8d62 100644
--- a/superset/queries/api.py
+++ b/superset/queries/api.py
@@ -82,7 +82,6 @@ class QueryRestApi(BaseSupersetModelRestApi):
         "user.first_name",
         "user.id",
         "user.last_name",
-        "user.username",
         "start_time",
         "end_time",
         "tmp_table_name",
diff --git a/superset/queries/schemas.py b/superset/queries/schemas.py
index c29c1c03b6..b139784c5b 100644
--- a/superset/queries/schemas.py
+++ b/superset/queries/schemas.py
@@ -65,7 +65,7 @@ class QuerySchema(Schema):
     tab_name = fields.String()
     tmp_table_name = fields.String()
     tracking_url = fields.String()
-    user = fields.Nested(UserSchema)
+    user = fields.Nested(UserSchema(exclude=["username"]))
 
     class Meta:  # pylint: disable=too-few-public-methods
         model = Query
diff --git a/superset/tags/schemas.py b/superset/tags/schemas.py
new file mode 100644
index 0000000000..71ab005bbc
--- /dev/null
+++ b/superset/tags/schemas.py
@@ -0,0 +1,59 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from marshmallow import fields, Schema
+
+from superset.dashboards.schemas import UserSchema
+
+delete_tags_schema = {"type": "array", "items": {"type": "string"}}
+
+object_type_description = "A title for the tag."
+
+openapi_spec_methods_override = {
+    "get": {"get": {"description": "Get a tag detail information."}},
+    "get_list": {
+        "get": {
+            "description": "Get a list of tags, use Rison or JSON query "
+            "parameters for filtering, sorting, pagination and "
+            " for selecting specific columns and metadata.",
+        }
+    },
+    "info": {
+        "get": {
+            "description": "Several metadata information about tag API " "endpoints.",
+        }
+    },
+}
+
+
+class TaggedObjectEntityResponseSchema(Schema):
+    id = fields.Int()
+    type = fields.String()
+    name = fields.String()
+    url = fields.String()
+    changed_on = fields.DateTime()
+    created_by = fields.Nested(UserSchema(exclude=["username"]))
+    creator = fields.String()
+
+
+class TagGetResponseSchema(Schema):
+    id = fields.Int()
+    name = fields.String()
+    type = fields.String()
+
+
+class TagPostSchema(Schema):
+    tags = fields.List(fields.String())
diff --git a/tests/integration_tests/charts/api_tests.py b/tests/integration_tests/charts/api_tests.py
index 38fa1b7a6c..02c5ce261e 100644
--- a/tests/integration_tests/charts/api_tests.py
+++ b/tests/integration_tests/charts/api_tests.py
@@ -605,6 +605,114 @@ class TestChartApi(SupersetTestCase, ApiOwnersTestCaseMixin, InsertChartMixin):
         db.session.delete(model)
         db.session.commit()
 
+    @pytest.mark.usefixtures("load_birth_names_dashboard_with_slices")
+    def test_chart_activity_access_disabled(self):
+        """
+        Chart API: Test ENABLE_BROAD_ACTIVITY_ACCESS = False
+        """
+        access_flag = app.config["ENABLE_BROAD_ACTIVITY_ACCESS"]
+        app.config["ENABLE_BROAD_ACTIVITY_ACCESS"] = False
+        admin = self.get_user("admin")
+        birth_names_table_id = SupersetTestCase.get_table(name="birth_names").id
+        chart_id = self.insert_chart("title", [admin.id], birth_names_table_id).id
+        chart_data = {
+            "slice_name": (new_name := "title1_changed"),
+        }
+        self.login(username="admin")
+        uri = f"api/v1/chart/{chart_id}"
+        rv = self.put_assert_metric(uri, chart_data, "put")
+        self.assertEqual(rv.status_code, 200)
+        model = db.session.query(Slice).get(chart_id)
+
+        self.assertEqual(model.slice_name, new_name)
+        self.assertEqual(model.changed_by_url, "")
+
+        app.config["ENABLE_BROAD_ACTIVITY_ACCESS"] = access_flag
+        db.session.delete(model)
+        db.session.commit()
+
+    @pytest.mark.usefixtures("load_birth_names_dashboard_with_slices")
+    def test_chart_activity_access_enabled(self):
+        """
+        Chart API: Test ENABLE_BROAD_ACTIVITY_ACCESS = True
+        """
+        access_flag = app.config["ENABLE_BROAD_ACTIVITY_ACCESS"]
+        app.config["ENABLE_BROAD_ACTIVITY_ACCESS"] = True
+        admin = self.get_user("admin")
+        birth_names_table_id = SupersetTestCase.get_table(name="birth_names").id
+        chart_id = self.insert_chart("title", [admin.id], birth_names_table_id).id
+        chart_data = {
+            "slice_name": (new_name := "title1_changed"),
+        }
+        self.login(username="admin")
+        uri = f"api/v1/chart/{chart_id}"
+        rv = self.put_assert_metric(uri, chart_data, "put")
+        self.assertEqual(rv.status_code, 200)
+        model = db.session.query(Slice).get(chart_id)
+
+        self.assertEqual(model.slice_name, new_name)
+        self.assertEqual(model.changed_by_url, "/superset/profile/admin")
+
+        app.config["ENABLE_BROAD_ACTIVITY_ACCESS"] = access_flag
+        db.session.delete(model)
+        db.session.commit()
+
+    @pytest.mark.usefixtures("load_birth_names_dashboard_with_slices")
+    def test_chart_get_list_no_username(self):
+        """
+        Chart API: Tests that no username is returned
+        """
+        admin = self.get_user("admin")
+        birth_names_table_id = SupersetTestCase.get_table(name="birth_names").id
+        chart_id = self.insert_chart("title", [admin.id], birth_names_table_id).id
+        chart_data = {
+            "slice_name": (new_name := "title1_changed"),
+            "owners": [admin.id],
+        }
+        self.login(username="admin")
+        uri = f"api/v1/chart/{chart_id}"
+        rv = self.put_assert_metric(uri, chart_data, "put")
+        self.assertEqual(rv.status_code, 200)
+        model = db.session.query(Slice).get(chart_id)
+
+        response = self.get_assert_metric("api/v1/chart/", "get_list")
+        res = json.loads(response.data.decode("utf-8"))["result"]
+
+        current_chart = [d for d in res if d["id"] == chart_id][0]
+        self.assertEqual(current_chart["slice_name"], new_name)
+        self.assertNotIn("username", current_chart["changed_by"].keys())
+        self.assertNotIn("username", current_chart["owners"][0].keys())
+
+        db.session.delete(model)
+        db.session.commit()
+
+    @pytest.mark.usefixtures("load_birth_names_dashboard_with_slices")
+    def test_chart_get_no_username(self):
+        """
+        Chart API: Tests that no username is returned
+        """
+        admin = self.get_user("admin")
+        birth_names_table_id = SupersetTestCase.get_table(name="birth_names").id
+        chart_id = self.insert_chart("title", [admin.id], birth_names_table_id).id
+        chart_data = {
+            "slice_name": (new_name := "title1_changed"),
+            "owners": [admin.id],
+        }
+        self.login(username="admin")
+        uri = f"api/v1/chart/{chart_id}"
+        rv = self.put_assert_metric(uri, chart_data, "put")
+        self.assertEqual(rv.status_code, 200)
+        model = db.session.query(Slice).get(chart_id)
+
+        response = self.get_assert_metric(uri, "get")
+        res = json.loads(response.data.decode("utf-8"))["result"]
+
+        self.assertEqual(res["slice_name"], new_name)
+        self.assertNotIn("username", res["owners"][0].keys())
+
+        db.session.delete(model)
+        db.session.commit()
+
     def test_update_chart_new_owner_not_admin(self):
         """
         Chart API: Test update set new owner implicitly adds logged in owner
@@ -823,7 +931,6 @@ class TestChartApi(SupersetTestCase, ApiOwnersTestCaseMixin, InsertChartMixin):
             "owners": [
                 {
                     "id": 1,
-                    "username": "admin",
                     "first_name": "admin",
                     "last_name": "user",
                 }
diff --git a/tests/integration_tests/dashboards/api_tests.py b/tests/integration_tests/dashboards/api_tests.py
index 725811ce5f..c9d25b679c 100644
--- a/tests/integration_tests/dashboards/api_tests.py
+++ b/tests/integration_tests/dashboards/api_tests.py
@@ -31,7 +31,7 @@ import yaml
 
 from freezegun import freeze_time
 from sqlalchemy import and_
-from superset import db, security_manager
+from superset import app, db, security_manager
 from superset.models.dashboard import Dashboard
 from superset.models.core import FavStar, FavStarClassName
 from superset.reports.models import ReportSchedule, ReportScheduleType
@@ -424,7 +424,6 @@ class TestDashboardApi(SupersetTestCase, ApiOwnersTestCaseMixin, InsertChartMixi
             "owners": [
                 {
                     "id": 1,
-                    "username": "admin",
                     "first_name": "admin",
                     "last_name": "user",
                 }
@@ -1298,6 +1297,109 @@ class TestDashboardApi(SupersetTestCase, ApiOwnersTestCaseMixin, InsertChartMixi
         db.session.delete(model)
         db.session.commit()
 
+    def test_dashboard_activity_access_disabled(self):
+        """
+        Dashboard API: Test ENABLE_BROAD_ACTIVITY_ACCESS = False
+        """
+        access_flag = app.config["ENABLE_BROAD_ACTIVITY_ACCESS"]
+        app.config["ENABLE_BROAD_ACTIVITY_ACCESS"] = False
+        admin = self.get_user("admin")
+        admin_role = self.get_role("Admin")
+        dashboard_id = self.insert_dashboard(
+            "title1", "slug1", [admin.id], roles=[admin_role.id]
+        ).id
+        self.login(username="admin")
+        uri = f"api/v1/dashboard/{dashboard_id}"
+        dashboard_data = {"dashboard_title": "title2"}
+        rv = self.client.put(uri, json=dashboard_data)
+        self.assertEqual(rv.status_code, 200)
+        model = db.session.query(Dashboard).get(dashboard_id)
+
+        self.assertEqual(model.dashboard_title, "title2")
+        self.assertEqual(model.changed_by_url, "")
+
+        app.config["ENABLE_BROAD_ACTIVITY_ACCESS"] = access_flag
+        db.session.delete(model)
+        db.session.commit()
+
+    def test_dashboard_activity_access_enabled(self):
+        """
+        Dashboard API: Test ENABLE_BROAD_ACTIVITY_ACCESS = True
+        """
+        access_flag = app.config["ENABLE_BROAD_ACTIVITY_ACCESS"]
+        app.config["ENABLE_BROAD_ACTIVITY_ACCESS"] = True
+        admin = self.get_user("admin")
+        admin_role = self.get_role("Admin")
+        dashboard_id = self.insert_dashboard(
+            "title1", "slug1", [admin.id], roles=[admin_role.id]
+        ).id
+        self.login(username="admin")
+        uri = f"api/v1/dashboard/{dashboard_id}"
+        dashboard_data = {"dashboard_title": "title2"}
+        rv = self.client.put(uri, json=dashboard_data)
+        self.assertEqual(rv.status_code, 200)
+        model = db.session.query(Dashboard).get(dashboard_id)
+
+        self.assertEqual(model.dashboard_title, "title2")
+        self.assertEqual(model.changed_by_url, "/superset/profile/admin")
+
+        app.config["ENABLE_BROAD_ACTIVITY_ACCESS"] = access_flag
+        db.session.delete(model)
+        db.session.commit()
+
+    def test_dashboard_get_list_no_username(self):
+        """
+        Dashboard API: Tests that no username is returned
+        """
+        admin = self.get_user("admin")
+        admin_role = self.get_role("Admin")
+        dashboard_id = self.insert_dashboard(
+            "title1", "slug1", [admin.id], roles=[admin_role.id]
+        ).id
+        model = db.session.query(Dashboard).get(dashboard_id)
+        self.login(username="admin")
+        uri = f"api/v1/dashboard/{dashboard_id}"
+        dashboard_data = {"dashboard_title": "title2"}
+        rv = self.client.put(uri, json=dashboard_data)
+        self.assertEqual(rv.status_code, 200)
+
+        response = self.get_assert_metric("api/v1/dashboard/", "get_list")
+        res = json.loads(response.data.decode("utf-8"))["result"]
+
+        current_dash = [d for d in res if d["id"] == dashboard_id][0]
+        self.assertEqual(current_dash["dashboard_title"], "title2")
+        self.assertNotIn("username", current_dash["changed_by"].keys())
+        self.assertNotIn("username", current_dash["owners"][0].keys())
+
+        db.session.delete(model)
+        db.session.commit()
+
+    def test_dashboard_get_no_username(self):
+        """
+        Dashboard API: Tests that no username is returned
+        """
+        admin = self.get_user("admin")
+        admin_role = self.get_role("Admin")
+        dashboard_id = self.insert_dashboard(
+            "title1", "slug1", [admin.id], roles=[admin_role.id]
+        ).id
+        model = db.session.query(Dashboard).get(dashboard_id)
+        self.login(username="admin")
+        uri = f"api/v1/dashboard/{dashboard_id}"
+        dashboard_data = {"dashboard_title": "title2"}
+        rv = self.client.put(uri, json=dashboard_data)
+        self.assertEqual(rv.status_code, 200)
+
+        response = self.get_assert_metric(uri, "get")
+        res = json.loads(response.data.decode("utf-8"))["result"]
+
+        self.assertEqual(res["dashboard_title"], "title2")
+        self.assertNotIn("username", res["changed_by"].keys())
+        self.assertNotIn("username", res["owners"][0].keys())
+
+        db.session.delete(model)
+        db.session.commit()
+
     @pytest.mark.usefixtures("load_birth_names_dashboard_with_slices")
     def test_update_dashboard_chart_owners(self):
         """
diff --git a/tests/integration_tests/datasets/api_tests.py b/tests/integration_tests/datasets/api_tests.py
index 8071902c45..cd5ecec13f 100644
--- a/tests/integration_tests/datasets/api_tests.py
+++ b/tests/integration_tests/datasets/api_tests.py
@@ -28,6 +28,7 @@ import yaml
 from sqlalchemy.orm import joinedload
 from sqlalchemy.sql import func
 
+from superset import app
 from superset.connectors.sqla.models import SqlaTable, SqlMetric, TableColumn
 from superset.dao.exceptions import (
     DAOCreateFailedError,
@@ -1294,6 +1295,107 @@ class TestDatasetApi(SupersetTestCase):
         db.session.delete(dataset)
         db.session.commit()
 
+    def test_dataset_get_list_no_username(self):
+        """
+        Dataset API: Tests that no username is returned
+        """
+        if backend() == "sqlite":
+            return
+
+        dataset = self.insert_default_dataset()
+        self.login(username="admin")
+        table_data = {"description": "changed_description"}
+        uri = f"api/v1/dataset/{dataset.id}"
+        rv = self.client.put(uri, json=table_data)
+        self.assertEqual(rv.status_code, 200)
+
+        response = self.get_assert_metric("api/v1/dataset/", "get_list")
+        res = json.loads(response.data.decode("utf-8"))["result"]
+
+        current_dataset = [d for d in res if d["id"] == dataset.id][0]
+        self.assertEqual(current_dataset["description"], "changed_description")
+        self.assertNotIn("username", current_dataset["changed_by"].keys())
+
+        db.session.delete(dataset)
+        db.session.commit()
+
+    def test_dataset_get_no_username(self):
+        """
+        Dataset API: Tests that no username is returned
+        """
+        if backend() == "sqlite":
+            return
+
+        dataset = self.insert_default_dataset()
+        self.login(username="admin")
+        table_data = {"description": "changed_description"}
+        uri = f"api/v1/dataset/{dataset.id}"
+        rv = self.client.put(uri, json=table_data)
+        self.assertEqual(rv.status_code, 200)
+
+        response = self.get_assert_metric(uri, "get")
+        res = json.loads(response.data.decode("utf-8"))["result"]
+
+        self.assertEqual(res["description"], "changed_description")
+        self.assertNotIn("username", res["changed_by"].keys())
+
+        db.session.delete(dataset)
+        db.session.commit()
+
+    def test_dataset_activity_access_enabled(self):
+        """
+        Dataset API: Test ENABLE_BROAD_ACTIVITY_ACCESS = True
+        """
+        if backend() == "sqlite":
+            return
+
+        access_flag = app.config["ENABLE_BROAD_ACTIVITY_ACCESS"]
+        app.config["ENABLE_BROAD_ACTIVITY_ACCESS"] = True
+        dataset = self.insert_default_dataset()
+        self.login(username="admin")
+        table_data = {"description": "changed_description"}
+        uri = f"api/v1/dataset/{dataset.id}"
+        rv = self.client.put(uri, json=table_data)
+        self.assertEqual(rv.status_code, 200)
+
+        response = self.get_assert_metric("api/v1/dataset/", "get_list")
+        res = json.loads(response.data.decode("utf-8"))["result"]
+
+        current_dataset = [d for d in res if d["id"] == dataset.id][0]
+        self.assertEqual(current_dataset["description"], "changed_description")
+        self.assertEqual(current_dataset["changed_by_url"], "/superset/profile/admin")
+
+        app.config["ENABLE_BROAD_ACTIVITY_ACCESS"] = access_flag
+        db.session.delete(dataset)
+        db.session.commit()
+
+    def test_dataset_activity_access_disabled(self):
+        """
+        Dataset API: Test ENABLE_BROAD_ACTIVITY_ACCESS = Fase
+        """
+        if backend() == "sqlite":
+            return
+
+        access_flag = app.config["ENABLE_BROAD_ACTIVITY_ACCESS"]
+        app.config["ENABLE_BROAD_ACTIVITY_ACCESS"] = False
+        dataset = self.insert_default_dataset()
+        self.login(username="admin")
+        table_data = {"description": "changed_description"}
+        uri = f"api/v1/dataset/{dataset.id}"
+        rv = self.put_assert_metric(uri, table_data, "put")
+        self.assertEqual(rv.status_code, 200)
+
+        response = self.get_assert_metric("api/v1/dataset/", "get_list")
+        res = json.loads(response.data.decode("utf-8"))["result"]
+
+        current_dataset = [d for d in res if d["id"] == dataset.id][0]
+        self.assertEqual(current_dataset["description"], "changed_description")
+        self.assertEqual(current_dataset["changed_by_url"], "")
+
+        app.config["ENABLE_BROAD_ACTIVITY_ACCESS"] = access_flag
+        db.session.delete(dataset)
+        db.session.commit()
+
     def test_update_dataset_item_not_owned(self):
         """
         Dataset API: Test update dataset item not owned
diff --git a/tests/integration_tests/queries/api_tests.py b/tests/integration_tests/queries/api_tests.py
index 7abcb31df1..b3b291cf96 100644
--- a/tests/integration_tests/queries/api_tests.py
+++ b/tests/integration_tests/queries/api_tests.py
@@ -285,7 +285,6 @@ class TestQueryApi(SupersetTestCase):
             "first_name",
             "id",
             "last_name",
-            "username",
         ]
         assert list(data["result"][0]["database"].keys()) == [
             "database_name",
diff --git a/tests/integration_tests/sqllab_tests.py b/tests/integration_tests/sqllab_tests.py
index 57b46f7bbc..f8e650c102 100644
--- a/tests/integration_tests/sqllab_tests.py
+++ b/tests/integration_tests/sqllab_tests.py
@@ -630,9 +630,11 @@ class TestSqlLab(SupersetTestCase):
         admin = security_manager.find_user("admin")
         gamma_sqllab = security_manager.find_user("gamma_sqllab")
         self.assertEqual(3, len(data["result"]))
-        user_queries = [result.get("user").get("username") for result in data["result"]]
-        assert admin.username in user_queries
-        assert gamma_sqllab.username in user_queries
+        user_queries = [
+            result.get("user").get("first_name") for result in data["result"]
+        ]
+        assert admin.first_name in user_queries
+        assert gamma_sqllab.first_name in user_queries
 
     def test_query_api_can_access_all_queries(self) -> None:
         """


[superset] 02/18: remove blocking test from release

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 483195ad70535a5e50c706b4b99eb23756ba4e54
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Fri Jun 2 17:10:55 2023 -0700

    remove blocking test from release
---
 tests/integration_tests/sqllab_tests.py | 18 +++++++++---------
 1 file changed, 9 insertions(+), 9 deletions(-)

diff --git a/tests/integration_tests/sqllab_tests.py b/tests/integration_tests/sqllab_tests.py
index aa15308e92..843057bb69 100644
--- a/tests/integration_tests/sqllab_tests.py
+++ b/tests/integration_tests/sqllab_tests.py
@@ -758,15 +758,15 @@ class TestSqlLab(SupersetTestCase):
         {"ENABLE_TEMPLATE_PROCESSING": True},
         clear=True,
     )
-    def test_sql_json_parameter_forbidden(self):
-        self.login("gamma")
-
-        data = self.run_sql(
-            "SELECT name FROM {{ table }} LIMIT 10",
-            "4",
-            template_params=json.dumps({"table": "birth_names"}),
-        )
-        assert data["errors"][0]["error_type"] == "GENERIC_BACKEND_ERROR"
+    # def test_sql_json_parameter_forbidden(self):
+    #     self.login("gamma")
+
+    #     data = self.run_sql(
+    #         "SELECT name FROM {{ table }} LIMIT 10",
+    #         "4",
+    #         template_params=json.dumps({"table": "birth_names"}),
+    #     )
+    #     assert data["errors"][0]["error_type"] == "GENERIC_BACKEND_ERROR"
 
     @mock.patch("superset.sql_lab.get_query")
     @mock.patch("superset.sql_lab.execute_sql_statement")


[superset] 13/18: fix: handle temporal columns in presto partitions (#24054)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 5f21e7385f37f0886eecef75c853ee57c735205f
Author: Rob Moore <gi...@users.noreply.github.com>
AuthorDate: Fri May 19 21:29:42 2023 +0100

    fix: handle temporal columns in presto partitions (#24054)
---
 superset/db_engine_specs/base.py                |  2 +-
 superset/db_engine_specs/hive.py                |  2 +-
 superset/db_engine_specs/presto.py              | 18 ++++++-----
 tests/unit_tests/db_engine_specs/test_presto.py | 43 ++++++++++++++++++++++++-
 4 files changed, 54 insertions(+), 11 deletions(-)

diff --git a/superset/db_engine_specs/base.py b/superset/db_engine_specs/base.py
index 27dd34a802..b789bbe70c 100644
--- a/superset/db_engine_specs/base.py
+++ b/superset/db_engine_specs/base.py
@@ -1168,7 +1168,7 @@ class BaseEngineSpec:  # pylint: disable=too-many-public-methods
         schema: Optional[str],
         database: Database,
         query: Select,
-        columns: Optional[List[Dict[str, str]]] = None,
+        columns: Optional[List[Dict[str, Any]]] = None,
     ) -> Optional[Select]:
         """
         Add a where clause to a query to reference only the most recent partition
diff --git a/superset/db_engine_specs/hive.py b/superset/db_engine_specs/hive.py
index f07d53518c..44dc435c2c 100644
--- a/superset/db_engine_specs/hive.py
+++ b/superset/db_engine_specs/hive.py
@@ -404,7 +404,7 @@ class HiveEngineSpec(PrestoEngineSpec):
         schema: Optional[str],
         database: "Database",
         query: Select,
-        columns: Optional[List[Dict[str, str]]] = None,
+        columns: Optional[List[Dict[str, Any]]] = None,
     ) -> Optional[Select]:
         try:
             col_names, values = cls.latest_partition(
diff --git a/superset/db_engine_specs/presto.py b/superset/db_engine_specs/presto.py
index 6bd556b79e..87f362acc8 100644
--- a/superset/db_engine_specs/presto.py
+++ b/superset/db_engine_specs/presto.py
@@ -462,7 +462,7 @@ class PrestoBaseEngineSpec(BaseEngineSpec, metaclass=ABCMeta):
         schema: Optional[str],
         database: Database,
         query: Select,
-        columns: Optional[List[Dict[str, str]]] = None,
+        columns: Optional[List[Dict[str, Any]]] = None,
     ) -> Optional[Select]:
         try:
             col_names, values = cls.latest_partition(
@@ -480,13 +480,15 @@ class PrestoBaseEngineSpec(BaseEngineSpec, metaclass=ABCMeta):
         }
 
         for col_name, value in zip(col_names, values):
-            if col_name in column_type_by_name:
-                if column_type_by_name.get(col_name) == "TIMESTAMP":
-                    query = query.where(Column(col_name, TimeStamp()) == value)
-                elif column_type_by_name.get(col_name) == "DATE":
-                    query = query.where(Column(col_name, Date()) == value)
-                else:
-                    query = query.where(Column(col_name) == value)
+            col_type = column_type_by_name.get(col_name)
+
+            if isinstance(col_type, types.DATE):
+                col_type = Date()
+            elif isinstance(col_type, types.TIMESTAMP):
+                col_type = TimeStamp()
+
+            query = query.where(Column(col_name, col_type) == value)
+
         return query
 
     @classmethod
diff --git a/tests/unit_tests/db_engine_specs/test_presto.py b/tests/unit_tests/db_engine_specs/test_presto.py
index a30fab94c9..8f55b1c048 100644
--- a/tests/unit_tests/db_engine_specs/test_presto.py
+++ b/tests/unit_tests/db_engine_specs/test_presto.py
@@ -16,10 +16,13 @@
 # under the License.
 from datetime import datetime
 from typing import Any, Dict, Optional, Type
+from unittest import mock
 
 import pytest
 import pytz
-from sqlalchemy import types
+from pyhive.sqlalchemy_presto import PrestoDialect
+from sqlalchemy import sql, text, types
+from sqlalchemy.engine.url import make_url
 
 from superset.utils.core import GenericDataType
 from tests.unit_tests.db_engine_specs.utils import (
@@ -82,3 +85,41 @@ def test_get_column_spec(
     from superset.db_engine_specs.presto import PrestoEngineSpec as spec
 
     assert_column_spec(spec, native_type, sqla_type, attrs, generic_type, is_dttm)
+
+
+@mock.patch("superset.db_engine_specs.presto.PrestoEngineSpec.latest_partition")
+@pytest.mark.parametrize(
+    ["column_type", "column_value", "expected_value"],
+    [
+        (types.DATE(), "2023-05-01", "DATE '2023-05-01'"),
+        (types.TIMESTAMP(), "2023-05-01", "TIMESTAMP '2023-05-01'"),
+        (types.VARCHAR(), "2023-05-01", "'2023-05-01'"),
+        (types.INT(), 1234, "1234"),
+    ],
+)
+def test_where_latest_partition(
+    mock_latest_partition: Any,
+    column_type: Any,
+    column_value: str,
+    expected_value: str,
+) -> None:
+    """
+    Test the ``where_latest_partition`` method
+    """
+    from superset.db_engine_specs.presto import PrestoEngineSpec as spec
+
+    mock_latest_partition.return_value = (["partition_key"], [column_value])
+
+    query = sql.select(text("* FROM table"))
+    columns = [{"name": "partition_key", "type": column_type}]
+
+    expected = f"""SELECT * FROM table \nWHERE "partition_key" = {expected_value}"""
+    result = spec.where_latest_partition(
+        "table", mock.MagicMock(), mock.MagicMock(), query, columns
+    )
+    assert result is not None
+    actual = result.compile(
+        dialect=PrestoDialect(), compile_kwargs={"literal_binds": True}
+    )
+
+    assert str(actual) == expected


[superset] 14/18: merge in fix with migration (#24314)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 7247b9bb07da6b3d534465d72375198cd957ecb6
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Wed Jun 7 13:41:32 2023 -0700

    merge in fix with migration (#24314)
    
    Co-authored-by: Ville Brofeldt <33...@users.noreply.github.com>
    Co-authored-by: Ville Brofeldt <vi...@apple.com>
---
 superset/dashboards/permalink/commands/base.py     |  3 +-
 superset/dashboards/permalink/commands/create.py   |  1 +
 superset/dashboards/permalink/commands/get.py      |  6 +-
 superset/explore/permalink/commands/base.py        |  3 +-
 superset/explore/permalink/commands/create.py      |  3 +-
 superset/explore/permalink/commands/get.py         |  1 +
 superset/extensions/metastore_cache.py             | 11 ++-
 superset/key_value/commands/create.py              | 23 ++++--
 superset/key_value/commands/get.py                 | 15 +++-
 superset/key_value/commands/update.py              | 11 ++-
 superset/key_value/commands/upsert.py              | 13 +--
 superset/key_value/shared_entries.py               | 12 ++-
 superset/key_value/types.py                        | 33 +++++++-
 ...2a5681ddfd_convert_key_value_entries_to_json.py | 96 ++++++++++++++++++++++
 superset/temporary_cache/api.py                    | 13 ++-
 superset/temporary_cache/commands/parameters.py    |  3 +
 .../explore/permalink/api_tests.py                 |  5 +-
 .../key_value/commands/create_test.py              | 55 +++++++++++--
 .../key_value/commands/delete_test.py              | 13 +--
 .../key_value/commands/fixtures.py                 | 15 +++-
 .../key_value/commands/get_test.py                 | 25 +++---
 .../key_value/commands/update_test.py              | 11 ++-
 .../key_value/commands/upsert_test.py              | 11 ++-
 23 files changed, 311 insertions(+), 71 deletions(-)

diff --git a/superset/dashboards/permalink/commands/base.py b/superset/dashboards/permalink/commands/base.py
index f4dc4f0726..82e24264ca 100644
--- a/superset/dashboards/permalink/commands/base.py
+++ b/superset/dashboards/permalink/commands/base.py
@@ -18,11 +18,12 @@ from abc import ABC
 
 from superset.commands.base import BaseCommand
 from superset.key_value.shared_entries import get_permalink_salt
-from superset.key_value.types import KeyValueResource, SharedKey
+from superset.key_value.types import JsonKeyValueCodec, KeyValueResource, SharedKey
 
 
 class BaseDashboardPermalinkCommand(BaseCommand, ABC):
     resource = KeyValueResource.DASHBOARD_PERMALINK
+    codec = JsonKeyValueCodec()
 
     @property
     def salt(self) -> str:
diff --git a/superset/dashboards/permalink/commands/create.py b/superset/dashboards/permalink/commands/create.py
index 51dac2d5de..2b6151fbb2 100644
--- a/superset/dashboards/permalink/commands/create.py
+++ b/superset/dashboards/permalink/commands/create.py
@@ -58,6 +58,7 @@ class CreateDashboardPermalinkCommand(BaseDashboardPermalinkCommand):
                 resource=self.resource,
                 key=get_deterministic_uuid(self.salt, (user_id, value)),
                 value=value,
+                codec=self.codec,
             ).run()
             assert key.id  # for type checks
             return encode_permalink_key(key=key.id, salt=self.salt)
diff --git a/superset/dashboards/permalink/commands/get.py b/superset/dashboards/permalink/commands/get.py
index f89f9444e7..4206263a37 100644
--- a/superset/dashboards/permalink/commands/get.py
+++ b/superset/dashboards/permalink/commands/get.py
@@ -39,7 +39,11 @@ class GetDashboardPermalinkCommand(BaseDashboardPermalinkCommand):
         self.validate()
         try:
             key = decode_permalink_id(self.key, salt=self.salt)
-            command = GetKeyValueCommand(resource=self.resource, key=key)
+            command = GetKeyValueCommand(
+                resource=self.resource,
+                key=key,
+                codec=self.codec,
+            )
             value: Optional[DashboardPermalinkValue] = command.run()
             if value:
                 DashboardDAO.get_by_id_or_slug(value["dashboardId"])
diff --git a/superset/explore/permalink/commands/base.py b/superset/explore/permalink/commands/base.py
index bef9546e21..a87183b7e9 100644
--- a/superset/explore/permalink/commands/base.py
+++ b/superset/explore/permalink/commands/base.py
@@ -18,11 +18,12 @@ from abc import ABC
 
 from superset.commands.base import BaseCommand
 from superset.key_value.shared_entries import get_permalink_salt
-from superset.key_value.types import KeyValueResource, SharedKey
+from superset.key_value.types import JsonKeyValueCodec, KeyValueResource, SharedKey
 
 
 class BaseExplorePermalinkCommand(BaseCommand, ABC):
     resource: KeyValueResource = KeyValueResource.EXPLORE_PERMALINK
+    codec = JsonKeyValueCodec()
 
     @property
     def salt(self) -> str:
diff --git a/superset/explore/permalink/commands/create.py b/superset/explore/permalink/commands/create.py
index 77ce04c4e4..21c0f4e42f 100644
--- a/superset/explore/permalink/commands/create.py
+++ b/superset/explore/permalink/commands/create.py
@@ -45,13 +45,14 @@ class CreateExplorePermalinkCommand(BaseExplorePermalinkCommand):
             value = {
                 "chartId": self.chart_id,
                 "datasourceId": datasource_id,
-                "datasourceType": datasource_type,
+                "datasourceType": datasource_type.value,
                 "datasource": self.datasource,
                 "state": self.state,
             }
             command = CreateKeyValueCommand(
                 resource=self.resource,
                 value=value,
+                codec=self.codec,
             )
             key = command.run()
             if key.id is None:
diff --git a/superset/explore/permalink/commands/get.py b/superset/explore/permalink/commands/get.py
index 3376cab080..4823117ece 100644
--- a/superset/explore/permalink/commands/get.py
+++ b/superset/explore/permalink/commands/get.py
@@ -43,6 +43,7 @@ class GetExplorePermalinkCommand(BaseExplorePermalinkCommand):
             value: Optional[ExplorePermalinkValue] = GetKeyValueCommand(
                 resource=self.resource,
                 key=key,
+                codec=self.codec,
             ).run()
             if value:
                 chart_id: Optional[int] = value.get("chartId")
diff --git a/superset/extensions/metastore_cache.py b/superset/extensions/metastore_cache.py
index 1e5cff7ee3..f69276c908 100644
--- a/superset/extensions/metastore_cache.py
+++ b/superset/extensions/metastore_cache.py
@@ -23,10 +23,11 @@ from flask import Flask
 from flask_caching import BaseCache
 
 from superset.key_value.exceptions import KeyValueCreateFailedError
-from superset.key_value.types import KeyValueResource
+from superset.key_value.types import KeyValueResource, PickleKeyValueCodec
 from superset.key_value.utils import get_uuid_namespace
 
 RESOURCE = KeyValueResource.METASTORE_CACHE
+CODEC = PickleKeyValueCodec()
 
 
 class SupersetMetastoreCache(BaseCache):
@@ -68,6 +69,7 @@ class SupersetMetastoreCache(BaseCache):
             resource=RESOURCE,
             key=self.get_key(key),
             value=value,
+            codec=CODEC,
             expires_on=self._get_expiry(timeout),
         ).run()
         return True
@@ -80,6 +82,7 @@ class SupersetMetastoreCache(BaseCache):
             CreateKeyValueCommand(
                 resource=RESOURCE,
                 value=value,
+                codec=CODEC,
                 key=self.get_key(key),
                 expires_on=self._get_expiry(timeout),
             ).run()
@@ -92,7 +95,11 @@ class SupersetMetastoreCache(BaseCache):
         # pylint: disable=import-outside-toplevel
         from superset.key_value.commands.get import GetKeyValueCommand
 
-        return GetKeyValueCommand(resource=RESOURCE, key=self.get_key(key)).run()
+        return GetKeyValueCommand(
+            resource=RESOURCE,
+            key=self.get_key(key),
+            codec=CODEC,
+        ).run()
 
     def has(self, key: str) -> bool:
         entry = self.get(key)
diff --git a/superset/key_value/commands/create.py b/superset/key_value/commands/create.py
index 93e99c223b..d66d99d6e9 100644
--- a/superset/key_value/commands/create.py
+++ b/superset/key_value/commands/create.py
@@ -15,7 +15,6 @@
 # specific language governing permissions and limitations
 # under the License.
 import logging
-import pickle
 from datetime import datetime
 from typing import Any, Optional, Union
 from uuid import UUID
@@ -26,7 +25,7 @@ from superset import db
 from superset.commands.base import BaseCommand
 from superset.key_value.exceptions import KeyValueCreateFailedError
 from superset.key_value.models import KeyValueEntry
-from superset.key_value.types import Key, KeyValueResource
+from superset.key_value.types import Key, KeyValueCodec, KeyValueResource
 from superset.utils.core import get_user_id
 
 logger = logging.getLogger(__name__)
@@ -35,13 +34,15 @@ logger = logging.getLogger(__name__)
 class CreateKeyValueCommand(BaseCommand):
     resource: KeyValueResource
     value: Any
+    codec: KeyValueCodec
     key: Optional[Union[int, UUID]]
     expires_on: Optional[datetime]
 
-    def __init__(
+    def __init__(  # pylint: disable=too-many-arguments
         self,
         resource: KeyValueResource,
         value: Any,
+        codec: KeyValueCodec,
         key: Optional[Union[int, UUID]] = None,
         expires_on: Optional[datetime] = None,
     ):
@@ -50,16 +51,24 @@ class CreateKeyValueCommand(BaseCommand):
 
         :param resource: the resource (dashboard, chart etc)
         :param value: the value to persist in the key-value store
+        :param codec: codec used to encode the value
         :param key: id of entry (autogenerated if undefined)
         :param expires_on: entry expiration time
-        :return: the key associated with the persisted value
+        :
         """
         self.resource = resource
         self.value = value
+        self.codec = codec
         self.key = key
         self.expires_on = expires_on
 
     def run(self) -> Key:
+        """
+        Persist the value
+
+        :return: the key associated with the persisted value
+
+        """
         try:
             return self.create()
         except SQLAlchemyError as ex:
@@ -70,9 +79,13 @@ class CreateKeyValueCommand(BaseCommand):
         pass
 
     def create(self) -> Key:
+        try:
+            value = self.codec.encode(self.value)
+        except Exception as ex:  # pylint: disable=broad-except
+            raise KeyValueCreateFailedError("Unable to encode value") from ex
         entry = KeyValueEntry(
             resource=self.resource.value,
-            value=pickle.dumps(self.value),
+            value=value,
             created_on=datetime.now(),
             created_by_fk=get_user_id(),
             expires_on=self.expires_on,
diff --git a/superset/key_value/commands/get.py b/superset/key_value/commands/get.py
index 44c02331cc..9d659f3bc7 100644
--- a/superset/key_value/commands/get.py
+++ b/superset/key_value/commands/get.py
@@ -16,7 +16,6 @@
 # under the License.
 
 import logging
-import pickle
 from datetime import datetime
 from typing import Any, Optional, Union
 from uuid import UUID
@@ -27,7 +26,7 @@ from superset import db
 from superset.commands.base import BaseCommand
 from superset.key_value.exceptions import KeyValueGetFailedError
 from superset.key_value.models import KeyValueEntry
-from superset.key_value.types import KeyValueResource
+from superset.key_value.types import KeyValueCodec, KeyValueResource
 from superset.key_value.utils import get_filter
 
 logger = logging.getLogger(__name__)
@@ -36,17 +35,25 @@ logger = logging.getLogger(__name__)
 class GetKeyValueCommand(BaseCommand):
     resource: KeyValueResource
     key: Union[int, UUID]
+    codec: KeyValueCodec
 
-    def __init__(self, resource: KeyValueResource, key: Union[int, UUID]):
+    def __init__(
+        self,
+        resource: KeyValueResource,
+        key: Union[int, UUID],
+        codec: KeyValueCodec,
+    ):
         """
         Retrieve a key value entry
 
         :param resource: the resource (dashboard, chart etc)
         :param key: the key to retrieve
+        :param codec: codec used to decode the value
         :return: the value associated with the key if present
         """
         self.resource = resource
         self.key = key
+        self.codec = codec
 
     def run(self) -> Any:
         try:
@@ -66,5 +73,5 @@ class GetKeyValueCommand(BaseCommand):
             .first()
         )
         if entry and (entry.expires_on is None or entry.expires_on > datetime.now()):
-            return pickle.loads(entry.value)
+            return self.codec.decode(entry.value)
         return None
diff --git a/superset/key_value/commands/update.py b/superset/key_value/commands/update.py
index b69ca5e70d..becd6d9ca8 100644
--- a/superset/key_value/commands/update.py
+++ b/superset/key_value/commands/update.py
@@ -16,7 +16,6 @@
 # under the License.
 
 import logging
-import pickle
 from datetime import datetime
 from typing import Any, Optional, Union
 from uuid import UUID
@@ -27,7 +26,7 @@ from superset import db
 from superset.commands.base import BaseCommand
 from superset.key_value.exceptions import KeyValueUpdateFailedError
 from superset.key_value.models import KeyValueEntry
-from superset.key_value.types import Key, KeyValueResource
+from superset.key_value.types import Key, KeyValueCodec, KeyValueResource
 from superset.key_value.utils import get_filter
 from superset.utils.core import get_user_id
 
@@ -37,14 +36,16 @@ logger = logging.getLogger(__name__)
 class UpdateKeyValueCommand(BaseCommand):
     resource: KeyValueResource
     value: Any
+    codec: KeyValueCodec
     key: Union[int, UUID]
     expires_on: Optional[datetime]
 
-    def __init__(
+    def __init__(  # pylint: disable=too-many-arguments
         self,
         resource: KeyValueResource,
         key: Union[int, UUID],
         value: Any,
+        codec: KeyValueCodec,
         expires_on: Optional[datetime] = None,
     ):
         """
@@ -53,12 +54,14 @@ class UpdateKeyValueCommand(BaseCommand):
         :param resource: the resource (dashboard, chart etc)
         :param key: the key to update
         :param value: the value to persist in the key-value store
+        :param codec: codec used to encode the value
         :param expires_on: entry expiration time
         :return: the key associated with the updated value
         """
         self.resource = resource
         self.key = key
         self.value = value
+        self.codec = codec
         self.expires_on = expires_on
 
     def run(self) -> Optional[Key]:
@@ -80,7 +83,7 @@ class UpdateKeyValueCommand(BaseCommand):
             .first()
         )
         if entry:
-            entry.value = pickle.dumps(self.value)
+            entry.value = self.codec.encode(self.value)
             entry.expires_on = self.expires_on
             entry.changed_on = datetime.now()
             entry.changed_by_fk = get_user_id()
diff --git a/superset/key_value/commands/upsert.py b/superset/key_value/commands/upsert.py
index 06b33c90fc..c5668f1161 100644
--- a/superset/key_value/commands/upsert.py
+++ b/superset/key_value/commands/upsert.py
@@ -16,7 +16,6 @@
 # under the License.
 
 import logging
-import pickle
 from datetime import datetime
 from typing import Any, Optional, Union
 from uuid import UUID
@@ -31,7 +30,7 @@ from superset.key_value.exceptions import (
     KeyValueUpsertFailedError,
 )
 from superset.key_value.models import KeyValueEntry
-from superset.key_value.types import Key, KeyValueResource
+from superset.key_value.types import Key, KeyValueCodec, KeyValueResource
 from superset.key_value.utils import get_filter
 from superset.utils.core import get_user_id
 
@@ -42,13 +41,15 @@ class UpsertKeyValueCommand(BaseCommand):
     resource: KeyValueResource
     value: Any
     key: Union[int, UUID]
+    codec: KeyValueCodec
     expires_on: Optional[datetime]
 
-    def __init__(
+    def __init__(  # pylint: disable=too-many-arguments
         self,
         resource: KeyValueResource,
         key: Union[int, UUID],
         value: Any,
+        codec: KeyValueCodec,
         expires_on: Optional[datetime] = None,
     ):
         """
@@ -57,13 +58,14 @@ class UpsertKeyValueCommand(BaseCommand):
         :param resource: the resource (dashboard, chart etc)
         :param key: the key to update
         :param value: the value to persist in the key-value store
-        :param key_type: the type of the key to update
+        :param codec: codec used to encode the value
         :param expires_on: entry expiration time
         :return: the key associated with the updated value
         """
         self.resource = resource
         self.key = key
         self.value = value
+        self.codec = codec
         self.expires_on = expires_on
 
     def run(self) -> Key:
@@ -85,7 +87,7 @@ class UpsertKeyValueCommand(BaseCommand):
             .first()
         )
         if entry:
-            entry.value = pickle.dumps(self.value)
+            entry.value = self.codec.encode(self.value)
             entry.expires_on = self.expires_on
             entry.changed_on = datetime.now()
             entry.changed_by_fk = get_user_id()
@@ -96,6 +98,7 @@ class UpsertKeyValueCommand(BaseCommand):
         return CreateKeyValueCommand(
             resource=self.resource,
             value=self.value,
+            codec=self.codec,
             key=self.key,
             expires_on=self.expires_on,
         ).run()
diff --git a/superset/key_value/shared_entries.py b/superset/key_value/shared_entries.py
index 5f4ded9498..7895b75907 100644
--- a/superset/key_value/shared_entries.py
+++ b/superset/key_value/shared_entries.py
@@ -18,11 +18,12 @@
 from typing import Any, Optional
 from uuid import uuid3
 
-from superset.key_value.types import KeyValueResource, SharedKey
+from superset.key_value.types import JsonKeyValueCodec, KeyValueResource, SharedKey
 from superset.key_value.utils import get_uuid_namespace, random_key
 
 RESOURCE = KeyValueResource.APP
 NAMESPACE = get_uuid_namespace("")
+CODEC = JsonKeyValueCodec()
 
 
 def get_shared_value(key: SharedKey) -> Optional[Any]:
@@ -30,7 +31,7 @@ def get_shared_value(key: SharedKey) -> Optional[Any]:
     from superset.key_value.commands.get import GetKeyValueCommand
 
     uuid_key = uuid3(NAMESPACE, key)
-    return GetKeyValueCommand(RESOURCE, key=uuid_key).run()
+    return GetKeyValueCommand(RESOURCE, key=uuid_key, codec=CODEC).run()
 
 
 def set_shared_value(key: SharedKey, value: Any) -> None:
@@ -38,7 +39,12 @@ def set_shared_value(key: SharedKey, value: Any) -> None:
     from superset.key_value.commands.create import CreateKeyValueCommand
 
     uuid_key = uuid3(NAMESPACE, key)
-    CreateKeyValueCommand(resource=RESOURCE, value=value, key=uuid_key).run()
+    CreateKeyValueCommand(
+        resource=RESOURCE,
+        value=value,
+        key=uuid_key,
+        codec=CODEC,
+    ).run()
 
 
 def get_permalink_salt(key: SharedKey) -> str:
diff --git a/superset/key_value/types.py b/superset/key_value/types.py
index c3064fbef4..07d06414f6 100644
--- a/superset/key_value/types.py
+++ b/superset/key_value/types.py
@@ -14,9 +14,14 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+from __future__ import annotations
+
+import json
+import pickle
+from abc import ABC, abstractmethod
 from dataclasses import dataclass
 from enum import Enum
-from typing import Optional, TypedDict
+from typing import Any, Optional, TypedDict
 from uuid import UUID
 
 
@@ -42,3 +47,29 @@ class KeyValueResource(str, Enum):
 class SharedKey(str, Enum):
     DASHBOARD_PERMALINK_SALT = "dashboard_permalink_salt"
     EXPLORE_PERMALINK_SALT = "explore_permalink_salt"
+
+
+class KeyValueCodec(ABC):
+    @abstractmethod
+    def encode(self, value: Any) -> bytes:
+        ...
+
+    @abstractmethod
+    def decode(self, value: bytes) -> Any:
+        ...
+
+
+class JsonKeyValueCodec(KeyValueCodec):
+    def encode(self, value: dict[Any, Any]) -> bytes:
+        return bytes(json.dumps(value), encoding="utf-8")
+
+    def decode(self, value: bytes) -> dict[Any, Any]:
+        return json.loads(value)
+
+
+class PickleKeyValueCodec(KeyValueCodec):
+    def encode(self, value: dict[Any, Any]) -> bytes:
+        return pickle.dumps(value)
+
+    def decode(self, value: bytes) -> dict[Any, Any]:
+        return pickle.loads(value)
diff --git a/superset/migrations/versions/2023-05-01_12-03_9c2a5681ddfd_convert_key_value_entries_to_json.py b/superset/migrations/versions/2023-05-01_12-03_9c2a5681ddfd_convert_key_value_entries_to_json.py
new file mode 100644
index 0000000000..6e55f3ddc9
--- /dev/null
+++ b/superset/migrations/versions/2023-05-01_12-03_9c2a5681ddfd_convert_key_value_entries_to_json.py
@@ -0,0 +1,96 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""convert key-value entries to json
+
+Revision ID: 9c2a5681ddfd
+Revises: f3c2d8ec8595
+Create Date: 2023-05-01 12:03:17.079862
+
+"""
+
+# revision identifiers, used by Alembic.
+revision = "9c2a5681ddfd"
+down_revision = "f3c2d8ec8595"
+
+import io
+import json
+import pickle
+
+from alembic import op
+from sqlalchemy import Column, Integer, LargeBinary, String
+from sqlalchemy.ext.declarative import declarative_base
+from sqlalchemy.orm import Session
+
+from superset import db
+from superset.migrations.shared.utils import paginated_update
+
+Base = declarative_base()
+VALUE_MAX_SIZE = 2**24 - 1
+RESOURCES_TO_MIGRATE = ("app", "dashboard_permalink", "explore_permalink")
+
+
+class RestrictedUnpickler(pickle.Unpickler):
+    def find_class(self, module, name):
+        if not (module == "superset.utils.core" and name == "DatasourceType"):
+            raise pickle.UnpicklingError(f"Unpickling of {module}.{name} is forbidden")
+
+        return super().find_class(module, name)
+
+
+class KeyValueEntry(Base):
+    __tablename__ = "key_value"
+    id = Column(Integer, primary_key=True)
+    resource = Column(String(32), nullable=False)
+    value = Column(LargeBinary(length=VALUE_MAX_SIZE), nullable=False)
+
+
+def upgrade():
+    bind = op.get_bind()
+    session: Session = db.Session(bind=bind)
+    truncated_count = 0
+    for entry in paginated_update(
+        session.query(KeyValueEntry).filter(
+            KeyValueEntry.resource.in_(RESOURCES_TO_MIGRATE)
+        )
+    ):
+        try:
+            value = RestrictedUnpickler(io.BytesIO(entry.value)).load() or {}
+        except pickle.UnpicklingError as ex:
+            if str(ex) == "pickle data was truncated":
+                # make truncated values that were created prior to #20385 an empty
+                # dict so that downgrading will work properly.
+                truncated_count += 1
+                value = {}
+            else:
+                raise
+
+        entry.value = bytes(json.dumps(value), encoding="utf-8")
+
+    if truncated_count:
+        print(f"Replaced {truncated_count} corrupted values with an empty value")
+
+
+def downgrade():
+    bind = op.get_bind()
+    session: Session = db.Session(bind=bind)
+    for entry in paginated_update(
+        session.query(KeyValueEntry).filter(
+            KeyValueEntry.resource.in_(RESOURCES_TO_MIGRATE)
+        ),
+    ):
+        value = json.loads(entry.value) or {}
+        entry.value = pickle.dumps(value)
diff --git a/superset/temporary_cache/api.py b/superset/temporary_cache/api.py
index b6376c63c3..85db65c62c 100644
--- a/superset/temporary_cache/api.py
+++ b/superset/temporary_cache/api.py
@@ -24,6 +24,7 @@ from flask import request, Response
 from marshmallow import ValidationError
 
 from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod
+from superset.key_value.types import JsonKeyValueCodec
 from superset.temporary_cache.commands.exceptions import (
     TemporaryCacheAccessDeniedError,
     TemporaryCacheResourceNotFoundError,
@@ -37,6 +38,8 @@ from superset.views.base_api import BaseSupersetApi, requires_json
 
 logger = logging.getLogger(__name__)
 
+CODEC = JsonKeyValueCodec()
+
 
 class TemporaryCacheRestApi(BaseSupersetApi, ABC):
     add_model_schema = TemporaryCachePostSchema()
@@ -69,7 +72,12 @@ class TemporaryCacheRestApi(BaseSupersetApi, ABC):
         try:
             item = self.add_model_schema.load(request.json)
             tab_id = request.args.get("tab_id")
-            args = CommandParameters(resource_id=pk, value=item["value"], tab_id=tab_id)
+            args = CommandParameters(
+                resource_id=pk,
+                value=item["value"],
+                tab_id=tab_id,
+                codec=CODEC,
+            )
             key = self.get_create_command()(args).run()
             return self.response(201, key=key)
         except ValidationError as ex:
@@ -89,6 +97,7 @@ class TemporaryCacheRestApi(BaseSupersetApi, ABC):
                 key=key,
                 value=item["value"],
                 tab_id=tab_id,
+                codec=CODEC,
             )
             key = self.get_update_command()(args).run()
             return self.response(200, key=key)
@@ -101,7 +110,7 @@ class TemporaryCacheRestApi(BaseSupersetApi, ABC):
 
     def get(self, pk: int, key: str) -> Response:
         try:
-            args = CommandParameters(resource_id=pk, key=key)
+            args = CommandParameters(resource_id=pk, key=key, codec=CODEC)
             value = self.get_get_command()(args).run()
             if not value:
                 return self.response_404()
diff --git a/superset/temporary_cache/commands/parameters.py b/superset/temporary_cache/commands/parameters.py
index 74b9c1c632..e4e5b9b06a 100644
--- a/superset/temporary_cache/commands/parameters.py
+++ b/superset/temporary_cache/commands/parameters.py
@@ -17,10 +17,13 @@
 from dataclasses import dataclass
 from typing import Optional
 
+from superset.key_value.types import KeyValueCodec
+
 
 @dataclass
 class CommandParameters:
     resource_id: int
+    codec: Optional[KeyValueCodec] = None
     tab_id: Optional[int] = None
     key: Optional[str] = None
     value: Optional[str] = None
diff --git a/tests/integration_tests/explore/permalink/api_tests.py b/tests/integration_tests/explore/permalink/api_tests.py
index 22a36f41e1..4c6a3c12dd 100644
--- a/tests/integration_tests/explore/permalink/api_tests.py
+++ b/tests/integration_tests/explore/permalink/api_tests.py
@@ -15,7 +15,6 @@
 # specific language governing permissions and limitations
 # under the License.
 import json
-import pickle
 from typing import Any, Dict, Iterator
 from uuid import uuid3
 
@@ -24,7 +23,7 @@ from sqlalchemy.orm import Session
 
 from superset import db
 from superset.key_value.models import KeyValueEntry
-from superset.key_value.types import KeyValueResource
+from superset.key_value.types import JsonKeyValueCodec, KeyValueResource
 from superset.key_value.utils import decode_permalink_id, encode_permalink_key
 from superset.models.slice import Slice
 from superset.utils.core import DatasourceType
@@ -95,7 +94,7 @@ def test_get_missing_chart(
     chart_id = 1234
     entry = KeyValueEntry(
         resource=KeyValueResource.EXPLORE_PERMALINK,
-        value=pickle.dumps(
+        value=JsonKeyValueCodec().encode(
             {
                 "chartId": chart_id,
                 "datasourceId": chart.datasource.id,
diff --git a/tests/integration_tests/key_value/commands/create_test.py b/tests/integration_tests/key_value/commands/create_test.py
index 0e789026ba..a2ee3d13ae 100644
--- a/tests/integration_tests/key_value/commands/create_test.py
+++ b/tests/integration_tests/key_value/commands/create_test.py
@@ -16,20 +16,23 @@
 # under the License.
 from __future__ import annotations
 
+import json
 import pickle
-from uuid import UUID
 
+import pytest
 from flask.ctx import AppContext
 from flask_appbuilder.security.sqla.models import User
 
 from superset.extensions import db
+from superset.key_value.exceptions import KeyValueCreateFailedError
 from superset.utils.core import override_user
 from tests.integration_tests.key_value.commands.fixtures import (
     admin,
-    ID_KEY,
+    JSON_CODEC,
+    JSON_VALUE,
+    PICKLE_CODEC,
+    PICKLE_VALUE,
     RESOURCE,
-    UUID_KEY,
-    VALUE,
 )
 
 
@@ -38,11 +41,15 @@ def test_create_id_entry(app_context: AppContext, admin: User) -> None:
     from superset.key_value.models import KeyValueEntry
 
     with override_user(admin):
-        key = CreateKeyValueCommand(resource=RESOURCE, value=VALUE).run()
+        key = CreateKeyValueCommand(
+            resource=RESOURCE,
+            value=JSON_VALUE,
+            codec=JSON_CODEC,
+        ).run()
         entry = (
             db.session.query(KeyValueEntry).filter_by(id=key.id).autoflush(False).one()
         )
-        assert pickle.loads(entry.value) == VALUE
+        assert json.loads(entry.value) == JSON_VALUE
         assert entry.created_by_fk == admin.id
         db.session.delete(entry)
         db.session.commit()
@@ -53,11 +60,43 @@ def test_create_uuid_entry(app_context: AppContext, admin: User) -> None:
     from superset.key_value.models import KeyValueEntry
 
     with override_user(admin):
-        key = CreateKeyValueCommand(resource=RESOURCE, value=VALUE).run()
+        key = CreateKeyValueCommand(
+            resource=RESOURCE, value=JSON_VALUE, codec=JSON_CODEC
+        ).run()
     entry = (
         db.session.query(KeyValueEntry).filter_by(uuid=key.uuid).autoflush(False).one()
     )
-    assert pickle.loads(entry.value) == VALUE
+    assert json.loads(entry.value) == JSON_VALUE
     assert entry.created_by_fk == admin.id
     db.session.delete(entry)
     db.session.commit()
+
+
+def test_create_fail_json_entry(app_context: AppContext, admin: User) -> None:
+    from superset.key_value.commands.create import CreateKeyValueCommand
+
+    with pytest.raises(KeyValueCreateFailedError):
+        CreateKeyValueCommand(
+            resource=RESOURCE,
+            value=PICKLE_VALUE,
+            codec=JSON_CODEC,
+        ).run()
+
+
+def test_create_pickle_entry(app_context: AppContext, admin: User) -> None:
+    from superset.key_value.commands.create import CreateKeyValueCommand
+    from superset.key_value.models import KeyValueEntry
+
+    with override_user(admin):
+        key = CreateKeyValueCommand(
+            resource=RESOURCE,
+            value=PICKLE_VALUE,
+            codec=PICKLE_CODEC,
+        ).run()
+        entry = (
+            db.session.query(KeyValueEntry).filter_by(id=key.id).autoflush(False).one()
+        )
+        assert type(pickle.loads(entry.value)) == type(PICKLE_VALUE)
+        assert entry.created_by_fk == admin.id
+        db.session.delete(entry)
+        db.session.commit()
diff --git a/tests/integration_tests/key_value/commands/delete_test.py b/tests/integration_tests/key_value/commands/delete_test.py
index 62f9883370..3c4892faa6 100644
--- a/tests/integration_tests/key_value/commands/delete_test.py
+++ b/tests/integration_tests/key_value/commands/delete_test.py
@@ -16,7 +16,7 @@
 # under the License.
 from __future__ import annotations
 
-import pickle
+import json
 from typing import TYPE_CHECKING
 from uuid import UUID
 
@@ -25,7 +25,11 @@ from flask.ctx import AppContext
 from flask_appbuilder.security.sqla.models import User
 
 from superset.extensions import db
-from tests.integration_tests.key_value.commands.fixtures import admin, RESOURCE, VALUE
+from tests.integration_tests.key_value.commands.fixtures import (
+    admin,
+    JSON_VALUE,
+    RESOURCE,
+)
 
 if TYPE_CHECKING:
     from superset.key_value.models import KeyValueEntry
@@ -42,7 +46,7 @@ def key_value_entry() -> KeyValueEntry:
         id=ID_KEY,
         uuid=UUID_KEY,
         resource=RESOURCE,
-        value=pickle.dumps(VALUE),
+        value=bytes(json.dumps(JSON_VALUE), encoding="utf-8"),
     )
     db.session.add(entry)
     db.session.commit()
@@ -55,7 +59,6 @@ def test_delete_id_entry(
     key_value_entry: KeyValueEntry,
 ) -> None:
     from superset.key_value.commands.delete import DeleteKeyValueCommand
-    from superset.key_value.models import KeyValueEntry
 
     assert DeleteKeyValueCommand(resource=RESOURCE, key=ID_KEY).run() is True
 
@@ -66,7 +69,6 @@ def test_delete_uuid_entry(
     key_value_entry: KeyValueEntry,
 ) -> None:
     from superset.key_value.commands.delete import DeleteKeyValueCommand
-    from superset.key_value.models import KeyValueEntry
 
     assert DeleteKeyValueCommand(resource=RESOURCE, key=UUID_KEY).run() is True
 
@@ -77,6 +79,5 @@ def test_delete_entry_missing(
     key_value_entry: KeyValueEntry,
 ) -> None:
     from superset.key_value.commands.delete import DeleteKeyValueCommand
-    from superset.key_value.models import KeyValueEntry
 
     assert DeleteKeyValueCommand(resource=RESOURCE, key=456).run() is False
diff --git a/tests/integration_tests/key_value/commands/fixtures.py b/tests/integration_tests/key_value/commands/fixtures.py
index 2fd4fde4e1..66aea8a4ed 100644
--- a/tests/integration_tests/key_value/commands/fixtures.py
+++ b/tests/integration_tests/key_value/commands/fixtures.py
@@ -17,7 +17,7 @@
 
 from __future__ import annotations
 
-import pickle
+import json
 from typing import Generator, TYPE_CHECKING
 from uuid import UUID
 
@@ -26,7 +26,11 @@ from flask_appbuilder.security.sqla.models import User
 from sqlalchemy.orm import Session
 
 from superset.extensions import db
-from superset.key_value.types import KeyValueResource
+from superset.key_value.types import (
+    JsonKeyValueCodec,
+    KeyValueResource,
+    PickleKeyValueCodec,
+)
 from tests.integration_tests.test_app import app
 
 if TYPE_CHECKING:
@@ -35,7 +39,10 @@ if TYPE_CHECKING:
 ID_KEY = 123
 UUID_KEY = UUID("3e7a2ab8-bcaf-49b0-a5df-dfb432f291cc")
 RESOURCE = KeyValueResource.APP
-VALUE = {"foo": "bar"}
+JSON_VALUE = {"foo": "bar"}
+PICKLE_VALUE = object()
+JSON_CODEC = JsonKeyValueCodec()
+PICKLE_CODEC = PickleKeyValueCodec()
 
 
 @pytest.fixture
@@ -46,7 +53,7 @@ def key_value_entry() -> Generator[KeyValueEntry, None, None]:
         id=ID_KEY,
         uuid=UUID_KEY,
         resource=RESOURCE,
-        value=pickle.dumps(VALUE),
+        value=bytes(json.dumps(JSON_VALUE), encoding="utf-8"),
     )
     db.session.add(entry)
     db.session.commit()
diff --git a/tests/integration_tests/key_value/commands/get_test.py b/tests/integration_tests/key_value/commands/get_test.py
index b1800a4c3b..28a6dd73d5 100644
--- a/tests/integration_tests/key_value/commands/get_test.py
+++ b/tests/integration_tests/key_value/commands/get_test.py
@@ -16,7 +16,7 @@
 # under the License.
 from __future__ import annotations
 
-import pickle
+import json
 import uuid
 from datetime import datetime, timedelta
 from typing import TYPE_CHECKING
@@ -26,10 +26,11 @@ from flask.ctx import AppContext
 from superset.extensions import db
 from tests.integration_tests.key_value.commands.fixtures import (
     ID_KEY,
+    JSON_CODEC,
+    JSON_VALUE,
     key_value_entry,
     RESOURCE,
     UUID_KEY,
-    VALUE,
 )
 
 if TYPE_CHECKING:
@@ -39,8 +40,8 @@ if TYPE_CHECKING:
 def test_get_id_entry(app_context: AppContext, key_value_entry: KeyValueEntry) -> None:
     from superset.key_value.commands.get import GetKeyValueCommand
 
-    value = GetKeyValueCommand(resource=RESOURCE, key=ID_KEY).run()
-    assert value == VALUE
+    value = GetKeyValueCommand(resource=RESOURCE, key=ID_KEY, codec=JSON_CODEC).run()
+    assert value == JSON_VALUE
 
 
 def test_get_uuid_entry(
@@ -48,8 +49,8 @@ def test_get_uuid_entry(
 ) -> None:
     from superset.key_value.commands.get import GetKeyValueCommand
 
-    value = GetKeyValueCommand(resource=RESOURCE, key=UUID_KEY).run()
-    assert value == VALUE
+    value = GetKeyValueCommand(resource=RESOURCE, key=UUID_KEY, codec=JSON_CODEC).run()
+    assert value == JSON_VALUE
 
 
 def test_get_id_entry_missing(
@@ -58,7 +59,7 @@ def test_get_id_entry_missing(
 ) -> None:
     from superset.key_value.commands.get import GetKeyValueCommand
 
-    value = GetKeyValueCommand(resource=RESOURCE, key=456).run()
+    value = GetKeyValueCommand(resource=RESOURCE, key=456, codec=JSON_CODEC).run()
     assert value is None
 
 
@@ -70,12 +71,12 @@ def test_get_expired_entry(app_context: AppContext) -> None:
         id=678,
         uuid=uuid.uuid4(),
         resource=RESOURCE,
-        value=pickle.dumps(VALUE),
+        value=bytes(json.dumps(JSON_VALUE), encoding="utf-8"),
         expires_on=datetime.now() - timedelta(days=1),
     )
     db.session.add(entry)
     db.session.commit()
-    value = GetKeyValueCommand(resource=RESOURCE, key=ID_KEY).run()
+    value = GetKeyValueCommand(resource=RESOURCE, key=ID_KEY, codec=JSON_CODEC).run()
     assert value is None
     db.session.delete(entry)
     db.session.commit()
@@ -90,12 +91,12 @@ def test_get_future_expiring_entry(app_context: AppContext) -> None:
         id=id_,
         uuid=uuid.uuid4(),
         resource=RESOURCE,
-        value=pickle.dumps(VALUE),
+        value=bytes(json.dumps(JSON_VALUE), encoding="utf-8"),
         expires_on=datetime.now() + timedelta(days=1),
     )
     db.session.add(entry)
     db.session.commit()
-    value = GetKeyValueCommand(resource=RESOURCE, key=id_).run()
-    assert value == VALUE
+    value = GetKeyValueCommand(resource=RESOURCE, key=id_, codec=JSON_CODEC).run()
+    assert value == JSON_VALUE
     db.session.delete(entry)
     db.session.commit()
diff --git a/tests/integration_tests/key_value/commands/update_test.py b/tests/integration_tests/key_value/commands/update_test.py
index 8eb03b4eda..2c0fc3e31d 100644
--- a/tests/integration_tests/key_value/commands/update_test.py
+++ b/tests/integration_tests/key_value/commands/update_test.py
@@ -16,9 +16,8 @@
 # under the License.
 from __future__ import annotations
 
-import pickle
+import json
 from typing import TYPE_CHECKING
-from uuid import UUID
 
 from flask.ctx import AppContext
 from flask_appbuilder.security.sqla.models import User
@@ -28,6 +27,7 @@ from superset.utils.core import override_user
 from tests.integration_tests.key_value.commands.fixtures import (
     admin,
     ID_KEY,
+    JSON_CODEC,
     key_value_entry,
     RESOURCE,
     UUID_KEY,
@@ -53,11 +53,12 @@ def test_update_id_entry(
             resource=RESOURCE,
             key=ID_KEY,
             value=NEW_VALUE,
+            codec=JSON_CODEC,
         ).run()
     assert key is not None
     assert key.id == ID_KEY
     entry = db.session.query(KeyValueEntry).filter_by(id=ID_KEY).autoflush(False).one()
-    assert pickle.loads(entry.value) == NEW_VALUE
+    assert json.loads(entry.value) == NEW_VALUE
     assert entry.changed_by_fk == admin.id
 
 
@@ -74,13 +75,14 @@ def test_update_uuid_entry(
             resource=RESOURCE,
             key=UUID_KEY,
             value=NEW_VALUE,
+            codec=JSON_CODEC,
         ).run()
     assert key is not None
     assert key.uuid == UUID_KEY
     entry = (
         db.session.query(KeyValueEntry).filter_by(uuid=UUID_KEY).autoflush(False).one()
     )
-    assert pickle.loads(entry.value) == NEW_VALUE
+    assert json.loads(entry.value) == NEW_VALUE
     assert entry.changed_by_fk == admin.id
 
 
@@ -92,5 +94,6 @@ def test_update_missing_entry(app_context: AppContext, admin: User) -> None:
             resource=RESOURCE,
             key=456,
             value=NEW_VALUE,
+            codec=JSON_CODEC,
         ).run()
     assert key is None
diff --git a/tests/integration_tests/key_value/commands/upsert_test.py b/tests/integration_tests/key_value/commands/upsert_test.py
index e5cd27e3a6..c26b66d02e 100644
--- a/tests/integration_tests/key_value/commands/upsert_test.py
+++ b/tests/integration_tests/key_value/commands/upsert_test.py
@@ -16,9 +16,8 @@
 # under the License.
 from __future__ import annotations
 
-import pickle
+import json
 from typing import TYPE_CHECKING
-from uuid import UUID
 
 from flask.ctx import AppContext
 from flask_appbuilder.security.sqla.models import User
@@ -28,6 +27,7 @@ from superset.utils.core import override_user
 from tests.integration_tests.key_value.commands.fixtures import (
     admin,
     ID_KEY,
+    JSON_CODEC,
     key_value_entry,
     RESOURCE,
     UUID_KEY,
@@ -53,13 +53,14 @@ def test_upsert_id_entry(
             resource=RESOURCE,
             key=ID_KEY,
             value=NEW_VALUE,
+            codec=JSON_CODEC,
         ).run()
     assert key is not None
     assert key.id == ID_KEY
     entry = (
         db.session.query(KeyValueEntry).filter_by(id=int(ID_KEY)).autoflush(False).one()
     )
-    assert pickle.loads(entry.value) == NEW_VALUE
+    assert json.loads(entry.value) == NEW_VALUE
     assert entry.changed_by_fk == admin.id
 
 
@@ -76,13 +77,14 @@ def test_upsert_uuid_entry(
             resource=RESOURCE,
             key=UUID_KEY,
             value=NEW_VALUE,
+            codec=JSON_CODEC,
         ).run()
     assert key is not None
     assert key.uuid == UUID_KEY
     entry = (
         db.session.query(KeyValueEntry).filter_by(uuid=UUID_KEY).autoflush(False).one()
     )
-    assert pickle.loads(entry.value) == NEW_VALUE
+    assert json.loads(entry.value) == NEW_VALUE
     assert entry.changed_by_fk == admin.id
 
 
@@ -95,6 +97,7 @@ def test_upsert_missing_entry(app_context: AppContext, admin: User) -> None:
             resource=RESOURCE,
             key=456,
             value=NEW_VALUE,
+            codec=JSON_CODEC,
         ).run()
     assert key is not None
     assert key.id == 456


[superset] 01/18: fix: allow db driver distinction on enforced URI params (#23769)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit b26901cb05d62637abd2aaa7144378516f4b7e0f
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Sun Apr 23 15:44:21 2023 +0100

    fix: allow db driver distinction on enforced URI params (#23769)
---
 superset/db_engine_specs/base.py               | 19 +++++++++------
 superset/db_engine_specs/drill.py              | 11 ++++++---
 superset/db_engine_specs/hive.py               | 10 ++++----
 superset/db_engine_specs/mysql.py              | 17 ++++++++++----
 superset/db_engine_specs/presto.py             |  9 +++++---
 superset/db_engine_specs/snowflake.py          | 10 ++++----
 superset/models/core.py                        | 15 +++++++-----
 tests/integration_tests/model_tests.py         | 12 +++++++++-
 tests/unit_tests/db_engine_specs/test_mysql.py | 32 +++++++++++++++++++++++++-
 9 files changed, 102 insertions(+), 33 deletions(-)

diff --git a/superset/db_engine_specs/base.py b/superset/db_engine_specs/base.py
index af2699a6dd..27dd34a802 100644
--- a/superset/db_engine_specs/base.py
+++ b/superset/db_engine_specs/base.py
@@ -354,10 +354,11 @@ class BaseEngineSpec:  # pylint: disable=too-many-public-methods
     # This set will give the keywords for data limit statements
     # to consider for the engines with TOP SQL parsing
     top_keywords: Set[str] = {"TOP"}
-    # A set of disallowed connection query parameters
-    disallow_uri_query_params: Set[str] = set()
+    # A set of disallowed connection query parameters by driver name
+    disallow_uri_query_params: Dict[str, Set[str]] = {}
     # A Dict of query parameters that will always be used on every connection
-    enforce_uri_query_params: Dict[str, Any] = {}
+    # by driver name
+    enforce_uri_query_params: Dict[str, Dict[str, Any]] = {}
 
     force_column_alias_quotes = False
     arraysize = 0
@@ -999,6 +1000,7 @@ class BaseEngineSpec:  # pylint: disable=too-many-public-methods
     def adjust_database_uri(  # pylint: disable=unused-argument
         cls,
         uri: URL,
+        connect_args: Dict[str, Any],
         selected_schema: Optional[str] = None,
     ) -> Tuple[URL, Dict[str, Any]]:
         """
@@ -1024,7 +1026,10 @@ class BaseEngineSpec:  # pylint: disable=too-many-public-methods
         This is important because DB engine specs can be installed from 3rd party
         packages.
         """
-        return uri, {**cls.enforce_uri_query_params}
+        return uri, {
+            **connect_args,
+            **cls.enforce_uri_query_params.get(uri.get_driver_name(), {}),
+        }
 
     @classmethod
     def patch(cls) -> None:
@@ -1744,9 +1749,9 @@ class BaseEngineSpec:  # pylint: disable=too-many-public-methods
 
         :param sqlalchemy_uri:
         """
-        if existing_disallowed := cls.disallow_uri_query_params.intersection(
-            sqlalchemy_uri.query
-        ):
+        if existing_disallowed := cls.disallow_uri_query_params.get(
+            sqlalchemy_uri.get_driver_name(), set()
+        ).intersection(sqlalchemy_uri.query):
             raise ValueError(f"Forbidden query parameter(s): {existing_disallowed}")
 
 
diff --git a/superset/db_engine_specs/drill.py b/superset/db_engine_specs/drill.py
index 756f74e82a..d8a1940007 100644
--- a/superset/db_engine_specs/drill.py
+++ b/superset/db_engine_specs/drill.py
@@ -15,7 +15,7 @@
 # specific language governing permissions and limitations
 # under the License.
 from datetime import datetime
-from typing import Any, Dict, Optional
+from typing import Any, Dict, Optional, Tuple
 from urllib import parse
 
 from sqlalchemy import types
@@ -69,11 +69,16 @@ class DrillEngineSpec(BaseEngineSpec):
         return None
 
     @classmethod
-    def adjust_database_uri(cls, uri: URL, selected_schema: Optional[str]) -> URL:
+    def adjust_database_uri(
+        cls,
+        uri: URL,
+        connect_args: Dict[str, Any],
+        selected_schema: Optional[str] = None,
+    ) -> Tuple[URL, Dict[str, Any]]:
         if selected_schema:
             uri = uri.set(database=parse.quote(selected_schema, safe=""))
 
-        return uri
+        return uri, connect_args
 
     @classmethod
     def get_url_for_impersonation(
diff --git a/superset/db_engine_specs/hive.py b/superset/db_engine_specs/hive.py
index c049ee652e..f07d53518c 100644
--- a/superset/db_engine_specs/hive.py
+++ b/superset/db_engine_specs/hive.py
@@ -191,7 +191,6 @@ class HiveEngineSpec(PrestoEngineSpec):
             raise SupersetException("Append operation not currently supported")
 
         if to_sql_kwargs["if_exists"] == "fail":
-
             # Ensure table doesn't already exist.
             if table.schema:
                 table_exists = not database.get_df(
@@ -260,12 +259,15 @@ class HiveEngineSpec(PrestoEngineSpec):
 
     @classmethod
     def adjust_database_uri(
-        cls, uri: URL, selected_schema: Optional[str] = None
-    ) -> URL:
+        cls,
+        uri: URL,
+        connect_args: Dict[str, Any],
+        selected_schema: Optional[str] = None,
+    ) -> Tuple[URL, Dict[str, Any]]:
         if selected_schema:
             uri = uri.set(database=parse.quote(selected_schema, safe=""))
 
-        return uri
+        return uri, connect_args
 
     @classmethod
     def _extract_error_message(cls, ex: Exception) -> str:
diff --git a/superset/db_engine_specs/mysql.py b/superset/db_engine_specs/mysql.py
index 457509f7a7..622e6c985c 100644
--- a/superset/db_engine_specs/mysql.py
+++ b/superset/db_engine_specs/mysql.py
@@ -173,8 +173,14 @@ class MySQLEngineSpec(BaseEngineSpec, BasicParametersMixin):
             {},
         ),
     }
-    disallow_uri_query_params = {"local_infile"}
-    enforce_uri_query_params = {"local_infile": 0}
+    disallow_uri_query_params = {
+        "mysqldb": {"local_infile"},
+        "mysqlconnector": {"allow_local_infile"},
+    }
+    enforce_uri_query_params = {
+        "mysqldb": {"local_infile": 0},
+        "mysqlconnector": {"allow_local_infile": 0},
+    }
 
     @classmethod
     def convert_dttm(
@@ -191,11 +197,14 @@ class MySQLEngineSpec(BaseEngineSpec, BasicParametersMixin):
 
     @classmethod
     def adjust_database_uri(
-        cls, uri: URL, selected_schema: Optional[str] = None
+        cls,
+        uri: URL,
+        connect_args: Dict[str, Any],
+        selected_schema: Optional[str] = None,
     ) -> Tuple[URL, Dict[str, Any]]:
         uri, new_connect_args = super(
             MySQLEngineSpec, MySQLEngineSpec
-        ).adjust_database_uri(uri)
+        ).adjust_database_uri(uri, connect_args)
         if selected_schema:
             uri = uri.set(database=parse.quote(selected_schema, safe=""))
 
diff --git a/superset/db_engine_specs/presto.py b/superset/db_engine_specs/presto.py
index 72931a85b4..6bd556b79e 100644
--- a/superset/db_engine_specs/presto.py
+++ b/superset/db_engine_specs/presto.py
@@ -300,8 +300,11 @@ class PrestoBaseEngineSpec(BaseEngineSpec, metaclass=ABCMeta):
 
     @classmethod
     def adjust_database_uri(
-        cls, uri: URL, selected_schema: Optional[str] = None
-    ) -> URL:
+        cls,
+        uri: URL,
+        connect_args: Dict[str, Any],
+        selected_schema: Optional[str] = None,
+    ) -> Tuple[URL, Dict[str, Any]]:
         database = uri.database
         if selected_schema and database:
             selected_schema = parse.quote(selected_schema, safe="")
@@ -311,7 +314,7 @@ class PrestoBaseEngineSpec(BaseEngineSpec, metaclass=ABCMeta):
                 database += "/" + selected_schema
             uri = uri.set(database=database)
 
-        return uri
+        return uri, connect_args
 
     @classmethod
     def estimate_statement_cost(cls, statement: str, cursor: Any) -> Dict[str, Any]:
diff --git a/superset/db_engine_specs/snowflake.py b/superset/db_engine_specs/snowflake.py
index 419e0a0655..35801fa768 100644
--- a/superset/db_engine_specs/snowflake.py
+++ b/superset/db_engine_specs/snowflake.py
@@ -134,8 +134,11 @@ class SnowflakeEngineSpec(PostgresBaseEngineSpec):
 
     @classmethod
     def adjust_database_uri(
-        cls, uri: URL, selected_schema: Optional[str] = None
-    ) -> URL:
+        cls,
+        uri: URL,
+        connect_args: Dict[str, Any],
+        selected_schema: Optional[str] = None,
+    ) -> Tuple[URL, Dict[str, Any]]:
         database = uri.database
         if "/" in uri.database:
             database = uri.database.split("/")[0]
@@ -143,7 +146,7 @@ class SnowflakeEngineSpec(PostgresBaseEngineSpec):
             selected_schema = parse.quote(selected_schema, safe="")
             uri = uri.set(database=f"{database}/{selected_schema}")
 
-        return uri
+        return uri, connect_args
 
     @classmethod
     def epoch_to_dttm(cls) -> str:
@@ -222,7 +225,6 @@ class SnowflakeEngineSpec(PostgresBaseEngineSpec):
             Dict[str, Any]
         ] = None,
     ) -> str:
-
         return str(
             URL(
                 "snowflake",
diff --git a/superset/models/core.py b/superset/models/core.py
index 9c67a2efa6..fce323b13c 100755
--- a/superset/models/core.py
+++ b/superset/models/core.py
@@ -426,7 +426,15 @@ class Database(
         )
         self.db_engine_spec.validate_database_uri(sqlalchemy_url)
 
-        sqlalchemy_url = self.db_engine_spec.adjust_database_uri(sqlalchemy_url, schema)
+        params = extra.get("engine_params", {})
+        if nullpool:
+            params["poolclass"] = NullPool
+
+        connect_args = params.get("connect_args", {})
+
+        sqlalchemy_url, connect_args = self.db_engine_spec.adjust_database_uri(
+            sqlalchemy_url, connect_args, schema
+        )
         effective_username = self.get_effective_user(sqlalchemy_url)
         # If using MySQL or Presto for example, will set url.username
         # If using Hive, will not do anything yet since that relies on a
@@ -438,11 +446,6 @@ class Database(
         masked_url = self.get_password_masked_url(sqlalchemy_url)
         logger.debug("Database._get_sqla_engine(). Masked URL: %s", str(masked_url))
 
-        params = extra.get("engine_params", {})
-        if nullpool:
-            params["poolclass"] = NullPool
-
-        connect_args = params.get("connect_args", {})
         if self.impersonate_user:
             self.db_engine_spec.update_impersonation_config(
                 connect_args, str(sqlalchemy_url), effective_username
diff --git a/tests/integration_tests/model_tests.py b/tests/integration_tests/model_tests.py
index 35dbcc0a6b..d5684b1b62 100644
--- a/tests/integration_tests/model_tests.py
+++ b/tests/integration_tests/model_tests.py
@@ -194,7 +194,7 @@ class TestDatabaseModel(SupersetTestCase):
     @mock.patch("superset.models.core.create_engine")
     def test_adjust_engine_params_mysql(self, mocked_create_engine):
         model = Database(
-            database_name="test_database",
+            database_name="test_database1",
             sqlalchemy_uri="mysql://user:password@localhost",
         )
         model._get_sqla_engine()
@@ -203,6 +203,16 @@ class TestDatabaseModel(SupersetTestCase):
         assert str(call_args[0][0]) == "mysql://user:password@localhost"
         assert call_args[1]["connect_args"]["local_infile"] == 0
 
+        model = Database(
+            database_name="test_database2",
+            sqlalchemy_uri="mysql+mysqlconnector://user:password@localhost",
+        )
+        model._get_sqla_engine()
+        call_args = mocked_create_engine.call_args
+
+        assert str(call_args[0][0]) == "mysql+mysqlconnector://user:password@localhost"
+        assert call_args[1]["connect_args"]["allow_local_infile"] == 0
+
     @mock.patch("superset.models.core.create_engine")
     def test_impersonate_user_trino(self, mocked_create_engine):
         principal_user = security_manager.find_user(username="gamma")
diff --git a/tests/unit_tests/db_engine_specs/test_mysql.py b/tests/unit_tests/db_engine_specs/test_mysql.py
index 3a24e1c2dc..a6f0d99e04 100644
--- a/tests/unit_tests/db_engine_specs/test_mysql.py
+++ b/tests/unit_tests/db_engine_specs/test_mysql.py
@@ -104,8 +104,11 @@ def test_convert_dttm(
     "sqlalchemy_uri,error",
     [
         ("mysql://user:password@host/db1?local_infile=1", True),
+        ("mysql+mysqlconnector://user:password@host/db1?allow_local_infile=1", True),
         ("mysql://user:password@host/db1?local_infile=0", True),
+        ("mysql+mysqlconnector://user:password@host/db1?allow_local_infile=0", True),
         ("mysql://user:password@host/db1", False),
+        ("mysql+mysqlconnector://user:password@host/db1", False),
     ],
 )
 def test_validate_database_uri(sqlalchemy_uri: str, error: bool) -> None:
@@ -123,18 +126,43 @@ def test_validate_database_uri(sqlalchemy_uri: str, error: bool) -> None:
     "sqlalchemy_uri,connect_args,returns",
     [
         ("mysql://user:password@host/db1", {"local_infile": 1}, {"local_infile": 0}),
+        (
+            "mysql+mysqlconnector://user:password@host/db1",
+            {"allow_local_infile": 1},
+            {"allow_local_infile": 0},
+        ),
         ("mysql://user:password@host/db1", {"local_infile": -1}, {"local_infile": 0}),
+        (
+            "mysql+mysqlconnector://user:password@host/db1",
+            {"allow_local_infile": -1},
+            {"allow_local_infile": 0},
+        ),
         ("mysql://user:password@host/db1", {"local_infile": 0}, {"local_infile": 0}),
+        (
+            "mysql+mysqlconnector://user:password@host/db1",
+            {"allow_local_infile": 0},
+            {"allow_local_infile": 0},
+        ),
         (
             "mysql://user:password@host/db1",
             {"param1": "some_value"},
             {"local_infile": 0, "param1": "some_value"},
         ),
+        (
+            "mysql+mysqlconnector://user:password@host/db1",
+            {"param1": "some_value"},
+            {"allow_local_infile": 0, "param1": "some_value"},
+        ),
         (
             "mysql://user:password@host/db1",
             {"local_infile": 1, "param1": "some_value"},
             {"local_infile": 0, "param1": "some_value"},
         ),
+        (
+            "mysql+mysqlconnector://user:password@host/db1",
+            {"allow_local_infile": 1, "param1": "some_value"},
+            {"allow_local_infile": 0, "param1": "some_value"},
+        ),
     ],
 )
 def test_adjust_database_uri(
@@ -143,7 +171,9 @@ def test_adjust_database_uri(
     from superset.db_engine_specs.mysql import MySQLEngineSpec
 
     url = make_url(sqlalchemy_uri)
-    returned_url, returned_connect_args = MySQLEngineSpec.adjust_database_uri(url)
+    returned_url, returned_connect_args = MySQLEngineSpec.adjust_database_uri(
+        url, connect_args
+    )
     assert returned_connect_args == returns
 
 


[superset] 18/18: update changelog

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to tag 2.1.1rc1
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 7b6907fe0f2a04d5511b1a936d8f8aa4c28d009b
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Thu Jun 8 16:36:58 2023 -0700

    update changelog
---
 CHANGELOG.md | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)

diff --git a/CHANGELOG.md b/CHANGELOG.md
index 71955dbc79..2d970a8828 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -39,6 +39,7 @@ under the License.
 
 **Fixes**
 - [#23723](https://github.com/apache/superset/pull/23723) add enforce URI query params with a specific for MySQL (@dpgaspar)
+- [#23600](https://github.com/apache/superset/pull/23600) fix: load examples as anon user (@betodealmeida)
 - [#24054](https://github.com/apache/superset/pull/24054) fix: handle temporal columns in presto partitions (@giftig)
 - [#23882](https://github.com/apache/superset/pull/23882) fix: handle comments in `has_table_query` (@betodealmeida)
 - [#24256](https://github.com/apache/superset/pull/24256) fix: enable strong session protection by default (@dpgaspar)
@@ -60,8 +61,6 @@ under the License.
 - [#24294](https://github.com/apache/superset/pull/24294) chore: update UPDATING for 2.1.0 (@eschutho)
 - [#24056](https://github.com/apache/superset/pull/24056) chore: Remove unnecessary information from response (@geido)
 
-
-
 ### 2.1.0 (Thu Mar 16 21:13:05 2023 -0700)
 **Database Migrations**
 - [#23139](https://github.com/apache/superset/pull/23139) fix: memoized decorator memory leak (@dpgaspar)