You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@superset.apache.org by mi...@apache.org on 2023/08/23 12:58:44 UTC

[superset] branch 3.0 updated (34bc86a484 -> 931e1b2139)

This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a change to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git


    from 34bc86a484 fix(mssql): avoid trying to return a resultset for DML queries with not resultset (#24999)
     new 80f1eaf6d7 chore: use os.getenv to simplify superset_config.py (#25016)
     new 6003aa2485 fix: extend user email size (#25053)
     new 8cb5142f87 fix: docker-compose non-dev (#25055)
     new 1af6df3190 fix: Native filter dashboard RBAC aware dataset permission (#25029)
     new b5f7f54c7f fix: Error when using the legacy dataset editor (#25057)
     new 931e1b2139 fix: dataset safe URL for explore_url (#24686)

The 6 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .pre-commit-config.yaml                            |  7 +--
 UPDATING.md                                        |  1 +
 docker-compose-non-dev.yml                         | 11 ++--
 docker/.env-non-dev                                | 10 +++-
 docker/pythonpath_dev/superset_config.py           | 47 ++++++----------
 requirements/base.txt                              |  2 +-
 setup.py                                           |  2 +-
 .../dashboard/components/nativeFilters/utils.ts    |  2 +
 .../src/pages/DatasetList/DatasetList.test.tsx     | 60 ++++++++++++++++++++-
 superset-frontend/src/pages/DatasetList/index.tsx  | 27 +++++++---
 superset/config.py                                 |  2 +-
 superset/connectors/sqla/views.py                  |  1 +
 superset/datasets/commands/exceptions.py           | 17 ------
 superset/datasets/commands/update.py               | 12 -----
 ...4aca4c8a2_increase_ab_user_email_field_size.py} | 32 +++++------
 superset/security/manager.py                       | 35 +++++++++---
 superset/utils/urls.py                             | 19 +------
 superset/views/base.py                             |  1 +
 superset/views/datasource/views.py                 | 17 +-----
 tests/integration_tests/datasets/api_tests.py      | 26 ---------
 tests/integration_tests/datasource_tests.py        | 26 ---------
 tests/integration_tests/security_tests.py          | 63 ++++++++++++++++++++--
 tests/unit_tests/utils/urls_tests.py               | 24 ---------
 23 files changed, 230 insertions(+), 214 deletions(-)
 copy superset/migrations/versions/{2018-12-13_15-38_cefabc8f7d38_increase_size_of_name_column_in_ab_view_.py => 2023-08-22_11-09_ec54aca4c8a2_increase_ab_user_email_field_size.py} (62%)


[superset] 03/06: fix: docker-compose non-dev (#25055)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 8cb5142f87b985483a0fdca9aee6f5cbce350c35
Author: Michael S. Molina <70...@users.noreply.github.com>
AuthorDate: Tue Aug 22 11:35:10 2023 -0300

    fix: docker-compose non-dev (#25055)
    
    (cherry picked from commit 7317d9c0b2f0782b161a19c9faf95fd4d8634619)
---
 docker-compose-non-dev.yml               | 11 ++++++++---
 docker/.env-non-dev                      | 10 ++++++++--
 docker/pythonpath_dev/superset_config.py | 24 ++++++++++++------------
 3 files changed, 28 insertions(+), 17 deletions(-)

diff --git a/docker-compose-non-dev.yml b/docker-compose-non-dev.yml
index 0ce96e00ba..785472e7c6 100644
--- a/docker-compose-non-dev.yml
+++ b/docker-compose-non-dev.yml
@@ -18,8 +18,8 @@ x-superset-image: &superset-image apachesuperset.docker.scarf.sh/apache/superset
 x-superset-depends-on: &superset-depends-on
   - db
   - redis
-x-superset-volumes: &superset-volumes
-  # /app/pythonpath_docker will be appended to the PYTHONPATH in the final container
+x-superset-volumes:
+  &superset-volumes # /app/pythonpath_docker will be appended to the PYTHONPATH in the final container
   - ./docker:/app/docker
   - superset_home:/app/superset_home
 
@@ -39,6 +39,7 @@ services:
     restart: unless-stopped
     volumes:
       - db_home:/var/lib/postgresql/data
+      - ./docker/docker-entrypoint-initdb.d:/docker-entrypoint-initdb.d
 
   superset:
     env_file: docker/.env-non-dev
@@ -73,7 +74,11 @@ services:
     user: "root"
     volumes: *superset-volumes
     healthcheck:
-      test: ["CMD-SHELL", "celery -A superset.tasks.celery_app:app inspect ping -d celery@$$HOSTNAME"]
+      test:
+        [
+          "CMD-SHELL",
+          "celery -A superset.tasks.celery_app:app inspect ping -d celery@$$HOSTNAME",
+        ]
 
   superset-worker-beat:
     image: *superset-image
diff --git a/docker/.env-non-dev b/docker/.env-non-dev
index 1d071af6a2..a86ddbd193 100644
--- a/docker/.env-non-dev
+++ b/docker/.env-non-dev
@@ -21,11 +21,17 @@ DATABASE_DB=superset
 DATABASE_HOST=db
 DATABASE_PASSWORD=superset
 DATABASE_USER=superset
+DATABASE_PORT=5432
+DATABASE_DIALECT=postgresql
+
+EXAMPLES_DB=examples
+EXAMPLES_HOST=db
+EXAMPLES_USER=examples
+EXAMPLES_PASSWORD=examples
+EXAMPLES_PORT=5432
 
 # database engine specific environment variables
 # change the below if you prefer another database engine
-DATABASE_PORT=5432
-DATABASE_DIALECT=postgresql
 POSTGRES_DB=superset
 POSTGRES_USER=superset
 POSTGRES_PASSWORD=superset
diff --git a/docker/pythonpath_dev/superset_config.py b/docker/pythonpath_dev/superset_config.py
index c2176e0388..794f7c910f 100644
--- a/docker/pythonpath_dev/superset_config.py
+++ b/docker/pythonpath_dev/superset_config.py
@@ -28,18 +28,18 @@ from celery.schedules import crontab
 
 logger = logging.getLogger()
 
-DATABASE_DIALECT = os.getenv("DATABASE_DIALECT", "postgresql")
-DATABASE_USER = os.getenv("DATABASE_USER", "superset")
-DATABASE_PASSWORD = os.getenv("DATABASE_PASSWORD", "superset")
-DATABASE_HOST = os.getenv("DATABASE_HOST", "db")
-DATABASE_PORT = os.getenv("DATABASE_PORT", "5432")
-DATABASE_DB = os.getenv("DATABASE_DB", "superset")
-
-EXAMPLES_USER = os.getenv("EXAMPLES_USER", "examples")
-EXAMPLES_PASSWORD = os.getenv("EXAMPLES_PASSWORD", "examples")
-EXAMPLES_HOST = os.getenv("EXAMPLES_HOST", "db")
-EXAMPLES_PORT = os.getenv("EXAMPLES_PORT", "5432")
-EXAMPLES_DB = os.getenv("EXAMPLES_DB", "examples")
+DATABASE_DIALECT = os.getenv("DATABASE_DIALECT")
+DATABASE_USER = os.getenv("DATABASE_USER")
+DATABASE_PASSWORD = os.getenv("DATABASE_PASSWORD")
+DATABASE_HOST = os.getenv("DATABASE_HOST")
+DATABASE_PORT = os.getenv("DATABASE_PORT")
+DATABASE_DB = os.getenv("DATABASE_DB")
+
+EXAMPLES_USER = os.getenv("EXAMPLES_USER")
+EXAMPLES_PASSWORD = os.getenv("EXAMPLES_PASSWORD")
+EXAMPLES_HOST = os.getenv("EXAMPLES_HOST")
+EXAMPLES_PORT = os.getenv("EXAMPLES_PORT")
+EXAMPLES_DB = os.getenv("EXAMPLES_DB")
 
 # The SQLAlchemy connection string.
 SQLALCHEMY_DATABASE_URI = (


[superset] 06/06: fix: dataset safe URL for explore_url (#24686)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 931e1b21396c55a7da8fa8b026c64c63b333e3eb
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Wed Aug 23 13:31:44 2023 +0100

    fix: dataset safe URL for explore_url (#24686)
    
    (cherry picked from commit a9efd4b2e307b0df68e88ebbd02d22d7032fa451)
---
 UPDATING.md                                        |  1 +
 .../src/pages/DatasetList/DatasetList.test.tsx     | 60 +++++++++++++++++++++-
 superset-frontend/src/pages/DatasetList/index.tsx  | 27 +++++++---
 superset/config.py                                 |  2 +-
 superset/datasets/commands/exceptions.py           | 17 ------
 superset/datasets/commands/update.py               | 12 -----
 superset/utils/urls.py                             | 19 +------
 superset/views/base.py                             |  1 +
 superset/views/datasource/views.py                 | 17 +-----
 tests/integration_tests/datasets/api_tests.py      | 26 ----------
 tests/integration_tests/datasource_tests.py        | 26 ----------
 tests/unit_tests/utils/urls_tests.py               | 24 ---------
 12 files changed, 85 insertions(+), 147 deletions(-)

diff --git a/UPDATING.md b/UPDATING.md
index 5a29c43dfa..19c60a19b7 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -51,6 +51,7 @@ assists people when migrating to a new version.
 
 ### Breaking Changes
 
+- [24686]https://github.com/apache/superset/pull/24686): All dataset's custom explore_url are handled as relative URLs on the frontend, behaviour controlled by PREVENT_UNSAFE_DEFAULT_URLS_ON_DATASET.
 - [24262](https://github.com/apache/superset/pull/24262): Enabled `TALISMAN_ENABLED` flag by default and provided stricter default Content Security Policy
 - [24415](https://github.com/apache/superset/pull/24415): Removed the obsolete Druid NoSQL REGEX operator.
 - [24423](https://github.com/apache/superset/pull/24423): Removed deprecated APIs `/superset/slice_json/...`, `/superset/annotation_json/...`
diff --git a/superset-frontend/src/pages/DatasetList/DatasetList.test.tsx b/superset-frontend/src/pages/DatasetList/DatasetList.test.tsx
index 115a861bdd..916dd0615b 100644
--- a/superset-frontend/src/pages/DatasetList/DatasetList.test.tsx
+++ b/superset-frontend/src/pages/DatasetList/DatasetList.test.tsx
@@ -35,6 +35,7 @@ import IndeterminateCheckbox from 'src/components/IndeterminateCheckbox';
 import waitForComponentToPaint from 'spec/helpers/waitForComponentToPaint';
 import { act } from 'react-dom/test-utils';
 import SubMenu from 'src/features/home/SubMenu';
+import * as reactRedux from 'react-redux';
 
 // store needed for withToasts(DatasetList)
 const mockStore = configureStore([thunk]);
@@ -47,13 +48,15 @@ const datasetsDuplicateEndpoint = 'glob:*/api/v1/dataset/duplicate*';
 const databaseEndpoint = 'glob:*/api/v1/dataset/related/database*';
 const datasetsEndpoint = 'glob:*/api/v1/dataset/?*';
 
+const useSelectorMock = jest.spyOn(reactRedux, 'useSelector');
+
 const mockdatasets = [...new Array(3)].map((_, i) => ({
   changed_by_name: 'user',
   kind: i === 0 ? 'virtual' : 'physical', // ensure there is 1 virtual
   changed_by: 'user',
   changed_on: new Date().toISOString(),
   database_name: `db ${i}`,
-  explore_url: `/explore/?datasource_type=table&datasource_id=${i}`,
+  explore_url: `https://www.google.com?${i}`,
   id: i,
   schema: `schema ${i}`,
   table_name: `coolest table ${i}`,
@@ -280,3 +283,58 @@ describe('RTL', () => {
     expect(importTooltip).toBeInTheDocument();
   });
 });
+
+describe('Prevent unsafe URLs', () => {
+  const mockedProps = {};
+  let wrapper: any;
+
+  it('Check prevent unsafe is on renders relative links', async () => {
+    const tdColumnsNumber = 9;
+    useSelectorMock.mockReturnValue(true);
+    wrapper = await mountAndWait(mockedProps);
+    const tdElements = wrapper.find(ListView).find('td');
+    expect(
+      tdElements
+        .at(0 * tdColumnsNumber + 1)
+        .find('a')
+        .prop('href'),
+    ).toBe('/https://www.google.com?0');
+    expect(
+      tdElements
+        .at(1 * tdColumnsNumber + 1)
+        .find('a')
+        .prop('href'),
+    ).toBe('/https://www.google.com?1');
+    expect(
+      tdElements
+        .at(2 * tdColumnsNumber + 1)
+        .find('a')
+        .prop('href'),
+    ).toBe('/https://www.google.com?2');
+  });
+
+  it('Check prevent unsafe is off renders absolute links', async () => {
+    const tdColumnsNumber = 9;
+    useSelectorMock.mockReturnValue(false);
+    wrapper = await mountAndWait(mockedProps);
+    const tdElements = wrapper.find(ListView).find('td');
+    expect(
+      tdElements
+        .at(0 * tdColumnsNumber + 1)
+        .find('a')
+        .prop('href'),
+    ).toBe('https://www.google.com?0');
+    expect(
+      tdElements
+        .at(1 * tdColumnsNumber + 1)
+        .find('a')
+        .prop('href'),
+    ).toBe('https://www.google.com?1');
+    expect(
+      tdElements
+        .at(2 * tdColumnsNumber + 1)
+        .find('a')
+        .prop('href'),
+    ).toBe('https://www.google.com?2');
+  });
+});
diff --git a/superset-frontend/src/pages/DatasetList/index.tsx b/superset-frontend/src/pages/DatasetList/index.tsx
index 7633edb016..0f3fb84ab1 100644
--- a/superset-frontend/src/pages/DatasetList/index.tsx
+++ b/superset-frontend/src/pages/DatasetList/index.tsx
@@ -30,7 +30,7 @@ import React, {
   useMemo,
   useCallback,
 } from 'react';
-import { useHistory } from 'react-router-dom';
+import { Link, useHistory } from 'react-router-dom';
 import rison from 'rison';
 import {
   createFetchRelated,
@@ -69,6 +69,7 @@ import {
   CONFIRM_OVERWRITE_MESSAGE,
 } from 'src/features/datasets/constants';
 import DuplicateDatasetModal from 'src/features/datasets/DuplicateDatasetModal';
+import { useSelector } from 'react-redux';
 
 const extensionsRegistry = getExtensionsRegistry();
 const DatasetDeleteRelatedExtension = extensionsRegistry.get(
@@ -181,6 +182,11 @@ const DatasetList: FunctionComponent<DatasetListProps> = ({
     setSSHTunnelPrivateKeyPasswordFields,
   ] = useState<string[]>([]);
 
+  const PREVENT_UNSAFE_DEFAULT_URLS_ON_DATASET = useSelector<any, boolean>(
+    state =>
+      state.common?.conf?.PREVENT_UNSAFE_DEFAULT_URLS_ON_DATASET || false,
+  );
+
   const openDatasetImportModal = () => {
     showImportModal(true);
   };
@@ -309,11 +315,20 @@ const DatasetList: FunctionComponent<DatasetListProps> = ({
             },
           },
         }: any) => {
-          const titleLink = (
-            // exploreUrl can be a link to Explore or an external link
-            // in the first case use SPA routing, else use HTML anchor
-            <GenericLink to={exploreURL}>{datasetTitle}</GenericLink>
-          );
+          let titleLink: JSX.Element;
+          if (PREVENT_UNSAFE_DEFAULT_URLS_ON_DATASET) {
+            titleLink = (
+              <Link data-test="internal-link" to={exploreURL}>
+                {datasetTitle}
+              </Link>
+            );
+          } else {
+            titleLink = (
+              // exploreUrl can be a link to Explore or an external link
+              // in the first case use SPA routing, else use HTML anchor
+              <GenericLink to={exploreURL}>{datasetTitle}</GenericLink>
+            );
+          }
           try {
             const parsedExtra = JSON.parse(extra);
             return (
diff --git a/superset/config.py b/superset/config.py
index 17f5bb72a2..a2ce1174a1 100644
--- a/superset/config.py
+++ b/superset/config.py
@@ -1460,7 +1460,7 @@ STATIC_ASSETS_PREFIX = ""
 # Typically these should not be allowed.
 PREVENT_UNSAFE_DB_CONNECTIONS = True
 
-# Prevents unsafe default endpoints to be registered on datasets.
+# If true all default urls on datasets will be handled as relative URLs by the frontend
 PREVENT_UNSAFE_DEFAULT_URLS_ON_DATASET = True
 
 # Define a list of allowed URLs for dataset data imports (v1).
diff --git a/superset/datasets/commands/exceptions.py b/superset/datasets/commands/exceptions.py
index fe9fe94cc6..f135294980 100644
--- a/superset/datasets/commands/exceptions.py
+++ b/superset/datasets/commands/exceptions.py
@@ -61,23 +61,6 @@ class DatasetExistsValidationError(ValidationError):
         )
 
 
-class DatasetEndpointUnsafeValidationError(ValidationError):
-    """
-    Marshmallow validation error for unsafe dataset default endpoint
-    """
-
-    def __init__(self) -> None:
-        super().__init__(
-            [
-                _(
-                    "The submitted URL is not considered safe,"
-                    " only use URLs with the same domain as Superset."
-                )
-            ],
-            field_name="default_endpoint",
-        )
-
-
 class DatasetColumnNotFoundValidationError(ValidationError):
     """
     Marshmallow validation error when dataset column for update does not exist
diff --git a/superset/datasets/commands/update.py b/superset/datasets/commands/update.py
index 1636805567..a38439fb7f 100644
--- a/superset/datasets/commands/update.py
+++ b/superset/datasets/commands/update.py
@@ -18,7 +18,6 @@ import logging
 from collections import Counter
 from typing import Any, Optional
 
-from flask import current_app
 from flask_appbuilder.models.sqla import Model
 from marshmallow import ValidationError
 
@@ -32,7 +31,6 @@ from superset.datasets.commands.exceptions import (
     DatasetColumnNotFoundValidationError,
     DatasetColumnsDuplicateValidationError,
     DatasetColumnsExistsValidationError,
-    DatasetEndpointUnsafeValidationError,
     DatasetExistsValidationError,
     DatasetForbiddenError,
     DatasetInvalidError,
@@ -43,7 +41,6 @@ from superset.datasets.commands.exceptions import (
     DatasetUpdateFailedError,
 )
 from superset.exceptions import SupersetSecurityException
-from superset.utils.urls import is_safe_url
 
 logger = logging.getLogger(__name__)
 
@@ -104,15 +101,6 @@ class UpdateDatasetCommand(UpdateMixin, BaseCommand):
             self._properties["owners"] = owners
         except ValidationError as ex:
             exceptions.append(ex)
-        # Validate default URL safety
-        default_endpoint = self._properties.get("default_endpoint")
-        if (
-            default_endpoint
-            and not is_safe_url(default_endpoint)
-            and current_app.config["PREVENT_UNSAFE_DEFAULT_URLS_ON_DATASET"]
-        ):
-            exceptions.append(DatasetEndpointUnsafeValidationError())
-
         # Validate columns
         if columns := self._properties.get("columns"):
             self._validate_columns(columns, exceptions)
diff --git a/superset/utils/urls.py b/superset/utils/urls.py
index 31fbd89337..57a1b63dd4 100644
--- a/superset/utils/urls.py
+++ b/superset/utils/urls.py
@@ -14,12 +14,10 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-import unicodedata
 import urllib
 from typing import Any
-from urllib.parse import urlparse
 
-from flask import current_app, request, url_for
+from flask import current_app, url_for
 
 
 def get_url_host(user_friendly: bool = False) -> str:
@@ -52,18 +50,3 @@ def modify_url_query(url: str, **kwargs: Any) -> str:
         f"{k}={urllib.parse.quote(str(v[0]))}" for k, v in params.items()
     )
     return urllib.parse.urlunsplit(parts)
-
-
-def is_safe_url(url: str) -> bool:
-    if url.startswith("///"):
-        return False
-    try:
-        ref_url = urlparse(request.host_url)
-        test_url = urlparse(url)
-    except ValueError:
-        return False
-    if unicodedata.category(url[0])[0] == "C":
-        return False
-    if test_url.scheme != ref_url.scheme or ref_url.netloc != test_url.netloc:
-        return False
-    return True
diff --git a/superset/views/base.py b/superset/views/base.py
index 250c16ebb2..ab53ff07da 100644
--- a/superset/views/base.py
+++ b/superset/views/base.py
@@ -120,6 +120,7 @@ FRONTEND_CONF_KEYS = (
     "ALERT_REPORTS_DEFAULT_RETENTION",
     "ALERT_REPORTS_DEFAULT_WORKING_TIMEOUT",
     "NATIVE_FILTER_DEFAULT_ROW_LIMIT",
+    "PREVENT_UNSAFE_DEFAULT_URLS_ON_DATASET",
 )
 
 logger = logging.getLogger(__name__)
diff --git a/superset/views/datasource/views.py b/superset/views/datasource/views.py
index b2fd387379..06dd37fbd9 100644
--- a/superset/views/datasource/views.py
+++ b/superset/views/datasource/views.py
@@ -18,7 +18,7 @@ import json
 from collections import Counter
 from typing import Any
 
-from flask import current_app, redirect, request
+from flask import redirect, request
 from flask_appbuilder import expose, permission_name
 from flask_appbuilder.api import rison
 from flask_appbuilder.security.decorators import has_access, has_access_api
@@ -40,7 +40,6 @@ from superset.exceptions import SupersetException, SupersetSecurityException
 from superset.models.core import Database
 from superset.superset_typing import FlaskResponse
 from superset.utils.core import DatasourceType
-from superset.utils.urls import is_safe_url
 from superset.views.base import (
     api,
     BaseSupersetView,
@@ -82,20 +81,6 @@ class Datasource(BaseSupersetView):
         datasource_id = datasource_dict.get("id")
         datasource_type = datasource_dict.get("type")
         database_id = datasource_dict["database"].get("id")
-        default_endpoint = datasource_dict["default_endpoint"]
-        if (
-            default_endpoint
-            and not is_safe_url(default_endpoint)
-            and current_app.config["PREVENT_UNSAFE_DEFAULT_URLS_ON_DATASET"]
-        ):
-            return json_error_response(
-                _(
-                    "The submitted URL is not considered safe,"
-                    " only use URLs with the same domain as Superset."
-                ),
-                status=400,
-            )
-
         orm_datasource = DatasourceDAO.get_datasource(
             db.session, DatasourceType(datasource_type), datasource_id
         )
diff --git a/tests/integration_tests/datasets/api_tests.py b/tests/integration_tests/datasets/api_tests.py
index 8884b1171a..2acefe48d0 100644
--- a/tests/integration_tests/datasets/api_tests.py
+++ b/tests/integration_tests/datasets/api_tests.py
@@ -1446,32 +1446,6 @@ class TestDatasetApi(SupersetTestCase):
         db.session.delete(ab_user)
         db.session.commit()
 
-    def test_update_dataset_unsafe_default_endpoint(self):
-        """
-        Dataset API: Test unsafe default endpoint
-        """
-        if backend() == "sqlite":
-            return
-
-        dataset = self.insert_default_dataset()
-        self.login(username="admin")
-        uri = f"api/v1/dataset/{dataset.id}"
-        table_data = {"default_endpoint": "http://www.google.com"}
-        rv = self.client.put(uri, json=table_data)
-        data = json.loads(rv.data.decode("utf-8"))
-        assert rv.status_code == 422
-        expected_response = {
-            "message": {
-                "default_endpoint": [
-                    "The submitted URL is not considered safe,"
-                    " only use URLs with the same domain as Superset."
-                ]
-            }
-        }
-        assert data == expected_response
-        db.session.delete(dataset)
-        db.session.commit()
-
     @patch("superset.daos.dataset.DatasetDAO.update")
     def test_update_dataset_sqlalchemy_error(self, mock_dao_update):
         """
diff --git a/tests/integration_tests/datasource_tests.py b/tests/integration_tests/datasource_tests.py
index 4c05898cfe..12293325d1 100644
--- a/tests/integration_tests/datasource_tests.py
+++ b/tests/integration_tests/datasource_tests.py
@@ -302,32 +302,6 @@ class TestDatasource(SupersetTestCase):
                 print(k)
                 self.assertEqual(resp[k], datasource_post[k])
 
-    def test_save_default_endpoint_validation_fail(self):
-        self.login(username="admin")
-        tbl_id = self.get_table(name="birth_names").id
-
-        datasource_post = get_datasource_post()
-        datasource_post["id"] = tbl_id
-        datasource_post["owners"] = [1]
-        datasource_post["default_endpoint"] = "http://www.google.com"
-        data = dict(data=json.dumps(datasource_post))
-        resp = self.client.post("/datasource/save/", data=data)
-        assert resp.status_code == 400
-
-    def test_save_default_endpoint_validation_unsafe(self):
-        self.app.config["PREVENT_UNSAFE_DEFAULT_URLS_ON_DATASET"] = False
-        self.login(username="admin")
-        tbl_id = self.get_table(name="birth_names").id
-
-        datasource_post = get_datasource_post()
-        datasource_post["id"] = tbl_id
-        datasource_post["owners"] = [1]
-        datasource_post["default_endpoint"] = "http://www.google.com"
-        data = dict(data=json.dumps(datasource_post))
-        resp = self.client.post("/datasource/save/", data=data)
-        assert resp.status_code == 200
-        self.app.config["PREVENT_UNSAFE_DEFAULT_URLS_ON_DATASET"] = True
-
     def test_save_default_endpoint_validation_success(self):
         self.login(username="admin")
         tbl_id = self.get_table(name="birth_names").id
diff --git a/tests/unit_tests/utils/urls_tests.py b/tests/unit_tests/utils/urls_tests.py
index 287f346c3d..8ead2dacb4 100644
--- a/tests/unit_tests/utils/urls_tests.py
+++ b/tests/unit_tests/utils/urls_tests.py
@@ -39,27 +39,3 @@ def test_convert_dashboard_link() -> None:
 def test_convert_dashboard_link_with_integer() -> None:
     test_url = modify_url_query(EXPLORE_DASHBOARD_LINK, standalone=0)
     assert test_url == "http://localhost:9000/superset/dashboard/3/?standalone=0"
-
-
-@pytest.mark.parametrize(
-    "url,is_safe",
-    [
-        ("http://localhost/", True),
-        ("http://localhost/superset/1", True),
-        ("https://localhost/", False),
-        ("https://localhost/superset/1", False),
-        ("localhost/superset/1", False),
-        ("ftp://localhost/superset/1", False),
-        ("http://external.com", False),
-        ("https://external.com", False),
-        ("external.com", False),
-        ("///localhost", False),
-        ("xpto://localhost:[3/1/", False),
-    ],
-)
-def test_is_safe_url(url: str, is_safe: bool) -> None:
-    from superset import app
-    from superset.utils.urls import is_safe_url
-
-    with app.test_request_context("/"):
-        assert is_safe_url(url) == is_safe


[superset] 02/06: fix: extend user email size (#25053)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 6003aa2485a168a3d5f95aa72a2e15c9c8bc0fe7
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Tue Aug 22 14:52:49 2023 +0100

    fix: extend user email size (#25053)
    
    (cherry picked from commit 6975084ea5045f0b099b5d8ced4b1068401284f7)
---
 requirements/base.txt                              |  2 +-
 setup.py                                           |  2 +-
 ...54aca4c8a2_increase_ab_user_email_field_size.py | 52 ++++++++++++++++++++++
 3 files changed, 54 insertions(+), 2 deletions(-)

diff --git a/requirements/base.txt b/requirements/base.txt
index 6a321708d9..49425ed5d3 100644
--- a/requirements/base.txt
+++ b/requirements/base.txt
@@ -88,7 +88,7 @@ flask==2.2.5
     #   flask-migrate
     #   flask-sqlalchemy
     #   flask-wtf
-flask-appbuilder==4.3.4
+flask-appbuilder==4.3.6
     # via apache-superset
 flask-babel==1.0.0
     # via flask-appbuilder
diff --git a/setup.py b/setup.py
index b494f324b3..b97dfb131a 100644
--- a/setup.py
+++ b/setup.py
@@ -81,7 +81,7 @@ setup(
         "cryptography>=39.0.1, <40",
         "deprecation>=2.1.0, <2.2.0",
         "flask>=2.2.5, <3.0.0",
-        "flask-appbuilder>=4.3.4, <5.0.0",
+        "flask-appbuilder>=4.3.6, <5.0.0",
         "flask-caching>=1.10.1, <2.0",
         "flask-compress>=1.13, <2.0",
         "flask-talisman>=1.0.0, <2.0",
diff --git a/superset/migrations/versions/2023-08-22_11-09_ec54aca4c8a2_increase_ab_user_email_field_size.py b/superset/migrations/versions/2023-08-22_11-09_ec54aca4c8a2_increase_ab_user_email_field_size.py
new file mode 100644
index 0000000000..8e2072655f
--- /dev/null
+++ b/superset/migrations/versions/2023-08-22_11-09_ec54aca4c8a2_increase_ab_user_email_field_size.py
@@ -0,0 +1,52 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Increase ab_user.email field size
+
+Revision ID: ec54aca4c8a2
+Revises: 9f4a086c2676
+Create Date: 2023-08-22 11:09:48.577457
+
+"""
+
+# revision identifiers, used by Alembic.
+revision = "ec54aca4c8a2"
+down_revision = "9f4a086c2676"
+
+import sqlalchemy as sa
+from alembic import op
+
+
+def upgrade():
+    # ### commands auto generated by Alembic - please adjust! ###
+    with op.batch_alter_table("ab_user") as batch_op:
+        batch_op.alter_column(
+            "email",
+            existing_type=sa.VARCHAR(length=64),
+            type_=sa.String(length=320),
+            nullable=False,
+        )
+
+
+def downgrade():
+    # ### commands auto generated by Alembic - please adjust! ###
+    with op.batch_alter_table("ab_user") as batch_op:
+        batch_op.alter_column(
+            "email",
+            existing_type=sa.VARCHAR(length=320),
+            type_=sa.String(length=64),
+            nullable=False,
+        )


[superset] 01/06: chore: use os.getenv to simplify superset_config.py (#25016)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 80f1eaf6d73156390a29052f3b82bd5da3edf458
Author: Sebastian Liebscher <11...@users.noreply.github.com>
AuthorDate: Mon Aug 21 13:45:22 2023 +0200

    chore: use os.getenv to simplify superset_config.py (#25016)
    
    (cherry picked from commit 969cd664cbd2e033a1c3102a004795ccb87e292b)
---
 docker/pythonpath_dev/superset_config.py | 47 ++++++++++----------------------
 1 file changed, 15 insertions(+), 32 deletions(-)

diff --git a/docker/pythonpath_dev/superset_config.py b/docker/pythonpath_dev/superset_config.py
index 2e3c1fdda4..c2176e0388 100644
--- a/docker/pythonpath_dev/superset_config.py
+++ b/docker/pythonpath_dev/superset_config.py
@@ -22,41 +22,24 @@
 #
 import logging
 import os
-from typing import Optional
 
 from cachelib.file import FileSystemCache
 from celery.schedules import crontab
 
 logger = logging.getLogger()
 
+DATABASE_DIALECT = os.getenv("DATABASE_DIALECT", "postgresql")
+DATABASE_USER = os.getenv("DATABASE_USER", "superset")
+DATABASE_PASSWORD = os.getenv("DATABASE_PASSWORD", "superset")
+DATABASE_HOST = os.getenv("DATABASE_HOST", "db")
+DATABASE_PORT = os.getenv("DATABASE_PORT", "5432")
+DATABASE_DB = os.getenv("DATABASE_DB", "superset")
 
-def get_env_variable(var_name: str, default: Optional[str] = None) -> str:
-    """Get the environment variable or raise exception."""
-    try:
-        return os.environ[var_name]
-    except KeyError:
-        if default is not None:
-            return default
-        else:
-            error_msg = "The environment variable {} was missing, abort...".format(
-                var_name
-            )
-            raise OSError(error_msg)
-
-
-DATABASE_DIALECT = get_env_variable("DATABASE_DIALECT")
-DATABASE_USER = get_env_variable("DATABASE_USER")
-DATABASE_PASSWORD = get_env_variable("DATABASE_PASSWORD")
-DATABASE_HOST = get_env_variable("DATABASE_HOST")
-DATABASE_PORT = get_env_variable("DATABASE_PORT")
-DATABASE_DB = get_env_variable("DATABASE_DB")
-
-EXAMPLES_USER = get_env_variable("EXAMPLES_USER")
-EXAMPLES_PASSWORD = get_env_variable("EXAMPLES_PASSWORD")
-EXAMPLES_HOST = get_env_variable("EXAMPLES_HOST")
-EXAMPLES_PORT = get_env_variable("EXAMPLES_PORT")
-EXAMPLES_DB = get_env_variable("EXAMPLES_DB")
-
+EXAMPLES_USER = os.getenv("EXAMPLES_USER", "examples")
+EXAMPLES_PASSWORD = os.getenv("EXAMPLES_PASSWORD", "examples")
+EXAMPLES_HOST = os.getenv("EXAMPLES_HOST", "db")
+EXAMPLES_PORT = os.getenv("EXAMPLES_PORT", "5432")
+EXAMPLES_DB = os.getenv("EXAMPLES_DB", "examples")
 
 # The SQLAlchemy connection string.
 SQLALCHEMY_DATABASE_URI = (
@@ -71,10 +54,10 @@ SQLALCHEMY_EXAMPLES_URI = (
     f"{EXAMPLES_HOST}:{EXAMPLES_PORT}/{EXAMPLES_DB}"
 )
 
-REDIS_HOST = get_env_variable("REDIS_HOST")
-REDIS_PORT = get_env_variable("REDIS_PORT")
-REDIS_CELERY_DB = get_env_variable("REDIS_CELERY_DB", "0")
-REDIS_RESULTS_DB = get_env_variable("REDIS_RESULTS_DB", "1")
+REDIS_HOST = os.getenv("REDIS_HOST", "redis")
+REDIS_PORT = os.getenv("REDIS_PORT", "6379")
+REDIS_CELERY_DB = os.getenv("REDIS_CELERY_DB", "0")
+REDIS_RESULTS_DB = os.getenv("REDIS_RESULTS_DB", "1")
 
 RESULTS_BACKEND = FileSystemCache("/app/superset_home/sqllab")
 


[superset] 04/06: fix: Native filter dashboard RBAC aware dataset permission (#25029)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 1af6df3190c0be433031a22eae523fc2b6cc5b04
Author: John Bodley <45...@users.noreply.github.com>
AuthorDate: Tue Aug 22 09:58:43 2023 -0700

    fix: Native filter dashboard RBAC aware dataset permission (#25029)
    
    (cherry picked from commit 60889d27edeeb306cff763743254ca0655faf4b5)
---
 .../dashboard/components/nativeFilters/utils.ts    |  2 +
 superset/security/manager.py                       | 35 +++++++++---
 tests/integration_tests/security_tests.py          | 63 ++++++++++++++++++++--
 3 files changed, 91 insertions(+), 9 deletions(-)

diff --git a/superset-frontend/src/dashboard/components/nativeFilters/utils.ts b/superset-frontend/src/dashboard/components/nativeFilters/utils.ts
index b4284a8a63..7194e2435f 100644
--- a/superset-frontend/src/dashboard/components/nativeFilters/utils.ts
+++ b/superset-frontend/src/dashboard/components/nativeFilters/utils.ts
@@ -55,6 +55,7 @@ export const getFormData = ({
   granularity_sqla,
   type,
   dashboardId,
+  id,
 }: Partial<Filter> & {
   dashboardId: number;
   datasetId?: number;
@@ -94,6 +95,7 @@ export const getFormData = ({
     viz_type: filterType,
     type,
     dashboardId,
+    native_filter_id: id,
   };
 };
 
diff --git a/superset/security/manager.py b/superset/security/manager.py
index a32ef9a1b9..028fd8762f 100644
--- a/superset/security/manager.py
+++ b/superset/security/manager.py
@@ -16,6 +16,7 @@
 # under the License.
 # pylint: disable=too-many-lines
 """A set of constants and methods to manage permissions and security"""
+import json
 import logging
 import re
 import time
@@ -1876,14 +1877,36 @@ class SupersetSecurityManager(  # pylint: disable=too-many-public-methods
                         .one_or_none()
                     )
                     and dashboard_.roles
-                    and (slice_id := form_data.get("slice_id"))
                     and (
-                        slc := self.get_session.query(Slice)
-                        .filter(Slice.id == slice_id)
-                        .one_or_none()
+                        (
+                            # Native filter.
+                            form_data.get("type") == "NATIVE_FILTER"
+                            and (native_filter_id := form_data.get("native_filter_id"))
+                            and dashboard_.json_metadata
+                            and (json_metadata := json.loads(dashboard_.json_metadata))
+                            and any(
+                                target.get("datasetId") == datasource.id
+                                for fltr in json_metadata.get(
+                                    "native_filter_configuration",
+                                    [],
+                                )
+                                for target in fltr.get("targets", [])
+                                if native_filter_id == fltr.get("id")
+                            )
+                        )
+                        or (
+                            # Chart.
+                            form_data.get("type") != "NATIVE_FILTER"
+                            and (slice_id := form_data.get("slice_id"))
+                            and (
+                                slc := self.get_session.query(Slice)
+                                .filter(Slice.id == slice_id)
+                                .one_or_none()
+                            )
+                            and slc in dashboard_.slices
+                            and slc.datasource == datasource
+                        )
                     )
-                    and slc in dashboard_.slices
-                    and slc.datasource == datasource
                     and self.can_access_dashboard(dashboard_)
                 )
             ):
diff --git a/tests/integration_tests/security_tests.py b/tests/integration_tests/security_tests.py
index 4767e5af0e..f741ec4315 100644
--- a/tests/integration_tests/security_tests.py
+++ b/tests/integration_tests/security_tests.py
@@ -15,6 +15,7 @@
 # specific language governing permissions and limitations
 # under the License.
 # isort:skip_file
+import json
 import inspect
 import time
 import unittest
@@ -1718,6 +1719,21 @@ class TestSecurityManager(SupersetTestCase):
         world_health = self.get_dash_by_slug("world_health")
         treemap = self.get_slice("Treemap", db.session, expunge_from_session=False)
 
+        births.json_metadata = json.dumps(
+            {
+                "native_filter_configuration": [
+                    {
+                        "id": "NATIVE_FILTER-ABCDEFGH",
+                        "targets": [{"datasetId": birth_names.id}],
+                    },
+                    {
+                        "id": "NATIVE_FILTER-IJKLMNOP",
+                        "targets": [{"datasetId": treemap.id}],
+                    },
+                ]
+            }
+        )
+
         mock_g.user = security_manager.find_user("gamma")
         mock_is_owner.return_value = False
         mock_can_access.return_value = False
@@ -1725,7 +1741,6 @@ class TestSecurityManager(SupersetTestCase):
 
         for kwarg in ["query_context", "viz"]:
             births.roles = []
-            db.session.flush()
 
             # No dashboard roles.
             with self.assertRaises(SupersetSecurityException):
@@ -1742,7 +1757,6 @@ class TestSecurityManager(SupersetTestCase):
                 )
 
             births.roles = [self.get_role("Gamma")]
-            db.session.flush()
 
             # Undefined dashboard.
             with self.assertRaises(SupersetSecurityException):
@@ -1807,7 +1821,50 @@ class TestSecurityManager(SupersetTestCase):
                 }
             )
 
-        db.session.rollback()
+            # Ill-defined native filter.
+            with self.assertRaises(SupersetSecurityException):
+                security_manager.raise_for_access(
+                    **{
+                        kwarg: Mock(
+                            datasource=birth_names,
+                            form_data={
+                                "dashboardId": births.id,
+                                "type": "NATIVE_FILTER",
+                            },
+                        )
+                    }
+                )
+
+            # Native filter not associated with said datasource.
+            with self.assertRaises(SupersetSecurityException):
+                security_manager.raise_for_access(
+                    **{
+                        kwarg: Mock(
+                            datasource=birth_names,
+                            form_data={
+                                "dashboardId": births.id,
+                                "native_filter_id": "NATIVE_FILTER-IJKLMNOP",
+                                "type": "NATIVE_FILTER",
+                            },
+                        )
+                    }
+                )
+
+            # Native filter associated with said datasource.
+            security_manager.raise_for_access(
+                **{
+                    kwarg: Mock(
+                        datasource=birth_names,
+                        form_data={
+                            "dashboardId": births.id,
+                            "native_filter_id": "NATIVE_FILTER-ABCDEFGH",
+                            "type": "NATIVE_FILTER",
+                        },
+                    )
+                }
+            )
+
+        db.session.expunge_all()
 
     @patch("superset.security.manager.g")
     def test_get_user_roles(self, mock_g):


[superset] 05/06: fix: Error when using the legacy dataset editor (#25057)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git

commit b5f7f54c7fc42090b00ff04a1a2988e999aa57ef
Author: Michael S. Molina <70...@users.noreply.github.com>
AuthorDate: Wed Aug 23 09:10:17 2023 -0300

    fix: Error when using the legacy dataset editor (#25057)
    
    Co-authored-by: John Bodley <45...@users.noreply.github.com>
    (cherry picked from commit c92a975e4b72962baf34d1fcbf2ee38011199377)
---
 .pre-commit-config.yaml           | 7 ++++---
 superset/connectors/sqla/views.py | 1 +
 2 files changed, 5 insertions(+), 3 deletions(-)

diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index 07544d66d2..9e0318456d 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -42,12 +42,13 @@ repos:
     hooks:
       - id: mypy
         args: [--check-untyped-defs]
-        additional_dependencies:
-          [
+        additional_dependencies: [
             types-simplejson,
             types-python-dateutil,
             types-requests,
-            types-redis,
+            # types-redis 4.6.0.5 is failing mypy
+            # because of https://github.com/python/typeshed/pull/10531
+            types-redis==4.6.0.4,
             types-pytz,
             types-croniter,
             types-PyYAML,
diff --git a/superset/connectors/sqla/views.py b/superset/connectors/sqla/views.py
index f72261eff8..4f32be1ad8 100644
--- a/superset/connectors/sqla/views.py
+++ b/superset/connectors/sqla/views.py
@@ -411,6 +411,7 @@ class TableModelView(  # pylint: disable=too-many-ancestors
         "database": QuerySelectField(
             "Database",
             query_func=lambda: db.session.query(models.Database),
+            get_pk_func=lambda item: item.id,
             widget=Select2Widget(extra_classes="readonly"),
         )
     }