You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@superset.apache.org by mi...@apache.org on 2023/10/31 14:35:31 UTC

(superset) branch 3.0 updated (293568ad5a -> 0c633f22e1)

This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a change to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git


    from 293568ad5a fix(dremio): Fixes issue with Dremio SQL generation for Charts with Series Limit (#25657)
     new 5293f5521d fix(sqllab): reinstate "Force trino client async execution" (#25680)
     new 315e75811f fix: remove unnecessary redirect (#25679)
     new 8483ab6c42 fix(chore): dashboard requests to database equal the number of slices it has (#24709)
     new 8da27eda40 fix: bump to FAB 4.3.9 remove CSP exception (#25712)
     new 9b31d97ac3 fix(horizontal filter label): show full tooltip with ellipsis (#25732)
     new fd2c2725d4 fix: Revert "fix(Charts): Set max row limit + removed the option to use an empty row limit value" (#25753)
     new 01d3ac20c7 fix: dataset update uniqueness (#25756)
     new fbe7e6265d fix(sqllab): slow pop datasource query (#25741)
     new 2f468900c8 fix: allow for backward compatible errors (#25640)
     new 1d403dab98 fix: DB-specific quoting in Jinja macro (#25779)
     new 0c633f22e1 fix: Revert "fix: Apply normalization to all dttm columns (#25147)" (#25801)

The 11 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../docs/databases/installing-database-drivers.mdx |  81 +++++------
 docs/docs/frequently-asked-questions.mdx           |   2 +-
 docs/docs/installation/configuring-superset.mdx    |   4 +-
 requirements/base.txt                              |   2 +-
 setup.py                                           |   2 +-
 .../src/shared-controls/sharedControls.tsx         |   9 +-
 .../superset-ui-core/src/validator/index.ts        |   1 -
 .../src/validator/validateMaxValue.ts              |   8 --
 .../test/validator/validateMaxValue.test.ts        |  38 -----
 superset-frontend/src/SqlLab/actions/sqlLab.js     |   8 +-
 .../SqlLab/components/ResultSet/ResultSet.test.tsx |   2 +-
 .../SaveDatasetModal/SaveDatasetModal.test.tsx     |   2 +-
 .../SqlLab/components/SaveDatasetModal/index.tsx   |   2 +-
 .../src/components/Datasource/DatasourceEditor.jsx |   2 +-
 .../components/Datasource/DatasourceModal.test.jsx | 156 ++++++++++++---------
 .../src/components/Datasource/DatasourceModal.tsx  |  26 +++-
 .../src/components/ErrorMessage/types.ts           |   2 +-
 .../FilterBar/FilterControls/FilterControl.tsx     |   7 +-
 superset-frontend/src/utils/errorMessages.ts       |   1 +
 superset/common/query_context_factory.py           |   1 -
 superset/common/query_context_processor.py         |   5 +-
 superset/common/query_object_factory.py            |  67 +--------
 superset/config.py                                 |   7 +-
 superset/daos/dashboard.py                         |   2 -
 superset/daos/dataset.py                           |   6 +-
 superset/datasets/commands/update.py               |   5 +-
 superset/db_engine_specs/base.py                   |  18 +++
 superset/db_engine_specs/trino.py                  |  66 ++++++++-
 superset/jinja_context.py                          |  45 ++++--
 superset/sql_lab.py                                |   7 +-
 tests/integration_tests/query_context_tests.py     |   8 +-
 .../unit_tests/common/test_query_object_factory.py |  90 +-----------
 tests/unit_tests/dao/dataset_test.py               |  83 +++++++++++
 tests/unit_tests/db_engine_specs/test_trino.py     |  31 +++-
 tests/unit_tests/jinja_context_test.py             |   9 +-
 tests/unit_tests/sql_lab_test.py                   |  10 +-
 36 files changed, 436 insertions(+), 379 deletions(-)
 delete mode 100644 superset-frontend/packages/superset-ui-core/src/validator/validateMaxValue.ts
 delete mode 100644 superset-frontend/packages/superset-ui-core/test/validator/validateMaxValue.test.ts
 create mode 100644 tests/unit_tests/dao/dataset_test.py


(superset) 08/11: fix(sqllab): slow pop datasource query (#25741)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git

commit fbe7e6265dec11da8a653e89e1ac9c2f9416a553
Author: JUST.in DO IT <ju...@airbnb.com>
AuthorDate: Thu Oct 26 12:44:41 2023 -0700

    fix(sqllab): slow pop datasource query (#25741)
    
    (cherry picked from commit 2a2bc82a8bbf900c825ba44e8b0f3f320b5962e0)
---
 superset-frontend/src/SqlLab/actions/sqlLab.js | 8 +++++++-
 1 file changed, 7 insertions(+), 1 deletion(-)

diff --git a/superset-frontend/src/SqlLab/actions/sqlLab.js b/superset-frontend/src/SqlLab/actions/sqlLab.js
index fbfba6783e..d25689e946 100644
--- a/superset-frontend/src/SqlLab/actions/sqlLab.js
+++ b/superset-frontend/src/SqlLab/actions/sqlLab.js
@@ -1384,8 +1384,14 @@ export function popDatasourceQuery(datasourceKey, sql) {
   return function (dispatch) {
     const QUERY_TEXT = t('Query');
     const datasetId = datasourceKey.split('__')[0];
+
+    const queryParams = rison.encode({
+      keys: ['none'],
+      columns: ['name', 'schema', 'database.id', 'select_star'],
+    });
+
     return SupersetClient.get({
-      endpoint: `/api/v1/dataset/${datasetId}?q=(keys:!(none))`,
+      endpoint: `/api/v1/dataset/${datasetId}?q=${queryParams}`,
     })
       .then(({ json }) =>
         dispatch(


(superset) 09/11: fix: allow for backward compatible errors (#25640)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 2f468900c87c6d384d928815cf85e572215b81c5
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Fri Oct 27 11:04:33 2023 -0700

    fix: allow for backward compatible errors (#25640)
---
 .../src/components/Datasource/DatasourceEditor.jsx |   2 +-
 .../components/Datasource/DatasourceModal.test.jsx | 156 ++++++++++++---------
 .../src/components/Datasource/DatasourceModal.tsx  |  26 +++-
 .../src/components/ErrorMessage/types.ts           |   2 +-
 superset-frontend/src/utils/errorMessages.ts       |   1 +
 5 files changed, 115 insertions(+), 72 deletions(-)

diff --git a/superset-frontend/src/components/Datasource/DatasourceEditor.jsx b/superset-frontend/src/components/Datasource/DatasourceEditor.jsx
index ffbba4c7db..195545a2f6 100644
--- a/superset-frontend/src/components/Datasource/DatasourceEditor.jsx
+++ b/superset-frontend/src/components/Datasource/DatasourceEditor.jsx
@@ -1386,7 +1386,7 @@ class DatasourceEditor extends React.PureComponent {
     const { theme } = this.props;
 
     return (
-      <DatasourceContainer>
+      <DatasourceContainer data-test="datasource-editor">
         {this.renderErrors()}
         <Alert
           css={theme => ({ marginBottom: theme.gridUnit * 4 })}
diff --git a/superset-frontend/src/components/Datasource/DatasourceModal.test.jsx b/superset-frontend/src/components/Datasource/DatasourceModal.test.jsx
index 5bcb705b68..6d991f24a0 100644
--- a/superset-frontend/src/components/Datasource/DatasourceModal.test.jsx
+++ b/superset-frontend/src/components/Datasource/DatasourceModal.test.jsx
@@ -18,30 +18,35 @@
  */
 import React from 'react';
 import { act } from 'react-dom/test-utils';
-import { mount } from 'enzyme';
-import { Provider } from 'react-redux';
+import {
+  render,
+  screen,
+  waitFor,
+  fireEvent,
+  cleanup,
+} from '@testing-library/react';
 import fetchMock from 'fetch-mock';
+import { Provider } from 'react-redux';
 import sinon from 'sinon';
-import { supersetTheme, ThemeProvider } from '@superset-ui/core';
-
-import waitForComponentToPaint from 'spec/helpers/waitForComponentToPaint';
+import {
+  supersetTheme,
+  ThemeProvider,
+  SupersetClient,
+} from '@superset-ui/core';
 import { defaultStore as store } from 'spec/helpers/testing-library';
-import Modal from 'src/components/Modal';
 import { DatasourceModal } from 'src/components/Datasource';
-import DatasourceEditor from 'src/components/Datasource/DatasourceEditor';
 import * as uiCore from '@superset-ui/core';
 import mockDatasource from 'spec/fixtures/mockDatasource';
-import { api } from 'src/hooks/apiResources/queryApi';
-
-const datasource = mockDatasource['7__table'];
 
+// Define your constants here
 const SAVE_ENDPOINT = 'glob:*/api/v1/dataset/7';
 const SAVE_PAYLOAD = { new: 'data' };
 const SAVE_DATASOURCE_ENDPOINT = 'glob:*/api/v1/dataset/7';
 const GET_DATASOURCE_ENDPOINT = SAVE_DATASOURCE_ENDPOINT;
+const GET_DATABASE_ENDPOINT = 'glob:*/api/v1/database/?q=*';
 
 const mockedProps = {
-  datasource,
+  datasource: mockDatasource['7__table'],
   addSuccessToast: () => {},
   addDangerToast: () => {},
   onChange: () => {},
@@ -50,80 +55,101 @@ const mockedProps = {
   onDatasourceSave: sinon.spy(),
 };
 
-async function mountAndWait(props = mockedProps) {
-  const mounted = mount(
+let container;
+let isFeatureEnabledMock;
+
+async function renderAndWait(props = mockedProps) {
+  const { container: renderedContainer } = render(
     <Provider store={store}>
-      <DatasourceModal {...props} />
+      <ThemeProvider theme={supersetTheme}>
+        <DatasourceModal {...props} />
+      </ThemeProvider>
     </Provider>,
-    {
-      wrappingComponent: ThemeProvider,
-      wrappingComponentProps: { theme: supersetTheme },
-    },
   );
-  await waitForComponentToPaint(mounted);
 
-  return mounted;
+  container = renderedContainer;
 }
 
+beforeEach(() => {
+  fetchMock.reset();
+  cleanup();
+  isFeatureEnabledMock = jest.spyOn(uiCore, 'isFeatureEnabled');
+  renderAndWait();
+  fetchMock.post(SAVE_ENDPOINT, SAVE_PAYLOAD);
+  fetchMock.put(SAVE_DATASOURCE_ENDPOINT, {});
+  fetchMock.get(GET_DATASOURCE_ENDPOINT, { result: {} });
+  fetchMock.get(GET_DATABASE_ENDPOINT, { result: [] });
+});
+
+afterEach(() => {
+  isFeatureEnabledMock.mockRestore();
+});
+
 describe('DatasourceModal', () => {
-  let wrapper;
-  let isFeatureEnabledMock;
-  beforeEach(async () => {
-    isFeatureEnabledMock = jest.spyOn(uiCore, 'isFeatureEnabled');
-    fetchMock.reset();
-    wrapper = await mountAndWait();
+  it('renders', async () => {
+    expect(container).toBeDefined();
   });
 
-  afterAll(() => {
-    isFeatureEnabledMock.restore();
-    act(() => {
-      store.dispatch(api.util.resetApiState());
-    });
+  it('renders the component', () => {
+    expect(screen.getByText('Edit Dataset')).toBeInTheDocument();
   });
 
-  it('renders', () => {
-    expect(wrapper.find(DatasourceModal)).toExist();
+  it('renders a Modal', async () => {
+    expect(screen.getByRole('dialog')).toBeInTheDocument();
   });
 
-  it('renders a Modal', () => {
-    expect(wrapper.find(Modal)).toExist();
+  it('renders a DatasourceEditor', async () => {
+    expect(screen.getByTestId('datasource-editor')).toBeInTheDocument();
   });
 
-  it('renders a DatasourceEditor', () => {
-    expect(wrapper.find(DatasourceEditor)).toExist();
+  it('renders a legacy data source btn', () => {
+    const button = screen.getByTestId('datasource-modal-legacy-edit');
+    expect(button).toBeInTheDocument();
   });
 
-  it('saves on confirm', async () => {
-    const callsP = fetchMock.post(SAVE_ENDPOINT, SAVE_PAYLOAD);
-    fetchMock.put(SAVE_DATASOURCE_ENDPOINT, {});
-    fetchMock.get(GET_DATASOURCE_ENDPOINT, {});
-    act(() => {
-      wrapper
-        .find('button[data-test="datasource-modal-save"]')
-        .props()
-        .onClick();
+  it('disables the save button when the datasource is managed externally', () => {
+    // the render is currently in a before operation, so it needs to be cleaned up
+    // we could alternatively move all the renders back into the tests or find a better
+    // way to automatically render but still allow to pass in props with the tests
+    cleanup();
+
+    renderAndWait({
+      ...mockedProps,
+      datasource: { ...mockedProps.datasource, is_managed_externally: true },
     });
-    await waitForComponentToPaint(wrapper);
-    act(() => {
-      const okButton = wrapper.find(
-        '.ant-modal-confirm .ant-modal-confirm-btns .ant-btn-primary',
-      );
-      okButton.simulate('click');
+    const saveButton = screen.getByTestId('datasource-modal-save');
+    expect(saveButton).toBeDisabled();
+  });
+
+  it('calls the onDatasourceSave function when the save button is clicked', async () => {
+    cleanup();
+    const onDatasourceSave = jest.fn();
+
+    renderAndWait({ ...mockedProps, onDatasourceSave });
+    const saveButton = screen.getByTestId('datasource-modal-save');
+    await act(async () => {
+      fireEvent.click(saveButton);
+      const okButton = await screen.findByRole('button', { name: 'OK' });
+      okButton.click();
+    });
+    await waitFor(() => {
+      expect(onDatasourceSave).toHaveBeenCalled();
     });
-    await waitForComponentToPaint(wrapper);
-    // one call to PUT, then one to GET
-    const expected = [
-      'http://localhost/api/v1/dataset/7',
-      'http://localhost/api/v1/dataset/7',
-    ];
-    expect(callsP._calls.map(call => call[0])).toEqual(
-      expected,
-    ); /* eslint no-underscore-dangle: 0 */
   });
 
-  it('renders a legacy data source btn', () => {
-    expect(
-      wrapper.find('button[data-test="datasource-modal-legacy-edit"]'),
-    ).toExist();
+  it.only('should render error dialog', async () => {
+    jest
+      .spyOn(SupersetClient, 'put')
+      .mockRejectedValue(new Error('Something went wrong'));
+    await act(async () => {
+      const saveButton = screen.getByTestId('datasource-modal-save');
+      fireEvent.click(saveButton);
+      const okButton = await screen.findByRole('button', { name: 'OK' });
+      okButton.click();
+    });
+    await act(async () => {
+      const errorTitle = await screen.findByText('Error saving dataset');
+      expect(errorTitle).toBeInTheDocument();
+    });
   });
 });
diff --git a/superset-frontend/src/components/Datasource/DatasourceModal.tsx b/superset-frontend/src/components/Datasource/DatasourceModal.tsx
index f9c40c47ba..031609e09a 100644
--- a/superset-frontend/src/components/Datasource/DatasourceModal.tsx
+++ b/superset-frontend/src/components/Datasource/DatasourceModal.tsx
@@ -28,12 +28,13 @@ import {
   SupersetClient,
   t,
 } from '@superset-ui/core';
-
 import Modal from 'src/components/Modal';
 import AsyncEsmComponent from 'src/components/AsyncEsmComponent';
-import { getClientErrorObject } from 'src/utils/getClientErrorObject';
+import { SupersetError } from 'src/components/ErrorMessage/types';
+import ErrorMessageWithStackTrace from 'src/components/ErrorMessage/ErrorMessageWithStackTrace';
 import withToasts from 'src/components/MessageToasts/withToasts';
 import { useSelector } from 'react-redux';
+import { getClientErrorObject } from 'src/utils/getClientErrorObject';
 
 const DatasourceEditor = AsyncEsmComponent(() => import('./DatasourceEditor'));
 
@@ -202,11 +203,26 @@ const DatasourceModal: FunctionComponent<DatasourceModalProps> = ({
       })
       .catch(response => {
         setIsSaving(false);
-        getClientErrorObject(response).then(({ error }) => {
+        getClientErrorObject(response).then(error => {
+          let errorResponse: SupersetError | undefined;
+          let errorText: string | undefined;
+          // sip-40 error response
+          if (error?.errors?.length) {
+            errorResponse = error.errors[0];
+          } else if (typeof error.error === 'string') {
+            // backward compatible with old error messages
+            errorText = error.error;
+          }
           modal.error({
-            title: t('Error'),
-            content: error || t('An error has occurred'),
+            title: t('Error saving dataset'),
             okButtonProps: { danger: true, className: 'btn-danger' },
+            content: (
+              <ErrorMessageWithStackTrace
+                error={errorResponse}
+                source="crud"
+                fallback={errorText}
+              />
+            ),
           });
         });
       });
diff --git a/superset-frontend/src/components/ErrorMessage/types.ts b/superset-frontend/src/components/ErrorMessage/types.ts
index d3fe5bfdf7..4375a9dec1 100644
--- a/superset-frontend/src/components/ErrorMessage/types.ts
+++ b/superset-frontend/src/components/ErrorMessage/types.ts
@@ -88,7 +88,7 @@ export type ErrorType = ValueOf<typeof ErrorTypeEnum>;
 // Keep in sync with superset/views/errors.py
 export type ErrorLevel = 'info' | 'warning' | 'error';
 
-export type ErrorSource = 'dashboard' | 'explore' | 'sqllab';
+export type ErrorSource = 'dashboard' | 'explore' | 'sqllab' | 'crud';
 
 export type SupersetError<ExtraType = Record<string, any> | null> = {
   error_type: ErrorType;
diff --git a/superset-frontend/src/utils/errorMessages.ts b/superset-frontend/src/utils/errorMessages.ts
index 16a04105c4..d5bfbdc17b 100644
--- a/superset-frontend/src/utils/errorMessages.ts
+++ b/superset-frontend/src/utils/errorMessages.ts
@@ -16,6 +16,7 @@
  * specific language governing permissions and limitations
  * under the License.
  */
+
 // Error messages used in many places across applications
 const COMMON_ERR_MESSAGES = {
   SESSION_TIMED_OUT:


(superset) 02/11: fix: remove unnecessary redirect (#25679)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 315e75811f5dbbc33ac1220bd95277cfc89465ed
Author: Igor Khrol <ig...@automattic.com>
AuthorDate: Thu Oct 19 21:03:44 2023 +0300

    fix: remove unnecessary redirect (#25679)
    
    (cherry picked from commit da42bf2dbb82a40d5ffcc9bfdc46584cb36af616)
---
 superset-frontend/src/SqlLab/components/ResultSet/ResultSet.test.tsx    | 2 +-
 .../src/SqlLab/components/SaveDatasetModal/SaveDatasetModal.test.tsx    | 2 +-
 superset-frontend/src/SqlLab/components/SaveDatasetModal/index.tsx      | 2 +-
 3 files changed, 3 insertions(+), 3 deletions(-)

diff --git a/superset-frontend/src/SqlLab/components/ResultSet/ResultSet.test.tsx b/superset-frontend/src/SqlLab/components/ResultSet/ResultSet.test.tsx
index 5e2a0455b5..d823c586f7 100644
--- a/superset-frontend/src/SqlLab/components/ResultSet/ResultSet.test.tsx
+++ b/superset-frontend/src/SqlLab/components/ResultSet/ResultSet.test.tsx
@@ -95,7 +95,7 @@ const asyncRefetchResultsTableProps = {
     resultsKey: 'async results key',
   },
 };
-fetchMock.get('glob:*/api/v1/dataset?*', { result: [] });
+fetchMock.get('glob:*/api/v1/dataset/?*', { result: [] });
 
 const middlewares = [thunk];
 const mockStore = configureStore(middlewares);
diff --git a/superset-frontend/src/SqlLab/components/SaveDatasetModal/SaveDatasetModal.test.tsx b/superset-frontend/src/SqlLab/components/SaveDatasetModal/SaveDatasetModal.test.tsx
index 4cac5c6204..8568bf2080 100644
--- a/superset-frontend/src/SqlLab/components/SaveDatasetModal/SaveDatasetModal.test.tsx
+++ b/superset-frontend/src/SqlLab/components/SaveDatasetModal/SaveDatasetModal.test.tsx
@@ -39,7 +39,7 @@ const mockedProps = {
   datasource: testQuery,
 };
 
-fetchMock.get('glob:*/api/v1/dataset?*', {
+fetchMock.get('glob:*/api/v1/dataset/?*', {
   result: mockdatasets,
   dataset_count: 3,
 });
diff --git a/superset-frontend/src/SqlLab/components/SaveDatasetModal/index.tsx b/superset-frontend/src/SqlLab/components/SaveDatasetModal/index.tsx
index eba873c83b..1932798138 100644
--- a/superset-frontend/src/SqlLab/components/SaveDatasetModal/index.tsx
+++ b/superset-frontend/src/SqlLab/components/SaveDatasetModal/index.tsx
@@ -257,7 +257,7 @@ export const SaveDatasetModal = ({
       });
 
       return SupersetClient.get({
-        endpoint: `/api/v1/dataset?q=${queryParams}`,
+        endpoint: `/api/v1/dataset/?q=${queryParams}`,
       }).then(response => ({
         data: response.json.result.map(
           (r: { table_name: string; id: number; owners: [DatasetOwner] }) => ({


(superset) 01/11: fix(sqllab): reinstate "Force trino client async execution" (#25680)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 5293f5521d795d9f97a7470b1b9bd97091a190f4
Author: Rob Moore <gi...@users.noreply.github.com>
AuthorDate: Thu Oct 19 14:38:13 2023 +0100

    fix(sqllab): reinstate "Force trino client async execution" (#25680)
---
 .../docs/databases/installing-database-drivers.mdx | 81 +++++++++++-----------
 docs/docs/frequently-asked-questions.mdx           |  2 +-
 docs/docs/installation/configuring-superset.mdx    |  4 +-
 superset/config.py                                 |  5 +-
 superset/db_engine_specs/base.py                   | 18 +++++
 superset/db_engine_specs/trino.py                  | 66 ++++++++++++++++--
 superset/sql_lab.py                                |  7 +-
 tests/unit_tests/db_engine_specs/test_trino.py     | 31 ++++++++-
 tests/unit_tests/sql_lab_test.py                   | 10 ++-
 9 files changed, 163 insertions(+), 61 deletions(-)

diff --git a/docs/docs/databases/installing-database-drivers.mdx b/docs/docs/databases/installing-database-drivers.mdx
index e4e972f064..57652db4b8 100644
--- a/docs/docs/databases/installing-database-drivers.mdx
+++ b/docs/docs/databases/installing-database-drivers.mdx
@@ -22,46 +22,47 @@ as well as the packages needed to connect to the databases you want to access th
 
 Some of the recommended packages are shown below. Please refer to [setup.py](https://github.com/apache/superset/blob/master/setup.py) for the versions that are compatible with Superset.
 
-| Database                                                  | PyPI package                                                                       | Connection String                                                                                           |
-| --------------------------------------------------------- | ---------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------- |
-| [Amazon Athena](/docs/databases/athena)                   | `pip install pyathena[pandas]` , `pip install PyAthenaJDBC`                        | `awsathena+rest://{aws_access_key_id}:{aws_secret_access_key}@athena.{region_name}.amazonaws.com/{ `        |
-| [Amazon DynamoDB](/docs/databases/dynamodb)               | `pip install pydynamodb`                                                           | `dynamodb://{access_key_id}:{secret_access_key}@dynamodb.{region_name}.amazonaws.com?connector=superset`    |
-| [Amazon Redshift](/docs/databases/redshift)               | `pip install sqlalchemy-redshift`                                                  | ` redshift+psycopg2://<userName>:<DBPassword>@<AWS End Point>:5439/<Database Name>`                         |
-| [Apache Drill](/docs/databases/drill)                     | `pip install sqlalchemy-drill`                                                     | `drill+sadrill:// For JDBC drill+jdbc://`                                                                   |
-| [Apache Druid](/docs/databases/druid)                     | `pip install pydruid`                                                              | `druid://<User>:<password>@<Host>:<Port-default-9088>/druid/v2/sql`                                         |
-| [Apache Hive](/docs/databases/hive)                       | `pip install pyhive`                                                               | `hive://hive@{hostname}:{port}/{database}`                                                                  |
-| [Apache Impala](/docs/databases/impala)                   | `pip install impyla`                                                               | `impala://{hostname}:{port}/{database}`                                                                     |
-| [Apache Kylin](/docs/databases/kylin)                     | `pip install kylinpy`                                                              | `kylin://<username>:<password>@<hostname>:<port>/<project>?<param1>=<value1>&<param2>=<value2>`             |
-| [Apache Pinot](/docs/databases/pinot)                     | `pip install pinotdb`                                                              | `pinot://BROKER:5436/query?server=http://CONTROLLER:5983/`                                                  |
-| [Apache Solr](/docs/databases/solr)                       | `pip install sqlalchemy-solr`                                                      | `solr://{username}:{password}@{hostname}:{port}/{server_path}/{collection}`                                 |
-| [Apache Spark SQL](/docs/databases/spark-sql)             | `pip install pyhive`                                                               | `hive://hive@{hostname}:{port}/{database}`                                                                  |
-| [Ascend.io](/docs/databases/ascend)                       | `pip install impyla`                                                               | `ascend://{username}:{password}@{hostname}:{port}/{database}?auth_mechanism=PLAIN;use_ssl=true`             |
-| [Azure MS SQL](/docs/databases/sql-server)                | `pip install pymssql`                                                              | `mssql+pymssql://UserName@presetSQL:TestPassword@presetSQL.database.windows.net:1433/TestSchema`            |
-| [Big Query](/docs/databases/bigquery)                     | `pip install sqlalchemy-bigquery`                                                  | `bigquery://{project_id}`                                                                                   |
-| [ClickHouse](/docs/databases/clickhouse)                  | `pip install clickhouse-connect`                                                   | `clickhousedb://{username}:{password}@{hostname}:{port}/{database}`                                         |
-| [CockroachDB](/docs/databases/cockroachdb)                | `pip install cockroachdb`                                                          | `cockroachdb://root@{hostname}:{port}/{database}?sslmode=disable`                                           |
-| [Dremio](/docs/databases/dremio)                          | `pip install sqlalchemy_dremio`                                                    | `dremio://user:pwd@host:31010/`                                                                             |
-| [Elasticsearch](/docs/databases/elasticsearch)            | `pip install elasticsearch-dbapi`                                                  | `elasticsearch+http://{user}:{password}@{host}:9200/`                                                       |
-| [Exasol](/docs/databases/exasol)                          | `pip install sqlalchemy-exasol`                                                    | `exa+pyodbc://{username}:{password}@{hostname}:{port}/my_schema?CONNECTIONLCALL=en_US.UTF-8&driver=EXAODBC` |
-| [Google Sheets](/docs/databases/google-sheets)            | `pip install shillelagh[gsheetsapi]`                                               | `gsheets://`                                                                                                |
-| [Firebolt](/docs/databases/firebolt)                      | `pip install firebolt-sqlalchemy`                                                  | `firebolt://{username}:{password}@{database} or firebolt://{username}:{password}@{database}/{engine_name}`  |
-| [Hologres](/docs/databases/hologres)                      | `pip install psycopg2`                                                             | `postgresql+psycopg2://<UserName>:<DBPassword>@<Database Host>/<Database Name>`                             |
-| [IBM Db2](/docs/databases/ibm-db2)                        | `pip install ibm_db_sa`                                                            | `db2+ibm_db://`                                                                                             |
-| [IBM Netezza Performance Server](/docs/databases/netezza) | `pip install nzalchemy`                                                            | `netezza+nzpy://<UserName>:<DBPassword>@<Database Host>/<Database Name>`                                    |
-| [MySQL](/docs/databases/mysql)                            | `pip install mysqlclient`                                                          | `mysql://<UserName>:<DBPassword>@<Database Host>/<Database Name>`                                           |
-| [Oracle](/docs/databases/oracle)                          | `pip install cx_Oracle`                                                            | `oracle://`                                                                                                 |
-| [PostgreSQL](/docs/databases/postgres)                    | `pip install psycopg2`                                                             | `postgresql://<UserName>:<DBPassword>@<Database Host>/<Database Name>`                                      |
-| [Trino](/docs/databases/trino)                            | `pip install trino`                                                                | `trino://{username}:{password}@{hostname}:{port}/{catalog}`                                                 |
-| [Presto](/docs/databases/presto)                          | `pip install pyhive`                                                               | `presto://`                                                                                                 |
-| [SAP Hana](/docs/databases/hana)                          | `pip install hdbcli sqlalchemy-hana or pip install apache-superset[hana]`          | `hana://{username}:{password}@{host}:{port}`                                                                |
-| [StarRocks](/docs/databases/starrocks)                    | `pip install starrocks`                                                            | `starrocks://<User>:<Password>@<Host>:<Port>/<Catalog>.<Database>`                                          |
-| [Snowflake](/docs/databases/snowflake)                    | `pip install snowflake-sqlalchemy`                                                 | `snowflake://{user}:{password}@{account}.{region}/{database}?role={role}&warehouse={warehouse}`             |
-| SQLite                                                    | No additional library needed                                                       | `sqlite://`                                                                                                 |
-| [SQL Server](/docs/databases/sql-server)                  | `pip install pymssql`                                                              | `mssql+pymssql://`                                                                                                  |
-| [Teradata](/docs/databases/teradata)                      | `pip install teradatasqlalchemy`                                                   | `teradatasql://{user}:{password}@{host}`                                                                    |
-| [TimescaleDB](/docs/databases/timescaledb)                | `pip install psycopg2`                                                             | `postgresql://<UserName>:<DBPassword>@<Database Host>:<Port>/<Database Name>`                               |
-| [Vertica](/docs/databases/vertica)                        | `pip install sqlalchemy-vertica-python`                                            | `vertica+vertica_python://<UserName>:<DBPassword>@<Database Host>/<Database Name>`                          |
-| [YugabyteDB](/docs/databases/yugabytedb)                  | `pip install psycopg2`                                                             | `postgresql://<UserName>:<DBPassword>@<Database Host>/<Database Name>`                                      |
+| Database                                                  | PyPI package                                                              | Connection String                                                                                           |
+| --------------------------------------------------------- | ------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------- |
+| [Amazon Athena](/docs/databases/athena)                   | `pip install pyathena[pandas]` , `pip install PyAthenaJDBC`               | `awsathena+rest://{aws_access_key_id}:{aws_secret_access_key}@athena.{region_name}.amazonaws.com/{ `        |
+| [Amazon DynamoDB](/docs/databases/dynamodb)               | `pip install pydynamodb`                                                  | `dynamodb://{access_key_id}:{secret_access_key}@dynamodb.{region_name}.amazonaws.com?connector=superset`    |
+| [Amazon Redshift](/docs/databases/redshift)               | `pip install sqlalchemy-redshift`                                         | ` redshift+psycopg2://<userName>:<DBPassword>@<AWS End Point>:5439/<Database Name>`                         |
+| [Apache Drill](/docs/databases/drill)                     | `pip install sqlalchemy-drill`                                            | `drill+sadrill:// For JDBC drill+jdbc://`                                                                   |
+| [Apache Druid](/docs/databases/druid)                     | `pip install pydruid`                                                     | `druid://<User>:<password>@<Host>:<Port-default-9088>/druid/v2/sql`                                         |
+| [Apache Hive](/docs/databases/hive)                       | `pip install pyhive`                                                      | `hive://hive@{hostname}:{port}/{database}`                                                                  |
+| [Apache Impala](/docs/databases/impala)                   | `pip install impyla`                                                      | `impala://{hostname}:{port}/{database}`                                                                     |
+| [Apache Kylin](/docs/databases/kylin)                     | `pip install kylinpy`                                                     | `kylin://<username>:<password>@<hostname>:<port>/<project>?<param1>=<value1>&<param2>=<value2>`             |
+| [Apache Pinot](/docs/databases/pinot)                     | `pip install pinotdb`                                                     | `pinot://BROKER:5436/query?server=http://CONTROLLER:5983/`                                                  |
+| [Apache Solr](/docs/databases/solr)                       | `pip install sqlalchemy-solr`                                             | `solr://{username}:{password}@{hostname}:{port}/{server_path}/{collection}`                                 |
+| [Apache Spark SQL](/docs/databases/spark-sql)             | `pip install pyhive`                                                      | `hive://hive@{hostname}:{port}/{database}`                                                                  |
+| [Ascend.io](/docs/databases/ascend)                       | `pip install impyla`                                                      | `ascend://{username}:{password}@{hostname}:{port}/{database}?auth_mechanism=PLAIN;use_ssl=true`             |
+| [Azure MS SQL](/docs/databases/sql-server)                | `pip install pymssql`                                                     | `mssql+pymssql://UserName@presetSQL:TestPassword@presetSQL.database.windows.net:1433/TestSchema`            |
+| [Big Query](/docs/databases/bigquery)                     | `pip install sqlalchemy-bigquery`                                         | `bigquery://{project_id}`                                                                                   |
+| [ClickHouse](/docs/databases/clickhouse)                  | `pip install clickhouse-connect`                                          | `clickhousedb://{username}:{password}@{hostname}:{port}/{database}`                                         |
+| [CockroachDB](/docs/databases/cockroachdb)                | `pip install cockroachdb`                                                 | `cockroachdb://root@{hostname}:{port}/{database}?sslmode=disable`                                           |
+| [Dremio](/docs/databases/dremio)                          | `pip install sqlalchemy_dremio`                                           | `dremio://user:pwd@host:31010/`                                                                             |
+| [Elasticsearch](/docs/databases/elasticsearch)            | `pip install elasticsearch-dbapi`                                         | `elasticsearch+http://{user}:{password}@{host}:9200/`                                                       |
+| [Exasol](/docs/databases/exasol)                          | `pip install sqlalchemy-exasol`                                           | `exa+pyodbc://{username}:{password}@{hostname}:{port}/my_schema?CONNECTIONLCALL=en_US.UTF-8&driver=EXAODBC` |
+| [Google Sheets](/docs/databases/google-sheets)            | `pip install shillelagh[gsheetsapi]`                                      | `gsheets://`                                                                                                |
+| [Firebolt](/docs/databases/firebolt)                      | `pip install firebolt-sqlalchemy`                                         | `firebolt://{username}:{password}@{database} or firebolt://{username}:{password}@{database}/{engine_name}`  |
+| [Hologres](/docs/databases/hologres)                      | `pip install psycopg2`                                                    | `postgresql+psycopg2://<UserName>:<DBPassword>@<Database Host>/<Database Name>`                             |
+| [IBM Db2](/docs/databases/ibm-db2)                        | `pip install ibm_db_sa`                                                   | `db2+ibm_db://`                                                                                             |
+| [IBM Netezza Performance Server](/docs/databases/netezza) | `pip install nzalchemy`                                                   | `netezza+nzpy://<UserName>:<DBPassword>@<Database Host>/<Database Name>`                                    |
+| [MySQL](/docs/databases/mysql)                            | `pip install mysqlclient`                                                 | `mysql://<UserName>:<DBPassword>@<Database Host>/<Database Name>`                                           |
+| [Oracle](/docs/databases/oracle)                          | `pip install cx_Oracle`                                                   | `oracle://`                                                                                                 |
+| [PostgreSQL](/docs/databases/postgres)                    | `pip install psycopg2`                                                    | `postgresql://<UserName>:<DBPassword>@<Database Host>/<Database Name>`                                      |
+| [Trino](/docs/databases/trino)                            | `pip install trino`                                                       | `trino://{username}:{password}@{hostname}:{port}/{catalog}`                                                 |
+| [Presto](/docs/databases/presto)                          | `pip install pyhive`                                                      | `presto://`                                                                                                 |
+| [SAP Hana](/docs/databases/hana)                          | `pip install hdbcli sqlalchemy-hana or pip install apache-superset[hana]` | `hana://{username}:{password}@{host}:{port}`                                                                |
+| [StarRocks](/docs/databases/starrocks)                    | `pip install starrocks`                                                   | `starrocks://<User>:<Password>@<Host>:<Port>/<Catalog>.<Database>`                                          |
+| [Snowflake](/docs/databases/snowflake)                    | `pip install snowflake-sqlalchemy`                                        | `snowflake://{user}:{password}@{account}.{region}/{database}?role={role}&warehouse={warehouse}`             |
+| SQLite                                                    | No additional library needed                                              | `sqlite://path/to/file.db?check_same_thread=false`                                                          |
+| [SQL Server](/docs/databases/sql-server)                  | `pip install pymssql`                                                     | `mssql+pymssql://`                                                                                          |
+| [Teradata](/docs/databases/teradata)                      | `pip install teradatasqlalchemy`                                          | `teradatasql://{user}:{password}@{host}`                                                                    |
+| [TimescaleDB](/docs/databases/timescaledb)                | `pip install psycopg2`                                                    | `postgresql://<UserName>:<DBPassword>@<Database Host>:<Port>/<Database Name>`                               |
+| [Vertica](/docs/databases/vertica)                        | `pip install sqlalchemy-vertica-python`                                   | `vertica+vertica_python://<UserName>:<DBPassword>@<Database Host>/<Database Name>`                          |
+| [YugabyteDB](/docs/databases/yugabytedb)                  | `pip install psycopg2`                                                    | `postgresql://<UserName>:<DBPassword>@<Database Host>/<Database Name>`                                      |
+
 ---
 
 Note that many other databases are supported, the main criteria being the existence of a functional
diff --git a/docs/docs/frequently-asked-questions.mdx b/docs/docs/frequently-asked-questions.mdx
index bbb94d617b..79a0863b08 100644
--- a/docs/docs/frequently-asked-questions.mdx
+++ b/docs/docs/frequently-asked-questions.mdx
@@ -168,7 +168,7 @@ Another workaround is to change where superset stores the sqlite database by add
 `superset_config.py`:
 
 ```
-SQLALCHEMY_DATABASE_URI = 'sqlite:////new/location/superset.db'
+SQLALCHEMY_DATABASE_URI = 'sqlite:////new/location/superset.db?check_same_thread=false'
 ```
 
 You can read more about customizing Superset using the configuration file
diff --git a/docs/docs/installation/configuring-superset.mdx b/docs/docs/installation/configuring-superset.mdx
index 9cb3aaefac..c6108d6f59 100644
--- a/docs/docs/installation/configuring-superset.mdx
+++ b/docs/docs/installation/configuring-superset.mdx
@@ -32,7 +32,9 @@ SECRET_KEY = 'YOUR_OWN_RANDOM_GENERATED_SECRET_KEY'
 # superset metadata (slices, connections, tables, dashboards, ...).
 # Note that the connection information to connect to the datasources
 # you want to explore are managed directly in the web UI
-SQLALCHEMY_DATABASE_URI = 'sqlite:////path/to/superset.db'
+# The check_same_thread=false property ensures the sqlite client does not attempt
+# to enforce single-threaded access, which may be problematic in some edge cases
+SQLALCHEMY_DATABASE_URI = 'sqlite:////path/to/superset.db?check_same_thread=false'
 
 # Flask-WTF flag for CSRF
 WTF_CSRF_ENABLED = True
diff --git a/superset/config.py b/superset/config.py
index 27f78832d1..73553fcc6c 100644
--- a/superset/config.py
+++ b/superset/config.py
@@ -186,7 +186,10 @@ SQLALCHEMY_TRACK_MODIFICATIONS = False
 SECRET_KEY = os.environ.get("SUPERSET_SECRET_KEY") or CHANGE_ME_SECRET_KEY
 
 # The SQLAlchemy connection string.
-SQLALCHEMY_DATABASE_URI = "sqlite:///" + os.path.join(DATA_DIR, "superset.db")
+SQLALCHEMY_DATABASE_URI = (
+    f"""sqlite:///{os.path.join(DATA_DIR, "superset.db")}?check_same_thread=false"""
+)
+
 # SQLALCHEMY_DATABASE_URI = 'mysql://myapp@localhost/myapp'
 # SQLALCHEMY_DATABASE_URI = 'postgresql://root:password@localhost/myapp'
 
diff --git a/superset/db_engine_specs/base.py b/superset/db_engine_specs/base.py
index 5836e6163f..6be3ab24b0 100644
--- a/superset/db_engine_specs/base.py
+++ b/superset/db_engine_specs/base.py
@@ -1053,6 +1053,24 @@ class BaseEngineSpec:  # pylint: disable=too-many-public-methods
         query object"""
         # TODO: Fix circular import error caused by importing sql_lab.Query
 
+    @classmethod
+    def execute_with_cursor(
+        cls, cursor: Any, sql: str, query: Query, session: Session
+    ) -> None:
+        """
+        Trigger execution of a query and handle the resulting cursor.
+
+        For most implementations this just makes calls to `execute` and
+        `handle_cursor` consecutively, but in some engines (e.g. Trino) we may
+        need to handle client limitations such as lack of async support and
+        perform a more complicated operation to get information from the cursor
+        in a timely manner and facilitate operations such as query stop
+        """
+        logger.debug("Query %d: Running query: %s", query.id, sql)
+        cls.execute(cursor, sql, async_=True)
+        logger.debug("Query %d: Handling cursor", query.id)
+        cls.handle_cursor(cursor, query, session)
+
     @classmethod
     def extract_error_message(cls, ex: Exception) -> str:
         return f"{cls.engine} error: {cls._extract_error_message(ex)}"
diff --git a/superset/db_engine_specs/trino.py b/superset/db_engine_specs/trino.py
index eff78c4fa4..f758f1fadd 100644
--- a/superset/db_engine_specs/trino.py
+++ b/superset/db_engine_specs/trino.py
@@ -17,6 +17,8 @@
 from __future__ import annotations
 
 import logging
+import threading
+import time
 from typing import Any, TYPE_CHECKING
 
 import simplejson as json
@@ -154,14 +156,21 @@ class TrinoEngineSpec(PrestoBaseEngineSpec):
 
     @classmethod
     def handle_cursor(cls, cursor: Cursor, query: Query, session: Session) -> None:
-        if tracking_url := cls.get_tracking_url(cursor):
-            query.tracking_url = tracking_url
+        """
+        Handle a trino client cursor.
+
+        WARNING: if you execute a query, it will block until complete and you
+        will not be able to handle the cursor until complete. Use
+        `execute_with_cursor` instead, to handle this asynchronously.
+        """
 
         # Adds the executed query id to the extra payload so the query can be cancelled
-        query.set_extra_json_key(
-            key=QUERY_CANCEL_KEY,
-            value=(cancel_query_id := cursor.stats["queryId"]),
-        )
+        cancel_query_id = cursor.query_id
+        logger.debug("Query %d: queryId %s found in cursor", query.id, cancel_query_id)
+        query.set_extra_json_key(key=QUERY_CANCEL_KEY, value=cancel_query_id)
+
+        if tracking_url := cls.get_tracking_url(cursor):
+            query.tracking_url = tracking_url
 
         session.commit()
 
@@ -176,6 +185,51 @@ class TrinoEngineSpec(PrestoBaseEngineSpec):
 
         super().handle_cursor(cursor=cursor, query=query, session=session)
 
+    @classmethod
+    def execute_with_cursor(
+        cls, cursor: Any, sql: str, query: Query, session: Session
+    ) -> None:
+        """
+        Trigger execution of a query and handle the resulting cursor.
+
+        Trino's client blocks until the query is complete, so we need to run it
+        in another thread and invoke `handle_cursor` to poll for the query ID
+        to appear on the cursor in parallel.
+        """
+        execute_result: dict[str, Any] = {}
+
+        def _execute(results: dict[str, Any]) -> None:
+            logger.debug("Query %d: Running query: %s", query.id, sql)
+
+            # Pass result / exception information back to the parent thread
+            try:
+                cls.execute(cursor, sql)
+                results["complete"] = True
+            except Exception as ex:  # pylint: disable=broad-except
+                results["complete"] = True
+                results["error"] = ex
+
+        execute_thread = threading.Thread(target=_execute, args=(execute_result,))
+        execute_thread.start()
+
+        # Wait for a query ID to be available before handling the cursor, as
+        # it's required by that method; it may never become available on error.
+        while not cursor.query_id and not execute_result.get("complete"):
+            time.sleep(0.1)
+
+        logger.debug("Query %d: Handling cursor", query.id)
+        cls.handle_cursor(cursor, query, session)
+
+        # Block until the query completes; same behaviour as the client itself
+        logger.debug("Query %d: Waiting for query to complete", query.id)
+        while not execute_result.get("complete"):
+            time.sleep(0.5)
+
+        # Unfortunately we'll mangle the stack trace due to the thread, but
+        # throwing the original exception allows mapping database errors as normal
+        if err := execute_result.get("error"):
+            raise err
+
     @classmethod
     def prepare_cancel_query(cls, query: Query, session: Session) -> None:
         if QUERY_CANCEL_KEY not in query.extra:
diff --git a/superset/sql_lab.py b/superset/sql_lab.py
index afc682b10f..ca157b3240 100644
--- a/superset/sql_lab.py
+++ b/superset/sql_lab.py
@@ -191,7 +191,7 @@ def get_sql_results(  # pylint: disable=too-many-arguments
                 return handle_query_error(ex, query, session)
 
 
-def execute_sql_statement(  # pylint: disable=too-many-arguments,too-many-statements
+def execute_sql_statement(  # pylint: disable=too-many-arguments
     sql_statement: str,
     query: Query,
     session: Session,
@@ -271,10 +271,7 @@ def execute_sql_statement(  # pylint: disable=too-many-arguments,too-many-statem
             )
         session.commit()
         with stats_timing("sqllab.query.time_executing_query", stats_logger):
-            logger.debug("Query %d: Running query: %s", query.id, sql)
-            db_engine_spec.execute(cursor, sql, async_=True)
-            logger.debug("Query %d: Handling cursor", query.id)
-            db_engine_spec.handle_cursor(cursor, query, session)
+            db_engine_spec.execute_with_cursor(cursor, sql, query, session)
 
         with stats_timing("sqllab.query.time_fetching_results", stats_logger):
             logger.debug(
diff --git a/tests/unit_tests/db_engine_specs/test_trino.py b/tests/unit_tests/db_engine_specs/test_trino.py
index 963953d18b..1b50a683a0 100644
--- a/tests/unit_tests/db_engine_specs/test_trino.py
+++ b/tests/unit_tests/db_engine_specs/test_trino.py
@@ -352,7 +352,7 @@ def test_handle_cursor_early_cancel(
     query_id = "myQueryId"
 
     cursor_mock = engine_mock.return_value.__enter__.return_value
-    cursor_mock.stats = {"queryId": query_id}
+    cursor_mock.query_id = query_id
     session_mock = mocker.MagicMock()
 
     query = Query()
@@ -366,3 +366,32 @@ def test_handle_cursor_early_cancel(
         assert cancel_query_mock.call_args[1]["cancel_query_id"] == query_id
     else:
         assert cancel_query_mock.call_args is None
+
+
+def test_execute_with_cursor_in_parallel(mocker: MockerFixture):
+    """Test that `execute_with_cursor` fetches query ID from the cursor"""
+    from superset.db_engine_specs.trino import TrinoEngineSpec
+
+    query_id = "myQueryId"
+
+    mock_cursor = mocker.MagicMock()
+    mock_cursor.query_id = None
+
+    mock_query = mocker.MagicMock()
+    mock_session = mocker.MagicMock()
+
+    def _mock_execute(*args, **kwargs):
+        mock_cursor.query_id = query_id
+
+    mock_cursor.execute.side_effect = _mock_execute
+
+    TrinoEngineSpec.execute_with_cursor(
+        cursor=mock_cursor,
+        sql="SELECT 1 FROM foo",
+        query=mock_query,
+        session=mock_session,
+    )
+
+    mock_query.set_extra_json_key.assert_called_once_with(
+        key=QUERY_CANCEL_KEY, value=query_id
+    )
diff --git a/tests/unit_tests/sql_lab_test.py b/tests/unit_tests/sql_lab_test.py
index 29f45eab68..edc1fd2ec4 100644
--- a/tests/unit_tests/sql_lab_test.py
+++ b/tests/unit_tests/sql_lab_test.py
@@ -55,8 +55,8 @@ def test_execute_sql_statement(mocker: MockerFixture, app: None) -> None:
     )
 
     database.apply_limit_to_sql.assert_called_with("SELECT 42 AS answer", 2, force=True)
-    db_engine_spec.execute.assert_called_with(
-        cursor, "SELECT 42 AS answer LIMIT 2", async_=True
+    db_engine_spec.execute_with_cursor.assert_called_with(
+        cursor, "SELECT 42 AS answer LIMIT 2", query, session
     )
     SupersetResultSet.assert_called_with([(42,)], cursor.description, db_engine_spec)
 
@@ -106,10 +106,8 @@ def test_execute_sql_statement_with_rls(
         101,
         force=True,
     )
-    db_engine_spec.execute.assert_called_with(
-        cursor,
-        "SELECT * FROM sales WHERE organization_id=42 LIMIT 101",
-        async_=True,
+    db_engine_spec.execute_with_cursor.assert_called_with(
+        cursor, "SELECT * FROM sales WHERE organization_id=42 LIMIT 101", query, session
     )
     SupersetResultSet.assert_called_with([(42,)], cursor.description, db_engine_spec)
 


(superset) 10/11: fix: DB-specific quoting in Jinja macro (#25779)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 1d403dab9822a8cee6108669c53e53fad881c751
Author: Beto Dealmeida <ro...@dealmeida.net>
AuthorDate: Mon Oct 30 09:50:44 2023 -0400

    fix: DB-specific quoting in Jinja macro (#25779)
    
    (cherry picked from commit 5659c87ed2da1ebafe3578cac9c3c52aeb256c5d)
---
 superset/jinja_context.py              | 45 +++++++++++++++++++++++-----------
 tests/unit_tests/jinja_context_test.py |  9 +++++--
 2 files changed, 38 insertions(+), 16 deletions(-)

diff --git a/superset/jinja_context.py b/superset/jinja_context.py
index 4bb0b91a4e..a736b9278e 100644
--- a/superset/jinja_context.py
+++ b/superset/jinja_context.py
@@ -25,6 +25,7 @@ from flask_babel import gettext as _
 from jinja2 import DebugUndefined
 from jinja2.sandbox import SandboxedEnvironment
 from sqlalchemy.engine.interfaces import Dialect
+from sqlalchemy.sql.expression import bindparam
 from sqlalchemy.types import String
 from typing_extensions import TypedDict
 
@@ -397,23 +398,39 @@ def validate_template_context(
     return validate_context_types(context)
 
 
-def where_in(values: list[Any], mark: str = "'") -> str:
-    """
-    Given a list of values, build a parenthesis list suitable for an IN expression.
+class WhereInMacro:  # pylint: disable=too-few-public-methods
+    def __init__(self, dialect: Dialect):
+        self.dialect = dialect
 
-        >>> where_in([1, "b", 3])
-        (1, 'b', 3)
+    def __call__(self, values: list[Any], mark: Optional[str] = None) -> str:
+        """
+        Given a list of values, build a parenthesis list suitable for an IN expression.
 
-    """
+            >>> from sqlalchemy.dialects import mysql
+            >>> where_in = WhereInMacro(dialect=mysql.dialect())
+            >>> where_in([1, "Joe's", 3])
+            (1, 'Joe''s', 3)
 
-    def quote(value: Any) -> str:
-        if isinstance(value, str):
-            value = value.replace(mark, mark * 2)
-            return f"{mark}{value}{mark}"
-        return str(value)
+        """
+        binds = [bindparam(f"value_{i}", value) for i, value in enumerate(values)]
+        string_representations = [
+            str(
+                bind.compile(
+                    dialect=self.dialect, compile_kwargs={"literal_binds": True}
+                )
+            )
+            for bind in binds
+        ]
+        joined_values = ", ".join(string_representations)
+        result = f"({joined_values})"
+
+        if mark:
+            result += (
+                "\n-- WARNING: the `mark` parameter was removed from the `where_in` "
+                "macro for security reasons\n"
+            )
 
-    joined_values = ", ".join(quote(value) for value in values)
-    return f"({joined_values})"
+        return result
 
 
 class BaseTemplateProcessor:
@@ -449,7 +466,7 @@ class BaseTemplateProcessor:
         self.set_context(**kwargs)
 
         # custom filters
-        self._env.filters["where_in"] = where_in
+        self._env.filters["where_in"] = WhereInMacro(database.get_dialect())
 
     def set_context(self, **kwargs: Any) -> None:
         self._context.update(kwargs)
diff --git a/tests/unit_tests/jinja_context_test.py b/tests/unit_tests/jinja_context_test.py
index fe4b144d2f..114f046300 100644
--- a/tests/unit_tests/jinja_context_test.py
+++ b/tests/unit_tests/jinja_context_test.py
@@ -20,17 +20,22 @@ import json
 
 import pytest
 from pytest_mock import MockFixture
+from sqlalchemy.dialects import mysql
 
 from superset.datasets.commands.exceptions import DatasetNotFoundError
-from superset.jinja_context import dataset_macro, where_in
+from superset.jinja_context import dataset_macro, WhereInMacro
 
 
 def test_where_in() -> None:
     """
     Test the ``where_in`` Jinja2 filter.
     """
+    where_in = WhereInMacro(mysql.dialect())
     assert where_in([1, "b", 3]) == "(1, 'b', 3)"
-    assert where_in([1, "b", 3], '"') == '(1, "b", 3)'
+    assert where_in([1, "b", 3], '"') == (
+        "(1, 'b', 3)\n-- WARNING: the `mark` parameter was removed from the "
+        "`where_in` macro for security reasons\n"
+    )
     assert where_in(["O'Malley's"]) == "('O''Malley''s')"
 
 


(superset) 05/11: fix(horizontal filter label): show full tooltip with ellipsis (#25732)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 9b31d97ac337bc02f4bfe5c3e5f091da685567a1
Author: Ross Mabbett <92...@users.noreply.github.com>
AuthorDate: Mon Oct 23 13:51:48 2023 -0300

    fix(horizontal filter label): show full tooltip with ellipsis (#25732)
    
    (cherry picked from commit e4173d90c8ccef58a87ec7ac00b57c1ec9317c11)
---
 .../nativeFilters/FilterBar/FilterControls/FilterControl.tsx       | 7 +++++--
 1 file changed, 5 insertions(+), 2 deletions(-)

diff --git a/superset-frontend/src/dashboard/components/nativeFilters/FilterBar/FilterControls/FilterControl.tsx b/superset-frontend/src/dashboard/components/nativeFilters/FilterBar/FilterControls/FilterControl.tsx
index 515fed1907..37739e5370 100644
--- a/superset-frontend/src/dashboard/components/nativeFilters/FilterBar/FilterControls/FilterControl.tsx
+++ b/superset-frontend/src/dashboard/components/nativeFilters/FilterBar/FilterControls/FilterControl.tsx
@@ -112,6 +112,7 @@ const HorizontalOverflowFilterControlContainer = styled(
 
 const VerticalFormItem = styled(StyledFormItem)`
   .ant-form-item-label {
+    overflow: visible;
     label.ant-form-item-required:not(.ant-form-item-required-mark-optional) {
       &::after {
         display: none;
@@ -127,6 +128,7 @@ const HorizontalFormItem = styled(StyledFormItem)`
   }
 
   .ant-form-item-label {
+    overflow: visible;
     padding-bottom: 0;
     margin-right: ${({ theme }) => theme.gridUnit * 2}px;
     label.ant-form-item-required:not(.ant-form-item-required-mark-optional) {
@@ -200,10 +202,11 @@ const DescriptionToolTip = ({ description }: { description: string }) => (
       placement="right"
       overlayInnerStyle={{
         display: '-webkit-box',
-        overflow: 'hidden',
-        WebkitLineClamp: 20,
+        WebkitLineClamp: 10,
         WebkitBoxOrient: 'vertical',
+        overflow: 'hidden',
         textOverflow: 'ellipsis',
+        whiteSpace: 'normal',
       }}
       getPopupContainer={trigger => trigger.parentElement as HTMLElement}
     >


(superset) 03/11: fix(chore): dashboard requests to database equal the number of slices it has (#24709)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 8483ab6c42f90ae7c3a62661c3a2a84858af238d
Author: Stepan <66...@users.noreply.github.com>
AuthorDate: Fri Oct 20 10:32:14 2023 +0300

    fix(chore): dashboard requests to database equal the number of slices it has (#24709)
    
    (cherry picked from commit 75a74313799b70b636c88cf421fd4d1118cc8a61)
---
 superset/daos/dashboard.py | 2 --
 1 file changed, 2 deletions(-)

diff --git a/superset/daos/dashboard.py b/superset/daos/dashboard.py
index f9544aa53d..2c03711f25 100644
--- a/superset/daos/dashboard.py
+++ b/superset/daos/dashboard.py
@@ -68,8 +68,6 @@ class DashboardDAO(BaseDAO[Dashboard]):
             query = (
                 db.session.query(Dashboard)
                 .filter(id_or_slug_filter(id_or_slug))
-                .outerjoin(Slice, Dashboard.slices)
-                .outerjoin(Slice.table)
                 .outerjoin(Dashboard.owners)
                 .outerjoin(Dashboard.roles)
             )


(superset) 11/11: fix: Revert "fix: Apply normalization to all dttm columns (#25147)" (#25801)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 0c633f22e1746cfb995ac0fc16aa2ef147e711e7
Author: John Bodley <45...@users.noreply.github.com>
AuthorDate: Tue Oct 31 06:21:47 2023 -0700

    fix: Revert "fix: Apply normalization to all dttm columns (#25147)" (#25801)
---
 superset/common/query_context_factory.py           |  1 -
 superset/common/query_context_processor.py         |  5 +-
 superset/common/query_object_factory.py            | 67 +---------------
 tests/integration_tests/query_context_tests.py     |  8 +-
 .../unit_tests/common/test_query_object_factory.py | 90 +---------------------
 5 files changed, 10 insertions(+), 161 deletions(-)

diff --git a/superset/common/query_context_factory.py b/superset/common/query_context_factory.py
index 62e8b79893..e4680ed5ed 100644
--- a/superset/common/query_context_factory.py
+++ b/superset/common/query_context_factory.py
@@ -186,7 +186,6 @@ class QueryContextFactory:  # pylint: disable=too-few-public-methods
                     filter
                     for filter in query_object.filter
                     if filter["col"] != filter_to_remove
-                    or filter["op"] != "TEMPORAL_RANGE"
                 ]
 
     def _apply_filters(self, query_object: QueryObject) -> None:
diff --git a/superset/common/query_context_processor.py b/superset/common/query_context_processor.py
index 754c9ae91a..f6152b232a 100644
--- a/superset/common/query_context_processor.py
+++ b/superset/common/query_context_processor.py
@@ -285,11 +285,10 @@ class QueryContextProcessor:
         datasource = self._qc_datasource
         labels = tuple(
             label
-            for label in {
+            for label in [
                 *get_base_axis_labels(query_object.columns),
-                *[col for col in query_object.columns or [] if isinstance(col, str)],
                 query_object.granularity,
-            }
+            ]
             if datasource
             # Query datasource didn't support `get_column`
             and hasattr(datasource, "get_column")
diff --git a/superset/common/query_object_factory.py b/superset/common/query_object_factory.py
index a76431122e..ae85912cdf 100644
--- a/superset/common/query_object_factory.py
+++ b/superset/common/query_object_factory.py
@@ -16,24 +16,17 @@
 # under the License.
 from __future__ import annotations
 
-from datetime import datetime
 from typing import Any, TYPE_CHECKING
 
 from superset.common.chart_data import ChartDataResultType
 from superset.common.query_object import QueryObject
 from superset.common.utils.time_range_utils import get_since_until_from_time_range
-from superset.utils.core import (
-    apply_max_row_limit,
-    DatasourceDict,
-    DatasourceType,
-    FilterOperator,
-    QueryObjectFilterClause,
-)
+from superset.utils.core import apply_max_row_limit, DatasourceDict, DatasourceType
 
 if TYPE_CHECKING:
     from sqlalchemy.orm import sessionmaker
 
-    from superset.connectors.base.models import BaseColumn, BaseDatasource
+    from superset.connectors.base.models import BaseDatasource
     from superset.daos.datasource import DatasourceDAO
 
 
@@ -73,10 +66,6 @@ class QueryObjectFactory:  # pylint: disable=too-few-public-methods
         )
         kwargs["from_dttm"] = from_dttm
         kwargs["to_dttm"] = to_dttm
-        if datasource_model_instance and kwargs.get("filters", []):
-            kwargs["filters"] = self._process_filters(
-                datasource_model_instance, kwargs["filters"]
-            )
         return QueryObject(
             datasource=datasource_model_instance,
             extras=extras,
@@ -113,55 +102,3 @@ class QueryObjectFactory:  # pylint: disable=too-few-public-methods
     # light version of the view.utils.core
     # import view.utils require application context
     # Todo: move it and the view.utils.core to utils package
-
-    # pylint: disable=no-self-use
-    def _process_filters(
-        self, datasource: BaseDatasource, query_filters: list[QueryObjectFilterClause]
-    ) -> list[QueryObjectFilterClause]:
-        def get_dttm_filter_value(
-            value: Any, col: BaseColumn, date_format: str
-        ) -> int | str:
-            if not isinstance(value, int):
-                return value
-            if date_format in {"epoch_ms", "epoch_s"}:
-                if date_format == "epoch_s":
-                    value = str(value)
-                else:
-                    value = str(value * 1000)
-            else:
-                dttm = datetime.utcfromtimestamp(value / 1000)
-                value = dttm.strftime(date_format)
-
-            if col.type in col.num_types:
-                value = int(value)
-            return value
-
-        for query_filter in query_filters:
-            if query_filter.get("op") == FilterOperator.TEMPORAL_RANGE:
-                continue
-            filter_col = query_filter.get("col")
-            if not isinstance(filter_col, str):
-                continue
-            column = datasource.get_column(filter_col)
-            if not column:
-                continue
-            filter_value = query_filter.get("val")
-
-            date_format = column.python_date_format
-            if not date_format and datasource.db_extra:
-                date_format = datasource.db_extra.get(
-                    "python_date_format_by_column_name", {}
-                ).get(column.column_name)
-
-            if column.is_dttm and date_format:
-                if isinstance(filter_value, list):
-                    query_filter["val"] = [
-                        get_dttm_filter_value(value, column, date_format)
-                        for value in filter_value
-                    ]
-                else:
-                    query_filter["val"] = get_dttm_filter_value(
-                        filter_value, column, date_format
-                    )
-
-        return query_filters
diff --git a/tests/integration_tests/query_context_tests.py b/tests/integration_tests/query_context_tests.py
index 00a98b2c21..8c2082d1c4 100644
--- a/tests/integration_tests/query_context_tests.py
+++ b/tests/integration_tests/query_context_tests.py
@@ -836,9 +836,11 @@ def test_special_chars_in_column_name(app_context, physical_dataset):
 
     query_object = qc.queries[0]
     df = qc.get_df_payload(query_object)["df"]
-
-    # sqlite doesn't have timestamp columns
-    if query_object.datasource.database.backend != "sqlite":
+    if query_object.datasource.database.backend == "sqlite":
+        # sqlite returns string as timestamp column
+        assert df["time column with spaces"][0] == "2002-01-03 00:00:00"
+        assert df["I_AM_A_TRUNC_COLUMN"][0] == "2002-01-01 00:00:00"
+    else:
         assert df["time column with spaces"][0].strftime("%Y-%m-%d") == "2002-01-03"
         assert df["I_AM_A_TRUNC_COLUMN"][0].strftime("%Y-%m-%d") == "2002-01-01"
 
diff --git a/tests/unit_tests/common/test_query_object_factory.py b/tests/unit_tests/common/test_query_object_factory.py
index 4e8fadfe3e..02304828dc 100644
--- a/tests/unit_tests/common/test_query_object_factory.py
+++ b/tests/unit_tests/common/test_query_object_factory.py
@@ -43,45 +43,9 @@ def session_factory() -> Mock:
     return Mock()
 
 
-class SimpleDatasetColumn:
-    def __init__(self, col_params: dict[str, Any]):
-        self.__dict__.update(col_params)
-
-
-TEMPORAL_COLUMN_NAMES = ["temporal_column", "temporal_column_with_python_date_format"]
-TEMPORAL_COLUMNS = {
-    TEMPORAL_COLUMN_NAMES[0]: SimpleDatasetColumn(
-        {
-            "column_name": TEMPORAL_COLUMN_NAMES[0],
-            "is_dttm": True,
-            "python_date_format": None,
-            "type": "string",
-            "num_types": ["BIGINT"],
-        }
-    ),
-    TEMPORAL_COLUMN_NAMES[1]: SimpleDatasetColumn(
-        {
-            "column_name": TEMPORAL_COLUMN_NAMES[1],
-            "type": "BIGINT",
-            "is_dttm": True,
-            "python_date_format": "%Y",
-            "num_types": ["BIGINT"],
-        }
-    ),
-}
-
-
 @fixture
 def connector_registry() -> Mock:
-    datasource_dao_mock = Mock(spec=["get_datasource"])
-    datasource_dao_mock.get_datasource.return_value = Mock()
-    datasource_dao_mock.get_datasource().get_column = Mock(
-        side_effect=lambda col_name: TEMPORAL_COLUMNS[col_name]
-        if col_name in TEMPORAL_COLUMN_NAMES
-        else Mock()
-    )
-    datasource_dao_mock.get_datasource().db_extra = None
-    return datasource_dao_mock
+    return Mock(spec=["get_datasource"])
 
 
 def apply_max_row_limit(limit: int, max_limit: Optional[int] = None) -> int:
@@ -148,55 +112,3 @@ class TestQueryObjectFactory:
             raw_query_context["result_type"], **raw_query_object
         )
         assert query_object.post_processing == []
-
-    def test_query_context_no_python_date_format_filters(
-        self,
-        query_object_factory: QueryObjectFactory,
-        raw_query_context: dict[str, Any],
-    ):
-        raw_query_object = raw_query_context["queries"][0]
-        raw_query_object["filters"].append(
-            {"col": TEMPORAL_COLUMN_NAMES[0], "op": "==", "val": 315532800000}
-        )
-        query_object = query_object_factory.create(
-            raw_query_context["result_type"],
-            raw_query_context["datasource"],
-            **raw_query_object
-        )
-        assert query_object.filter[3]["val"] == 315532800000
-
-    def test_query_context_python_date_format_filters(
-        self,
-        query_object_factory: QueryObjectFactory,
-        raw_query_context: dict[str, Any],
-    ):
-        raw_query_object = raw_query_context["queries"][0]
-        raw_query_object["filters"].append(
-            {"col": TEMPORAL_COLUMN_NAMES[1], "op": "==", "val": 315532800000}
-        )
-        query_object = query_object_factory.create(
-            raw_query_context["result_type"],
-            raw_query_context["datasource"],
-            **raw_query_object
-        )
-        assert query_object.filter[3]["val"] == 1980
-
-    def test_query_context_python_date_format_filters_list_of_values(
-        self,
-        query_object_factory: QueryObjectFactory,
-        raw_query_context: dict[str, Any],
-    ):
-        raw_query_object = raw_query_context["queries"][0]
-        raw_query_object["filters"].append(
-            {
-                "col": TEMPORAL_COLUMN_NAMES[1],
-                "op": "==",
-                "val": [315532800000, 631152000000],
-            }
-        )
-        query_object = query_object_factory.create(
-            raw_query_context["result_type"],
-            raw_query_context["datasource"],
-            **raw_query_object
-        )
-        assert query_object.filter[3]["val"] == [1980, 1990]


(superset) 07/11: fix: dataset update uniqueness (#25756)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 01d3ac20c7204007d66a240e3311fa19ea8455fd
Author: Beto Dealmeida <ro...@dealmeida.net>
AuthorDate: Wed Oct 25 16:49:32 2023 -0400

    fix: dataset update uniqueness (#25756)
    
    (cherry picked from commit c7f8d11a7eca33b7eed187f4e757fd7b9f45f9be)
---
 superset/daos/dataset.py             |  6 ++-
 superset/datasets/commands/update.py |  5 ++-
 tests/unit_tests/dao/dataset_test.py | 83 ++++++++++++++++++++++++++++++++++++
 3 files changed, 92 insertions(+), 2 deletions(-)

diff --git a/superset/daos/dataset.py b/superset/daos/dataset.py
index 716fcd9a05..0b6c4f6271 100644
--- a/superset/daos/dataset.py
+++ b/superset/daos/dataset.py
@@ -100,11 +100,15 @@ class DatasetDAO(BaseDAO[SqlaTable]):  # pylint: disable=too-many-public-methods
 
     @staticmethod
     def validate_update_uniqueness(
-        database_id: int, dataset_id: int, name: str
+        database_id: int,
+        schema: str | None,
+        dataset_id: int,
+        name: str,
     ) -> bool:
         dataset_query = db.session.query(SqlaTable).filter(
             SqlaTable.table_name == name,
             SqlaTable.database_id == database_id,
+            SqlaTable.schema == schema,
             SqlaTable.id != dataset_id,
         )
         return not db.session.query(dataset_query.exists()).scalar()
diff --git a/superset/datasets/commands/update.py b/superset/datasets/commands/update.py
index a38439fb7f..dfa3a3dcf8 100644
--- a/superset/datasets/commands/update.py
+++ b/superset/datasets/commands/update.py
@@ -89,7 +89,10 @@ class UpdateDatasetCommand(UpdateMixin, BaseCommand):
         table_name = self._properties.get("table_name", None)
         # Validate uniqueness
         if not DatasetDAO.validate_update_uniqueness(
-            self._model.database_id, self._model_id, table_name
+            self._model.database_id,
+            self._model.schema,
+            self._model_id,
+            table_name,
         ):
             exceptions.append(DatasetExistsValidationError(table_name))
         # Validate/Populate database not allowed to change
diff --git a/tests/unit_tests/dao/dataset_test.py b/tests/unit_tests/dao/dataset_test.py
new file mode 100644
index 0000000000..288f68cae0
--- /dev/null
+++ b/tests/unit_tests/dao/dataset_test.py
@@ -0,0 +1,83 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from sqlalchemy.orm.session import Session
+
+from superset.daos.dataset import DatasetDAO
+
+
+def test_validate_update_uniqueness(session: Session) -> None:
+    """
+    Test the `validate_update_uniqueness` static method.
+
+    In particular, allow datasets with the same name in the same database as long as they
+    are in different schemas
+    """
+    from superset.connectors.sqla.models import SqlaTable
+    from superset.models.core import Database
+
+    SqlaTable.metadata.create_all(session.get_bind())
+
+    database = Database(
+        database_name="my_db",
+        sqlalchemy_uri="sqlite://",
+    )
+    dataset1 = SqlaTable(
+        table_name="my_dataset",
+        schema="main",
+        database=database,
+    )
+    dataset2 = SqlaTable(
+        table_name="my_dataset",
+        schema="dev",
+        database=database,
+    )
+    session.add_all([database, dataset1, dataset2])
+    session.flush()
+
+    # same table name, different schema
+    assert (
+        DatasetDAO.validate_update_uniqueness(
+            database_id=database.id,
+            schema=dataset1.schema,
+            dataset_id=dataset1.id,
+            name=dataset1.table_name,
+        )
+        is True
+    )
+
+    # duplicate schema and table name
+    assert (
+        DatasetDAO.validate_update_uniqueness(
+            database_id=database.id,
+            schema=dataset2.schema,
+            dataset_id=dataset1.id,
+            name=dataset1.table_name,
+        )
+        is False
+    )
+
+    # no schema
+    assert (
+        DatasetDAO.validate_update_uniqueness(
+            database_id=database.id,
+            schema=None,
+            dataset_id=dataset1.id,
+            name=dataset1.table_name,
+        )
+        is True
+    )


(superset) 06/11: fix: Revert "fix(Charts): Set max row limit + removed the option to use an empty row limit value" (#25753)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git

commit fd2c2725d4da7261083f88362d2b1084241678fe
Author: Geido <60...@users.noreply.github.com>
AuthorDate: Wed Oct 25 15:39:49 2023 +0300

    fix: Revert "fix(Charts): Set max row limit + removed the option to use an empty row limit value" (#25753)
    
    (cherry picked from commit e2fe96778887d203a852cf09def151ff024cfaf7)
---
 .../src/shared-controls/sharedControls.tsx         |  9 +----
 .../superset-ui-core/src/validator/index.ts        |  1 -
 .../src/validator/validateMaxValue.ts              |  8 -----
 .../test/validator/validateMaxValue.test.ts        | 38 ----------------------
 4 files changed, 1 insertion(+), 55 deletions(-)

diff --git a/superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/sharedControls.tsx b/superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/sharedControls.tsx
index 69fa8a6864..abf5153bb0 100644
--- a/superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/sharedControls.tsx
+++ b/superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/sharedControls.tsx
@@ -47,8 +47,6 @@ import {
   isDefined,
   hasGenericChartAxes,
   NO_TIME_RANGE,
-  validateNonEmpty,
-  validateMaxValue,
 } from '@superset-ui/core';
 
 import {
@@ -247,12 +245,7 @@ const row_limit: SharedControlConfig<'SelectControl'> = {
   type: 'SelectControl',
   freeForm: true,
   label: t('Row limit'),
-  clearable: false,
-  validators: [
-    validateNonEmpty,
-    legacyValidateInteger,
-    v => validateMaxValue(v, 100000),
-  ],
+  validators: [legacyValidateInteger],
   default: 10000,
   choices: formatSelectOptions(ROW_LIMIT_OPTIONS),
   description: t('Limits the number of rows that get displayed.'),
diff --git a/superset-frontend/packages/superset-ui-core/src/validator/index.ts b/superset-frontend/packages/superset-ui-core/src/validator/index.ts
index fb37328c02..532efcc959 100644
--- a/superset-frontend/packages/superset-ui-core/src/validator/index.ts
+++ b/superset-frontend/packages/superset-ui-core/src/validator/index.ts
@@ -22,4 +22,3 @@ export { default as legacyValidateNumber } from './legacyValidateNumber';
 export { default as validateInteger } from './validateInteger';
 export { default as validateNumber } from './validateNumber';
 export { default as validateNonEmpty } from './validateNonEmpty';
-export { default as validateMaxValue } from './validateMaxValue';
diff --git a/superset-frontend/packages/superset-ui-core/src/validator/validateMaxValue.ts b/superset-frontend/packages/superset-ui-core/src/validator/validateMaxValue.ts
deleted file mode 100644
index 24c1da1c79..0000000000
--- a/superset-frontend/packages/superset-ui-core/src/validator/validateMaxValue.ts
+++ /dev/null
@@ -1,8 +0,0 @@
-import { t } from '../translation';
-
-export default function validateMaxValue(v: unknown, max: Number) {
-  if (Number(v) > +max) {
-    return t('Value cannot exceed %s', max);
-  }
-  return false;
-}
diff --git a/superset-frontend/packages/superset-ui-core/test/validator/validateMaxValue.test.ts b/superset-frontend/packages/superset-ui-core/test/validator/validateMaxValue.test.ts
deleted file mode 100644
index 70f3d332c5..0000000000
--- a/superset-frontend/packages/superset-ui-core/test/validator/validateMaxValue.test.ts
+++ /dev/null
@@ -1,38 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-import { validateMaxValue } from '@superset-ui/core';
-import './setup';
-
-describe('validateInteger()', () => {
-  it('returns the warning message if invalid', () => {
-    expect(validateMaxValue(10.1, 10)).toBeTruthy();
-    expect(validateMaxValue(1, 0)).toBeTruthy();
-    expect(validateMaxValue('2', 1)).toBeTruthy();
-  });
-  it('returns false if the input is valid', () => {
-    expect(validateMaxValue(0, 1)).toBeFalsy();
-    expect(validateMaxValue(10, 10)).toBeFalsy();
-    expect(validateMaxValue(undefined, 1)).toBeFalsy();
-    expect(validateMaxValue(NaN, NaN)).toBeFalsy();
-    expect(validateMaxValue(null, 1)).toBeFalsy();
-    expect(validateMaxValue('1', 1)).toBeFalsy();
-    expect(validateMaxValue('a', 1)).toBeFalsy();
-  });
-});


(superset) 04/11: fix: bump to FAB 4.3.9 remove CSP exception (#25712)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 3.0
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 8da27eda4059202438cbfaea511f71dea828afdf
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Fri Oct 20 11:33:40 2023 +0100

    fix: bump to FAB 4.3.9 remove CSP exception (#25712)
    
    (cherry picked from commit 8fb0c8da56f572c086126cc5ca16676ce74e7a3c)
---
 requirements/base.txt | 2 +-
 setup.py              | 2 +-
 superset/config.py    | 2 --
 3 files changed, 2 insertions(+), 4 deletions(-)

diff --git a/requirements/base.txt b/requirements/base.txt
index d6ee2e6a6b..95e6912272 100644
--- a/requirements/base.txt
+++ b/requirements/base.txt
@@ -88,7 +88,7 @@ flask==2.2.5
     #   flask-migrate
     #   flask-sqlalchemy
     #   flask-wtf
-flask-appbuilder==4.3.7
+flask-appbuilder==4.3.9
     # via apache-superset
 flask-babel==1.0.0
     # via flask-appbuilder
diff --git a/setup.py b/setup.py
index 3cb0c144b2..87a721d21b 100644
--- a/setup.py
+++ b/setup.py
@@ -80,7 +80,7 @@ setup(
         "cryptography>=39.0.1, <40",
         "deprecation>=2.1.0, <2.2.0",
         "flask>=2.2.5, <3.0.0",
-        "flask-appbuilder>=4.3.7, <5.0.0",
+        "flask-appbuilder>=4.3.9, <5.0.0",
         "flask-caching>=1.11.1, <2.0",
         "flask-compress>=1.13, <2.0",
         "flask-talisman>=1.0.0, <2.0",
diff --git a/superset/config.py b/superset/config.py
index 73553fcc6c..e15c7bf990 100644
--- a/superset/config.py
+++ b/superset/config.py
@@ -1421,7 +1421,6 @@ TALISMAN_CONFIG = {
         "style-src": [
             "'self'",
             "'unsafe-inline'",
-            "https://cdn.jsdelivr.net/npm/swagger-ui-dist@5/swagger-ui.css",
         ],
         "script-src": ["'self'", "'strict-dynamic'"],
     },
@@ -1443,7 +1442,6 @@ TALISMAN_DEV_CONFIG = {
         "style-src": [
             "'self'",
             "'unsafe-inline'",
-            "https://cdn.jsdelivr.net/npm/swagger-ui-dist@5/swagger-ui.css",
         ],
         "script-src": ["'self'", "'unsafe-inline'", "'unsafe-eval'"],
     },