You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@superset.apache.org by mi...@apache.org on 2022/08/30 12:41:38 UTC

[superset] branch 1.5 updated (48f3eb4273 -> fb0d6d57d6)

This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a change to branch 1.5
in repository https://gitbox.apache.org/repos/asf/superset.git


    from 48f3eb4273 Updates CHANGELOG
     new af5ded3fcb fix: sqloxide optional (#19570)
     new f8a369d233 fix: avoid while cycle in computeMaxFontSize for big Number run forever when css rule applied (#20173)
     new d512e89aa9 fix(csv): Ensure df_to_escaped_csv handles NULL (#20151)
     new 4073b58aa2 fix: Box Plot Chart throws an error when the average (AVG) / SUM is being calculated on the Metrics (#20235)
     new 6dee53682e Fixes #20155 (#20273)
     new 8e1b2a1354 fix(chart): chart gets cut off on the dashboard (#20315)
     new 991a453ff2 fix(docker): Make Gunicorn Keepalive Adjustable (#20348)
     new 094b17e8cc fix(20428): Address-Presto/Trino-Poll-Issue-Refactor (#20434)
     new 1e8259a410 perf: Implement model specific lookups by id to improve performance (#20974)
     new 6bcb9674da Memoize the common_bootstrap_payload (#21018)
     new e5de0909c3 fix: Support the Clipboard API in modern browsers (#20058)
     new 5f3301aadb fix: exporting CSV can't apply pagination #17861 (#20178)
     new fb0d6d57d6 chore: updating python docker image to 3.8.13 (#20550)

The 13 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 Dockerfile                                         |   4 +-
 UPDATING.md                                        |   7 ++
 docker/run-server.sh                               |   1 +
 docs/docs/installation/running-on-kubernetes.mdx   |   2 +-
 requirements/testing.in                            |   1 +
 requirements/testing.txt                           |  13 +--
 setup.py                                           |   1 -
 .../src/dimension/computeMaxFontSize.ts            |  16 +++-
 .../test/dimension/computeMaxFontSize.test.ts      |   9 ++
 .../src/components/CopyToClipboard/index.jsx       |   6 +-
 .../dashboard/components/gridComponents/Chart.jsx  |  12 ++-
 .../menu/ShareMenuItems/ShareMenuItems.test.tsx    |  10 +-
 .../components/menu/ShareMenuItems/index.tsx       |   3 +-
 .../CopyToClipboardButton.test.tsx                 |  22 ++++-
 .../DataTablesPane/DataTablesPane.test.tsx         |   9 +-
 .../explore/components/ExploreActionButtons.tsx    |   3 +-
 .../components/ExploreChartHeader/index.jsx        |   1 +
 superset-frontend/src/utils/common.js              |   8 +-
 superset-frontend/src/utils/copy.ts                | 105 ++++++++++++++-------
 .../components/SyntaxHighlighterCopy/index.tsx     |   2 +-
 .../views/CRUD/data/savedquery/SavedQueryList.tsx  |   6 +-
 superset-frontend/src/views/CRUD/hooks.ts          |   6 +-
 superset/common/query_context_processor.py         |   2 +
 superset/dao/base.py                               |   7 +-
 superset/db_engine_specs/presto.py                 |   4 -
 superset/db_engine_specs/trino.py                  |  97 ++++++-------------
 superset/explore/utils.py                          |   6 +-
 superset/migrations/shared/utils.py                |  10 +-
 superset/utils/csv.py                              |   9 +-
 superset/utils/pandas_postprocessing/boxplot.py    |   9 +-
 superset/views/base.py                             |   8 +-
 tests/integration_tests/core_tests.py              |   4 +-
 tests/integration_tests/datasets/api_tests.py      |   1 +
 .../explore/form_data/api_tests.py                 |  10 +-
 .../explore/permalink/api_tests.py                 |   2 +-
 tests/integration_tests/utils/csv_tests.py         |   4 +
 .../unit_tests/charts/dao}/__init__.py             |   0
 tests/unit_tests/charts/dao/dao_tests.py           |  67 +++++++++++++
 .../unit_tests/datasets/dao}/__init__.py           |   0
 tests/unit_tests/datasets/dao/dao_tests.py         |  73 ++++++++++++++
 .../pandas_postprocessing/test_boxplot.py          |  25 +++++
 41 files changed, 422 insertions(+), 163 deletions(-)
 copy {superset/annotation_layers => tests/unit_tests/charts/dao}/__init__.py (100%)
 create mode 100644 tests/unit_tests/charts/dao/dao_tests.py
 copy {superset/annotation_layers => tests/unit_tests/datasets/dao}/__init__.py (100%)
 create mode 100644 tests/unit_tests/datasets/dao/dao_tests.py


[superset] 02/13: fix: avoid while cycle in computeMaxFontSize for big Number run forever when css rule applied (#20173)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 1.5
in repository https://gitbox.apache.org/repos/asf/superset.git

commit f8a369d233c7670b726ed3c6a1bf44f08f3e5e32
Author: Diego Medina <di...@gmail.com>
AuthorDate: Wed May 25 06:04:04 2022 -0400

    fix: avoid while cycle in computeMaxFontSize for big Number run forever when css rule applied (#20173)
    
    (cherry picked from commit 365acee663f7942ba7d8dfd0e4cf72c4cecb7a2d)
---
 .../superset-ui-core/src/dimension/computeMaxFontSize.ts | 16 ++++++++++++++--
 .../test/dimension/computeMaxFontSize.test.ts            |  9 +++++++++
 2 files changed, 23 insertions(+), 2 deletions(-)

diff --git a/superset-frontend/packages/superset-ui-core/src/dimension/computeMaxFontSize.ts b/superset-frontend/packages/superset-ui-core/src/dimension/computeMaxFontSize.ts
index a762d8b1f4..ebd1f6e568 100644
--- a/superset-frontend/packages/superset-ui-core/src/dimension/computeMaxFontSize.ts
+++ b/superset-frontend/packages/superset-ui-core/src/dimension/computeMaxFontSize.ts
@@ -27,8 +27,20 @@ function decreaseSizeUntil(
 ): number {
   let size = startSize;
   let dimension = computeDimension(size);
+
   while (!condition(dimension)) {
     size -= 1;
+
+    // Here if the size goes below zero most likely is because it
+    // has additional style applied in which case we assume the user
+    // knows what it's doing and we just let them use that.
+    // Visually it works, although it could have another
+    // check in place.
+    if (size < 0) {
+      size = startSize;
+      break;
+    }
+
     dimension = computeDimension(size);
   }
 
@@ -66,7 +78,7 @@ export default function computeMaxFontSize(
     size = decreaseSizeUntil(
       size,
       computeDimension,
-      dim => dim.width <= maxWidth,
+      dim => dim.width > 0 && dim.width <= maxWidth,
     );
   }
 
@@ -74,7 +86,7 @@ export default function computeMaxFontSize(
     size = decreaseSizeUntil(
       size,
       computeDimension,
-      dim => dim.height <= maxHeight,
+      dim => dim.height > 0 && dim.height <= maxHeight,
     );
   }
 
diff --git a/superset-frontend/packages/superset-ui-core/test/dimension/computeMaxFontSize.test.ts b/superset-frontend/packages/superset-ui-core/test/dimension/computeMaxFontSize.test.ts
index 99574f4ccf..a64d819535 100644
--- a/superset-frontend/packages/superset-ui-core/test/dimension/computeMaxFontSize.test.ts
+++ b/superset-frontend/packages/superset-ui-core/test/dimension/computeMaxFontSize.test.ts
@@ -59,5 +59,14 @@ describe('computeMaxFontSize(input)', () => {
         }),
       ).toEqual(25);
     });
+    it('ensure idealFontSize is used if the maximum font size calculation goes below zero', () => {
+      expect(
+        computeMaxFontSize({
+          maxWidth: 5,
+          idealFontSize: 34,
+          text: SAMPLE_TEXT[0],
+        }),
+      ).toEqual(34);
+    });
   });
 });


[superset] 13/13: chore: updating python docker image to 3.8.13 (#20550)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 1.5
in repository https://gitbox.apache.org/repos/asf/superset.git

commit fb0d6d57d6bd720e7192ecc71e8274747f3fde5e
Author: nisheldo <nj...@me.com>
AuthorDate: Fri Jul 1 13:16:17 2022 -0500

    chore: updating python docker image to 3.8.13 (#20550)
    
    (cherry picked from commit 7275805e957f1944864eff512a165760af08b51f)
---
 Dockerfile | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/Dockerfile b/Dockerfile
index d973091c9d..bb9f521737 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -18,7 +18,7 @@
 ######################################################################
 # PY stage that simply does a pip install on our requirements
 ######################################################################
-ARG PY_VER=3.8.12
+ARG PY_VER=3.8.13
 FROM python:${PY_VER} AS superset-py
 
 RUN mkdir /app \
@@ -71,7 +71,7 @@ RUN cd /app/superset-frontend \
 ######################################################################
 # Final lean image...
 ######################################################################
-ARG PY_VER=3.8.12
+ARG PY_VER=3.8.13
 FROM python:${PY_VER} AS lean
 
 ENV LANG=C.UTF-8 \


[superset] 03/13: fix(csv): Ensure df_to_escaped_csv handles NULL (#20151)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 1.5
in repository https://gitbox.apache.org/repos/asf/superset.git

commit d512e89aa9d0f2b5c8eeda49733efc198735e310
Author: John Bodley <45...@users.noreply.github.com>
AuthorDate: Tue May 31 09:56:25 2022 -0700

    fix(csv): Ensure df_to_escaped_csv handles NULL (#20151)
    
    Co-authored-by: John Bodley <jo...@airbnb.com>
    (cherry picked from commit 97ce920d493d126ddcff93b9e46cdde1c5c8bb69)
---
 superset/utils/csv.py                      | 9 +++++++--
 tests/integration_tests/utils/csv_tests.py | 4 ++++
 2 files changed, 11 insertions(+), 2 deletions(-)

diff --git a/superset/utils/csv.py b/superset/utils/csv.py
index 42d2c55783..cf73c99dfa 100644
--- a/superset/utils/csv.py
+++ b/superset/utils/csv.py
@@ -19,6 +19,7 @@ import urllib.request
 from typing import Any, Dict, Optional
 from urllib.error import URLError
 
+import numpy as np
 import pandas as pd
 import simplejson
 
@@ -64,8 +65,12 @@ def df_to_escaped_csv(df: pd.DataFrame, **kwargs: Any) -> Any:
     # Escape csv headers
     df = df.rename(columns=escape_values)
 
-    # Escape csv rows
-    df = df.applymap(escape_values)
+    # Escape csv values
+    for name, column in df.items():
+        if column.dtype == np.dtype(object):
+            for idx, value in enumerate(column.values):
+                if isinstance(value, str):
+                    df.at[idx, name] = escape_value(value)
 
     return df.to_csv(**kwargs)
 
diff --git a/tests/integration_tests/utils/csv_tests.py b/tests/integration_tests/utils/csv_tests.py
index bf6110c639..e514efb1d2 100644
--- a/tests/integration_tests/utils/csv_tests.py
+++ b/tests/integration_tests/utils/csv_tests.py
@@ -17,6 +17,7 @@
 import io
 
 import pandas as pd
+import pyarrow as pa
 import pytest
 
 from superset.utils import csv
@@ -77,3 +78,6 @@ def test_df_to_escaped_csv():
         ["a", "'=b"],  # pandas seems to be removing the leading ""
         ["' =a", "b"],
     ]
+
+    df = pa.array([1, None]).to_pandas(integer_object_nulls=True).to_frame()
+    assert csv.df_to_escaped_csv(df, encoding="utf8", index=False) == '0\n1\n""\n'


[superset] 12/13: fix: exporting CSV can't apply pagination #17861 (#20178)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 1.5
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 5f3301aadbd42b5cc97b5b40a7f632b1158b745b
Author: Ilyas <31...@users.noreply.github.com>
AuthorDate: Thu Jun 2 05:42:51 2022 +0100

    fix: exporting CSV can't apply pagination #17861 (#20178)
---
 superset-frontend/src/dashboard/components/gridComponents/Chart.jsx   | 1 +
 superset-frontend/src/explore/components/ExploreChartHeader/index.jsx | 1 +
 2 files changed, 2 insertions(+)

diff --git a/superset-frontend/src/dashboard/components/gridComponents/Chart.jsx b/superset-frontend/src/dashboard/components/gridComponents/Chart.jsx
index b852523f17..7e7efa6353 100644
--- a/superset-frontend/src/dashboard/components/gridComponents/Chart.jsx
+++ b/superset-frontend/src/dashboard/components/gridComponents/Chart.jsx
@@ -304,6 +304,7 @@ export default class Chart extends React.Component {
       resultType: 'full',
       resultFormat: 'csv',
       force: true,
+      ownState: this.props.ownState,
     });
   }
 
diff --git a/superset-frontend/src/explore/components/ExploreChartHeader/index.jsx b/superset-frontend/src/explore/components/ExploreChartHeader/index.jsx
index 21605c553d..e308325fd5 100644
--- a/superset-frontend/src/explore/components/ExploreChartHeader/index.jsx
+++ b/superset-frontend/src/explore/components/ExploreChartHeader/index.jsx
@@ -374,6 +374,7 @@ export class ExploreChartHeader extends React.PureComponent {
               ...this.props.actions,
               openPropertiesModal: this.openPropertiesModal,
             }}
+            ownState={this.props.ownState}
             slice={this.props.slice}
             canDownloadCSV={this.props.can_download}
             chartStatus={chartStatus}


[superset] 07/13: fix(docker): Make Gunicorn Keepalive Adjustable (#20348)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 1.5
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 991a453ff29c915e825872e2b70b93795352b48e
Author: Multazim Deshmukh <57...@users.noreply.github.com>
AuthorDate: Sat Jun 11 01:34:39 2022 +0530

    fix(docker): Make Gunicorn Keepalive Adjustable (#20348)
    
    Co-authored-by: Multazim Deshmukh <mu...@morningstar.com>
    (cherry picked from commit 86368dd406b9e828f31186a4b6179d24758a7d87)
---
 docker/run-server.sh | 1 +
 1 file changed, 1 insertion(+)

diff --git a/docker/run-server.sh b/docker/run-server.sh
index 5519ff5d5c..064f47b9c2 100644
--- a/docker/run-server.sh
+++ b/docker/run-server.sh
@@ -27,6 +27,7 @@ gunicorn \
     --worker-class ${SERVER_WORKER_CLASS:-gthread} \
     --threads ${SERVER_THREADS_AMOUNT:-20} \
     --timeout ${GUNICORN_TIMEOUT:-60} \
+    --keep-alive ${GUNICORN_KEEPALIVE:-2} \
     --limit-request-line ${SERVER_LIMIT_REQUEST_LINE:-0} \
     --limit-request-field_size ${SERVER_LIMIT_REQUEST_FIELD_SIZE:-0} \
     "${FLASK_APP}"


[superset] 06/13: fix(chart): chart gets cut off on the dashboard (#20315)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 1.5
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 8e1b2a1354b1a3c089102835bc0114c4847f9e78
Author: Stephen Liu <75...@qq.com>
AuthorDate: Thu Jun 9 22:59:58 2022 +0800

    fix(chart): chart gets cut off on the dashboard (#20315)
    
    * fix(chart): chart gets cut off on the dashboard
    
    * add some failsafe
    
    * address comment
    
    (cherry picked from commit 07b4a7159dd293061b83c671ad64cc51c928a199)
---
 .../src/dashboard/components/gridComponents/Chart.jsx         | 11 ++++++++---
 1 file changed, 8 insertions(+), 3 deletions(-)

diff --git a/superset-frontend/src/dashboard/components/gridComponents/Chart.jsx b/superset-frontend/src/dashboard/components/gridComponents/Chart.jsx
index b212af44e5..b852523f17 100644
--- a/superset-frontend/src/dashboard/components/gridComponents/Chart.jsx
+++ b/superset-frontend/src/dashboard/components/gridComponents/Chart.jsx
@@ -223,9 +223,14 @@ export default class Chart extends React.Component {
   }
 
   getHeaderHeight() {
-    return (
-      (this.headerRef && this.headerRef.offsetHeight) || DEFAULT_HEADER_HEIGHT
-    );
+    if (this.headerRef) {
+      const computedStyle = getComputedStyle(this.headerRef).getPropertyValue(
+        'margin-bottom',
+      );
+      const marginBottom = parseInt(computedStyle, 10) || 0;
+      return this.headerRef.offsetHeight + marginBottom;
+    }
+    return DEFAULT_HEADER_HEIGHT;
   }
 
   setDescriptionRef(ref) {


[superset] 11/13: fix: Support the Clipboard API in modern browsers (#20058)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 1.5
in repository https://gitbox.apache.org/repos/asf/superset.git

commit e5de0909c39349b7d270dab36d9b9b7f5f3a075b
Author: Diego Medina <di...@gmail.com>
AuthorDate: Fri Jun 3 07:34:00 2022 -0400

    fix: Support the Clipboard API in modern browsers (#20058)
    
    * fix: Support the Clipboard API in modern browsers
    
    * fix tests
    
    * PR comment
    
    * Improvements
---
 .../src/components/CopyToClipboard/index.jsx       |   6 +-
 .../menu/ShareMenuItems/ShareMenuItems.test.tsx    |  10 +-
 .../components/menu/ShareMenuItems/index.tsx       |   3 +-
 .../CopyToClipboardButton.test.tsx                 |  22 ++++-
 .../DataTablesPane/DataTablesPane.test.tsx         |   9 +-
 .../explore/components/ExploreActionButtons.tsx    |   3 +-
 superset-frontend/src/utils/common.js              |   8 +-
 superset-frontend/src/utils/copy.ts                | 105 ++++++++++++++-------
 .../components/SyntaxHighlighterCopy/index.tsx     |   2 +-
 .../views/CRUD/data/savedquery/SavedQueryList.tsx  |   6 +-
 superset-frontend/src/views/CRUD/hooks.ts          |   6 +-
 11 files changed, 121 insertions(+), 59 deletions(-)

diff --git a/superset-frontend/src/components/CopyToClipboard/index.jsx b/superset-frontend/src/components/CopyToClipboard/index.jsx
index 00a23b1662..73cdc2b9e4 100644
--- a/superset-frontend/src/components/CopyToClipboard/index.jsx
+++ b/superset-frontend/src/components/CopyToClipboard/index.jsx
@@ -57,10 +57,10 @@ class CopyToClipboard extends React.Component {
   onClick() {
     if (this.props.getText) {
       this.props.getText(d => {
-        this.copyToClipboard(d);
+        this.copyToClipboard(Promise.resolve(d));
       });
     } else {
-      this.copyToClipboard(this.props.text);
+      this.copyToClipboard(Promise.resolve(this.props.text));
     }
   }
 
@@ -72,7 +72,7 @@ class CopyToClipboard extends React.Component {
   }
 
   copyToClipboard(textToCopy) {
-    copyTextToClipboard(textToCopy)
+    copyTextToClipboard(() => textToCopy)
       .then(() => {
         this.props.addSuccessToast(t('Copied to clipboard!'));
       })
diff --git a/superset-frontend/src/dashboard/components/menu/ShareMenuItems/ShareMenuItems.test.tsx b/superset-frontend/src/dashboard/components/menu/ShareMenuItems/ShareMenuItems.test.tsx
index 579f9d4b69..498009224a 100644
--- a/superset-frontend/src/dashboard/components/menu/ShareMenuItems/ShareMenuItems.test.tsx
+++ b/superset-frontend/src/dashboard/components/menu/ShareMenuItems/ShareMenuItems.test.tsx
@@ -102,9 +102,10 @@ test('Click on "Copy dashboard URL" and succeed', async () => {
 
   userEvent.click(screen.getByRole('button', { name: 'Copy dashboard URL' }));
 
-  await waitFor(() => {
+  await waitFor(async () => {
     expect(spy).toBeCalledTimes(1);
-    expect(spy).toBeCalledWith('http://localhost/superset/dashboard/p/123/');
+    const value = await spy.mock.calls[0][0]();
+    expect(value).toBe('http://localhost/superset/dashboard/p/123/');
     expect(props.addSuccessToast).toBeCalledTimes(1);
     expect(props.addSuccessToast).toBeCalledWith('Copied to clipboard!');
     expect(props.addDangerToast).toBeCalledTimes(0);
@@ -128,9 +129,10 @@ test('Click on "Copy dashboard URL" and fail', async () => {
 
   userEvent.click(screen.getByRole('button', { name: 'Copy dashboard URL' }));
 
-  await waitFor(() => {
+  await waitFor(async () => {
     expect(spy).toBeCalledTimes(1);
-    expect(spy).toBeCalledWith('http://localhost/superset/dashboard/p/123/');
+    const value = await spy.mock.calls[0][0]();
+    expect(value).toBe('http://localhost/superset/dashboard/p/123/');
     expect(props.addSuccessToast).toBeCalledTimes(0);
     expect(props.addDangerToast).toBeCalledTimes(1);
     expect(props.addDangerToast).toBeCalledWith(
diff --git a/superset-frontend/src/dashboard/components/menu/ShareMenuItems/index.tsx b/superset-frontend/src/dashboard/components/menu/ShareMenuItems/index.tsx
index b196100734..e8f608e6fe 100644
--- a/superset-frontend/src/dashboard/components/menu/ShareMenuItems/index.tsx
+++ b/superset-frontend/src/dashboard/components/menu/ShareMenuItems/index.tsx
@@ -64,8 +64,7 @@ const ShareMenuItems = (props: ShareMenuItemProps) => {
 
   async function onCopyLink() {
     try {
-      const url = await generateUrl();
-      await copyTextToClipboard(url);
+      await copyTextToClipboard(generateUrl);
       addSuccessToast(t('Copied to clipboard!'));
     } catch (error) {
       logging.error(error);
diff --git a/superset-frontend/src/explore/components/DataTableControl/CopyToClipboardButton.test.tsx b/superset-frontend/src/explore/components/DataTableControl/CopyToClipboardButton.test.tsx
index 2ce91590b9..a158bfd7ed 100644
--- a/superset-frontend/src/explore/components/DataTableControl/CopyToClipboardButton.test.tsx
+++ b/superset-frontend/src/explore/components/DataTableControl/CopyToClipboardButton.test.tsx
@@ -18,7 +18,7 @@
  */
 import userEvent from '@testing-library/user-event';
 import React from 'react';
-import { render, screen } from 'spec/helpers/testing-library';
+import { render, screen, waitFor } from 'spec/helpers/testing-library';
 import { CopyToClipboardButton } from '.';
 
 test('Render a button', () => {
@@ -28,14 +28,26 @@ test('Render a button', () => {
   expect(screen.getByRole('button')).toBeInTheDocument();
 });
 
-test('Should copy to clipboard', () => {
-  document.execCommand = jest.fn();
+test('Should copy to clipboard', async () => {
+  const callback = jest.fn();
+  document.execCommand = callback;
+
+  const originalClipboard = { ...global.navigator.clipboard };
+  // @ts-ignore
+  global.navigator.clipboard = { write: callback, writeText: callback };
 
   render(<CopyToClipboardButton data={{ copy: 'data', data: 'copy' }} />, {
     useRedux: true,
   });
 
-  expect(document.execCommand).toHaveBeenCalledTimes(0);
+  expect(callback).toHaveBeenCalledTimes(0);
   userEvent.click(screen.getByRole('button'));
-  expect(document.execCommand).toHaveBeenCalledWith('copy');
+
+  await waitFor(() => {
+    expect(callback).toHaveBeenCalled();
+  });
+
+  jest.resetAllMocks();
+  // @ts-ignore
+  global.navigator.clipboard = originalClipboard;
 });
diff --git a/superset-frontend/src/explore/components/DataTablesPane/DataTablesPane.test.tsx b/superset-frontend/src/explore/components/DataTablesPane/DataTablesPane.test.tsx
index 380285b811..7f4f8d11e0 100644
--- a/superset-frontend/src/explore/components/DataTablesPane/DataTablesPane.test.tsx
+++ b/superset-frontend/src/explore/components/DataTablesPane/DataTablesPane.test.tsx
@@ -144,8 +144,9 @@ test('Should copy data table content correctly', async () => {
   expect(await screen.findByText('1 rows retrieved')).toBeVisible();
 
   userEvent.click(screen.getByRole('button', { name: 'Copy' }));
-  expect(copyToClipboardSpy).toHaveBeenCalledWith(
-    '2009-01-01 00:00:00\tAction\n',
-  );
-  fetchMock.done();
+  expect(copyToClipboardSpy).toHaveBeenCalledTimes(1);
+  const value = await copyToClipboardSpy.mock.calls[0][0]();
+  expect(value).toBe('2009-01-01 00:00:00\tAction\n');
+  copyToClipboardSpy.mockRestore();
+  fetchMock.restore();
 });
diff --git a/superset-frontend/src/explore/components/ExploreActionButtons.tsx b/superset-frontend/src/explore/components/ExploreActionButtons.tsx
index adb85e11d5..8931321918 100644
--- a/superset-frontend/src/explore/components/ExploreActionButtons.tsx
+++ b/superset-frontend/src/explore/components/ExploreActionButtons.tsx
@@ -109,8 +109,7 @@ const ExploreActionButtons = (props: ExploreActionButtonsProps) => {
   const doCopyLink = async () => {
     try {
       setCopyTooltip(t('Loading...'));
-      const url = await getChartPermalink(latestQueryFormData);
-      await copyTextToClipboard(url);
+      await copyTextToClipboard(() => getChartPermalink(latestQueryFormData));
       setCopyTooltip(t('Copied to clipboard!'));
       addSuccessToast(t('Copied to clipboard!'));
     } catch (error) {
diff --git a/superset-frontend/src/utils/common.js b/superset-frontend/src/utils/common.js
index 4efdb205e5..603ec7c549 100644
--- a/superset-frontend/src/utils/common.js
+++ b/superset-frontend/src/utils/common.js
@@ -94,7 +94,7 @@ export function prepareCopyToClipboardTabularData(data, columns) {
   for (let i = 0; i < data.length; i += 1) {
     const row = {};
     for (let j = 0; j < columns.length; j += 1) {
-      // JavaScript does not mantain the order of a mixed set of keys (i.e integers and strings)
+      // JavaScript does not maintain the order of a mixed set of keys (i.e integers and strings)
       // the below function orders the keys based on the column names.
       const key = columns[j].name || columns[j];
       if (data[i][key]) {
@@ -145,4 +145,10 @@ export const detectOS = () => {
   return 'Unknown OS';
 };
 
+export const isSafari = () => {
+  const { userAgent } = navigator;
+
+  return userAgent && /^((?!chrome|android).)*safari/i.test(userAgent);
+};
+
 export const isNullish = value => value === null || value === undefined;
diff --git a/superset-frontend/src/utils/copy.ts b/superset-frontend/src/utils/copy.ts
index 7db289c040..0980f2ab17 100644
--- a/superset-frontend/src/utils/copy.ts
+++ b/superset-frontend/src/utils/copy.ts
@@ -17,40 +17,79 @@
  * under the License.
  */
 
-const copyTextToClipboard = async (text: string) =>
-  new Promise<void>((resolve, reject) => {
-    const selection: Selection | null = document.getSelection();
-    if (selection) {
-      selection.removeAllRanges();
-      const range = document.createRange();
-      const span = document.createElement('span');
-      span.textContent = text;
-      span.style.position = 'fixed';
-      span.style.top = '0';
-      span.style.clip = 'rect(0, 0, 0, 0)';
-      span.style.whiteSpace = 'pre';
-
-      document.body.appendChild(span);
-      range.selectNode(span);
-      selection.addRange(range);
-
-      try {
-        if (!document.execCommand('copy')) {
-          reject();
-        }
-      } catch (err) {
-        reject();
-      }
-
-      document.body.removeChild(span);
-      if (selection.removeRange) {
-        selection.removeRange(range);
-      } else {
-        selection.removeAllRanges();
-      }
+import { isSafari } from './common';
+
+// Use the new Clipboard API if the browser supports it
+const copyTextWithClipboardApi = async (getText: () => Promise<string>) => {
+  // Safari (WebKit) does not support delayed generation of clipboard.
+  // This means that writing to the clipboard, from the moment the user
+  // interacts with the app, must be instantaneous.
+  // However, neither writeText nor write accepts a Promise, so
+  // we need to create a ClipboardItem that accepts said Promise to
+  // delay the text generation, as needed.
+  // Source: https://bugs.webkit.org/show_bug.cgi?id=222262P
+  if (isSafari()) {
+    try {
+      const clipboardItem = new ClipboardItem({
+        'text/plain': getText(),
+      });
+      await navigator.clipboard.write([clipboardItem]);
+    } catch {
+      // Fallback to default clipboard API implementation
+      const text = await getText();
+      await navigator.clipboard.writeText(text);
     }
+  } else {
+    // For Blink, the above method won't work, but we can use the
+    // default (intended) API, since the delayed generation of the
+    // clipboard is now supported.
+    // Source: https://bugs.chromium.org/p/chromium/issues/detail?id=1014310
+    const text = await getText();
+    await navigator.clipboard.writeText(text);
+  }
+};
+
+const copyTextToClipboard = (getText: () => Promise<string>) =>
+  copyTextWithClipboardApi(getText)
+    // If the Clipboard API is not supported, fallback to the older method.
+    .catch(() =>
+      getText().then(
+        text =>
+          new Promise<void>((resolve, reject) => {
+            const selection: Selection | null = document.getSelection();
+            if (selection) {
+              selection.removeAllRanges();
+              const range = document.createRange();
+              const span = document.createElement('span');
+              span.textContent = text;
+              span.style.position = 'fixed';
+              span.style.top = '0';
+              span.style.clip = 'rect(0, 0, 0, 0)';
+              span.style.whiteSpace = 'pre';
+
+              document.body.appendChild(span);
+              range.selectNode(span);
+              selection.addRange(range);
+
+              try {
+                if (!document.execCommand('copy')) {
+                  reject();
+                }
+              } catch (err) {
+                reject();
+              }
+
+              document.body.removeChild(span);
+              if (selection.removeRange) {
+                selection.removeRange(range);
+              } else {
+                selection.removeAllRanges();
+              }
+            }
 
-    resolve();
-  });
+            resolve();
+          }),
+      ),
+    );
 
 export default copyTextToClipboard;
diff --git a/superset-frontend/src/views/CRUD/data/components/SyntaxHighlighterCopy/index.tsx b/superset-frontend/src/views/CRUD/data/components/SyntaxHighlighterCopy/index.tsx
index 73a0f8e9cc..8e96b95f0d 100644
--- a/superset-frontend/src/views/CRUD/data/components/SyntaxHighlighterCopy/index.tsx
+++ b/superset-frontend/src/views/CRUD/data/components/SyntaxHighlighterCopy/index.tsx
@@ -65,7 +65,7 @@ export default function SyntaxHighlighterCopy({
   language: 'sql' | 'markdown' | 'html' | 'json';
 }) {
   function copyToClipboard(textToCopy: string) {
-    copyTextToClipboard(textToCopy)
+    copyTextToClipboard(() => Promise.resolve(textToCopy))
       .then(() => {
         if (addSuccessToast) {
           addSuccessToast(t('SQL Copied!'));
diff --git a/superset-frontend/src/views/CRUD/data/savedquery/SavedQueryList.tsx b/superset-frontend/src/views/CRUD/data/savedquery/SavedQueryList.tsx
index df3f16a858..d2dc6aff9c 100644
--- a/superset-frontend/src/views/CRUD/data/savedquery/SavedQueryList.tsx
+++ b/superset-frontend/src/views/CRUD/data/savedquery/SavedQueryList.tsx
@@ -210,8 +210,10 @@ function SavedQueryList({
 
   const copyQueryLink = useCallback(
     (id: number) => {
-      copyTextToClipboard(
-        `${window.location.origin}/superset/sqllab?savedQueryId=${id}`,
+      copyTextToClipboard(() =>
+        Promise.resolve(
+          `${window.location.origin}/superset/sqllab?savedQueryId=${id}`,
+        ),
       )
         .then(() => {
           addSuccessToast(t('Link Copied!'));
diff --git a/superset-frontend/src/views/CRUD/hooks.ts b/superset-frontend/src/views/CRUD/hooks.ts
index b0ca13d96a..6a503ee494 100644
--- a/superset-frontend/src/views/CRUD/hooks.ts
+++ b/superset-frontend/src/views/CRUD/hooks.ts
@@ -595,8 +595,10 @@ export const copyQueryLink = (
   addDangerToast: (arg0: string) => void,
   addSuccessToast: (arg0: string) => void,
 ) => {
-  copyTextToClipboard(
-    `${window.location.origin}/superset/sqllab?savedQueryId=${id}`,
+  copyTextToClipboard(() =>
+    Promise.resolve(
+      `${window.location.origin}/superset/sqllab?savedQueryId=${id}`,
+    ),
   )
     .then(() => {
       addSuccessToast(t('Link Copied!'));


[superset] 08/13: fix(20428): Address-Presto/Trino-Poll-Issue-Refactor (#20434)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 1.5
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 094b17e8cc9f345395533aa85f1c8f4388074177
Author: Simon Thelin <si...@gmail.com>
AuthorDate: Mon Jun 20 00:28:59 2022 +0100

    fix(20428): Address-Presto/Trino-Poll-Issue-Refactor (#20434)
    
    * fix(20428)-Address-Presto/Trino-Poll-Issue-Refacto
    r
    
    Update linter
    
    * Update to only use BaseEngineSpec handle_cursor
    
    * Fix CI
    
    Co-authored-by: John Bodley <45...@users.noreply.github.com>
---
 superset/db_engine_specs/presto.py |  4 --
 superset/db_engine_specs/trino.py  | 97 ++++++++++++--------------------------
 2 files changed, 30 insertions(+), 71 deletions(-)

diff --git a/superset/db_engine_specs/presto.py b/superset/db_engine_specs/presto.py
index 86b12b8538..49810cdd6c 100644
--- a/superset/db_engine_specs/presto.py
+++ b/superset/db_engine_specs/presto.py
@@ -946,11 +946,7 @@ class PrestoEngineSpec(BaseEngineSpec):  # pylint: disable=too-many-public-metho
             sql = f"SHOW CREATE VIEW {schema}.{table}"
             try:
                 cls.execute(cursor, sql)
-                polled = cursor.poll()
 
-                while polled:
-                    time.sleep(0.2)
-                    polled = cursor.poll()
             except DatabaseError:  # not a VIEW
                 return None
             rows = cls.fetch_data(cursor, 1)
diff --git a/superset/db_engine_specs/trino.py b/superset/db_engine_specs/trino.py
index 4a5e9af01c..1759345069 100644
--- a/superset/db_engine_specs/trino.py
+++ b/superset/db_engine_specs/trino.py
@@ -21,9 +21,13 @@ from urllib import parse
 
 import simplejson as json
 from flask import current_app
+from sqlalchemy.engine.reflection import Inspector
 from sqlalchemy.engine.url import make_url, URL
+from sqlalchemy.orm import Session
 
 from superset.db_engine_specs.base import BaseEngineSpec
+from superset.db_engine_specs.presto import PrestoEngineSpec
+from superset.models.sql_lab import Query
 from superset.utils import core as utils
 
 if TYPE_CHECKING:
@@ -133,76 +137,35 @@ class TrinoEngineSpec(BaseEngineSpec):
         return True
 
     @classmethod
-    def estimate_statement_cost(cls, statement: str, cursor: Any) -> Dict[str, Any]:
-        """
-        Run a SQL query that estimates the cost of a given statement.
-
-        :param statement: A single SQL statement
-        :param cursor: Cursor instance
-        :return: JSON response from Trino
-        """
-        sql = f"EXPLAIN (TYPE IO, FORMAT JSON) {statement}"
-        cursor.execute(sql)
-
-        # the output from Trino is a single column and a single row containing
-        # JSON:
-        #
-        #   {
-        #     ...
-        #     "estimate" : {
-        #       "outputRowCount" : 8.73265878E8,
-        #       "outputSizeInBytes" : 3.41425774958E11,
-        #       "cpuCost" : 3.41425774958E11,
-        #       "maxMemory" : 0.0,
-        #       "networkCost" : 3.41425774958E11
-        #     }
-        #   }
-        result = json.loads(cursor.fetchone()[0])
-        return result
+    def get_table_names(
+        cls,
+        database: "Database",
+        inspector: Inspector,
+        schema: Optional[str],
+    ) -> List[str]:
+        return BaseEngineSpec.get_table_names(
+            database=database,
+            inspector=inspector,
+            schema=schema,
+        )
 
     @classmethod
-    def query_cost_formatter(
-        cls, raw_cost: List[Dict[str, Any]]
-    ) -> List[Dict[str, str]]:
-        """
-        Format cost estimate.
-
-        :param raw_cost: JSON estimate from Trino
-        :return: Human readable cost estimate
-        """
-
-        def humanize(value: Any, suffix: str) -> str:
-            try:
-                value = int(value)
-            except ValueError:
-                return str(value)
-
-            prefixes = ["K", "M", "G", "T", "P", "E", "Z", "Y"]
-            prefix = ""
-            to_next_prefix = 1000
-            while value > to_next_prefix and prefixes:
-                prefix = prefixes.pop(0)
-                value //= to_next_prefix
-
-            return f"{value} {prefix}{suffix}"
-
-        cost = []
-        columns = [
-            ("outputRowCount", "Output count", " rows"),
-            ("outputSizeInBytes", "Output size", "B"),
-            ("cpuCost", "CPU cost", ""),
-            ("maxMemory", "Max memory", "B"),
-            ("networkCost", "Network cost", ""),
-        ]
-        for row in raw_cost:
-            estimate: Dict[str, float] = row.get("estimate", {})
-            statement_cost = {}
-            for key, label, suffix in columns:
-                if key in estimate:
-                    statement_cost[label] = humanize(estimate[key], suffix).strip()
-            cost.append(statement_cost)
+    def get_view_names(
+        cls,
+        database: "Database",
+        inspector: Inspector,
+        schema: Optional[str],
+    ) -> List[str]:
+        return BaseEngineSpec.get_view_names(
+            database=database,
+            inspector=inspector,
+            schema=schema,
+        )
 
-        return cost
+    @classmethod
+    def handle_cursor(cls, cursor: Any, query: Query, session: Session) -> None:
+        """Updates progress information"""
+        BaseEngineSpec.handle_cursor(cursor=cursor, query=query, session=session)
 
     @staticmethod
     def get_extra_params(database: "Database") -> Dict[str, Any]:


[superset] 04/13: fix: Box Plot Chart throws an error when the average (AVG) / SUM is being calculated on the Metrics (#20235)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 1.5
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 4073b58aa2123991734872c364b344261e51b22f
Author: Diego Medina <di...@gmail.com>
AuthorDate: Wed Jun 1 22:00:04 2022 -0400

    fix: Box Plot Chart throws an error when the average (AVG) / SUM is being calculated on the Metrics (#20235)
    
    * fix: Box Plot Chart throws an error when the average (AVG) / SUM is being calculated on the Metrics
    
    * add test
    
    (cherry picked from commit 8638f59b4c7ebe954afe46bbfbd5880f1ae6afda)
---
 superset/utils/pandas_postprocessing/boxplot.py    |  9 +++++++-
 .../pandas_postprocessing/test_boxplot.py          | 25 ++++++++++++++++++++++
 2 files changed, 33 insertions(+), 1 deletion(-)

diff --git a/superset/utils/pandas_postprocessing/boxplot.py b/superset/utils/pandas_postprocessing/boxplot.py
index 4436af9182..40ce9200d3 100644
--- a/superset/utils/pandas_postprocessing/boxplot.py
+++ b/superset/utils/pandas_postprocessing/boxplot.py
@@ -18,7 +18,7 @@ from typing import Any, Callable, Dict, List, Optional, Set, Tuple, Union
 
 import numpy as np
 from flask_babel import gettext as _
-from pandas import DataFrame, Series
+from pandas import DataFrame, Series, to_numeric
 
 from superset.exceptions import InvalidPostProcessingError
 from superset.utils.core import PostProcessingBoxplotWhiskerType
@@ -122,4 +122,11 @@ def boxplot(
         for operator_name, operator in operators.items()
         for metric in metrics
     }
+
+    # nanpercentile needs numeric values, otherwise the isnan function
+    # that's used in the underlying function will fail
+    for column in metrics:
+        if df.dtypes[column] == np.object:
+            df[column] = to_numeric(df[column], errors="coerce")
+
     return aggregate(df, groupby=groupby, aggregates=aggregates)
diff --git a/tests/unit_tests/pandas_postprocessing/test_boxplot.py b/tests/unit_tests/pandas_postprocessing/test_boxplot.py
index 9252b0da78..27dff0adeb 100644
--- a/tests/unit_tests/pandas_postprocessing/test_boxplot.py
+++ b/tests/unit_tests/pandas_postprocessing/test_boxplot.py
@@ -124,3 +124,28 @@ def test_boxplot_percentile_incorrect_params():
             metrics=["cars"],
             percentiles=[10, 90, 10],
         )
+
+
+def test_boxplot_type_coercion():
+    df = names_df
+    df["cars"] = df["cars"].astype(str)
+    df = boxplot(
+        df=df,
+        groupby=["region"],
+        whisker_type=PostProcessingBoxplotWhiskerType.TUKEY,
+        metrics=["cars"],
+    )
+
+    columns = {column for column in df.columns}
+    assert columns == {
+        "cars__mean",
+        "cars__median",
+        "cars__q1",
+        "cars__q3",
+        "cars__max",
+        "cars__min",
+        "cars__count",
+        "cars__outliers",
+        "region",
+    }
+    assert len(df) == 4


[superset] 09/13: perf: Implement model specific lookups by id to improve performance (#20974)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 1.5
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 1e8259a4108a1e5747570345a6df57239df07bfd
Author: Bogdan <b....@gmail.com>
AuthorDate: Tue Aug 9 09:59:31 2022 -0700

    perf: Implement model specific lookups by id to improve performance (#20974)
    
    * Implement model specific lookups by id to improve performance
    
    * Address comments e.g. better variable names and test cleanup
    
    * commit after cleanup
    
    * even better name and test cleanup via rollback
    
    Co-authored-by: Bogdan Kyryliuk <bo...@dropbox.com>
---
 superset/common/query_context_processor.py         |  2 +
 superset/dao/base.py                               |  7 ++-
 superset/explore/utils.py                          |  6 +-
 tests/integration_tests/datasets/api_tests.py      |  1 +
 .../explore/form_data/api_tests.py                 | 10 +--
 .../explore/permalink/api_tests.py                 |  2 +-
 tests/unit_tests/charts/dao/__init__.py            | 16 +++++
 tests/unit_tests/charts/dao/dao_tests.py           | 67 ++++++++++++++++++++
 tests/unit_tests/datasets/dao/__init__.py          | 16 +++++
 tests/unit_tests/datasets/dao/dao_tests.py         | 73 ++++++++++++++++++++++
 10 files changed, 190 insertions(+), 10 deletions(-)

diff --git a/superset/common/query_context_processor.py b/superset/common/query_context_processor.py
index c87e878fdd..d528aa3293 100644
--- a/superset/common/query_context_processor.py
+++ b/superset/common/query_context_processor.py
@@ -473,6 +473,8 @@ class QueryContextProcessor:
         chart = ChartDAO.find_by_id(annotation_layer["value"])
         if not chart:
             raise QueryObjectValidationError(_("The chart does not exist"))
+        if not chart.datasource:
+            raise QueryObjectValidationError(_("The chart datasource does not exist"))
         form_data = chart.form_data.copy()
         try:
             viz_obj = get_viz(
diff --git a/superset/dao/base.py b/superset/dao/base.py
index 607967e304..981243d0db 100644
--- a/superset/dao/base.py
+++ b/superset/dao/base.py
@@ -50,14 +50,17 @@ class BaseDAO:
 
     @classmethod
     def find_by_id(
-        cls, model_id: Union[str, int], session: Session = None
+        cls,
+        model_id: Union[str, int],
+        session: Session = None,
+        skip_base_filter: bool = False,
     ) -> Optional[Model]:
         """
         Find a model by id, if defined applies `base_filter`
         """
         session = session or db.session
         query = session.query(cls.model_cls)
-        if cls.base_filter:
+        if cls.base_filter and not skip_base_filter:
             data_model = SQLAInterface(cls.model_cls, session)
             query = cls.base_filter(  # pylint: disable=not-callable
                 cls.id_column_name, data_model
diff --git a/superset/explore/utils.py b/superset/explore/utils.py
index 7ab29de2f7..989294619f 100644
--- a/superset/explore/utils.py
+++ b/superset/explore/utils.py
@@ -35,7 +35,8 @@ from superset.views.utils import is_owner
 
 def check_dataset_access(dataset_id: int) -> Optional[bool]:
     if dataset_id:
-        dataset = DatasetDAO.find_by_id(dataset_id)
+        # Access checks below, no need to validate them twice as they can be expensive.
+        dataset = DatasetDAO.find_by_id(dataset_id, skip_base_filter=True)
         if dataset:
             can_access_datasource = security_manager.can_access_datasource(dataset)
             if can_access_datasource:
@@ -48,7 +49,8 @@ def check_access(dataset_id: int, chart_id: Optional[int], actor: User) -> None:
     check_dataset_access(dataset_id)
     if not chart_id:
         return
-    chart = ChartDAO.find_by_id(chart_id)
+    # Access checks below, no need to validate them twice as they can be expensive.
+    chart = ChartDAO.find_by_id(chart_id, skip_base_filter=True)
     if chart:
         can_access_chart = (
             is_user_admin()
diff --git a/tests/integration_tests/datasets/api_tests.py b/tests/integration_tests/datasets/api_tests.py
index 2031d64557..e656201030 100644
--- a/tests/integration_tests/datasets/api_tests.py
+++ b/tests/integration_tests/datasets/api_tests.py
@@ -1579,6 +1579,7 @@ class TestDatasetApi(SupersetTestCase):
         rv = self.client.get(uri)
         assert rv.status_code == 404
         self.logout()
+
         self.login(username="gamma")
         table = self.get_birth_names_dataset()
         uri = f"api/v1/dataset/{table.id}/related_objects"
diff --git a/tests/integration_tests/explore/form_data/api_tests.py b/tests/integration_tests/explore/form_data/api_tests.py
index c05be00e96..af7df78f90 100644
--- a/tests/integration_tests/explore/form_data/api_tests.py
+++ b/tests/integration_tests/explore/form_data/api_tests.py
@@ -119,7 +119,7 @@ def test_post_access_denied(client, chart_id: int, dataset_id: int):
         "form_data": INITIAL_FORM_DATA,
     }
     resp = client.post("api/v1/explore/form_data", json=payload)
-    assert resp.status_code == 404
+    assert resp.status_code == 403
 
 
 def test_post_same_key_for_same_context(client, chart_id: int, dataset_id: int):
@@ -310,7 +310,7 @@ def test_put_access_denied(client, chart_id: int, dataset_id: int):
         "form_data": UPDATED_FORM_DATA,
     }
     resp = client.put(f"api/v1/explore/form_data/{KEY}", json=payload)
-    assert resp.status_code == 404
+    assert resp.status_code == 403
 
 
 def test_put_not_owner(client, chart_id: int, dataset_id: int):
@@ -321,7 +321,7 @@ def test_put_not_owner(client, chart_id: int, dataset_id: int):
         "form_data": UPDATED_FORM_DATA,
     }
     resp = client.put(f"api/v1/explore/form_data/{KEY}", json=payload)
-    assert resp.status_code == 404
+    assert resp.status_code == 403
 
 
 def test_get_key_not_found(client):
@@ -341,7 +341,7 @@ def test_get(client):
 def test_get_access_denied(client):
     login(client, "gamma")
     resp = client.get(f"api/v1/explore/form_data/{KEY}")
-    assert resp.status_code == 404
+    assert resp.status_code == 403
 
 
 @patch("superset.security.SupersetSecurityManager.can_access_datasource")
@@ -361,7 +361,7 @@ def test_delete(client):
 def test_delete_access_denied(client):
     login(client, "gamma")
     resp = client.delete(f"api/v1/explore/form_data/{KEY}")
-    assert resp.status_code == 404
+    assert resp.status_code == 403
 
 
 def test_delete_not_owner(client, chart_id: int, dataset_id: int, admin_id: int):
diff --git a/tests/integration_tests/explore/permalink/api_tests.py b/tests/integration_tests/explore/permalink/api_tests.py
index a44bc70a7b..d25f3f1149 100644
--- a/tests/integration_tests/explore/permalink/api_tests.py
+++ b/tests/integration_tests/explore/permalink/api_tests.py
@@ -85,7 +85,7 @@ def test_post(client, form_data: Dict[str, Any], permalink_salt: str):
 def test_post_access_denied(client, form_data):
     login(client, "gamma")
     resp = client.post(f"api/v1/explore/permalink", json={"formData": form_data})
-    assert resp.status_code == 404
+    assert resp.status_code == 403
 
 
 def test_get_missing_chart(client, chart, permalink_salt: str) -> None:
diff --git a/tests/unit_tests/charts/dao/__init__.py b/tests/unit_tests/charts/dao/__init__.py
new file mode 100644
index 0000000000..13a83393a9
--- /dev/null
+++ b/tests/unit_tests/charts/dao/__init__.py
@@ -0,0 +1,16 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
diff --git a/tests/unit_tests/charts/dao/dao_tests.py b/tests/unit_tests/charts/dao/dao_tests.py
new file mode 100644
index 0000000000..15310712a5
--- /dev/null
+++ b/tests/unit_tests/charts/dao/dao_tests.py
@@ -0,0 +1,67 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from typing import Iterator
+
+import pytest
+from sqlalchemy.orm.session import Session
+
+from superset.utils.core import DatasourceType
+
+
+@pytest.fixture
+def session_with_data(session: Session) -> Iterator[Session]:
+    from superset.models.slice import Slice
+
+    engine = session.get_bind()
+    Slice.metadata.create_all(engine)  # pylint: disable=no-member
+
+    slice_obj = Slice(
+        id=1,
+        datasource_id=1,
+        datasource_type=DatasourceType.TABLE,
+        datasource_name="tmp_perm_table",
+        slice_name="slice_name",
+    )
+
+    session.add(slice_obj)
+    session.commit()
+    yield session
+    session.rollback()
+
+
+def test_slice_find_by_id_skip_base_filter(session_with_data: Session) -> None:
+    from superset.charts.dao import ChartDAO
+    from superset.models.slice import Slice
+
+    result = ChartDAO.find_by_id(1, session=session_with_data, skip_base_filter=True)
+
+    assert result
+    assert 1 == result.id
+    assert "slice_name" == result.slice_name
+    assert isinstance(result, Slice)
+
+
+def test_datasource_find_by_id_skip_base_filter_not_found(
+    session_with_data: Session,
+) -> None:
+    from superset.charts.dao import ChartDAO
+
+    result = ChartDAO.find_by_id(
+        125326326, session=session_with_data, skip_base_filter=True
+    )
+    assert result is None
diff --git a/tests/unit_tests/datasets/dao/__init__.py b/tests/unit_tests/datasets/dao/__init__.py
new file mode 100644
index 0000000000..13a83393a9
--- /dev/null
+++ b/tests/unit_tests/datasets/dao/__init__.py
@@ -0,0 +1,16 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
diff --git a/tests/unit_tests/datasets/dao/dao_tests.py b/tests/unit_tests/datasets/dao/dao_tests.py
new file mode 100644
index 0000000000..31aa9f27d0
--- /dev/null
+++ b/tests/unit_tests/datasets/dao/dao_tests.py
@@ -0,0 +1,73 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from typing import Iterator
+
+import pytest
+from sqlalchemy.orm.session import Session
+
+
+@pytest.fixture
+def session_with_data(session: Session) -> Iterator[Session]:
+    from superset.connectors.sqla.models import SqlaTable
+    from superset.models.core import Database
+
+    engine = session.get_bind()
+    SqlaTable.metadata.create_all(engine)  # pylint: disable=no-member
+
+    db = Database(database_name="my_database", sqlalchemy_uri="sqlite://")
+    sqla_table = SqlaTable(
+        table_name="my_sqla_table",
+        columns=[],
+        metrics=[],
+        database=db,
+    )
+
+    session.add(db)
+    session.add(sqla_table)
+    session.flush()
+    yield session
+    session.rollback()
+
+
+def test_datasource_find_by_id_skip_base_filter(session_with_data: Session) -> None:
+    from superset.connectors.sqla.models import SqlaTable
+    from superset.datasets.dao import DatasetDAO
+
+    result = DatasetDAO.find_by_id(
+        1,
+        session=session_with_data,
+        skip_base_filter=True,
+    )
+
+    assert result
+    assert 1 == result.id
+    assert "my_sqla_table" == result.table_name
+    assert isinstance(result, SqlaTable)
+
+
+def test_datasource_find_by_id_skip_base_filter_not_found(
+    session_with_data: Session,
+) -> None:
+    from superset.datasets.dao import DatasetDAO
+
+    result = DatasetDAO.find_by_id(
+        125326326,
+        session=session_with_data,
+        skip_base_filter=True,
+    )
+    assert result is None


[superset] 05/13: Fixes #20155 (#20273)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 1.5
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 6dee53682e18ea1474c60feec1188231a6c6cc01
Author: Ensky <en...@gmail.com>
AuthorDate: Mon Jun 6 18:57:54 2022 +0800

    Fixes #20155 (#20273)
    
    (cherry picked from commit 77e326fd95f4322951aecfe226a2c47c40af4d55)
---
 docs/docs/installation/running-on-kubernetes.mdx | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/docs/installation/running-on-kubernetes.mdx b/docs/docs/installation/running-on-kubernetes.mdx
index d87359f146..8f4d97d41f 100644
--- a/docs/docs/installation/running-on-kubernetes.mdx
+++ b/docs/docs/installation/running-on-kubernetes.mdx
@@ -130,7 +130,7 @@ database drivers so that you can connect to those datasources in your Superset i
 ```yaml
 bootstrapScript: |
   #!/bin/bash
-  pip install psycopg2==2.8.5 \
+  pip install psycopg2==2.9.1 \
     redis==3.2.1 \
     pybigquery==2.26.0 \
     elasticsearch-dbapi==0.2.5 &&\


[superset] 10/13: Memoize the common_bootstrap_payload (#21018)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 1.5
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 6bcb9674da5d42c414b0d24ed3d6535b4837b613
Author: Bogdan <b....@gmail.com>
AuthorDate: Tue Aug 16 08:27:12 2022 -0700

    Memoize the common_bootstrap_payload (#21018)
    
    Try patch
    
    Co-authored-by: Bogdan Kyryliuk <bo...@dropbox.com>
    (cherry picked from commit 495a205dec577097651d929bb2f062b0f5003e2e)
---
 superset/views/base.py                | 8 +++++++-
 tests/integration_tests/core_tests.py | 4 +++-
 2 files changed, 10 insertions(+), 2 deletions(-)

diff --git a/superset/views/base.py b/superset/views/base.py
index e505af5300..173ba5eb19 100644
--- a/superset/views/base.py
+++ b/superset/views/base.py
@@ -71,6 +71,7 @@ from superset.exceptions import (
     SupersetException,
     SupersetSecurityException,
 )
+from superset.extensions import cache_manager
 from superset.models.helpers import ImportExportMixin
 from superset.models.reports import ReportRecipientType
 from superset.superset_typing import FlaskResponse
@@ -345,8 +346,13 @@ def menu_data() -> Dict[str, Any]:
     }
 
 
+@cache_manager.cache.memoize(timeout=60)
 def common_bootstrap_payload() -> Dict[str, Any]:
-    """Common data always sent to the client"""
+    """Common data always sent to the client
+
+    The function is memoized as the return value only changes based
+    on configuration and feature flag values.
+    """
     messages = get_flashed_messages(with_categories=True)
     locale = str(get_locale())
 
diff --git a/tests/integration_tests/core_tests.py b/tests/integration_tests/core_tests.py
index 796ae8a8d8..5c2b81b283 100644
--- a/tests/integration_tests/core_tests.py
+++ b/tests/integration_tests/core_tests.py
@@ -62,7 +62,7 @@ from superset.connectors.sqla.models import SqlaTable
 from superset.db_engine_specs.base import BaseEngineSpec
 from superset.db_engine_specs.mssql import MssqlEngineSpec
 from superset.exceptions import SupersetException
-from superset.extensions import async_query_manager
+from superset.extensions import async_query_manager, cache_manager
 from superset.models import core as models
 from superset.models.annotations import Annotation, AnnotationLayer
 from superset.models.dashboard import Dashboard
@@ -1434,6 +1434,8 @@ class TestCore(SupersetTestCase):
         """
         Functions in feature flags don't break bootstrap data serialization.
         """
+        # feature flags are cached
+        cache_manager.cache.clear()
         self.login()
 
         encoded = json.dumps(


[superset] 01/13: fix: sqloxide optional (#19570)

Posted by mi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch 1.5
in repository https://gitbox.apache.org/repos/asf/superset.git

commit af5ded3fcbaf964fcf7d6c56be2fc26540987fcd
Author: Beto Dealmeida <ro...@dealmeida.net>
AuthorDate: Wed Apr 6 16:11:38 2022 -0700

    fix: sqloxide optional (#19570)
---
 UPDATING.md                         |  7 +++++++
 requirements/testing.in             |  1 +
 requirements/testing.txt            | 13 +++++--------
 setup.py                            |  1 -
 superset/migrations/shared/utils.py | 10 +++++++++-
 5 files changed, 22 insertions(+), 10 deletions(-)

diff --git a/UPDATING.md b/UPDATING.md
index 0923865c62..24d03c4a21 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -22,6 +22,12 @@ under the License.
 This file documents any backwards-incompatible changes in Superset and
 assists people when migrating to a new version.
 
+## 1.5.2
+
+### Other
+
+- [19570](https://github.com/apache/superset/pull/19570): makes [sqloxide](https://pypi.org/project/sqloxide/) optional so the SIP-68 migration can be run on aarch64. If the migration is taking too long installing sqloxide manually should improve the performance.
+
 ## 1.5.0
 
 ### Breaking Changes
@@ -54,6 +60,7 @@ assists people when migrating to a new version.
 ## 1.4.1
 
 ### Breaking Changes
+
 - [17984](https://github.com/apache/superset/pull/17984): Default Flask SECRET_KEY has changed for security reasons. You should always override with your own secret. Set `PREVIOUS_SECRET_KEY` (ex: PREVIOUS_SECRET_KEY = "\2\1thisismyscretkey\1\2\\e\\y\\y\\h") with your previous key and use `superset re-encrypt-secrets` to rotate you current secrets
 
 ### Potential Downtime
diff --git a/requirements/testing.in b/requirements/testing.in
index c33f245280..082dbc934a 100644
--- a/requirements/testing.in
+++ b/requirements/testing.in
@@ -36,6 +36,7 @@ pytest
 pytest-cov
 statsd
 pytest-mock
+sqloxide
 # DB dependencies
 -e file:.[bigquery]
 -e file:.[trino]
diff --git a/requirements/testing.txt b/requirements/testing.txt
index a02d250526..ec61b36757 100644
--- a/requirements/testing.txt
+++ b/requirements/testing.txt
@@ -1,4 +1,4 @@
-# SHA1:7a8e256097b4758bdeda2529d3d4d31e421e1a3c
+# SHA1:e273e8da6bfd5f6f8563fe067e243297cc7c588c
 #
 # This file is autogenerated by pip-compile-multi
 # To update, run:
@@ -52,7 +52,6 @@ google-auth-oauthlib==0.4.6
 google-cloud-bigquery[bqstorage,pandas]==2.29.0
     # via
     #   -r requirements/testing.in
-    #   apache-superset
     #   pandas-gbq
     #   pybigquery
 google-cloud-bigquery-storage==2.9.1
@@ -105,9 +104,7 @@ openapi-schema-validator==0.1.5
 openapi-spec-validator==0.3.1
     # via -r requirements/testing.in
 pandas-gbq==0.15.0
-    # via
-    #   -r requirements/testing.in
-    #   apache-superset
+    # via -r requirements/testing.in
 parameterized==0.8.1
     # via -r requirements/testing.in
 parso==0.8.2
@@ -138,9 +135,7 @@ pyasn1==0.4.8
 pyasn1-modules==0.2.8
     # via google-auth
 pybigquery==0.10.2
-    # via
-    #   -r requirements/testing.in
-    #   apache-superset
+    # via -r requirements/testing.in
 pydata-google-auth==1.2.0
     # via pandas-gbq
 pyfakefs==4.5.0
@@ -168,6 +163,8 @@ rsa==4.7.2
     # via google-auth
 sqlalchemy-trino==0.4.1
     # via apache-superset
+sqloxide==0.1.15
+    # via -r requirements/testing.in
 statsd==3.3.0
     # via -r requirements/testing.in
 traitlets==5.0.5
diff --git a/setup.py b/setup.py
index 02b3924c72..7af8fe619a 100644
--- a/setup.py
+++ b/setup.py
@@ -112,7 +112,6 @@ setup(
         "slackclient==2.5.0",  # PINNED! slack changes file upload api in the future versions
         "sqlalchemy>=1.3.16, <1.4, !=1.3.21",
         "sqlalchemy-utils>=0.37.8, <0.38",
-        "sqloxide==0.1.15",
         "sqlparse==0.3.0",  # PINNED! see https://github.com/andialbrecht/sqlparse/issues/562
         "tabulate==0.8.9",
         # needed to support Literal (3.8) and TypeGuard (3.10)
diff --git a/superset/migrations/shared/utils.py b/superset/migrations/shared/utils.py
index bff25e05d1..c54de83c42 100644
--- a/superset/migrations/shared/utils.py
+++ b/superset/migrations/shared/utils.py
@@ -21,7 +21,11 @@ from alembic import op
 from sqlalchemy import engine_from_config
 from sqlalchemy.engine import reflection
 from sqlalchemy.exc import NoSuchTableError
-from sqloxide import parse_sql
+
+try:
+    from sqloxide import parse_sql
+except ImportError:
+    parse_sql = None
 
 from superset.sql_parse import ParsedQuery, Table
 
@@ -88,6 +92,10 @@ def extract_table_references(sql_text: str, sqla_dialect: str) -> Set[Table]:
     """
     Return all the dependencies from a SQL sql_text.
     """
+    if not parse_sql:
+        parsed = ParsedQuery(sql_text)
+        return parsed.tables
+
     dialect = "generic"
     for dialect, sqla_dialects in sqloxide_dialects.items():
         if sqla_dialect in sqla_dialects: