You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@superset.apache.org by el...@apache.org on 2022/09/20 20:50:10 UTC

[superset] branch 2.0-test created (now 79351a4d92)

This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a change to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git


      at 79351a4d92 fix: disallow users from viewing other user's profile on config (#21302)

This branch includes the following new commits:

     new 56137ebbe5 fix: logger message (#20714)
     new 5efee17def fix: make max-requests and max-requests-jitter adjustable (#20733)
     new 43b8f18a21 fix: getting default value in run-server.sh (#20736)
     new 8c2ca2d8d8 Temporal X Axis values are not properly displayed if the time column has a custom label defined (#20819)
     new 6a5b12ec8c Big Number Viz: (#20946)
     new 346c035690 fix: Explore scrolled down when navigating from dashboard (#20962)
     new 2db82d578c fix(plugin-chart-echarts): invalid total label location for negative values in stacked bar chart (#21032)
     new e91222eb65 chore(deps): unpin holidays dependency version (#21091)
     new 884e2f1ca7 fix(plugin-chart-echarts): gauge chart enhancements and fixes (#21007)
     new b4df82591e Memoize the common_bootstrap_payload (#21018)
     new 47c3cd10bd fix(dashboard): Fix scroll behaviour in DashboardBuilderSidepane (#20969)
     new c3e04d7ebf fix(plugin-chart-handlebars): order by control not work (#21005)
     new 05d7c3d74d fix(native filters): groupby filter issue (#21084)
     new 9337dec038 fix(sqllab): missing zero values while copy-to-clipboard (#21153)
     new 32736680da fix(database-list): hidden upload file button if no permission (#21216)
     new 4298690e16 fix(celery cache warmup): add auth and use warm_up_cache endpoint (#21076)
     new 2b760d0775 feat: adds TLS certificate validation option for SMTP (#21272)
     new 9b8c20e6c0 feat(embedded): provides filter bar visibility setting on embedded dashboard (#21069) (#21070)
     new 551f306479 fix(explore): Time column label not formatted when GENERIC_X_AXES enabled (#21294)
     new 150caaf787 fix(plugin-chart-echarts): show zero value in tooltip (#21296)
     new 3d3ea39eb9 fix: cached common bootstrap Revert (#21018) (#21419)
     new a9c04284c4 fix: database permissions on update and delete (avoid orphaned perms) (#20081)
     new 5ff8eb1a4b fix(plugin-chart-echarts): missing value format in mixed timeseries (#21044)
     new f106f6e66a fix: Add french translation missing (#20061)
     new 3fb48dd1e9 fix(sqllab): Fix cursor alignment in SQL lab editor by avoiding Lucida Console font on Windows (#21380)
     new 0da708e359 fix: set correct favicon from config for login and FAB list views (#21498)
     new 26963d371c fix(explore): Prevent unnecessary series limit subquery (#21154)
     new cb91fe421c fix: dataset name change and permission change (#21161)
     new 79351a4d92 fix: disallow users from viewing other user's profile on config (#21302)

The 29 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.



[superset] 10/29: Memoize the common_bootstrap_payload (#21018)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit b4df82591e3826ae383e3719eef832499b306b56
Author: Bogdan <b....@gmail.com>
AuthorDate: Tue Aug 16 08:27:12 2022 -0700

    Memoize the common_bootstrap_payload (#21018)
    
    Try patch
    
    Co-authored-by: Bogdan Kyryliuk <bo...@dropbox.com>
    (cherry picked from commit 495a205dec577097651d929bb2f062b0f5003e2e)
---
 superset/views/base.py                | 8 +++++++-
 tests/integration_tests/core_tests.py | 4 +++-
 2 files changed, 10 insertions(+), 2 deletions(-)

diff --git a/superset/views/base.py b/superset/views/base.py
index 9460b8d1ae..f5a8c30184 100644
--- a/superset/views/base.py
+++ b/superset/views/base.py
@@ -71,6 +71,7 @@ from superset.exceptions import (
     SupersetException,
     SupersetSecurityException,
 )
+from superset.extensions import cache_manager
 from superset.models.helpers import ImportExportMixin
 from superset.models.reports import ReportRecipientType
 from superset.superset_typing import FlaskResponse
@@ -343,8 +344,13 @@ def menu_data() -> Dict[str, Any]:
     }
 
 
+@cache_manager.cache.memoize(timeout=60)
 def common_bootstrap_payload() -> Dict[str, Any]:
-    """Common data always sent to the client"""
+    """Common data always sent to the client
+
+    The function is memoized as the return value only changes based
+    on configuration and feature flag values.
+    """
     messages = get_flashed_messages(with_categories=True)
     locale = str(get_locale())
 
diff --git a/tests/integration_tests/core_tests.py b/tests/integration_tests/core_tests.py
index 58943246c5..6aa1eac0ec 100644
--- a/tests/integration_tests/core_tests.py
+++ b/tests/integration_tests/core_tests.py
@@ -62,7 +62,7 @@ from superset.connectors.sqla.models import SqlaTable
 from superset.db_engine_specs.base import BaseEngineSpec
 from superset.db_engine_specs.mssql import MssqlEngineSpec
 from superset.exceptions import SupersetException
-from superset.extensions import async_query_manager
+from superset.extensions import async_query_manager, cache_manager
 from superset.models import core as models
 from superset.models.annotations import Annotation, AnnotationLayer
 from superset.models.dashboard import Dashboard
@@ -1400,6 +1400,8 @@ class TestCore(SupersetTestCase):
         """
         Functions in feature flags don't break bootstrap data serialization.
         """
+        # feature flags are cached
+        cache_manager.cache.clear()
         self.login()
 
         encoded = json.dumps(


[superset] 12/29: fix(plugin-chart-handlebars): order by control not work (#21005)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit c3e04d7ebf1b7703d108a3658d4ccd60f1d7de49
Author: Stephen Liu <75...@qq.com>
AuthorDate: Mon Aug 22 15:00:34 2022 +0800

    fix(plugin-chart-handlebars): order by control not work (#21005)
    
    (cherry picked from commit e70699fb433849e07af81ea1812f20aa271d028e)
---
 .../plugins/plugin-chart-handlebars/src/plugin/buildQuery.ts   | 10 +++++++---
 .../plugin-chart-handlebars/src/plugin/controlPanel.tsx        |  3 ++-
 .../plugin-chart-handlebars/src/plugin/controls/orderBy.tsx    |  8 +++++++-
 3 files changed, 16 insertions(+), 5 deletions(-)

diff --git a/superset-frontend/plugins/plugin-chart-handlebars/src/plugin/buildQuery.ts b/superset-frontend/plugins/plugin-chart-handlebars/src/plugin/buildQuery.ts
index 36bcb96515..3dc7bf87a3 100644
--- a/superset-frontend/plugins/plugin-chart-handlebars/src/plugin/buildQuery.ts
+++ b/superset-frontend/plugins/plugin-chart-handlebars/src/plugin/buildQuery.ts
@@ -16,15 +16,19 @@
  * specific language governing permissions and limitations
  * under the License.
  */
-import { buildQueryContext, QueryFormData } from '@superset-ui/core';
+import {
+  buildQueryContext,
+  normalizeOrderBy,
+  QueryFormData,
+} from '@superset-ui/core';
 
 export default function buildQuery(formData: QueryFormData) {
-  const { metric, sort_by_metric, groupby } = formData;
+  const { groupby } = formData;
 
   return buildQueryContext(formData, baseQueryObject => [
     {
       ...baseQueryObject,
-      ...(sort_by_metric && { orderby: [[metric, false]] }),
+      orderby: normalizeOrderBy(baseQueryObject).orderby,
       ...(groupby && { groupby }),
     },
   ]);
diff --git a/superset-frontend/plugins/plugin-chart-handlebars/src/plugin/controlPanel.tsx b/superset-frontend/plugins/plugin-chart-handlebars/src/plugin/controlPanel.tsx
index da0ba7d589..aa4efe6212 100644
--- a/superset-frontend/plugins/plugin-chart-handlebars/src/plugin/controlPanel.tsx
+++ b/superset-frontend/plugins/plugin-chart-handlebars/src/plugin/controlPanel.tsx
@@ -61,9 +61,10 @@ const config: ControlPanelConfig = {
         [metricsControlSetItem, allColumnsControlSetItem],
         [percentMetricsControlSetItem],
         [timeSeriesLimitMetricControlSetItem, orderByControlSetItem],
+        [orderDescendingControlSetItem],
         serverPaginationControlSetRow,
         [rowLimitControlSetItem, serverPageLengthControlSetItem],
-        [includeTimeControlSetItem, orderDescendingControlSetItem],
+        [includeTimeControlSetItem],
         [showTotalsControlSetItem],
         ['adhoc_filters'],
         emitFilterControl,
diff --git a/superset-frontend/plugins/plugin-chart-handlebars/src/plugin/controls/orderBy.tsx b/superset-frontend/plugins/plugin-chart-handlebars/src/plugin/controls/orderBy.tsx
index 93002bd49b..d2f52e8e9b 100644
--- a/superset-frontend/plugins/plugin-chart-handlebars/src/plugin/controls/orderBy.tsx
+++ b/superset-frontend/plugins/plugin-chart-handlebars/src/plugin/controls/orderBy.tsx
@@ -18,6 +18,7 @@
  */
 import { ControlSetItem, Dataset } from '@superset-ui/chart-controls';
 import { t } from '@superset-ui/core';
+import { isEmpty } from 'lodash';
 import { isAggMode, isRawMode } from './shared';
 
 export const orderByControlSetItem: ControlSetItem = {
@@ -45,7 +46,12 @@ export const orderDescendingControlSetItem: ControlSetItem = {
     label: t('Sort descending'),
     default: true,
     description: t('Whether to sort descending or ascending'),
-    visibility: isAggMode,
+    visibility: ({ controls }) =>
+      !!(
+        isAggMode({ controls }) &&
+        controls?.timeseries_limit_metric.value &&
+        !isEmpty(controls?.timeseries_limit_metric.value)
+      ),
     resetOnHide: false,
   },
 };


[superset] 06/29: fix: Explore scrolled down when navigating from dashboard (#20962)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 346c035690524f564421af6c18e4911c9ee5a268
Author: Kamil Gabryjelski <ka...@gmail.com>
AuthorDate: Wed Aug 3 21:55:52 2022 +0200

    fix: Explore scrolled down when navigating from dashboard (#20962)
    
    (cherry picked from commit e4fc5564ced1e2ad2f475629ce082ededd063ba9)
---
 superset-frontend/src/views/App.tsx         |  2 ++
 superset-frontend/src/views/ScrollToTop.tsx | 31 +++++++++++++++++++++++++++++
 2 files changed, 33 insertions(+)

diff --git a/superset-frontend/src/views/App.tsx b/superset-frontend/src/views/App.tsx
index 04ca777c3d..ecdbddbd56 100644
--- a/superset-frontend/src/views/App.tsx
+++ b/superset-frontend/src/views/App.tsx
@@ -35,6 +35,7 @@ import setupApp from 'src/setup/setupApp';
 import { routes, isFrontendRoute } from 'src/views/routes';
 import { Logger } from 'src/logger/LogUtils';
 import { RootContextProviders } from './RootContextProviders';
+import { ScrollToTop } from './ScrollToTop';
 
 setupApp();
 
@@ -60,6 +61,7 @@ const LocationPathnameLogger = () => {
 
 const App = () => (
   <Router>
+    <ScrollToTop />
     <LocationPathnameLogger />
     <RootContextProviders>
       <GlobalStyles />
diff --git a/superset-frontend/src/views/ScrollToTop.tsx b/superset-frontend/src/views/ScrollToTop.tsx
new file mode 100644
index 0000000000..283f1b4208
--- /dev/null
+++ b/superset-frontend/src/views/ScrollToTop.tsx
@@ -0,0 +1,31 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+import { useEffect } from 'react';
+import { useLocation } from 'react-router-dom';
+
+export const ScrollToTop = () => {
+  const { pathname } = useLocation();
+
+  useEffect(() => {
+    window.scrollTo(0, 0);
+  }, [pathname]);
+
+  return null;
+};


[superset] 29/29: fix: disallow users from viewing other user's profile on config (#21302)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 79351a4d92834b2f5692e44d0b66b9c5246df0d0
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Mon Sep 5 13:32:48 2022 +0100

    fix: disallow users from viewing other user's profile on config (#21302)
---
 .../src/views/CRUD/chart/ChartList.tsx             | 22 ++++++++++++++--------
 .../src/views/CRUD/dashboard/DashboardList.tsx     | 10 +++++++++-
 superset/config.py                                 |  1 +
 superset/views/base.py                             |  1 +
 superset/views/core.py                             |  9 +++++++--
 tests/integration_tests/core_tests.py              | 12 ++++++++++++
 6 files changed, 44 insertions(+), 11 deletions(-)

diff --git a/superset-frontend/src/views/CRUD/chart/ChartList.tsx b/superset-frontend/src/views/CRUD/chart/ChartList.tsx
index 637e9b6ea7..1e22985386 100644
--- a/superset-frontend/src/views/CRUD/chart/ChartList.tsx
+++ b/superset-frontend/src/views/CRUD/chart/ChartList.tsx
@@ -60,6 +60,8 @@ import { nativeFilterGate } from 'src/dashboard/components/nativeFilters/utils';
 import setupPlugins from 'src/setup/setupPlugins';
 import InfoTooltip from 'src/components/InfoTooltip';
 import CertifiedBadge from 'src/components/CertifiedBadge';
+import { bootstrapData } from 'src/preamble';
+import Owner from 'src/types/Owner';
 import ChartCard from './ChartCard';
 
 const FlexRowContainer = styled.div`
@@ -209,7 +211,8 @@ function ChartList(props: ChartListProps) {
   const canExport =
     hasPerm('can_export') && isFeatureEnabled(FeatureFlag.VERSIONED_EXPORT);
   const initialSort = [{ id: 'changed_on_delta_humanized', desc: true }];
-
+  const enableBroadUserAccess =
+    bootstrapData?.common?.conf?.ENABLE_BROAD_ACTIVITY_ACCESS;
   const handleBulkChartExport = (chartsToExport: Chart[]) => {
     const ids = chartsToExport.map(({ id }) => id);
     handleResourceExport('chart', ids, () => {
@@ -217,6 +220,10 @@ function ChartList(props: ChartListProps) {
     });
     setPreparingExport(true);
   };
+  const changedByName = (lastSavedBy: Owner) =>
+    lastSavedBy?.first_name
+      ? `${lastSavedBy?.first_name} ${lastSavedBy?.last_name}`
+      : null;
 
   function handleBulkChartDelete(chartsToDelete: Chart[]) {
     SupersetClient.delete({
@@ -321,13 +328,12 @@ function ChartList(props: ChartListProps) {
               changed_by_url: changedByUrl,
             },
           },
-        }: any) => (
-          <a href={changedByUrl}>
-            {lastSavedBy?.first_name
-              ? `${lastSavedBy?.first_name} ${lastSavedBy?.last_name}`
-              : null}
-          </a>
-        ),
+        }: any) =>
+          enableBroadUserAccess ? (
+            <a href={changedByUrl}>{changedByName(lastSavedBy)}</a>
+          ) : (
+            <>{changedByName(lastSavedBy)}</>
+          ),
         Header: t('Modified by'),
         accessor: 'last_saved_by.first_name',
         size: 'xl',
diff --git a/superset-frontend/src/views/CRUD/dashboard/DashboardList.tsx b/superset-frontend/src/views/CRUD/dashboard/DashboardList.tsx
index 7dbb30159d..8569f840d3 100644
--- a/superset-frontend/src/views/CRUD/dashboard/DashboardList.tsx
+++ b/superset-frontend/src/views/CRUD/dashboard/DashboardList.tsx
@@ -49,6 +49,7 @@ import ImportModelsModal from 'src/components/ImportModal/index';
 
 import Dashboard from 'src/dashboard/containers/Dashboard';
 import CertifiedBadge from 'src/components/CertifiedBadge';
+import { bootstrapData } from 'src/preamble';
 import DashboardCard from './DashboardCard';
 import { DashboardStatus } from './types';
 
@@ -132,6 +133,8 @@ function DashboardList(props: DashboardListProps) {
   const [importingDashboard, showImportModal] = useState<boolean>(false);
   const [passwordFields, setPasswordFields] = useState<string[]>([]);
   const [preparingExport, setPreparingExport] = useState<boolean>(false);
+  const enableBroadUserAccess =
+    bootstrapData?.common?.conf?.ENABLE_BROAD_ACTIVITY_ACCESS;
 
   const openDashboardImportModal = () => {
     showImportModal(true);
@@ -290,7 +293,12 @@ function DashboardList(props: DashboardListProps) {
               changed_by_url: changedByUrl,
             },
           },
-        }: any) => <a href={changedByUrl}>{changedByName}</a>,
+        }: any) =>
+          enableBroadUserAccess ? (
+            <a href={changedByUrl}>{changedByName}</a>
+          ) : (
+            <>{changedByName}</>
+          ),
         Header: t('Modified by'),
         accessor: 'changed_by.first_name',
         size: 'xl',
diff --git a/superset/config.py b/superset/config.py
index bae75fed6e..c5f59e365f 100644
--- a/superset/config.py
+++ b/superset/config.py
@@ -1289,6 +1289,7 @@ DATASET_HEALTH_CHECK: Optional[Callable[["SqlaTable"], str]] = None
 MENU_HIDE_USER_INFO = False
 
 # Set to False to only allow viewing own recent activity
+# or to disallow users from viewing other users profile page
 ENABLE_BROAD_ACTIVITY_ACCESS = True
 
 # the advanced data type key should correspond to that set in the column metadata
diff --git a/superset/views/base.py b/superset/views/base.py
index 9460b8d1ae..42ec46a706 100644
--- a/superset/views/base.py
+++ b/superset/views/base.py
@@ -86,6 +86,7 @@ FRONTEND_CONF_KEYS = (
     "SUPERSET_DASHBOARD_PERIODICAL_REFRESH_WARNING_MESSAGE",
     "DISABLE_DATASET_SOURCE_EDIT",
     "ENABLE_JAVASCRIPT_CONTROLS",
+    "ENABLE_BROAD_ACTIVITY_ACCESS",
     "DEFAULT_SQLLAB_LIMIT",
     "DEFAULT_VIZ_TYPE",
     "SQL_MAX_ROW",
diff --git a/superset/views/core.py b/superset/views/core.py
index f65385fc30..fd7539e916 100755
--- a/superset/views/core.py
+++ b/superset/views/core.py
@@ -2693,8 +2693,13 @@ class Superset(BaseSupersetView):  # pylint: disable=too-many-public-methods
         user = (
             db.session.query(ab_models.User).filter_by(username=username).one_or_none()
         )
-        if not user:
-            abort(404, description=f"User: {username} does not exist.")
+        # Prevent returning 404 when user is not found to prevent username scanning
+        user_id = -1 if not user else user.id
+        # Prevent unauthorized access to other user's profiles,
+        # unless configured to do so on with ENABLE_BROAD_ACTIVITY_ACCESS
+        error_obj = self.get_user_activity_access_error(user_id)
+        if error_obj:
+            return error_obj
 
         payload = {
             "user": bootstrap_user_data(user, include_perms=True),
diff --git a/tests/integration_tests/core_tests.py b/tests/integration_tests/core_tests.py
index 58943246c5..f0d7925334 100644
--- a/tests/integration_tests/core_tests.py
+++ b/tests/integration_tests/core_tests.py
@@ -851,6 +851,18 @@ class TestCore(SupersetTestCase):
             data = self.get_json_resp(endpoint)
             self.assertNotIn("message", data)
 
+    def test_user_profile_optional_access(self):
+        self.login(username="gamma")
+        resp = self.client.get(f"/superset/profile/admin/")
+        self.assertEqual(resp.status_code, 200)
+
+        app.config["ENABLE_BROAD_ACTIVITY_ACCESS"] = False
+        resp = self.client.get(f"/superset/profile/admin/")
+        self.assertEqual(resp.status_code, 403)
+
+        # Restore config
+        app.config["ENABLE_BROAD_ACTIVITY_ACCESS"] = True
+
     @pytest.mark.usefixtures("load_birth_names_dashboard_with_slices")
     def test_user_activity_access(self, username="gamma"):
         self.login(username=username)


[superset] 07/29: fix(plugin-chart-echarts): invalid total label location for negative values in stacked bar chart (#21032)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 2db82d578c10abc14fde3936102049b3919f5994
Author: JUST.in DO IT <ju...@airbnb.com>
AuthorDate: Thu Aug 11 11:28:18 2022 -0700

    fix(plugin-chart-echarts): invalid total label location for negative values in stacked bar chart (#21032)
    
    (cherry picked from commit a8ba544e609ad3af449239c1fb956bb18c7066c4)
---
 .../plugin-chart-echarts/Timeseries/Stories.tsx    |  35 +++++-
 .../Timeseries/negativeNumData.ts                  | 111 +++++++++++++++++++
 .../src/Timeseries/transformProps.ts               |   2 +
 .../src/Timeseries/transformers.ts                 |   5 +-
 .../plugin-chart-echarts/src/utils/series.ts       |  15 ++-
 .../plugin-chart-echarts/test/utils/series.test.ts | 119 +++++++++++++++++++++
 6 files changed, 284 insertions(+), 3 deletions(-)

diff --git a/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/plugin-chart-echarts/Timeseries/Stories.tsx b/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/plugin-chart-echarts/Timeseries/Stories.tsx
index 3f219ed6e6..b44ba252ea 100644
--- a/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/plugin-chart-echarts/Timeseries/Stories.tsx
+++ b/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/plugin-chart-echarts/Timeseries/Stories.tsx
@@ -25,6 +25,7 @@ import {
   TimeseriesTransformProps,
 } from '@superset-ui/plugin-chart-echarts';
 import data from './data';
+import negativeNumData from './negativeNumData';
 import { withResizableChartDemo } from '../../../../shared/components/ResizableChartDemo';
 
 new EchartsTimeseriesChartPlugin()
@@ -61,7 +62,9 @@ export const Timeseries = ({ width, height }) => {
       chartType="echarts-timeseries"
       width={width}
       height={height}
-      queriesData={[{ data: queryData }]}
+      queriesData={[
+        { data: queryData, colnames: ['__timestamp'], coltypes: [2] },
+      ]}
       formData={{
         contributionMode: undefined,
         forecastEnabled,
@@ -87,3 +90,33 @@ export const Timeseries = ({ width, height }) => {
     />
   );
 };
+
+export const WithNegativeNumbers = ({ width, height }) => (
+  <SuperChart
+    chartType="echarts-timeseries"
+    width={width}
+    height={height}
+    queriesData={[
+      { data: negativeNumData, colnames: ['__timestamp'], coltypes: [2] },
+    ]}
+    formData={{
+      contributionMode: undefined,
+      colorScheme: 'supersetColors',
+      seriesType: select(
+        'Line type',
+        ['line', 'scatter', 'smooth', 'bar', 'start', 'middle', 'end'],
+        'line',
+      ),
+      yAxisFormat: '$,.2f',
+      stack: boolean('Stack', true),
+      showValue: true,
+      showLegend: true,
+      onlyTotal: boolean('Only Total', true),
+      orientation: select(
+        'Orientation',
+        ['vertical', 'horizontal'],
+        'vertical',
+      ),
+    }}
+  />
+);
diff --git a/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/plugin-chart-echarts/Timeseries/negativeNumData.ts b/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/plugin-chart-echarts/Timeseries/negativeNumData.ts
new file mode 100644
index 0000000000..8dc0f7e9b9
--- /dev/null
+++ b/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/plugin-chart-echarts/Timeseries/negativeNumData.ts
@@ -0,0 +1,111 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+export default [
+  {
+    __timestamp: 1619827200000,
+    Boston: -0.88,
+    NewYork: null,
+    Washington: -0.3,
+    JerseyCity: -3.05,
+    Denver: -8.25,
+    SF: -0.13,
+  },
+  {
+    __timestamp: 1622505600000,
+    Boston: -0.81,
+    NewYork: null,
+    Washington: -0.29,
+    JerseyCity: -3.54,
+    Denver: -13.4,
+    SF: -0.12,
+  },
+  {
+    __timestamp: 1625097600000,
+    Boston: 0.91,
+    NewYork: null,
+    Washington: 0.25,
+    JerseyCity: 7.17,
+    Denver: 7.69,
+    SF: 0.05,
+  },
+  {
+    __timestamp: 1627776000000,
+    Boston: -1.05,
+    NewYork: -1.04,
+    Washington: -0.19,
+    JerseyCity: -8.99,
+    Denver: -7.99,
+    SF: -0.01,
+  },
+  {
+    __timestamp: 1630454400000,
+    Boston: -0.92,
+    NewYork: -1.09,
+    Washington: -0.17,
+    JerseyCity: -8.75,
+    Denver: -7.55,
+    SF: -0.01,
+  },
+  {
+    __timestamp: 1633046400000,
+    Boston: 0.79,
+    NewYork: -0.85,
+    Washington: 0.13,
+    JerseyCity: 12.59,
+    Denver: 3.34,
+    SF: -0.05,
+  },
+  {
+    __timestamp: 1635724800000,
+    Boston: 0.72,
+    NewYork: 0.54,
+    Washington: 0.15,
+    JerseyCity: 11.03,
+    Denver: 7.24,
+    SF: -0.14,
+  },
+  {
+    __timestamp: 1638316800000,
+    Boston: 0.61,
+    NewYork: 0.73,
+    Washington: 0.15,
+    JerseyCity: 13.45,
+    Denver: 5.98,
+    SF: -0.22,
+  },
+  {
+    __timestamp: 1640995200000,
+    Boston: 0.51,
+    NewYork: 1.8,
+    Washington: 0.15,
+    JerseyCity: 12.96,
+    Denver: 3.22,
+    SF: -0.02,
+  },
+  {
+    __timestamp: 1643673600000,
+    Boston: -0.47,
+    NewYork: null,
+    Washington: -0.18,
+    JerseyCity: -14.27,
+    Denver: -6.24,
+    SF: -0.04,
+  },
+];
diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformProps.ts b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformProps.ts
index ca0e079609..d96fe1b9b0 100644
--- a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformProps.ts
+++ b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformProps.ts
@@ -165,6 +165,8 @@ export default function transformProps(
   });
   const showValueIndexes = extractShowValueIndexes(rawSeries, {
     stack,
+    onlyTotal,
+    isHorizontal,
   });
   const seriesContexts = extractForecastSeriesContexts(
     Object.values(rawSeries).map(series => series.name as string),
diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformers.ts b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformers.ts
index 93565de46c..420662647d 100644
--- a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformers.ts
+++ b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformers.ts
@@ -233,7 +233,10 @@ export function transformSeries(
         if (!formatter) return numericValue;
         if (!stack || isSelectedLegend) return formatter(numericValue);
         if (!onlyTotal) {
-          if (numericValue >= thresholdValues[dataIndex]) {
+          if (
+            numericValue >=
+            (thresholdValues[dataIndex] || Number.MIN_SAFE_INTEGER)
+          ) {
             return formatter(numericValue);
           }
           return '';
diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/utils/series.ts b/superset-frontend/plugins/plugin-chart-echarts/src/utils/series.ts
index 23710cd6d1..8d93ce3f34 100644
--- a/superset-frontend/plugins/plugin-chart-echarts/src/utils/series.ts
+++ b/superset-frontend/plugins/plugin-chart-echarts/src/utils/series.ts
@@ -77,6 +77,8 @@ export function extractShowValueIndexes(
   series: SeriesOption[],
   opts: {
     stack: StackType;
+    onlyTotal?: boolean;
+    isHorizontal?: boolean;
   },
 ): number[] {
   const showValueIndexes: number[] = [];
@@ -84,9 +86,20 @@ export function extractShowValueIndexes(
     series.forEach((entry, seriesIndex) => {
       const { data = [] } = entry;
       (data as [any, number][]).forEach((datum, dataIndex) => {
-        if (datum[1] !== null) {
+        if (!opts.onlyTotal && datum[opts.isHorizontal ? 0 : 1] !== null) {
           showValueIndexes[dataIndex] = seriesIndex;
         }
+        if (opts.onlyTotal) {
+          if (datum[opts.isHorizontal ? 0 : 1] > 0) {
+            showValueIndexes[dataIndex] = seriesIndex;
+          }
+          if (
+            !showValueIndexes[dataIndex] &&
+            datum[opts.isHorizontal ? 0 : 1] !== null
+          ) {
+            showValueIndexes[dataIndex] = seriesIndex;
+          }
+        }
       });
     });
   }
diff --git a/superset-frontend/plugins/plugin-chart-echarts/test/utils/series.test.ts b/superset-frontend/plugins/plugin-chart-echarts/test/utils/series.test.ts
index ae3871c821..3f20e0d5f8 100644
--- a/superset-frontend/plugins/plugin-chart-echarts/test/utils/series.test.ts
+++ b/superset-frontend/plugins/plugin-chart-echarts/test/utils/series.test.ts
@@ -25,6 +25,7 @@ import {
   getChartPadding,
   getLegendProps,
   sanitizeHtml,
+  extractShowValueIndexes,
 } from '../../src/utils/series';
 import { LegendOrientation, LegendType } from '../../src/types';
 import { defaultLegendPadding } from '../../src/defaults';
@@ -206,6 +207,124 @@ describe('extractGroupbyLabel', () => {
   });
 });
 
+describe('extractShowValueIndexes', () => {
+  it('should return the latest index for stack', () => {
+    expect(
+      extractShowValueIndexes(
+        [
+          {
+            id: 'abc',
+            name: 'abc',
+            data: [
+              ['2000-01-01', null],
+              ['2000-02-01', 0],
+              ['2000-03-01', 1],
+              ['2000-04-01', 0],
+              ['2000-05-01', null],
+              ['2000-06-01', 0],
+              ['2000-07-01', 2],
+              ['2000-08-01', 3],
+              ['2000-09-01', null],
+              ['2000-10-01', null],
+            ],
+          },
+          {
+            id: 'def',
+            name: 'def',
+            data: [
+              ['2000-01-01', null],
+              ['2000-02-01', 0],
+              ['2000-03-01', null],
+              ['2000-04-01', 0],
+              ['2000-05-01', null],
+              ['2000-06-01', 0],
+              ['2000-07-01', 2],
+              ['2000-08-01', 3],
+              ['2000-09-01', null],
+              ['2000-10-01', 0],
+            ],
+          },
+          {
+            id: 'def',
+            name: 'def',
+            data: [
+              ['2000-01-01', null],
+              ['2000-02-01', null],
+              ['2000-03-01', null],
+              ['2000-04-01', null],
+              ['2000-05-01', null],
+              ['2000-06-01', 3],
+              ['2000-07-01', null],
+              ['2000-08-01', null],
+              ['2000-09-01', null],
+              ['2000-10-01', null],
+            ],
+          },
+        ],
+        { stack: true, onlyTotal: false, isHorizontal: false },
+      ),
+    ).toEqual([undefined, 1, 0, 1, undefined, 2, 1, 1, undefined, 1]);
+  });
+
+  it('should handle the negative numbers for total only', () => {
+    expect(
+      extractShowValueIndexes(
+        [
+          {
+            id: 'abc',
+            name: 'abc',
+            data: [
+              ['2000-01-01', null],
+              ['2000-02-01', 0],
+              ['2000-03-01', -1],
+              ['2000-04-01', 0],
+              ['2000-05-01', null],
+              ['2000-06-01', 0],
+              ['2000-07-01', -2],
+              ['2000-08-01', -3],
+              ['2000-09-01', null],
+              ['2000-10-01', null],
+            ],
+          },
+          {
+            id: 'def',
+            name: 'def',
+            data: [
+              ['2000-01-01', null],
+              ['2000-02-01', 0],
+              ['2000-03-01', null],
+              ['2000-04-01', 0],
+              ['2000-05-01', null],
+              ['2000-06-01', 0],
+              ['2000-07-01', 2],
+              ['2000-08-01', -3],
+              ['2000-09-01', null],
+              ['2000-10-01', 0],
+            ],
+          },
+          {
+            id: 'def',
+            name: 'def',
+            data: [
+              ['2000-01-01', null],
+              ['2000-02-01', 0],
+              ['2000-03-01', null],
+              ['2000-04-01', 1],
+              ['2000-05-01', null],
+              ['2000-06-01', 0],
+              ['2000-07-01', -2],
+              ['2000-08-01', 3],
+              ['2000-09-01', null],
+              ['2000-10-01', 0],
+            ],
+          },
+        ],
+        { stack: true, onlyTotal: true, isHorizontal: false },
+      ),
+    ).toEqual([undefined, 1, 0, 2, undefined, 1, 1, 2, undefined, 1]);
+  });
+});
+
 describe('formatSeriesName', () => {
   const numberFormatter = getNumberFormatter();
   const timeFormatter = getTimeFormatter();


[superset] 20/29: fix(plugin-chart-echarts): show zero value in tooltip (#21296)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 150caaf7876a39f9f898e7a03615b586c34d02f6
Author: Ville Brofeldt <33...@users.noreply.github.com>
AuthorDate: Thu Sep 1 19:27:41 2022 +0200

    fix(plugin-chart-echarts): show zero value in tooltip (#21296)
    
    Co-authored-by: Ville Brofeldt <vi...@apple.com>
    (cherry picked from commit 1aeb8fd6b78d5b53501d277f54b46a02f7067163)
---
 .../plugin-chart-echarts/src/utils/forecast.ts     |   5 +-
 .../test/utils/forecast.test.ts                    | 223 +++++++++++++--------
 2 files changed, 137 insertions(+), 91 deletions(-)

diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/utils/forecast.ts b/superset-frontend/plugins/plugin-chart-echarts/src/utils/forecast.ts
index 94e4630bf4..485e9fb896 100644
--- a/superset-frontend/plugins/plugin-chart-echarts/src/utils/forecast.ts
+++ b/superset-frontend/plugins/plugin-chart-echarts/src/utils/forecast.ts
@@ -16,6 +16,7 @@
  * specific language governing permissions and limitations
  * under the License.
  */
+import { isNumber } from 'lodash';
 import { DataRecord, DTTM_ALIAS, NumberFormatter } from '@superset-ui/core';
 import { OptionName } from 'echarts/types/src/util/types';
 import { TooltipMarker } from 'echarts/types/src/util/format';
@@ -60,7 +61,7 @@ export const extractForecastValuesFromTooltipParams = (
     const { marker, seriesId, value } = param;
     const context = extractForecastSeriesContext(seriesId);
     const numericValue = isHorizontal ? value[0] : value[1];
-    if (numericValue) {
+    if (isNumber(numericValue)) {
       if (!(context.name in values))
         values[context.name] = {
           marker: marker || '',
@@ -94,7 +95,7 @@ export const formatForecastTooltipSeries = ({
 }): string => {
   let row = `${marker}${sanitizeHtml(seriesName)}: `;
   let isObservation = false;
-  if (observation) {
+  if (isNumber(observation)) {
     isObservation = true;
     row += `${formatter(observation)}`;
   }
diff --git a/superset-frontend/plugins/plugin-chart-echarts/test/utils/forecast.test.ts b/superset-frontend/plugins/plugin-chart-echarts/test/utils/forecast.test.ts
index 819b2b85b1..f3d6f60267 100644
--- a/superset-frontend/plugins/plugin-chart-echarts/test/utils/forecast.test.ts
+++ b/superset-frontend/plugins/plugin-chart-echarts/test/utils/forecast.test.ts
@@ -154,103 +154,148 @@ describe('rebaseForecastDatum', () => {
   });
 });
 
-describe('extractForecastValuesFromTooltipParams', () => {
-  it('should extract the proper data from tooltip params', () => {
-    expect(
-      extractForecastValuesFromTooltipParams([
-        {
-          marker: '<img>',
-          seriesId: 'abc',
-          value: [new Date(0), 10],
-        },
-        {
-          marker: '<img>',
-          seriesId: 'abc__yhat',
-          value: [new Date(0), 1],
-        },
-        {
-          marker: '<img>',
-          seriesId: 'abc__yhat_lower',
-          value: [new Date(0), 5],
-        },
-        {
-          marker: '<img>',
-          seriesId: 'abc__yhat_upper',
-          value: [new Date(0), 6],
-        },
-        {
-          marker: '<img>',
-          seriesId: 'qwerty',
-          value: [new Date(0), 2],
-        },
-      ]),
-    ).toEqual({
-      abc: {
+test('extractForecastValuesFromTooltipParams should extract the proper data from tooltip params', () => {
+  expect(
+    extractForecastValuesFromTooltipParams([
+      {
         marker: '<img>',
-        observation: 10,
-        forecastTrend: 1,
-        forecastLower: 5,
-        forecastUpper: 6,
+        seriesId: 'abc',
+        value: [new Date(0), 10],
       },
-      qwerty: {
+      {
         marker: '<img>',
-        observation: 2,
+        seriesId: 'abc__yhat',
+        value: [new Date(0), 1],
       },
-    });
-  });
-});
-
-const formatter = getNumberFormatter(NumberFormats.INTEGER);
-
-describe('formatForecastTooltipSeries', () => {
-  it('should generate a proper series tooltip', () => {
-    expect(
-      formatForecastTooltipSeries({
-        seriesName: 'abc',
+      {
         marker: '<img>',
-        observation: 10.1,
-        formatter,
-      }),
-    ).toEqual('<img>abc: 10');
-    expect(
-      formatForecastTooltipSeries({
-        seriesName: 'qwerty',
+        seriesId: 'abc__yhat_lower',
+        value: [new Date(0), 5],
+      },
+      {
         marker: '<img>',
-        observation: 10.1,
-        forecastTrend: 20.1,
-        forecastLower: 5.1,
-        forecastUpper: 7.1,
-        formatter,
-      }),
-    ).toEqual('<img>qwerty: 10, ŷ = 20 (5, 12)');
-    expect(
-      formatForecastTooltipSeries({
-        seriesName: 'qwerty',
+        seriesId: 'abc__yhat_upper',
+        value: [new Date(0), 6],
+      },
+      {
         marker: '<img>',
-        forecastTrend: 20,
-        forecastLower: 5,
-        forecastUpper: 7,
-        formatter,
-      }),
-    ).toEqual('<img>qwerty: ŷ = 20 (5, 12)');
-    expect(
-      formatForecastTooltipSeries({
-        seriesName: 'qwerty',
+        seriesId: 'qwerty',
+        value: [new Date(0), 2],
+      },
+    ]),
+  ).toEqual({
+    abc: {
+      marker: '<img>',
+      observation: 10,
+      forecastTrend: 1,
+      forecastLower: 5,
+      forecastUpper: 6,
+    },
+    qwerty: {
+      marker: '<img>',
+      observation: 2,
+    },
+  });
+});
+
+test('extractForecastValuesFromTooltipParams should extract valid values', () => {
+  expect(
+    extractForecastValuesFromTooltipParams([
+      {
         marker: '<img>',
-        observation: 10.1,
-        forecastLower: 6,
-        forecastUpper: 7,
-        formatter,
-      }),
-    ).toEqual('<img>qwerty: 10 (6, 13)');
-    expect(
-      formatForecastTooltipSeries({
-        seriesName: 'qwerty',
+        seriesId: 'foo',
+        value: [0, 10],
+      },
+      {
         marker: '<img>',
-        forecastLower: 7,
-        forecastUpper: 8,
-        formatter,
-      }),
-    ).toEqual('<img>qwerty: (7, 15)');
+        seriesId: 'bar',
+        value: [100, 0],
+      },
+    ]),
+  ).toEqual({
+    foo: {
+      marker: '<img>',
+      observation: 10,
+    },
+    bar: {
+      marker: '<img>',
+      observation: 0,
+    },
   });
 });
+
+const formatter = getNumberFormatter(NumberFormats.INTEGER);
+
+test('formatForecastTooltipSeries should apply format to value', () => {
+  expect(
+    formatForecastTooltipSeries({
+      seriesName: 'abc',
+      marker: '<img>',
+      observation: 10.1,
+      formatter,
+    }),
+  ).toEqual('<img>abc: 10');
+});
+
+test('formatForecastTooltipSeries should show falsy value', () => {
+  expect(
+    formatForecastTooltipSeries({
+      seriesName: 'abc',
+      marker: '<img>',
+      observation: 0,
+      formatter,
+    }),
+  ).toEqual('<img>abc: 0');
+});
+
+test('formatForecastTooltipSeries should format full forecast', () => {
+  expect(
+    formatForecastTooltipSeries({
+      seriesName: 'qwerty',
+      marker: '<img>',
+      observation: 10.1,
+      forecastTrend: 20.1,
+      forecastLower: 5.1,
+      forecastUpper: 7.1,
+      formatter,
+    }),
+  ).toEqual('<img>qwerty: 10, ŷ = 20 (5, 12)');
+});
+
+test('formatForecastTooltipSeries should format forecast without observation', () => {
+  expect(
+    formatForecastTooltipSeries({
+      seriesName: 'qwerty',
+      marker: '<img>',
+      forecastTrend: 20,
+      forecastLower: 5,
+      forecastUpper: 7,
+      formatter,
+    }),
+  ).toEqual('<img>qwerty: ŷ = 20 (5, 12)');
+});
+
+test('formatForecastTooltipSeries should format forecast without point estimate', () => {
+  expect(
+    formatForecastTooltipSeries({
+      seriesName: 'qwerty',
+      marker: '<img>',
+      observation: 10.1,
+      forecastLower: 6,
+      forecastUpper: 7,
+      formatter,
+    }),
+  ).toEqual('<img>qwerty: 10 (6, 13)');
+});
+
+test('formatForecastTooltipSeries should format forecast with only confidence band', () => {
+  expect(
+    formatForecastTooltipSeries({
+      seriesName: 'qwerty',
+      marker: '<img>',
+      forecastLower: 7,
+      forecastUpper: 8,
+      formatter,
+    }),
+  ).toEqual('<img>qwerty: (7, 15)');
+});


[superset] 26/29: fix: set correct favicon from config for login and FAB list views (#21498)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 0da708e3593f4dc3c802411e0a35fce8fd2baaee
Author: Mayur <ma...@gmail.com>
AuthorDate: Mon Sep 19 12:54:01 2022 +0530

    fix: set correct favicon from config for login and FAB list views (#21498)
    
    (cherry picked from commit b29e7e7d9e6f4c1f429eb1220f03640596579e9f)
---
 superset/templates/superset/base.html | 10 +++++++++-
 1 file changed, 9 insertions(+), 1 deletion(-)

diff --git a/superset/templates/superset/base.html b/superset/templates/superset/base.html
index e3c3d35dfe..b226d3aedd 100644
--- a/superset/templates/superset/base.html
+++ b/superset/templates/superset/base.html
@@ -21,7 +21,15 @@
 
 {% block head_css %}
   {{ super() }}
-  <link rel="icon" type="image/png" href="{{ assets_prefix }}/static/assets/images/favicon.png">
+  {% set favicons = appbuilder.app.config['FAVICONS'] %}
+  {% for favicon in favicons %}
+    <link
+      rel="{{favicon.rel if favicon.rel else "icon"}}"
+      type="{{favicon.type if favicon.type else "image/png"}}"
+      {% if favicon.sizes %}sizes={{favicon.sizes}}{% endif %}
+      href="{{ "" if favicon.href.startswith("http") else assets_prefix }}{{favicon.href}}"
+    >
+  {% endfor %}
   {{ css_bundle("theme") }}
 {% endblock %}
 


[superset] 02/29: fix: make max-requests and max-requests-jitter adjustable (#20733)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 5efee17defc387eac2ea49b2a492e58953783aec
Author: Multazim Deshmukh <57...@users.noreply.github.com>
AuthorDate: Sun Jul 17 18:29:16 2022 +0530

    fix: make max-requests and max-requests-jitter adjustable (#20733)
    
    Co-authored-by: Multazim Deshmukh <mu...@morningstar.com>
    (cherry picked from commit 883241070f5dd717d188b69dd681af127656283b)
---
 docker/run-server.sh | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/docker/run-server.sh b/docker/run-server.sh
index 064f47b9c2..1136852d02 100644
--- a/docker/run-server.sh
+++ b/docker/run-server.sh
@@ -28,6 +28,8 @@ gunicorn \
     --threads ${SERVER_THREADS_AMOUNT:-20} \
     --timeout ${GUNICORN_TIMEOUT:-60} \
     --keep-alive ${GUNICORN_KEEPALIVE:-2} \
+    --max-requests ${WORKER_MAX_REQUESTS:0} \
+    --max-requests-jitter ${WORKER_MAX_REQUESTS_JITTER:0} \
     --limit-request-line ${SERVER_LIMIT_REQUEST_LINE:-0} \
     --limit-request-field_size ${SERVER_LIMIT_REQUEST_FIELD_SIZE:-0} \
     "${FLASK_APP}"


[superset] 23/29: fix(plugin-chart-echarts): missing value format in mixed timeseries (#21044)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 5ff8eb1a4b80f4bb17882ae65ea5e628fa822dce
Author: JUST.in DO IT <ju...@airbnb.com>
AuthorDate: Thu Aug 11 12:33:59 2022 -0700

    fix(plugin-chart-echarts): missing value format in mixed timeseries (#21044)
---
 .../MixedTimeseries/Stories.tsx                    | 64 +++++++++++++++++++++-
 .../MixedTimeseries/negativeData.ts                | 45 +++++++++++++++
 .../src/MixedTimeseries/transformProps.ts          | 34 +++++++++++-
 3 files changed, 141 insertions(+), 2 deletions(-)

diff --git a/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/plugin-chart-echarts/MixedTimeseries/Stories.tsx b/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/plugin-chart-echarts/MixedTimeseries/Stories.tsx
index a6a13e4e56..1082ac58ab 100644
--- a/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/plugin-chart-echarts/MixedTimeseries/Stories.tsx
+++ b/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/plugin-chart-echarts/MixedTimeseries/Stories.tsx
@@ -31,6 +31,7 @@ import {
   MixedTimeseriesTransformProps,
 } from '@superset-ui/plugin-chart-echarts';
 import data from '../Timeseries/data';
+import negativeNumData from './negativeData';
 import { withResizableChartDemo } from '../../../../shared/components/ResizableChartDemo';
 
 new EchartsTimeseriesChartPlugin()
@@ -57,6 +58,8 @@ export const Timeseries = ({ width, height }) => {
           Boston: row.Boston,
         }))
         .filter(row => !!row.Boston),
+      colnames: ['__timestamp'],
+      coltypes: [2],
     },
     {
       data: data
@@ -82,8 +85,13 @@ export const Timeseries = ({ width, height }) => {
         logAxis: boolean('Log axis', false),
         xAxisTimeFormat: 'smart_date',
         tooltipTimeFormat: 'smart_date',
-        yAxisFormat: 'SMART_NUMBER',
+        yAxisFormat: select(
+          'y-axis format',
+          ['$,.2f', 'SMART_NUMBER'],
+          '$,.2f',
+        ),
         yAxisTitle: text('Y Axis title', ''),
+        yAxisIndexB: select('yAxisIndexB', [0, 1], 1),
         minorSplitLine: boolean('Query 1: Minor splitline', false),
         seriesType: select(
           'Query 1: Line type',
@@ -105,7 +113,61 @@ export const Timeseries = ({ width, height }) => {
         markerEnabledB: boolean('Query 2: Enable markers', false),
         markerSizeB: number('Query 2: Marker Size', 6),
         opacityB: number('Query 2: Opacity', 0.2),
+        showValue: true,
       }}
     />
   );
 };
+
+export const WithNegativeNumbers = ({ width, height }) => (
+  <SuperChart
+    chartType="mixed-timeseries"
+    width={width}
+    height={height}
+    queriesData={[
+      {
+        data: negativeNumData,
+        colnames: ['__timestamp'],
+        coltypes: [2],
+      },
+      {
+        data: negativeNumData.map(({ __timestamp, Boston }) => ({
+          __timestamp,
+          avgRate: Boston / 100,
+        })),
+      },
+    ]}
+    formData={{
+      contributionMode: undefined,
+      colorScheme: 'supersetColors',
+      seriesType: select(
+        'Line type',
+        ['line', 'scatter', 'smooth', 'bar', 'start', 'middle', 'end'],
+        'line',
+      ),
+      xAxisTimeFormat: 'smart_date',
+      yAxisFormat: select(
+        'y-axis format',
+        {
+          'Original value': '~g',
+          'Smart number': 'SMART_NUMBER',
+          '(12345.432 => $12,345.43)': '$,.2f',
+        },
+        '$,.2f',
+      ),
+      stack: true,
+      showValue: boolean('Query 1: Show Value', true),
+      showValueB: boolean('Query 2: Show Value', false),
+      showLegend: true,
+      markerEnabledB: true,
+      yAxisIndexB: select(
+        'Query 2: Y Axis',
+        {
+          Primary: 0,
+          Secondary: 1,
+        },
+        1,
+      ),
+    }}
+  />
+);
diff --git a/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/plugin-chart-echarts/MixedTimeseries/negativeData.ts b/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/plugin-chart-echarts/MixedTimeseries/negativeData.ts
new file mode 100644
index 0000000000..ce5cb79d27
--- /dev/null
+++ b/superset-frontend/packages/superset-ui-demo/storybook/stories/plugins/plugin-chart-echarts/MixedTimeseries/negativeData.ts
@@ -0,0 +1,45 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+export default [
+  {
+    __timestamp: 1619827200000,
+    Boston: 10.8812312312,
+    Washington: -45.3089432023,
+    JerseyCity: -23.0509234029834,
+  },
+  {
+    __timestamp: 1622505600000,
+    Boston: 80.81029340234,
+    Washington: -10.299023489023,
+    JerseyCity: 53.54239402349,
+  },
+  {
+    __timestamp: 1625097600000,
+    Boston: 30.9129034924,
+    Washington: 100.25234902349,
+    JerseyCity: 27.17239402394,
+  },
+  {
+    __timestamp: 1627776000000,
+    Boston: 42.6129034924,
+    Washington: 90.23234902349,
+    JerseyCity: -32.23239402394,
+  },
+];
diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/transformProps.ts b/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/transformProps.ts
index 663131dd97..83b00f0dde 100644
--- a/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/transformProps.ts
+++ b/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/transformProps.ts
@@ -47,6 +47,8 @@ import {
   getAxisType,
   getColtypesMapping,
   getLegendProps,
+  extractDataTotalValues,
+  extractShowValueIndexes,
 } from '../utils/series';
 import { extractAnnotationLabels } from '../utils/annotation';
 import {
@@ -136,6 +138,7 @@ export default function transformProps(
     yAxisTitlePosition,
     sliceId,
     timeGrainSqla,
+    percentageThreshold,
   }: EchartsMixedTimeseriesFormData = { ...DEFAULT_FORM_DATA, ...formData };
 
   const colorScale = CategoricalColorNamespace.getScale(colorScheme as string);
@@ -181,7 +184,28 @@ export default function transformProps(
   rawSeriesB.forEach(seriesOption =>
     mapSeriesIdToAxis(seriesOption, yAxisIndexB),
   );
-
+  const showValueIndexesA = extractShowValueIndexes(rawSeriesA, {
+    stack,
+  });
+  const showValueIndexesB = extractShowValueIndexes(rawSeriesB, {
+    stack,
+  });
+  const { totalStackedValues, thresholdValues } = extractDataTotalValues(
+    rebasedDataA,
+    {
+      stack,
+      percentageThreshold,
+      xAxisCol,
+    },
+  );
+  const {
+    totalStackedValues: totalStackedValuesB,
+    thresholdValues: thresholdValuesB,
+  } = extractDataTotalValues(rebasedDataB, {
+    stack: Boolean(stackB),
+    percentageThreshold,
+    xAxisCol,
+  });
   rawSeriesA.forEach(entry => {
     const transformedSeries = transformSeries(entry, colorScale, {
       area,
@@ -195,6 +219,10 @@ export default function transformProps(
       filterState,
       seriesKey: entry.name,
       sliceId,
+      formatter,
+      showValueIndexes: showValueIndexesA,
+      totalStackedValues,
+      thresholdValues,
     });
     if (transformedSeries) series.push(transformedSeries);
   });
@@ -214,6 +242,10 @@ export default function transformProps(
         ? `${entry.name} (1)`
         : entry.name,
       sliceId,
+      formatter: formatterSecondary,
+      showValueIndexes: showValueIndexesB,
+      totalStackedValues: totalStackedValuesB,
+      thresholdValues: thresholdValuesB,
     });
     if (transformedSeries) series.push(transformedSeries);
   });


[superset] 08/29: chore(deps): unpin holidays dependency version (#21091)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit e91222eb65f433f04a57b7567601ae8eaab07d7a
Author: Erik Cederstrand <er...@cederstrand.dk>
AuthorDate: Mon Aug 15 18:51:27 2022 +0200

    chore(deps): unpin holidays dependency version (#21091)
    
    The blocking issue has been fixed upstream
    
    Co-authored-by: Erik Cederstrand <er...@adamatics.com>
    (cherry picked from commit d817a1dc87c1c8444465f6fd6ff66c86bf21cc03)
---
 requirements/base.txt | 4 ++--
 setup.py              | 2 +-
 2 files changed, 3 insertions(+), 3 deletions(-)

diff --git a/requirements/base.txt b/requirements/base.txt
index d7d4d2b80e..a2f72d85bc 100644
--- a/requirements/base.txt
+++ b/requirements/base.txt
@@ -60,7 +60,7 @@ colorama==0.4.4
     # via
     #   apache-superset
     #   flask-appbuilder
-convertdate==2.3.2
+convertdate==2.4.0
     # via holidays
 cron-descriptor==1.2.24
     # via apache-superset
@@ -126,7 +126,7 @@ gunicorn==20.1.0
     # via apache-superset
 hashids==1.3.1
     # via apache-superset
-holidays==0.10.3
+holidays==0.14.2
     # via apache-superset
 humanize==3.11.0
     # via apache-superset
diff --git a/setup.py b/setup.py
index 26cc0b8a99..c8d72b5263 100644
--- a/setup.py
+++ b/setup.py
@@ -88,7 +88,7 @@ setup(
         "graphlib-backport",
         "gunicorn>=20.1.0",
         "hashids>=1.3.1, <2",
-        "holidays==0.10.3",  # PINNED! https://github.com/dr-prodigy/python-holidays/issues/406
+        "holidays==0.14.2",
         "humanize",
         "isodate",
         "markdown>=3.0",


[superset] 14/29: fix(sqllab): missing zero values while copy-to-clipboard (#21153)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 9337dec038cf03ea44ba4521fb16f7e32a672b88
Author: JUST.in DO IT <ju...@airbnb.com>
AuthorDate: Fri Aug 26 14:28:48 2022 -0700

    fix(sqllab): missing zero values while copy-to-clipboard (#21153)
    
    (cherry picked from commit 4e23d62d4f3714808af8b915caa5790900688526)
---
 superset-frontend/src/utils/common.js       |  2 +-
 superset-frontend/src/utils/common.test.jsx | 10 ++++++++++
 2 files changed, 11 insertions(+), 1 deletion(-)

diff --git a/superset-frontend/src/utils/common.js b/superset-frontend/src/utils/common.js
index 603ec7c549..400a7d05e0 100644
--- a/superset-frontend/src/utils/common.js
+++ b/superset-frontend/src/utils/common.js
@@ -97,7 +97,7 @@ export function prepareCopyToClipboardTabularData(data, columns) {
       // JavaScript does not maintain the order of a mixed set of keys (i.e integers and strings)
       // the below function orders the keys based on the column names.
       const key = columns[j].name || columns[j];
-      if (data[i][key]) {
+      if (key in data[i]) {
         row[j] = data[i][key];
       } else {
         row[j] = data[i][parseFloat(key)];
diff --git a/superset-frontend/src/utils/common.test.jsx b/superset-frontend/src/utils/common.test.jsx
index 6c73b1011c..571e493add 100644
--- a/superset-frontend/src/utils/common.test.jsx
+++ b/superset-frontend/src/utils/common.test.jsx
@@ -59,6 +59,16 @@ describe('utils/common', () => {
         'lorem\tipsum\t\ndolor\tsit\tamet\n',
       );
     });
+    it('includes 0 values', () => {
+      const array = [
+        { column1: 0, column2: 0 },
+        { column1: 1, column2: -1, 0: 0 },
+      ];
+      const column = ['column1', 'column2', '0'];
+      expect(prepareCopyToClipboardTabularData(array, column)).toEqual(
+        '0\t0\t\n1\t-1\t0\n',
+      );
+    });
   });
   describe('applyFormattingToTabularData', () => {
     it('does not mutate empty array', () => {


[superset] 03/29: fix: getting default value in run-server.sh (#20736)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 43b8f18a2164054faa86f86ac236243109583389
Author: Yongjie Zhao <yo...@gmail.com>
AuthorDate: Mon Jul 18 23:00:27 2022 +0800

    fix: getting default value in run-server.sh (#20736)
    
    (cherry picked from commit 5990ea639e4f94b54d3109d14b1918a6f9770f14)
---
 docker/run-server.sh | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/docker/run-server.sh b/docker/run-server.sh
index 1136852d02..58174d2bb5 100644
--- a/docker/run-server.sh
+++ b/docker/run-server.sh
@@ -28,8 +28,8 @@ gunicorn \
     --threads ${SERVER_THREADS_AMOUNT:-20} \
     --timeout ${GUNICORN_TIMEOUT:-60} \
     --keep-alive ${GUNICORN_KEEPALIVE:-2} \
-    --max-requests ${WORKER_MAX_REQUESTS:0} \
-    --max-requests-jitter ${WORKER_MAX_REQUESTS_JITTER:0} \
+    --max-requests ${WORKER_MAX_REQUESTS:-0} \
+    --max-requests-jitter ${WORKER_MAX_REQUESTS_JITTER:-0} \
     --limit-request-line ${SERVER_LIMIT_REQUEST_LINE:-0} \
     --limit-request-field_size ${SERVER_LIMIT_REQUEST_FIELD_SIZE:-0} \
     "${FLASK_APP}"


[superset] 22/29: fix: database permissions on update and delete (avoid orphaned perms) (#20081)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit a9c04284c405b81749cbb40304f7f38b1769513e
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Tue Aug 2 18:28:46 2022 +0100

    fix: database permissions on update and delete (avoid orphaned perms) (#20081)
    
    * fix: database permissions on update and delete (avoid orphaned perms)
    
    * fix event transaction
    
    * fix test
    
    * fix lint
    
    * update datasource access permissions
    
    * add tests
    
    * fix import
    
    * fix tests
    
    * update slice and dataset perms also
    
    * fix lint
    
    * fix tests
    
    * fix lint
    
    * fix lint
    
    * add test for edge case, small refactor
    
    * add test for edge case, small refactor
    
    * improve code
    
    * fix lint
---
 .../installing-superset-from-scratch.mdx           |   2 +-
 superset/databases/commands/create.py              |   1 -
 superset/databases/commands/update.py              |  22 +-
 superset/models/core.py                            |   3 +-
 superset/security/manager.py                       | 279 ++++++++++++++++++++-
 tests/integration_tests/security_tests.py          | 180 +++++++++++++
 6 files changed, 481 insertions(+), 6 deletions(-)

diff --git a/docs/docs/installation/installing-superset-from-scratch.mdx b/docs/docs/installation/installing-superset-from-scratch.mdx
index 3a12c9db3a..5efdb3e8f1 100644
--- a/docs/docs/installation/installing-superset-from-scratch.mdx
+++ b/docs/docs/installation/installing-superset-from-scratch.mdx
@@ -64,7 +64,7 @@ We don't recommend using the system installed Python. Instead, first install the
 brew install readline pkg-config libffi openssl mysql postgres
 ```
 
-You should install a recent version of Python (the official docker image uses 3.8.12). We'd recommend using a Python version manager like [pyenv](https://github.com/pyenv/pyenv) (and also [pyenv-virtualenv](https://github.com/pyenv/pyenv-virtualenv)).
+You should install a recent version of Python (the official docker image uses 3.8.13). We'd recommend using a Python version manager like [pyenv](https://github.com/pyenv/pyenv) (and also [pyenv-virtualenv](https://github.com/pyenv/pyenv-virtualenv)).
 
 Let's also make sure we have the latest version of `pip` and `setuptools`:
 
diff --git a/superset/databases/commands/create.py b/superset/databases/commands/create.py
index e91ccec45c..e798f115a5 100644
--- a/superset/databases/commands/create.py
+++ b/superset/databases/commands/create.py
@@ -65,7 +65,6 @@ class CreateDatabaseCommand(BaseCommand):
                 security_manager.add_permission_view_menu(
                     "schema_access", security_manager.get_schema_perm(database, schema)
                 )
-            security_manager.add_permission_view_menu("database_access", database.perm)
             db.session.commit()
         except DAOCreateFailedError as ex:
             db.session.rollback()
diff --git a/superset/databases/commands/update.py b/superset/databases/commands/update.py
index 69b6c30e71..30e67b79ca 100644
--- a/superset/databases/commands/update.py
+++ b/superset/databases/commands/update.py
@@ -46,10 +46,13 @@ class UpdateDatabaseCommand(BaseCommand):
 
     def run(self) -> Model:
         self.validate()
+        if not self._model:
+            raise DatabaseNotFoundError()
+        old_database_name = self._model.database_name
+
         try:
             database = DatabaseDAO.update(self._model, self._properties, commit=False)
             database.set_sqlalchemy_uri(database.sqlalchemy_uri)
-            security_manager.add_permission_view_menu("database_access", database.perm)
             # adding a new database we always want to force refresh schema list
             # TODO Improve this simplistic implementation for catching DB conn fails
             try:
@@ -57,7 +60,24 @@ class UpdateDatabaseCommand(BaseCommand):
             except Exception as ex:
                 db.session.rollback()
                 raise DatabaseConnectionFailedError() from ex
+            # Update database schema permissions
+            new_schemas: List[str] = []
             for schema in schemas:
+                old_view_menu_name = security_manager.get_schema_perm(
+                    old_database_name, schema
+                )
+                new_view_menu_name = security_manager.get_schema_perm(
+                    database.database_name, schema
+                )
+                schema_pvm = security_manager.find_permission_view_menu(
+                    "schema_access", old_view_menu_name
+                )
+                # Update the schema permission if the database name changed
+                if schema_pvm and old_database_name != database.database_name:
+                    schema_pvm.view_menu.name = new_view_menu_name
+                else:
+                    new_schemas.append(schema)
+            for schema in new_schemas:
                 security_manager.add_permission_view_menu(
                     "schema_access", security_manager.get_schema_perm(database, schema)
                 )
diff --git a/superset/models/core.py b/superset/models/core.py
index d21ac56dad..997759c8b1 100755
--- a/superset/models/core.py
+++ b/superset/models/core.py
@@ -795,7 +795,8 @@ class Database(
 
 
 sqla.event.listen(Database, "after_insert", security_manager.set_perm)
-sqla.event.listen(Database, "after_update", security_manager.set_perm)
+sqla.event.listen(Database, "after_update", security_manager.database_after_update)
+sqla.event.listen(Database, "after_delete", security_manager.database_after_delete)
 
 
 class Log(Model):  # pylint: disable=too-few-public-methods
diff --git a/superset/security/manager.py b/superset/security/manager.py
index d5455e73fc..699c7472d5 100644
--- a/superset/security/manager.py
+++ b/superset/security/manager.py
@@ -40,9 +40,11 @@ from flask_appbuilder.security.sqla.manager import SecurityManager
 from flask_appbuilder.security.sqla.models import (
     assoc_permissionview_role,
     assoc_user_role,
+    Permission,
     PermissionView,
     Role,
     User,
+    ViewMenu,
 )
 from flask_appbuilder.security.views import (
     PermissionModelView,
@@ -54,7 +56,7 @@ from flask_appbuilder.security.views import (
 from flask_appbuilder.widgets import ListWidget
 from flask_login import AnonymousUserMixin, LoginManager
 from jwt.api_jwt import _jwt_global_obj
-from sqlalchemy import and_, or_
+from sqlalchemy import and_, inspect, or_
 from sqlalchemy.engine.base import Connection
 from sqlalchemy.orm import Session
 from sqlalchemy.orm.mapper import Mapper
@@ -269,6 +271,14 @@ class SupersetSecurityManager(  # pylint: disable=too-many-public-methods
 
         return None
 
+    @staticmethod
+    def get_database_perm(database_id: int, database_name: str) -> str:
+        return f"[{database_name}].(id:{database_id})"
+
+    @staticmethod
+    def get_dataset_perm(dataset_id: int, dataset_name: str, database_name: str) -> str:
+        return f"[{database_name}].[{dataset_name}](id:{dataset_id})"
+
     def unpack_database_and_schema(  # pylint: disable=no-self-use
         self, schema_permission: str
     ) -> DatabaseAndSchema:
@@ -927,7 +937,271 @@ class SupersetSecurityManager(  # pylint: disable=too-many-public-methods
 
         return pvm.permission.name in {"can_override_role_permissions", "can_approve"}
 
-    def set_perm(  # pylint: disable=unused-argument
+    def database_after_delete(
+        self,
+        mapper: Mapper,
+        connection: Connection,
+        target: "Database",
+    ) -> None:
+        self._delete_vm_database_access(
+            mapper, connection, target.id, target.database_name
+        )
+
+    def _delete_vm_database_access(
+        self,
+        mapper: Mapper,
+        connection: Connection,
+        database_id: int,
+        database_name: str,
+    ) -> None:
+        view_menu_table = self.viewmenu_model.__table__  # pylint: disable=no-member
+        permission_view_menu_table = (
+            self.permissionview_model.__table__  # pylint: disable=no-member
+        )
+        view_menu_name = self.get_database_perm(database_id, database_name)
+        # Clean database access permission
+        db_pvm = self.find_permission_view_menu("database_access", view_menu_name)
+        if not db_pvm:
+            logger.warning(
+                "Could not find previous database permission %s",
+                view_menu_name,
+            )
+            return
+        connection.execute(
+            permission_view_menu_table.delete().where(
+                permission_view_menu_table.c.id == db_pvm.id
+            )
+        )
+        self.on_permission_after_delete(mapper, connection, db_pvm)
+        connection.execute(
+            view_menu_table.delete().where(view_menu_table.c.id == db_pvm.view_menu_id)
+        )
+
+        # Clean database schema permissions
+        schema_pvms = (
+            self.get_session.query(self.permissionview_model)
+            .join(self.permission_model)
+            .join(self.viewmenu_model)
+            .filter(self.permission_model.name == "schema_access")
+            .filter(self.viewmenu_model.name.like(f"[{database_name}].[%]"))
+            .all()
+        )
+        for schema_pvm in schema_pvms:
+            connection.execute(
+                permission_view_menu_table.delete().where(
+                    permission_view_menu_table.c.id == schema_pvm.id
+                )
+            )
+            self.on_permission_after_delete(mapper, connection, schema_pvm)
+            connection.execute(
+                view_menu_table.delete().where(
+                    view_menu_table.c.id == schema_pvm.view_menu_id
+                )
+            )
+
+    def _update_vm_database_access(
+        self,
+        mapper: Mapper,
+        connection: Connection,
+        old_database_name: str,
+        target: "Database",
+    ) -> Optional[ViewMenu]:
+        view_menu_table = self.viewmenu_model.__table__  # pylint: disable=no-member
+        new_database_name = target.database_name
+        old_view_menu_name = self.get_database_perm(target.id, old_database_name)
+        new_view_menu_name = self.get_database_perm(target.id, new_database_name)
+        db_pvm = self.find_permission_view_menu("database_access", old_view_menu_name)
+        if not db_pvm:
+            logger.warning(
+                "Could not find previous database permission %s",
+                old_view_menu_name,
+            )
+            return None
+        new_updated_pvm = self.find_permission_view_menu(
+            "database_access", new_view_menu_name
+        )
+        if new_updated_pvm:
+            logger.info(
+                "New permission [%s] already exists, deleting the previous",
+                new_view_menu_name,
+            )
+            self._delete_vm_database_access(
+                mapper, connection, target.id, old_database_name
+            )
+            return None
+        connection.execute(
+            view_menu_table.update()
+            .where(view_menu_table.c.id == db_pvm.view_menu_id)
+            .values(name=new_view_menu_name)
+        )
+        new_db_view_menu = self.find_view_menu(new_view_menu_name)
+
+        self.on_view_menu_after_update(mapper, connection, new_db_view_menu)
+        return new_db_view_menu
+
+    def _update_vm_datasources_access(  # pylint: disable=too-many-locals
+        self,
+        mapper: Mapper,
+        connection: Connection,
+        old_database_name: str,
+        target: "Database",
+    ) -> List[ViewMenu]:
+        """
+        Updates all datasource access permission when a database name changes
+
+        :param connection: Current connection (called on SQLAlchemy event listener scope)
+        :param old_database_name: the old database name
+        :param target: The new database name
+        :return: A list of changed view menus (permission resource names)
+        """
+        from superset.connectors.sqla.models import (  # pylint: disable=import-outside-toplevel
+            SqlaTable,
+        )
+        from superset.models.slice import (  # pylint: disable=import-outside-toplevel
+            Slice,
+        )
+
+        view_menu_table = self.viewmenu_model.__table__  # pylint: disable=no-member
+        sqlatable_table = SqlaTable.__table__  # pylint: disable=no-member
+        chart_table = Slice.__table__  # pylint: disable=no-member
+        new_database_name = target.database_name
+        datasets = (
+            self.get_session.query(SqlaTable)
+            .filter(SqlaTable.database_id == target.id)
+            .all()
+        )
+        updated_view_menus: List[ViewMenu] = []
+        for dataset in datasets:
+            old_dataset_vm_name = self.get_dataset_perm(
+                dataset.id, dataset.table_name, old_database_name
+            )
+            new_dataset_vm_name = self.get_dataset_perm(
+                dataset.id, dataset.table_name, new_database_name
+            )
+            new_dataset_view_menu = self.find_view_menu(new_dataset_vm_name)
+            if new_dataset_view_menu:
+                continue
+            connection.execute(
+                view_menu_table.update()
+                .where(view_menu_table.c.name == old_dataset_vm_name)
+                .values(name=new_dataset_vm_name)
+            )
+            # Update dataset (SqlaTable perm field)
+            connection.execute(
+                sqlatable_table.update()
+                .where(
+                    sqlatable_table.c.id == dataset.id,
+                    sqlatable_table.c.perm == old_dataset_vm_name,
+                )
+                .values(perm=new_dataset_vm_name)
+            )
+            # Update charts (Slice perm field)
+            connection.execute(
+                chart_table.update()
+                .where(chart_table.c.perm == old_dataset_vm_name)
+                .values(perm=new_dataset_vm_name)
+            )
+            self.on_view_menu_after_update(mapper, connection, new_dataset_view_menu)
+            updated_view_menus.append(self.find_view_menu(new_dataset_view_menu))
+        return updated_view_menus
+
+    def database_after_update(
+        self,
+        mapper: Mapper,
+        connection: Connection,
+        target: "Database",
+    ) -> None:
+        # Check if database name has changed
+        state = inspect(target)
+        history = state.get_history("database_name", True)
+        if not history.has_changes() or not history.deleted:
+            return
+
+        old_database_name = history.deleted[0]
+        # update database access permission
+        self._update_vm_database_access(mapper, connection, old_database_name, target)
+        # update datasource access
+        self._update_vm_datasources_access(
+            mapper, connection, old_database_name, target
+        )
+
+    def on_view_menu_after_update(
+        self, mapper: Mapper, connection: Connection, target: ViewMenu
+    ) -> None:
+        """
+        Hook that allows for further custom operations when a new ViewMenu
+        is updated
+
+        Since the update may be performed on after_update event. We cannot
+        update ViewMenus using a session, so any SQLAlchemy events hooked to
+        `ViewMenu` will not trigger an after_update.
+
+        :param mapper: The table mapper
+        :param connection: The DB-API connection
+        :param target: The mapped instance being persisted
+        """
+
+    def on_permission_after_delete(
+        self, mapper: Mapper, connection: Connection, target: Permission
+    ) -> None:
+        """
+        Hook that allows for further custom operations when a permission
+        is deleted by sqlalchemy events.
+
+        :param mapper: The table mapper
+        :param connection: The DB-API connection
+        :param target: The mapped instance being persisted
+        """
+
+    def on_permission_after_insert(
+        self, mapper: Mapper, connection: Connection, target: Permission
+    ) -> None:
+        """
+        Hook that allows for further custom operations when a new permission
+        is created by set_perm.
+
+        Since set_perm is executed by SQLAlchemy after_insert events, we cannot
+        create new permissions using a session, so any SQLAlchemy events hooked to
+        `Permission` will not trigger an after_insert.
+
+        :param mapper: The table mapper
+        :param connection: The DB-API connection
+        :param target: The mapped instance being persisted
+        """
+
+    def on_view_menu_after_insert(
+        self, mapper: Mapper, connection: Connection, target: ViewMenu
+    ) -> None:
+        """
+        Hook that allows for further custom operations when a new ViewMenu
+        is created by set_perm.
+
+        Since set_perm is executed by SQLAlchemy after_insert events, we cannot
+        create new view_menu's using a session, so any SQLAlchemy events hooked to
+        `ViewMenu` will not trigger an after_insert.
+
+        :param mapper: The table mapper
+        :param connection: The DB-API connection
+        :param target: The mapped instance being persisted
+        """
+
+    def on_permission_view_after_insert(
+        self, mapper: Mapper, connection: Connection, target: PermissionView
+    ) -> None:
+        """
+        Hook that allows for further custom operations when a new PermissionView
+        is created by set_perm.
+
+        Since set_perm is executed by SQLAlchemy after_insert events, we cannot
+        create new pvms using a session, so any SQLAlchemy events hooked to
+        `PermissionView` will not trigger an after_insert.
+
+        :param mapper: The table mapper
+        :param connection: The DB-API connection
+        :param target: The mapped instance being persisted
+        """
+
+    def set_perm(
         self, mapper: Mapper, connection: Connection, target: "BaseDatasource"
     ) -> None:
         """
@@ -951,6 +1225,7 @@ class SupersetSecurityManager(  # pylint: disable=too-many-public-methods
             )
             target.perm = target_get_perm
 
+        # check schema perm for datasets
         if (
             hasattr(target, "schema_perm")
             and target.schema_perm != target.get_schema_perm()
diff --git a/tests/integration_tests/security_tests.py b/tests/integration_tests/security_tests.py
index 9de83f665a..25d946f9e5 100644
--- a/tests/integration_tests/security_tests.py
+++ b/tests/integration_tests/security_tests.py
@@ -332,6 +332,186 @@ class TestRolePermission(SupersetTestCase):
         session.delete(stored_db)
         session.commit()
 
+    def test_after_update_database__perm_database_access(self):
+        session = db.session
+        database = Database(database_name="tmp_database", sqlalchemy_uri="sqlite://")
+        session.add(database)
+        session.commit()
+        stored_db = (
+            session.query(Database).filter_by(database_name="tmp_database").one()
+        )
+
+        self.assertIsNotNone(
+            security_manager.find_permission_view_menu(
+                "database_access", stored_db.perm
+            )
+        )
+
+        stored_db.database_name = "tmp_database2"
+        session.commit()
+
+        # Assert that the old permission was updated
+        self.assertIsNone(
+            security_manager.find_permission_view_menu(
+                "database_access", f"[tmp_database].(id:{stored_db.id})"
+            )
+        )
+        # Assert that the db permission was updated
+        self.assertIsNotNone(
+            security_manager.find_permission_view_menu(
+                "database_access", f"[tmp_database2].(id:{stored_db.id})"
+            )
+        )
+        session.delete(stored_db)
+        session.commit()
+
+    def test_after_update_database__perm_database_access_exists(self):
+        session = db.session
+        # Add a bogus existing permission before the change
+
+        database = Database(database_name="tmp_database", sqlalchemy_uri="sqlite://")
+        session.add(database)
+        session.commit()
+        stored_db = (
+            session.query(Database).filter_by(database_name="tmp_database").one()
+        )
+        security_manager.add_permission_view_menu(
+            "database_access", f"[tmp_database2].(id:{stored_db.id})"
+        )
+
+        self.assertIsNotNone(
+            security_manager.find_permission_view_menu(
+                "database_access", stored_db.perm
+            )
+        )
+
+        stored_db.database_name = "tmp_database2"
+        session.commit()
+
+        # Assert that the old permission was updated
+        self.assertIsNone(
+            security_manager.find_permission_view_menu(
+                "database_access", f"[tmp_database].(id:{stored_db.id})"
+            )
+        )
+        # Assert that the db permission was updated
+        self.assertIsNotNone(
+            security_manager.find_permission_view_menu(
+                "database_access", f"[tmp_database2].(id:{stored_db.id})"
+            )
+        )
+        session.delete(stored_db)
+        session.commit()
+
+    def test_after_update_database__perm_datasource_access(self):
+        session = db.session
+        database = Database(database_name="tmp_database", sqlalchemy_uri="sqlite://")
+        session.add(database)
+        session.commit()
+
+        table1 = SqlaTable(
+            schema="tmp_schema",
+            table_name="tmp_table1",
+            database=database,
+        )
+        session.add(table1)
+        table2 = SqlaTable(
+            schema="tmp_schema",
+            table_name="tmp_table2",
+            database=database,
+        )
+        session.add(table2)
+        session.commit()
+        slice1 = Slice(
+            datasource_id=table1.id,
+            datasource_type=DatasourceType.TABLE,
+            datasource_name="tmp_table1",
+            slice_name="tmp_slice1",
+        )
+        session.add(slice1)
+        session.commit()
+        slice1 = session.query(Slice).filter_by(slice_name="tmp_slice1").one()
+        table1 = session.query(SqlaTable).filter_by(table_name="tmp_table1").one()
+        table2 = session.query(SqlaTable).filter_by(table_name="tmp_table2").one()
+
+        # assert initial perms
+        self.assertIsNotNone(
+            security_manager.find_permission_view_menu(
+                "datasource_access", f"[tmp_database].[tmp_table1](id:{table1.id})"
+            )
+        )
+        self.assertIsNotNone(
+            security_manager.find_permission_view_menu(
+                "datasource_access", f"[tmp_database].[tmp_table2](id:{table2.id})"
+            )
+        )
+        self.assertEqual(slice1.perm, f"[tmp_database].[tmp_table1](id:{table1.id})")
+        self.assertEqual(table1.perm, f"[tmp_database].[tmp_table1](id:{table1.id})")
+        self.assertEqual(table2.perm, f"[tmp_database].[tmp_table2](id:{table2.id})")
+
+        stored_db = (
+            session.query(Database).filter_by(database_name="tmp_database").one()
+        )
+        stored_db.database_name = "tmp_database2"
+        session.commit()
+
+        # Assert that the old permissions were updated
+        self.assertIsNone(
+            security_manager.find_permission_view_menu(
+                "datasource_access", f"[tmp_database].[tmp_table1](id:{table1.id})"
+            )
+        )
+        self.assertIsNone(
+            security_manager.find_permission_view_menu(
+                "datasource_access", f"[tmp_database].[tmp_table2](id:{table2.id})"
+            )
+        )
+
+        # Assert that the db permission was updated
+        self.assertIsNotNone(
+            security_manager.find_permission_view_menu(
+                "datasource_access", f"[tmp_database2].[tmp_table1](id:{table1.id})"
+            )
+        )
+        self.assertIsNotNone(
+            security_manager.find_permission_view_menu(
+                "datasource_access", f"[tmp_database2].[tmp_table2](id:{table2.id})"
+            )
+        )
+        self.assertEqual(slice1.perm, f"[tmp_database2].[tmp_table1](id:{table1.id})")
+        self.assertEqual(table1.perm, f"[tmp_database2].[tmp_table1](id:{table1.id})")
+        self.assertEqual(table2.perm, f"[tmp_database2].[tmp_table2](id:{table2.id})")
+
+        session.delete(slice1)
+        session.delete(table1)
+        session.delete(table2)
+        session.delete(stored_db)
+        session.commit()
+
+    def test_after_delete_database__perm_database_access(self):
+        session = db.session
+        database = Database(database_name="tmp_database", sqlalchemy_uri="sqlite://")
+        session.add(database)
+        session.commit()
+        stored_db = (
+            session.query(Database).filter_by(database_name="tmp_database").one()
+        )
+
+        self.assertIsNotNone(
+            security_manager.find_permission_view_menu(
+                "database_access", stored_db.perm
+            )
+        )
+        session.delete(stored_db)
+        session.commit()
+
+        # Assert that the old permission was updated
+        self.assertIsNone(
+            security_manager.find_permission_view_menu(
+                "database_access", f"[tmp_database].(id:{stored_db.id})"
+            )
+        )
+
     def test_hybrid_perm_database(self):
         database = Database(database_name="tmp_database3", sqlalchemy_uri="sqlite://")
 


[superset] 05/29: Big Number Viz: (#20946)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 6a5b12ec8c703404c922ce5fd62662c02a6c7460
Author: Antonio Rivero Martinez <38...@users.noreply.github.com>
AuthorDate: Wed Aug 3 12:12:56 2022 -0300

    Big Number Viz: (#20946)
    
    - When the value is zero we still render the percent change and suffix if present
    
    (cherry picked from commit aa53c1031215ece0fec7dd798ab113a3e012d910)
---
 .../src/BigNumber/BigNumberWithTrendline/transformProps.ts          | 6 ++++--
 1 file changed, 4 insertions(+), 2 deletions(-)

diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/BigNumber/BigNumberWithTrendline/transformProps.ts b/superset-frontend/plugins/plugin-chart-echarts/src/BigNumber/BigNumberWithTrendline/transformProps.ts
index faf6271302..07ca77547b 100644
--- a/superset-frontend/plugins/plugin-chart-echarts/src/BigNumber/BigNumberWithTrendline/transformProps.ts
+++ b/superset-frontend/plugins/plugin-chart-echarts/src/BigNumber/BigNumberWithTrendline/transformProps.ts
@@ -125,8 +125,10 @@ export default function transformProps(
       if (compareIndex < sortedData.length) {
         const compareValue = sortedData[compareIndex][1];
         // compare values must both be non-nulls
-        if (bigNumber !== null && compareValue !== null && compareValue !== 0) {
-          percentChange = (bigNumber - compareValue) / Math.abs(compareValue);
+        if (bigNumber !== null && compareValue !== null) {
+          percentChange = compareValue
+            ? (bigNumber - compareValue) / Math.abs(compareValue)
+            : 0;
           formattedSubheader = `${formatPercentChange(
             percentChange,
           )} ${compareSuffix}`;


[superset] 17/29: feat: adds TLS certificate validation option for SMTP (#21272)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 2b760d07757442c10320826af224265f564df146
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Thu Sep 1 10:51:34 2022 +0100

    feat: adds TLS certificate validation option for SMTP (#21272)
    
    (cherry picked from commit 9fd752057eb261b0e5db87636836fd30579ffce6)
---
 docs/docs/installation/alerts-reports.mdx |  1 +
 superset/config.py                        |  4 +++-
 superset/utils/core.py                    | 34 ++++++++++++++++++-------------
 tests/integration_tests/email_tests.py    | 29 +++++++++++++++++++++++++-
 4 files changed, 52 insertions(+), 16 deletions(-)

diff --git a/docs/docs/installation/alerts-reports.mdx b/docs/docs/installation/alerts-reports.mdx
index a7491ad03e..a86f14893e 100644
--- a/docs/docs/installation/alerts-reports.mdx
+++ b/docs/docs/installation/alerts-reports.mdx
@@ -126,6 +126,7 @@ SLACK_API_TOKEN = "xoxb-"
 # Email configuration
 SMTP_HOST = "smtp.sendgrid.net" #change to your host
 SMTP_STARTTLS = True
+SMTP_SSL_SERVER_AUTH = True # If your using an SMTP server with a valid certificate
 SMTP_SSL = False
 SMTP_USER = "your_user"
 SMTP_PORT = 2525 # your port eg. 587
diff --git a/superset/config.py b/superset/config.py
index 8ab257e511..bae75fed6e 100644
--- a/superset/config.py
+++ b/superset/config.py
@@ -956,7 +956,9 @@ SMTP_USER = "superset"
 SMTP_PORT = 25
 SMTP_PASSWORD = "superset"
 SMTP_MAIL_FROM = "superset@superset.com"
-
+# If True creates a default SSL context with ssl.Purpose.CLIENT_AUTH using the
+# default system root CA certificates.
+SMTP_SSL_SERVER_AUTH = False
 ENABLE_CHUNK_ENCODING = False
 
 # Whether to bump the logging level to ERROR on the flask_appbuilder package
diff --git a/superset/utils/core.py b/superset/utils/core.py
index 6c90837959..652014b7de 100644
--- a/superset/utils/core.py
+++ b/superset/utils/core.py
@@ -27,6 +27,7 @@ import platform
 import re
 import signal
 import smtplib
+import ssl
 import tempfile
 import threading
 import traceback
@@ -980,23 +981,28 @@ def send_mime_email(
     smtp_password = config["SMTP_PASSWORD"]
     smtp_starttls = config["SMTP_STARTTLS"]
     smtp_ssl = config["SMTP_SSL"]
+    smpt_ssl_server_auth = config["SMTP_SSL_SERVER_AUTH"]
 
-    if not dryrun:
-        smtp = (
-            smtplib.SMTP_SSL(smtp_host, smtp_port)
-            if smtp_ssl
-            else smtplib.SMTP(smtp_host, smtp_port)
-        )
-        if smtp_starttls:
-            smtp.starttls()
-        if smtp_user and smtp_password:
-            smtp.login(smtp_user, smtp_password)
-        logger.debug("Sent an email to %s", str(e_to))
-        smtp.sendmail(e_from, e_to, mime_msg.as_string())
-        smtp.quit()
-    else:
+    if dryrun:
         logger.info("Dryrun enabled, email notification content is below:")
         logger.info(mime_msg.as_string())
+        return
+
+    # Default ssl context is SERVER_AUTH using the default system
+    # root CA certificates
+    ssl_context = ssl.create_default_context() if smpt_ssl_server_auth else None
+    smtp = (
+        smtplib.SMTP_SSL(smtp_host, smtp_port, context=ssl_context)
+        if smtp_ssl
+        else smtplib.SMTP(smtp_host, smtp_port)
+    )
+    if smtp_starttls:
+        smtp.starttls(context=ssl_context)
+    if smtp_user and smtp_password:
+        smtp.login(smtp_user, smtp_password)
+    logger.debug("Sent an email to %s", str(e_to))
+    smtp.sendmail(e_from, e_to, mime_msg.as_string())
+    smtp.quit()
 
 
 def get_email_address_list(address_string: str) -> List[str]:
diff --git a/tests/integration_tests/email_tests.py b/tests/integration_tests/email_tests.py
index d6c46a08d9..68e4aaf71e 100644
--- a/tests/integration_tests/email_tests.py
+++ b/tests/integration_tests/email_tests.py
@@ -17,6 +17,7 @@
 # under the License.
 """Unit tests for email service in Superset"""
 import logging
+import ssl
 import tempfile
 import unittest
 from email.mime.application import MIMEApplication
@@ -144,9 +145,35 @@ class TestEmailSmtp(SupersetTestCase):
         utils.send_mime_email("from", "to", MIMEMultipart(), app.config, dryrun=False)
         assert not mock_smtp.called
         mock_smtp_ssl.assert_called_with(
-            app.config["SMTP_HOST"], app.config["SMTP_PORT"]
+            app.config["SMTP_HOST"], app.config["SMTP_PORT"], context=None
         )
 
+    @mock.patch("smtplib.SMTP_SSL")
+    @mock.patch("smtplib.SMTP")
+    def test_send_mime_ssl_server_auth(self, mock_smtp, mock_smtp_ssl):
+        app.config["SMTP_SSL"] = True
+        app.config["SMTP_SSL_SERVER_AUTH"] = True
+        mock_smtp.return_value = mock.Mock()
+        mock_smtp_ssl.return_value = mock.Mock()
+        utils.send_mime_email("from", "to", MIMEMultipart(), app.config, dryrun=False)
+        assert not mock_smtp.called
+        mock_smtp_ssl.assert_called_with(
+            app.config["SMTP_HOST"], app.config["SMTP_PORT"], context=mock.ANY
+        )
+        called_context = mock_smtp_ssl.call_args.kwargs["context"]
+        self.assertEqual(called_context.verify_mode, ssl.CERT_REQUIRED)
+
+    @mock.patch("smtplib.SMTP")
+    def test_send_mime_tls_server_auth(self, mock_smtp):
+        app.config["SMTP_STARTTLS"] = True
+        app.config["SMTP_SSL_SERVER_AUTH"] = True
+        mock_smtp.return_value = mock.Mock()
+        mock_smtp.return_value.starttls.return_value = mock.Mock()
+        utils.send_mime_email("from", "to", MIMEMultipart(), app.config, dryrun=False)
+        mock_smtp.return_value.starttls.assert_called_with(context=mock.ANY)
+        called_context = mock_smtp.return_value.starttls.call_args.kwargs["context"]
+        self.assertEqual(called_context.verify_mode, ssl.CERT_REQUIRED)
+
     @mock.patch("smtplib.SMTP_SSL")
     @mock.patch("smtplib.SMTP")
     def test_send_mime_noauth(self, mock_smtp, mock_smtp_ssl):


[superset] 18/29: feat(embedded): provides filter bar visibility setting on embedded dashboard (#21069) (#21070)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 9b8c20e6c0284e988de56687d419a780fd3257e3
Author: Jae Ik, Lee <ji...@gmail.com>
AuthorDate: Thu Sep 1 21:50:02 2022 +0900

    feat(embedded): provides filter bar visibility setting on embedded dashboard (#21069) (#21070)
    
    Co-authored-by: 이재익 [jileeon] <ji...@nexon.co.kr>
    (cherry picked from commit eb805682e2d9b8ff6c4bda446e665d1045afe55f)
---
 superset-embedded-sdk/README.md    |  7 ++++++-
 superset-embedded-sdk/src/const.ts |  4 ++++
 superset-embedded-sdk/src/index.ts | 21 ++++++++++++++++++---
 3 files changed, 28 insertions(+), 4 deletions(-)

diff --git a/superset-embedded-sdk/README.md b/superset-embedded-sdk/README.md
index 93b0aa4c09..7e05d94a6c 100644
--- a/superset-embedded-sdk/README.md
+++ b/superset-embedded-sdk/README.md
@@ -40,7 +40,12 @@ embedDashboard({
   supersetDomain: "https://superset.example.com",
   mountPoint: document.getElementById("my-superset-container"), // any html element that can contain an iframe
   fetchGuestToken: () => fetchGuestTokenFromBackend(),
-  dashboardUiConfig: { hideTitle: true }, // dashboard UI config: hideTitle, hideTab, hideChartControls (optional)
+  dashboardUiConfig: { // dashboard UI config: hideTitle, hideTab, hideChartControls, filters.visible, filters.expanded (optional)
+      hideTitle: true,
+      filters: {
+          expanded: true,
+      }
+  },
 });
 ```
 
diff --git a/superset-embedded-sdk/src/const.ts b/superset-embedded-sdk/src/const.ts
index e887974520..72eba8525d 100644
--- a/superset-embedded-sdk/src/const.ts
+++ b/superset-embedded-sdk/src/const.ts
@@ -18,3 +18,7 @@
  */
 
 export const IFRAME_COMMS_MESSAGE_TYPE = "__embedded_comms__";
+export const DASHBOARD_UI_FILTER_CONFIG_URL_PARAM_KEY: { [index: string]: any } = {
+  visible: "show_filters",
+  expanded: "expand_filters",
+}
diff --git a/superset-embedded-sdk/src/index.ts b/superset-embedded-sdk/src/index.ts
index 32b02641e0..85920950d1 100644
--- a/superset-embedded-sdk/src/index.ts
+++ b/superset-embedded-sdk/src/index.ts
@@ -17,7 +17,10 @@
  * under the License.
  */
 
-import { IFRAME_COMMS_MESSAGE_TYPE } from './const';
+import {
+  DASHBOARD_UI_FILTER_CONFIG_URL_PARAM_KEY,
+  IFRAME_COMMS_MESSAGE_TYPE
+} from './const';
 
 // We can swap this out for the actual switchboard package once it gets published
 import { Switchboard } from '@superset-ui/switchboard';
@@ -34,6 +37,11 @@ export type UiConfigType = {
   hideTitle?: boolean
   hideTab?: boolean
   hideChartControls?: boolean
+  filters?: {
+    [key: string]: boolean | undefined
+    visible?: boolean
+    expanded?: boolean
+  }
 }
 
 export type EmbedDashboardParams = {
@@ -45,7 +53,7 @@ export type EmbedDashboardParams = {
   mountPoint: HTMLElement
   /** A function to fetch a guest token from the Host App's backend server */
   fetchGuestToken: GuestTokenFetchFn
-  /** The dashboard UI config: hideTitle, hideTab, hideChartControls **/
+  /** The dashboard UI config: hideTitle, hideTab, hideChartControls, filters.visible, filters.expanded **/
   dashboardUiConfig?: UiConfigType
   /** Are we in debug mode? */
   debug?: boolean
@@ -99,6 +107,13 @@ export async function embedDashboard({
     return new Promise(resolve => {
       const iframe = document.createElement('iframe');
       const dashboardConfig = dashboardUiConfig ? `?uiConfig=${calculateConfig()}` : ""
+      const filterConfig = dashboardUiConfig?.filters || {}
+      const filterConfigKeys = Object.keys(filterConfig)
+      const filterConfigUrlParams = filterConfigKeys.length > 0
+        ? "&"
+        + filterConfigKeys
+          .map(key => DASHBOARD_UI_FILTER_CONFIG_URL_PARAM_KEY[key] + '=' + filterConfig[key]).join('&')
+        : ""
 
       // setup the iframe's sandbox configuration
       iframe.sandbox.add("allow-same-origin"); // needed for postMessage to work
@@ -131,7 +146,7 @@ export async function embedDashboard({
         resolve(new Switchboard({ port: ourPort, name: 'superset-embedded-sdk', debug }));
       });
 
-      iframe.src = `${supersetDomain}/embedded/${id}${dashboardConfig}`;
+      iframe.src = `${supersetDomain}/embedded/${id}${dashboardConfig}${filterConfigUrlParams}`;
       mountPoint.replaceChildren(iframe);
       log('placed the iframe')
     });


[superset] 04/29: Temporal X Axis values are not properly displayed if the time column has a custom label defined (#20819)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 8c2ca2d8d8dedf22ac164a22c6fd68e50b1699fe
Author: Diego Medina <di...@gmail.com>
AuthorDate: Fri Jul 22 06:05:37 2022 -0300

    Temporal X Axis values are not properly displayed if the time column has a custom label defined (#20819)
    
    (cherry picked from commit 51869f32acd24af183f7c07bb515835bb10a7bc6)
---
 .../plugins/plugin-chart-echarts/src/Timeseries/transformProps.ts      | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformProps.ts b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformProps.ts
index 89d5c1e03b..ca0e079609 100644
--- a/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformProps.ts
+++ b/superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/transformProps.ts
@@ -170,7 +170,8 @@ export default function transformProps(
     Object.values(rawSeries).map(series => series.name as string),
   );
   const isAreaExpand = stack === AreaChartExtraControlsValue.Expand;
-  const xAxisDataType = dataTypes?.[xAxisCol];
+  const xAxisDataType = dataTypes?.[xAxisCol] ?? dataTypes?.[xAxisOrig];
+
   const xAxisType = getAxisType(xAxisDataType);
   const series: SeriesOption[] = [];
   const formatter = getNumberFormatter(


[superset] 25/29: fix(sqllab): Fix cursor alignment in SQL lab editor by avoiding Lucida Console font on Windows (#21380)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 3fb48dd1e95da466ec66ac6c0fc5651933480d85
Author: MichaelHintz <47...@users.noreply.github.com>
AuthorDate: Tue Sep 13 22:03:21 2022 +0200

    fix(sqllab): Fix cursor alignment in SQL lab editor by avoiding Lucida Console font on Windows (#21380)
    
    (cherry picked from commit 3098e657e5699b60e5c3e10df1249bc3f4ca1729)
---
 superset-frontend/src/SqlLab/main.less | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/superset-frontend/src/SqlLab/main.less b/superset-frontend/src/SqlLab/main.less
index 7822b91d3c..652c7a60b2 100644
--- a/superset-frontend/src/SqlLab/main.less
+++ b/superset-frontend/src/SqlLab/main.less
@@ -370,8 +370,8 @@ div.tablePopover {
   border: 1px solid @gray-light;
   font-feature-settings: @font-feature-settings;
   // Fira Code causes problem with Ace under Firefox
-  font-family: 'Menlo', 'Lucida Console', 'Courier New', 'Ubuntu Mono',
-    'Consolas', 'source-code-pro', monospace;
+  font-family: 'Menlo', 'Consolas', 'Courier New', 'Ubuntu Mono',
+    'source-code-pro', 'Lucida Console', monospace;
 
   &.ace_autocomplete {
     // Use !important because Ace Editor applies extra CSS at the last second


[superset] 11/29: fix(dashboard): Fix scroll behaviour in DashboardBuilderSidepane (#20969)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 47c3cd10bda0b140bd8119e9353d9a39ed188ea0
Author: EugeneTorap <ev...@gmail.com>
AuthorDate: Tue Aug 16 20:29:28 2022 +0300

    fix(dashboard): Fix scroll behaviour in DashboardBuilderSidepane (#20969)
    
    (cherry picked from commit 6f3a555e589cd8caee7ef6d5e667531b5e7ac43d)
---
 .../dashboard/components/BuilderComponentPane/index.tsx   | 15 ++-------------
 1 file changed, 2 insertions(+), 13 deletions(-)

diff --git a/superset-frontend/src/dashboard/components/BuilderComponentPane/index.tsx b/superset-frontend/src/dashboard/components/BuilderComponentPane/index.tsx
index 4e47e161e4..7a1019a0e2 100644
--- a/superset-frontend/src/dashboard/components/BuilderComponentPane/index.tsx
+++ b/superset-frontend/src/dashboard/components/BuilderComponentPane/index.tsx
@@ -41,8 +41,7 @@ export interface BCPProps {
 
 const SUPERSET_HEADER_HEIGHT = 59;
 const SIDEPANE_ADJUST_OFFSET = 4;
-const SIDEPANE_HEADER_HEIGHT = 64; // including margins
-const SIDEPANE_FILTERBAR_HEIGHT = 56;
+const TOP_PANEL_OFFSET = 210;
 
 const BuilderComponentPaneTabs = styled(Tabs)`
   line-height: inherit;
@@ -52,20 +51,10 @@ const BuilderComponentPaneTabs = styled(Tabs)`
 const DashboardBuilderSidepane = styled.div<{
   topOffset: number;
 }>`
-  height: 100%;
+  height: calc(100% - ${TOP_PANEL_OFFSET}px);
   position: fixed;
   right: 0;
   top: 0;
-
-  .ReactVirtualized__List {
-    padding-bottom: ${({ topOffset }) =>
-      `${
-        SIDEPANE_HEADER_HEIGHT +
-        SIDEPANE_FILTERBAR_HEIGHT +
-        SIDEPANE_ADJUST_OFFSET +
-        topOffset
-      }px`};
-  }
 `;
 
 const BuilderComponentPane: React.FC<BCPProps> = ({


[superset] 24/29: fix: Add french translation missing (#20061)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit f106f6e66a30f85bc3220ba16ed2d627169ea2f1
Author: aehanno <10...@users.noreply.github.com>
AuthorDate: Tue Aug 30 16:23:05 2022 -0400

    fix: Add french translation missing (#20061)
    
    (cherry picked from commit 944808a0ce6f094071bff5b3b789e63157a8b8f7)
---
 superset/translations/fr/LC_MESSAGES/messages.json | 193 ++++++++++-
 superset/translations/fr/LC_MESSAGES/messages.po   | 356 ++++++++++++++++++---
 superset/translations/messages.pot                 |  10 +
 3 files changed, 496 insertions(+), 63 deletions(-)

diff --git a/superset/translations/fr/LC_MESSAGES/messages.json b/superset/translations/fr/LC_MESSAGES/messages.json
index 0f97274571..92f92d84b1 100644
--- a/superset/translations/fr/LC_MESSAGES/messages.json
+++ b/superset/translations/fr/LC_MESSAGES/messages.json
@@ -164,6 +164,7 @@
         "Quand vous utilisez 'Grouper par' vous êtes limité à une seule métrique"
       ],
       "Pivot Table": ["Table pivot"],
+      "Pivot Table v2": ["Table pivot v2"],
       "Please choose at least one 'Group by' field ": [
         "Merci de choisir au moins un champ dans 'Grouper par' "
       ],
@@ -437,6 +438,7 @@
       "Add Druid Column": ["Ajouter une colonne Druid"],
       "Edit Druid Column": ["Éditer une colonne Druid"],
       "Column": ["Colonne"],
+      "column": ["colonne"],
       "Type": ["Type"],
       "Datasource": ["Source de données"],
       "Groupable": ["Groupable"],
@@ -506,6 +508,7 @@
       "Fetch Values From": ["Récupérer les valeurs des prédicats"],
       "Changed By": ["Modifié par"],
       "Modified": ["Modifié"],
+      "Edit the dashboard": ["Modifier le tableau de bord"],
       "Refreshed metadata from cluster [{}]": [
         "Métadonnées du cluster [{}] rafraîchies"
       ],
@@ -847,6 +850,11 @@
         "Veuillez corriger une erreur de syntaxe dans la requête près de \"%(syntax_error)s\". Puis essayez de relancer la requête."
       ],
       "Original value": ["Valeur d'origine"],
+      "Duration in ms (66000 => 1m 6s)": ["Durée en ms (66000 => 1m 6s)"],
+      "Duration in ms (1.40008 => 1ms 400µs 80ns)": [
+        "Durée en ms (1.40008 => 1ms 400µs 80ns)"
+      ],
+      "Adaptive formatting": ["Formatage adapté"],
       "Second": ["Seconde"],
       "5 second": ["5 secondes"],
       "30 second": ["30 secondes"],
@@ -947,7 +955,9 @@
       "Annotation Layers": ["Couches d'annotations"],
       "Manage": ["Gestion"],
       "Dashboards": ["Tableaux de bord"],
+      "dashboards": ["tableaux de bord"],
       "Charts": ["Graphiques"],
+      "charts": ["graphiques"],
       "Plugins": ["Plugins"],
       "CSS Templates": ["Templates CSS"],
       "Row Level Security": ["Sécurité de niveau ligne"],
@@ -1286,6 +1296,7 @@
       "Add Annotation Layer": ["Ajouter une couche d'annotation"],
       "Edit Annotation Layer": ["Editer la couche d'annotation"],
       "Name": ["Nom"],
+      "name": ["nom"],
       "Table [%{table}s] could not be found, please double check your database connection, schema, and table name, error: {}": [
         "La table [%{table}s] n'a pu être trouvée, vérifiez à nouveau votre la connexion à votre base de données, le schéma et le nom de la table, error: {}"
       ],
@@ -1835,7 +1846,7 @@
       "Query search string": ["Chaîne de recherche"],
       "[From]-": ["[Depuis]-"],
       "[To]-": ["[à]-"],
-      "Filter by status": ["Filtrer par status"],
+      "Filter by status": ["Filtrer par statut"],
       "Success": ["Succès"],
       "Failed": ["Echec"],
       "Running": ["En cours"],
@@ -1855,7 +1866,7 @@
         "Une erreur s'est produite durant la sauvegarde du jeu de données"
       ],
       "Download to CSV": ["Télécharger en CSV"],
-      "Copy to Clipboard": ["Copier vers le presse-papier"],
+      "Copy to Clipboard": ["Copier vers le presse-papiers"],
       "Filter results": ["Filtrer les résultats"],
       "The number of results displayed is limited to %(rows)d by the configuration DISPLAY_MAX_ROWS. ": [
         "Le nombre de résultats affichés est limité à %(rows)d  par la configuration DISPLAY_MAX_ROWS. "
@@ -1933,6 +1944,9 @@
       "Run a query to display results here": [
         "Lancer la requête pour afficher les résultats ici"
       ],
+      "Run a query to display results": [
+        "Lancer une requête pour afficher les résultats"
+      ],
       "Preview: `%s`": ["Prévisualisation : `%s`"],
       "Results": ["Résultats"],
       "Query history": ["Historiques des requêtes"],
@@ -2242,7 +2256,7 @@
         "Erreur au chargement de ces résultats. Les requêtes s'interrompent au bout de %s secondes."
       ],
       "Timeout error": ["Erreur de timeout"],
-      "Click to favorite/unfavorite": ["Cliquez pour favori ou non"],
+      "Click to favorite/unfavorite": ["Ajouter/Retirer des favoris"],
       "Cell content": ["Contenu de cellule"],
       "The import was successful": ["Importé avec succès"],
       "OVERWRITE": ["ECRASE"],
@@ -2403,7 +2417,9 @@
       "Copy dashboard URL": ["Copier l'URL du tableau de bord"],
       "Share dashboard by email": ["Partager le tableau de bord par e-mail"],
       "Refresh dashboard": ["Rafraichir le tableau de bord"],
-      "Set auto-refresh interval": ["Définir l'interval d'auto-refresh"],
+      "Set auto-refresh interval": [
+        "Définir l'intervalle de rafraichissement automatique"
+      ],
       "Set filter mapping": ["Définir le mappage de filtre"],
       "Edit dashboard properties": [
         "Modifier les propriétés de ce tableau de bord"
@@ -2503,7 +2519,7 @@
       "Filter Sets (${filterSetFilterValues.length})": [
         "Filtres définis (${filterSetFilterValues.length})"
       ],
-      "Select parent filters": ["Selectionnee les filtres parents"],
+      "Select parent filters": ["Sélectionnée les filtres parents"],
       "Check configuration": ["Vérifier la configuration"],
       "Cannot load filter": ["Impossible de charger le filtre"],
       "Editing filter set:": ["Modifier l'ensemble de filtre :"],
@@ -2542,7 +2558,7 @@
       "Value is required": ["Une valeur est obligatoire"],
       "Configuration": ["Configuration"],
       "Scoping": ["Portée"],
-      "Select filter": ["Selectionner un filtre"],
+      "Select filter": ["Sélectionner un filtre"],
       "Value": ["Valeur"],
       "Range filter": ["Filtre d'intervalle"],
       "Numerical range": ["Interval numérique"],
@@ -2863,7 +2879,7 @@
       "Configure Advanced Time Range ": [
         "Configurer Intervalle de temps avancé "
       ],
-      "START (INCLUSIVE)": ["DEBUT (INCLUSIVE)"],
+      "START (INCLUSIVE)": ["DÉBUT (INCLUSIVE)"],
       "Start date included in time range": [
         "Date de début incluse de l'intervalle de temps"
       ],
@@ -2878,7 +2894,7 @@
         "Configurer intervalle de temps : Dernier ..."
       ],
       "Configure custom time range": [
-        "Configurer un intervalle de temps personnalisée"
+        "Configurer un intervalle de temps personnalisé"
       ],
       "Relative quantity": ["Quantité relative"],
       "Relative period": ["Période relative"],
@@ -2946,7 +2962,27 @@
       "User must select a value for this filter": [
         "L'utilisateur doit sélectionner une valeur pour ce filtre"
       ],
-      "Filter configuration": ["Configuration du filtre"],
+      "Filter Configuration": ["Configuration du filtre"],
+      "Filter Settings": ["Paramètres du filtre"],
+      "Inverse selection": ["Inverser la selection"],
+      "Dynamically search all filter values": [
+        "Charge dynamiquement les valeurs du filtre"
+      ],
+      "Filter value is required": ["La valeur du filtre est requise"],
+      "Default value must be set when \"Filter value is required\" is checked": [
+        "La valeur par defaut doit être initialisé si \"La valeur du filtre est requise\" est selectionné"
+      ],
+      "User must select a value before applying the filter": [
+        "L'utilisateur doit selectionner une valeur avant d'appliquer le filtre"
+      ],
+      "When using this option, default value can’t be set": [
+        "Quand l'option est utilisée, une valeur par defaut doit être indiquée"
+      ],
+      "Exclude selected values": ["Exclus les valeurs selectionnées"],
+      "Can select multiple values": ["Peut selectionner plusieurs valeurs"],
+      "Select first filter value by default": [
+        "Selectionne la première valeur du filtre par défaut"
+      ],
       "Custom SQL ad-hoc filters are not available for the native Druid connector": [
         "Les filtres ad-hoc pour le SQL personnalisé ne sont par disponibles pour le connecteur Druid natif"
       ],
@@ -2981,7 +3017,6 @@
       "Simple ad-hoc metrics are not enabled for this dataset": [
         "Les métriques ad-hoc simples ne sont pas disponibles pour ce dataset"
       ],
-      "column": ["colonne"],
       "aggregate": ["agrégat"],
       "Custom SQL ad-hoc metrics are not available for the native Druid connector": [
         "Les métriques ad-hoc pour le SQL personnalisé ne sont pas disponibles pour le connecteur Druid natif"
@@ -3140,7 +3175,7 @@
       "An error occurred while fetching created by values: %s": [
         "Une erreur s'est produite en récupérant les valeurs créées : %s"
       ],
-      "Status": ["Status"],
+      "Status": ["Statut"],
       "${AlertState.success}": ["${AlertState.success}"],
       "${AlertState.working}": ["${AlertState.working}"],
       "${AlertState.error}": ["${AlertState.error}"],
@@ -3266,7 +3301,7 @@
         "Etes-vous sûr de vouloir supprimer les couches sélectionnées ?"
       ],
       "Are you sure you want to delete": ["Etes-vous sûr de vouloir supprimer"],
-      "Modified %s": ["%s modifié"],
+      "Modified %s": ["Modifié %s"],
       "The passwords for the databases below are needed in order to import them together with the charts. Please note that the \"Secure Extra\" and \"Certificate\" sections of the database configuration are not present in export files, and should be added manually after the import if they are needed.": [
         "Les mots de passe pour les bases de données ci-dessous sont nécessaires pour les importer en même temps que les graphiques. Notez que les sections \"Securité Supplémentaire\" et  \"Certificat\" de la configuration de la base de données ne sont pas présents dans les fichiers d'export et doivent être ajoutés manuellement après l'import si nécessaire."
       ],
@@ -3276,7 +3311,7 @@
       "There was an issue deleting the selected charts: %s": [
         "Il y a eu un problème lors de la suppression des graphiques sélectionnés : %s"
       ],
-      "Modified by": ["Modifié"],
+      "Modified by": ["Modifié par"],
       "Favorite": ["Favoris"],
       "Any": ["Tous"],
       "Yes": ["Oui"],
@@ -3290,6 +3325,7 @@
         "Une erreur s'est produite durant la récupération du graphique créé par les valeurs : %s"
       ],
       "Viz type": ["Type"],
+      "viz type": ["type de visualisation"],
       "Alphabetical": ["Alphabétique"],
       "Recently modified": ["Dernière modification"],
       "Least recently modified": ["Dernière modification"],
@@ -3615,9 +3651,10 @@
       "Query name": ["Nom de la requête"],
       "[Untitled]": ["[Sans titre]"],
       "Unknown": ["Erreur inconnue"],
-      "Edited": ["Édité"],
-      "Created": ["Créé le"],
+      "Edited": ["Édités"],
+      "Created": ["Créés le"],
       "Viewed": ["Consultés"],
+      "Viewed %s": ["Consulté %s"],
       "Mine": ["Personnel"],
       "Recently viewed charts, dashboards, and saved queries will appear here": [
         "Les graphiques, tableaux de bord et requêtes sauvegardées qui ont été récemment consultés apparaîtront ici"
@@ -3641,7 +3678,7 @@
       "${tableName}": ["${tableName}"],
       "query": ["requête"],
       "Share": ["Partage de requête"],
-      "Ran %s": ["A exécuté %s"],
+      "Ran %s": ["A été exécuté %s"],
       "There was an issue fetching your recent activity: %s": [
         "Une erreur s'est produite lors de la récupération de votre activité récente : %s"
       ],
@@ -3655,6 +3692,8 @@
         "Une erreur s'est produite lors de la récupération de vos requêtes sauvegardées : %s"
       ],
       "Recents": ["Récents"],
+      "recents": ["récents"],
+      "recent": ["date"],
       "Select start and end date": [
         "Selectionner la date de début et la date de fin"
       ],
@@ -3718,7 +3757,127 @@
       "Percentages": ["Pourcentages"],
       "Tabular": ["Tabulaire"],
       "Text": ["Zone de texte"],
-      "Trend": ["Tendance"]
+      "Trend": ["Tendance"],
+      "View All »": ["Tout voir »"],
+      "Add/Edit Filters": ["Ajouter/Modifier les filtres"],
+      "Add filters and dividers": ["Ajouter des filtres et diviseurs"],
+      "Add and edit filters": ["Ajouter et modifier les filtres"],
+      "Last": ["Dernier"],
+      "last day": ["hier"],
+      "last week": ["la semaine derniere"],
+      "last month": ["le mois dernier"],
+      "last quarter": ["le trimestre dernier"],
+      "last year": ["l'année dernière"],
+      "previous calendar week": ["semaine calendaire précédente"],
+      "previous calendar month": ["mois calendaire précédent"],
+      "previous calendar year": ["année calendaire précédente"],
+      "Days %s": ["Jours %s"],
+      "Before": ["Avant"],
+      "After": ["Après"],
+      "Custom": ["Personnalisée"],
+      "Seconds %s": ["Secondes %s"],
+      "Minutes %s": ["Minutes %s"],
+      "Hours %s": ["Heures %s"],
+      "Weeks %s": ["Semaines %s"],
+      "Months %s": ["Mois %s"],
+      "Quarters %s": ["Trimestres %s"],
+      "Years %s": ["Année %s"],
+      "Certified": ["Certifié"],
+      "Relative Date/Time": ["Date/Heure Relative"],
+      "Specific Date/Time": ["Date/Heure Spécifique"],
+      "Now": ["Maintenant"],
+      "Midnight": ["Minuit"],
+      "Apply filters": ["Appliquer les filtres"],
+      "Single Value": ["Valeur Unique"],
+      "Single value": ["Valeur unique"],
+      "Copy permalink to clipboard": ["Copier le lien dans le presse-papiers"],
+      "Share permalink by email": ["Partager le lien par mail"],
+      "Connect database": ["Connexion à la base de données"],
+      "Upload CSV to database": [
+        "Importer des fichiers CSV vers la base de données"
+      ],
+      "Upload columnar file to database": [
+        "Importer des colonnes vers la base de données"
+      ],
+      "Upload Excel file to database": [
+        "Importer des fichiers Excel vers la base de données"
+      ],
+      "No %(tableName)s yet": ["Il n'y a pas encore de %(tableName)s"],
+      "saved queries": ["requête sauvegardée"],
+      "See all %(tableName)s": ["Explorer - %(tableName)s"],
+      "Sort by %s": ["Trier par %s"],
+      "Filters out of scope (%d)": ["Filtres hors du périmètre (%d)"],
+      "There are no charts added to this dashboard": [
+        "Il n'y a pas de graphiques ajouté dans ce tableau de bord"
+      ],
+      "Go to the edit mode to configure the dashboard and add charts": [
+        "Allez dans l'edition pour configurer le tableau de bord et ajouter des graphiques"
+      ],
+      "No filters are currently added": ["Aucun filtre ajouté"],
+      "Drag and drop components and charts to the dashboard": [
+        "Glissez/Déposez des composants et des graphiques sur le tableau de bord"
+      ],
+      "You can create new charts or use existing ones from the panel on the right": [
+        "Vous pouvez créer de nouveaux graphiques ou utililser ceux existants à partir du panneau de droite"
+      ],
+      "You can create a new chart or use existing ones from the panel on the right": [
+        "Vous pouvez créer un nouveau graphique ou utililser ceux existants à partir du panneau de droite"
+      ],
+      "Enable 'Allow data upload' in any database's settings": [
+        "Activez l'option 'Autoriser le chargement de données' dans les paramètres de la base de données"
+      ],
+      "There are no components added to this tab": [
+        "Il n'y a pas de composant à ajouter dans cet onglet"
+      ],
+      "You can add the components in the edit mode": [
+        "Vous pouvez ajouter les composants via mode edition"
+      ],
+      "You can add the components in the": [
+        "Vous pouvez ajouter les composants via le"
+      ],
+      "edit mode": ["mode edition"],
+      "Time Column": ["Colonne de temps"],
+      "Time Grain": ["Granularité"],
+      "Pivot Options": ["Options de pivot"],
+      "Aggregation function": ["Fonction d'agrégation"],
+      "Aggregate function to apply when pivoting and computing the total rows and columns": [
+        "Fonction d'agrégation à appliquer lors du pivotement et du calcul du total des lignes et des colonnes"
+      ],
+      "Combine Metrics": ["Combiner les métriques"],
+      "Show totals": ["Afficher les totaux"],
+      "Transpose Pivot": ["Pivot de transposition"],
+      "Update chart": ["Mettre à jour"],
+      "Display total row/column": ["Affiche le total ligne/colonne"],
+      "Swap Groups and Columns": ["Permuter les groupes et les colonnes"],
+      "Samples": ["Exemples"],
+      "Number format": ["Format D3"],
+      "Date format": ["Format Date"],
+      "Values dependent on": ["Valeurs dépendent de"],
+      "No results were returned for this query": [
+        "Aucun résultat avec ces paramètres"
+      ],
+      "Values selected in other filters will affect the filter options to only show relevant values": [
+        "Les valeurs sélectionnées dans d'autres filtres affecteront les options de filtrage afin de n'afficher que les valeurs pertinentes"
+      ],
+      "Values are dependent on other filters": [
+        "Les valeurs dépendent d'autres filtres"
+      ],
+      "Display metrics side by side within each column, as opposed to each column being displayed side by side for each metric.": [
+        "Affichez les indicateurs côte à côte dans chaque colonne, au lieu d'afficher chaque colonne côte à côte pour chaque indicateur."
+      ],
+      "Click the button above to add a filter to the dashboard": [
+        "Cliquez sur le bouton ci-dessus pour ajouter un filtre au tableau de bord"
+      ],
+      "Totals": ["Totaux"],
+      "Filter only displays values relevant to selections made in other filters.": [
+        "Le filtre n'affiche que les valeurs pertinentes après les sélections effectuées dans d'autres filtres."
+      ],
+      "No data after filtering or data is NULL for the latest time record": [
+        "Pas de données après filtrage ou données manquantes pour la période sélectionnée"
+      ],
+      "Scope": ["Périmètre"],
+      "Dependent on": ["Dépend de"],
+      "No matching records found": ["Aucun résultat trouvé"]
     }
   }
 }
diff --git a/superset/translations/fr/LC_MESSAGES/messages.po b/superset/translations/fr/LC_MESSAGES/messages.po
index 0661382664..5472d7459c 100644
--- a/superset/translations/fr/LC_MESSAGES/messages.po
+++ b/superset/translations/fr/LC_MESSAGES/messages.po
@@ -676,7 +676,7 @@ msgstr "Intervalle de temps courant"
 #: superset-frontend/packages/superset-ui-chart-controls/src/utils/D3Formatting.ts:49
 #, fuzzy
 msgid "Adaptive formatting"
-msgstr "Format Datetime"
+msgstr "Formatage adapté"
 
 #: superset-frontend/src/components/ReportModal/index.tsx:267
 #: superset-frontend/src/dashboard/components/nativeFilters/FiltersConfigModal/FilterTitlePane.tsx:133
@@ -928,7 +928,7 @@ msgstr ""
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:84
 #, fuzzy
 msgid "After"
-msgstr "date"
+msgstr "Après"
 
 #: superset-frontend/packages/superset-ui-chart-controls/src/constants.ts:45
 #, fuzzy
@@ -2035,7 +2035,7 @@ msgstr "Faites attention."
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:76
 #, fuzzy
 msgid "Before"
-msgstr "Forcer à rafraîchir"
+msgstr "Avant"
 
 #: superset-frontend/plugins/legacy-preset-chart-big-number/src/BigNumberTotal/index.ts:38
 #: superset/viz.py:1254
@@ -2500,7 +2500,7 @@ msgstr "Détails de certification"
 #: superset-frontend/src/views/CRUD/dashboard/DashboardList.tsx:504
 #, fuzzy
 msgid "Certified"
-msgstr "Certifié par"
+msgstr "Certifié"
 
 #: superset-frontend/src/components/Datasource/DatasourceEditor.jsx:259
 msgid "Certified By"
@@ -3750,7 +3750,7 @@ msgstr "Actif"
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:33
 #, fuzzy
 msgid "Custom"
-msgstr "Personnaliser"
+msgstr "Personnalisée"
 
 #: superset/views/dynamic_plugins.py:59
 msgid "Custom Plugin"
@@ -4241,7 +4241,7 @@ msgstr "Filtre de date"
 #: superset-frontend/plugins/plugin-chart-pivot-table/src/plugin/controlPanel.tsx:216
 #, fuzzy
 msgid "Date format"
-msgstr "Format Datetime"
+msgstr "Format Date"
 
 #: superset-frontend/src/explore/components/controls/DateFilterControl/components/CustomFrame.tsx:235
 msgid "Date/Time"
@@ -4270,7 +4270,7 @@ msgstr "Jour"
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:66
 #, fuzzy, python-format
 msgid "Days %s"
-msgstr "A exécuté %s"
+msgstr "Jours %s"
 
 #: superset/connectors/sqla/models.py:1593
 msgid "Db engine did not return all queried columns"
@@ -4794,6 +4794,10 @@ msgstr "Distibution - histogramme"
 msgid "Divider"
 msgstr "Diviseur"
 
+#: superset-frontend/src/components/ListView/Filters/Select.tsx:77
+msgid "Filter"
+msgstr "Filtre"
+
 #: superset-frontend/plugins/legacy-preset-chart-nvd3/src/Pie/controlPanel.ts:86
 #: superset-frontend/plugins/plugin-chart-echarts/src/Pie/controlPanel.tsx:211
 msgid "Do you want a donut or a pie?"
@@ -5023,11 +5027,11 @@ msgstr ""
 
 #: superset-frontend/packages/superset-ui-chart-controls/src/utils/D3Formatting.ts:41
 msgid "Duration in ms (1.40008 => 1ms 400µs 80ns)"
-msgstr ""
+msgstr "Durée en ms (1.40008 => 1ms 400µs 80ns)"
 
 #: superset-frontend/packages/superset-ui-chart-controls/src/utils/D3Formatting.ts:40
 msgid "Duration in ms (66000 => 1m 6s)"
-msgstr ""
+msgstr "Durée en ms (66000 => 1m 6s)"
 
 #: superset-frontend/src/views/CRUD/data/query/QueryList.tsx:205
 #, python-format
@@ -5934,7 +5938,7 @@ msgstr "Filtrer par base de données"
 
 #: superset-frontend/src/SqlLab/components/QuerySearch/index.tsx:240
 msgid "Filter by status"
-msgstr "Filtrer par status"
+msgstr "Filtrer par statut"
 
 #: superset-frontend/src/SqlLab/components/QuerySearch/index.tsx:195
 msgid "Filter by user"
@@ -6035,7 +6039,7 @@ msgstr "Configuration et portée des filtres"
 #: superset-frontend/src/dashboard/components/nativeFilters/FilterBar/FilterControls/FilterControls.tsx:159
 #, python-format
 msgid "Filters out of scope (%d)"
-msgstr ""
+msgstr "Filtres hors du périmètre (%d)"
 
 #: superset/connectors/sqla/views.py:348
 msgid ""
@@ -6340,6 +6344,7 @@ msgid "Histogram"
 msgstr "Histogramme"
 
 #: superset/initialization/__init__.py:222
+#: superset-frontend/src/views/CRUD/welcome/Welcome.tsx:278
 msgid "Home"
 msgstr "Accueil"
 
@@ -6371,7 +6376,7 @@ msgstr "Heure"
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:65
 #, fuzzy, python-format
 msgid "Hours %s"
-msgstr "heure"
+msgstr "Heures %s"
 
 #: superset-frontend/src/components/Datasource/DatasourceEditor.jsx:779
 msgid "Hours offset"
@@ -7706,7 +7711,7 @@ msgstr ""
 
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:93
 msgid "Midnight"
-msgstr ""
+msgstr "Minuit"
 
 #: superset-frontend/plugins/plugin-chart-echarts/src/Gauge/controlPanel.tsx:88
 #: superset-frontend/src/explore/components/controls/BoundsControl.jsx:112
@@ -7783,7 +7788,7 @@ msgstr "Minute"
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:64
 #, fuzzy, python-format
 msgid "Minutes %s"
-msgstr "minute"
+msgstr "Minutes %s"
 
 #: superset-frontend/src/views/CRUD/data/database/DatabaseModal/index.tsx:93
 #, fuzzy
@@ -7839,7 +7844,7 @@ msgstr "Mois"
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:68
 #, fuzzy, python-format
 msgid "Months %s"
-msgstr "mois"
+msgstr "Mois %s"
 
 #: superset-frontend/src/explore/components/controls/DatasourceControl/index.jsx:238
 msgid "More dataset related options"
@@ -8080,7 +8085,7 @@ msgstr "Non"
 #: superset-frontend/src/views/CRUD/welcome/EmptyState.tsx:74
 #, python-format
 msgid "No %(tableName)s yet"
-msgstr ""
+msgstr "Il n'y a pas encore de %(tableName)s"
 
 #: superset-frontend/src/views/CRUD/alert/AlertList.tsx:376
 #, python-format
@@ -8133,7 +8138,7 @@ msgstr "Pas de données"
 
 #: superset-frontend/plugins/legacy-preset-chart-big-number/src/BigNumber/BigNumber.tsx:219
 msgid "No data after filtering or data is NULL for the latest time record"
-msgstr ""
+msgstr "Pas de données après filtrage ou données manquantes pour la période sélectionnée"
 
 #: superset/dashboards/commands/importers/v0.py:321
 msgid "No data in file"
@@ -8154,6 +8159,7 @@ msgstr "Aucun tableau de bord favori pour le moment, cliquer sur les étoiles !"
 #: superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/index.tsx:313
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:35
 #: superset-frontend/src/explore/controls.jsx:342
+#: superset-frontend/src/explore/components/controls/DateFilterControl/DateFilterLabel.tsx:320
 msgid "No filter"
 msgstr "Pas de filtre"
 
@@ -8279,7 +8285,7 @@ msgstr "Novembre"
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:92
 #, fuzzy
 msgid "Now"
-msgstr "Ligne"
+msgstr "Maintenant"
 
 #: superset/datasets/filters.py:26
 msgid "Null or Empty"
@@ -9435,7 +9441,7 @@ msgstr "Trimestre"
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:69
 #, fuzzy, python-format
 msgid "Quarters %s"
-msgstr "Trimestre"
+msgstr "Trimestres %s"
 
 #: superset-frontend/plugins/legacy-plugin-chart-calendar/src/controlPanel.ts:32
 #: superset-frontend/plugins/legacy-plugin-chart-chord/src/controlPanel.ts:26
@@ -9591,7 +9597,7 @@ msgstr "Spatial"
 #: superset-frontend/src/views/CRUD/welcome/SavedQueries.tsx:318
 #, python-format
 msgid "Ran %s"
-msgstr "A exécuté %s"
+msgstr "A été exécuté %s"
 
 #: superset-frontend/plugins/legacy-plugin-chart-country-map/src/index.js:35
 #: superset-frontend/plugins/legacy-plugin-chart-histogram/src/index.js:38
@@ -9810,7 +9816,7 @@ msgstr ""
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:91
 #, fuzzy
 msgid "Relative Date/Time"
-msgstr "Quantité relative"
+msgstr "Date/Heure Relative"
 
 #: superset-frontend/src/explore/components/controls/DateFilterControl/components/CustomFrame.tsx:157
 #: superset-frontend/src/explore/components/controls/DateFilterControl/components/CustomFrame.tsx:210
@@ -10260,6 +10266,14 @@ msgstr "Exécuter"
 msgid "Run a query to display results here"
 msgstr "Lancer la requête pour afficher les résultats ici"
 
+#: superset-frontend/src/explore/components/DataTablesPane/index.tsx:182
+msgid "Run a query to display results"
+msgstr "Lancer une requête pour afficher les résultats"
+
+#: superset-frontend/src/explore/components/DataTablesPane/index.tsx:181
+msgid "Run a query to display samples"
+msgstr "Lancer une requête pour afficher les exemples"
+
 #: superset-frontend/src/explore/components/ExploreAdditionalActionsMenu/index.jsx:101
 msgid "Run in SQL Lab"
 msgstr "Exécuter dans SQL Lab"
@@ -10677,7 +10691,7 @@ msgstr ""
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:63
 #, fuzzy, python-format
 msgid "Seconds %s"
-msgstr "30 secondes"
+msgstr "%s secondes"
 
 #: superset/views/database/mixins.py:197
 msgid "Secure Extra"
@@ -10701,7 +10715,7 @@ msgstr "Securité et accès"
 #: superset-frontend/src/views/CRUD/welcome/EmptyState.tsx:157
 #, fuzzy, python-format
 msgid "See all %(tableName)s"
-msgstr "Explorer - %(table)s"
+msgstr "Explorer - %(tableName)s"
 
 #: superset-frontend/src/components/ErrorMessage/ErrorAlert.tsx:155
 msgid "See less"
@@ -11328,16 +11342,16 @@ msgstr "Filtres par métrique"
 #: superset-frontend/src/dashboard/components/nativeFilters/FiltersConfigModal/FiltersConfigForm/FiltersConfigForm.tsx:1234
 #, fuzzy
 msgid "Single Value"
-msgstr "Valeur droite"
+msgstr "Valeur Unique"
 
 #: superset-frontend/src/filters/components/Range/controlPanel.ts:65
 #, fuzzy
 msgid "Single value"
-msgstr "Valeur droite"
+msgstr "Valeur unique"
 
 #: superset-frontend/src/dashboard/components/nativeFilters/FiltersConfigModal/FiltersConfigForm/FiltersConfigForm.tsx:1251
 msgid "Single value type"
-msgstr ""
+msgstr "Type de valeur unique"
 
 #: superset-frontend/plugins/plugin-chart-echarts/src/Tree/controlPanel.tsx:261
 msgid "Size of edge symbols"
@@ -11507,7 +11521,22 @@ msgstr "Trier par"
 #: superset-frontend/src/dashboard/components/SliceAdder.jsx:256
 #, fuzzy, python-format
 msgid "Sort by %s"
-msgstr "Trier par"
+msgstr "Trier par %s"
+
+#: superset-frontend/src/dashboard/components/SliceAdder.jsx:66
+#, fuzzy, python-format
+msgid "viz type"
+msgstr "type de visualisation"
+
+#: superset-frontend/src/dashboard/components/SliceAdder.jsx:65
+#, fuzzy, python-format
+msgid "name"
+msgstr "nom"
+
+#: superset-frontend/src/dashboard/components/SliceAdder.jsx:66
+#, fuzzy, python-format
+msgid "recent"
+msgstr "date"
 
 #: superset-frontend/plugins/legacy-plugin-chart-chord/src/controlPanel.ts:39
 #: superset-frontend/plugins/legacy-plugin-chart-force-directed/src/controlPanel.ts:38
@@ -11588,7 +11617,7 @@ msgstr "Spatial"
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:90
 #, fuzzy
 msgid "Specific Date/Time"
-msgstr "Retour au datetime spécifique."
+msgstr "Date/Heure Spécifique"
 
 #: superset/views/database/forms.py:127 superset/views/database/forms.py:286
 #: superset/views/database/forms.py:414
@@ -11698,7 +11727,7 @@ msgstr ""
 #: superset-frontend/src/views/CRUD/dashboard/DashboardList.tsx:302
 #: superset-frontend/src/views/CRUD/dashboard/DashboardList.tsx:492
 msgid "Status"
-msgstr "Status"
+msgstr "Statut"
 
 #: superset-frontend/plugins/plugin-chart-echarts/src/Timeseries/Step/controlPanel.tsx:112
 #, fuzzy
@@ -13735,7 +13764,7 @@ msgstr ""
 
 #: superset-frontend/plugins/plugin-chart-table/src/TableChart.tsx:439
 msgid "Totals"
-msgstr ""
+msgstr "Totaux"
 
 #: superset-frontend/src/SqlLab/components/ResultSet/index.tsx:820
 msgid "Track job"
@@ -14357,7 +14386,7 @@ msgstr ""
 #: superset-frontend/src/views/CRUD/welcome/DashboardTable.tsx:219
 #: superset-frontend/src/views/CRUD/welcome/SavedQueries.tsx:296
 msgid "View All »"
-msgstr ""
+msgstr "Tout voir »"
 
 #: superset-frontend/src/dashboard/components/SliceHeaderControls/index.tsx:287
 msgid "View chart in Explore"
@@ -14395,7 +14424,7 @@ msgstr "Consultés"
 #: superset-frontend/src/views/CRUD/welcome/ActivityTable.tsx:124
 #, fuzzy, python-format
 msgid "Viewed %s"
-msgstr "Consultés"
+msgstr "Consultés %s"
 
 #: superset-frontend/plugins/legacy-plugin-chart-map-box/src/controlPanel.ts:268
 #, fuzzy
@@ -14646,7 +14675,7 @@ msgstr "Semaine terminant le dimanche"
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:67
 #, fuzzy, python-format
 msgid "Weeks %s"
-msgstr "semaine"
+msgstr "Semaines %s"
 
 #: superset-frontend/src/components/ErrorMessage/TimeoutErrorMessage.tsx:52
 #, python-format
@@ -15201,7 +15230,7 @@ msgstr "Année"
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:70
 #, fuzzy, python-format
 msgid "Years %s"
-msgstr "année"
+msgstr "Année %s"
 
 #: superset-frontend/src/views/CRUD/chart/ChartList.tsx:443
 #: superset-frontend/src/views/CRUD/chart/ChartList.tsx:537
@@ -15624,7 +15653,7 @@ msgstr "graphique"
 #: superset-frontend/src/views/CRUD/welcome/EmptyState.tsx:26
 #, fuzzy
 msgid "charts"
-msgstr "graphique"
+msgstr "graphiques"
 
 #: superset-frontend/src/explore/components/controls/FilterControl/AdhocFilterEditPopoverSqlTabContent/index.jsx:101
 msgid "choose WHERE or HAVING..."
@@ -15666,7 +15695,7 @@ msgstr "tableau de bord"
 #: superset-frontend/src/views/CRUD/welcome/EmptyState.tsx:27
 #, fuzzy
 msgid "dashboards"
-msgstr "tableau de bord"
+msgstr "tableaux de bord"
 
 #: superset-frontend/src/views/CRUD/data/database/DatabaseList.tsx:89
 #: superset-frontend/src/views/CRUD/data/database/DatabaseList.tsx:471
@@ -15879,27 +15908,27 @@ msgstr "Label"
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:39
 #, fuzzy
 msgid "last day"
-msgstr "Samedi"
+msgstr "hier"
 
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:41
 #, fuzzy
 msgid "last month"
-msgstr "mois"
+msgstr "le mois dernier"
 
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:42
 #, fuzzy
 msgid "last quarter"
-msgstr "Trimestre"
+msgstr "le trimestre dernier"
 
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:40
 #, fuzzy
 msgid "last week"
-msgstr "semaine"
+msgstr "la semaine derniere"
 
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:43
 #, fuzzy
 msgid "last year"
-msgstr "Cluster"
+msgstr "l'année dernière"
 
 #: superset-frontend/src/SqlLab/components/TableElement/index.tsx:120
 msgid "latest partition:"
@@ -15991,15 +16020,15 @@ msgstr ""
 
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:53
 msgid "previous calendar month"
-msgstr ""
+msgstr "mois calendaire précédent"
 
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:50
 msgid "previous calendar week"
-msgstr ""
+msgstr "semaine calendaire précédente"
 
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:56
 msgid "previous calendar year"
-msgstr ""
+msgstr "année calendaire précédente"
 
 #: superset-frontend/src/views/CRUD/dashboard/DashboardCard.tsx:153
 msgid "published"
@@ -16020,7 +16049,7 @@ msgstr "reboot"
 #: superset-frontend/src/views/CRUD/welcome/EmptyState.tsx:28
 #, fuzzy
 msgid "recents"
-msgstr "Récents"
+msgstr "récents"
 
 #: superset-frontend/src/explore/components/controls/ConditionalFormattingControl/FormattingPopoverContent.tsx:43
 msgid "red"
@@ -16055,7 +16084,7 @@ msgstr "lignes récupérées"
 #: superset-frontend/src/views/CRUD/welcome/EmptyState.tsx:29
 #, fuzzy
 msgid "saved queries"
-msgstr "Requêtes sauvegardées"
+msgstr "requêtes sauvegardées"
 
 #: superset-frontend/plugins/plugin-chart-table/src/TableChart.tsx:124
 msgid "search.num_records"
@@ -16117,3 +16146,238 @@ msgstr "année"
 #: superset-frontend/src/explore/components/controls/ConditionalFormattingControl/FormattingPopoverContent.tsx:42
 msgid "yellow"
 msgstr "jaune"
+
+#: superset-frontend/src/dashboard/components/nativeFilters/FilterBar/Header/index.tsx:104
+msgid "Add/Edit Filters"
+msgstr "Ajouter/Editer les filtres"
+
+#: superset-frontend/src/dashboard/components/nativeFilters/FiltersConfigModal/FilterTitlePane.tsx:109
+msgid "Add and edit filters"
+msgstr "Ajouter et modifier les filtres"
+
+#: superset-frontend/src/dashboard/components/nativeFilters/FiltersConfigModal/FiltersConfigModal.tsx:500
+msgid "Add filters and dividers"
+msgstr "Ajouter un filtre ou un diviseur"
+
+#: superset-frontend/src/views/components/MenuRight.tsx:105
+msgid "Connect database"
+msgstr "Connexion à la base de données"
+
+#: superset-frontend/src/views/components/MenuRight.tsx:105
+msgid "No %s yet"
+msgstr "Aucune %s"
+
+#: superset-frontend/src/dashboard/components/DashboardBuilder/DashboardBuilder.tsx:394
+msgid "There are no charts added to this dashboard"
+msgstr "Il n'y a pas de graphiques ajouté dans ce tableau de bord"
+
+#: superset-frontend/src/dashboard/components/DashboardBuilder/DashboardBuilder.tsx:398
+msgid "Go to the edit mode to configure the dashboard and add charts"
+msgstr "Allez dans l'edition pour configurer le tableau de bord et ajouter des graphiques"
+
+#: superset-frontend/src/dashboard/components/nativeFilters/FiltersConfigModal/FiltersConfigForm/FiltersConfigForm.tsx:276
+msgid "Filter Settings"
+msgstr "Paramètres des filtres"
+
+#: superset-frontend/src/dashboard/components/nativeFilters/FiltersConfigModal/FiltersConfigForm/FiltersConfigForm.tsx:276
+msgid "Filter Configuration"
+msgstr "Configuration du filtre"
+
+#: superset-frontend/src/filters/components/Select/controlPanel.ts:104
+msgid "Select first filter value by default"
+msgstr "Selectionne la première valeur du filtre par défaut"
+
+#: superset-frontend/src/filters/components/GroupBy/controlPanel.ts:59
+#: superset-frontend/src/filters/components/Select/controlPanel.ts:77
+msgid "Can select multiple values"
+msgstr "Peut selectionner plusieurs valeurs"
+
+#: superset-frontend/src/filters/components/Select/controlPanel.ts:136
+msgid "Dynamically search all filter values"
+msgstr "Charge dynamiquement les valeurs du filtre"
+
+#: superset-frontend/src/filters/components/GroupBy/controlPanel.ts:72
+#: superset-frontend/src/filters/components/Range/controlPanel.ts:55
+#: superset-frontend/src/filters/components/Select/controlPanel.ts:90
+#: superset-frontend/src/filters/components/Time/controlPanel.ts:53
+#: superset-frontend/src/filters/components/TimeColumn/controlPanel.ts:33
+#: superset-frontend/src/filters/components/TimeGrain/controlPanel.ts:33
+msgid "Filter value is required"
+msgstr "La valeur du filtre est requise"
+
+#: superset-frontend/src/dashboard/components/nativeFilters/FilterBar/index.tsx:388
+#: superset-frontend/src/dashboard/components/nativeFilters/FilterBar/index.tsx:428
+msgid "No filters are currently added"
+msgstr "Aucun filtre ajouté"
+
+#: superset-frontend/src/dashboard/components/nativeFilters/FilterBar/index.tsx:392
+#: superset-frontend/src/dashboard/components/nativeFilters/FilterBar/index.tsx:432
+msgid "Click the button above to add a filter to the dashboard"
+msgstr "Pour ajouter un filtre, cliquez sur le bouton au dessus"
+
+#: superset-frontend/src/dashboard/components/DashboardGrid.jsx:155
+msgid "Drag and drop components and charts to the dashboard"
+msgstr "Glissez/Déposez des composants et des graphiques sur le tableau de bord"
+
+#: superset-frontend/src/dashboard/components/DashboardGrid.jsx:155
+msgid "You can create a new chart or use existing ones from the panel on the right"
+msgstr "Vous pouvez créer un nouveau graphique ou utililser ceux existants à partir du panneau de droite"
+
+#: superset-frontend/src/dashboard/components/DashboardGrid.jsx:156
+msgid "You can create new charts or use existing ones from the panel on the right"
+msgstr "Vous pouvez créer de nouveaux graphiques ou utililser ceux existants à partir du panneau de droite"
+
+#: superset-frontend/src/views/components/MenuRight.tsx:127
+msgid "Upload Excel file to database"
+msgstr "Importer des fichiers Excel vers la base de données"
+
+#: superset-frontend/src/views/components/MenuRight.tsx:121
+msgid "Upload columnar file to database"
+msgstr "Importer des colonnes vers la base de données"
+
+#: superset-frontend/src/views/components/MenuRight.tsx:115
+msgid "Upload CSV to database"
+msgstr "Importer des fichiers CSV vers la base de données"
+
+#: superset-frontend/src/dashboard/components/SliceHeaderControls/index.tsx:319
+#: superset-frontend/src/dashboard/components/Header/HeaderActionsDropdown/index.jsx:269
+msgid "Share permalink by email"
+msgstr "Partager le lien par mail"
+
+#: superset-frontend/src/dashboard/components/SliceHeaderControls/index.tsx:318
+#: superset-frontend/src/dashboard/components/Header/HeaderActionsDropdown/index.jsx:268
+msgid "Copy permalink to clipboard"
+msgstr "Copier le lien dans le presse-papiers"
+
+#: superset-frontend/src/views/components/MenuRight.tsx:222
+#: superset-frontend/src/views/components/SubMenu.tsx:289
+msgid "Enable 'Allow data upload' in any database's settings"
+msgstr "Activez l'option 'Autoriser le chargement de données' dans les paramètres de la base de données"
+
+#: superset-frontend/src/dashboard/components/DashboardGrid.jsx:195
+#: superset-frontend/src/dashboard/components/gridComponents/Tab.jsx:179
+msgid "There are no components added to this tab"
+msgstr "Il n'y a pas de composant à ajouter dans cet onglet"
+
+#: superset-frontend/src/dashboard/components/DashboardGrid.jsx:200
+msgid "You can add the components in the edit mode"
+msgstr "Vous pouvez ajouter les composants via mode edition"
+
+#: superset-frontend/src/dashboard/components/gridComponents/Tab.jsx:184
+msgid "You can add the components in the"
+msgstr "Vous pouvez ajouter les composants via le"
+
+#: superset-frontend/src/dashboard/components/gridComponents/Tab.jsx:190
+msgid "edit mode"
+msgstr "mode edition"
+
+#: superset-frontend/src/dashboard/components/nativeFilters/FilterBar/ActionButtons/index.tsx:111
+msgid "Apply filters"
+msgstr "Appliquer les filtres"
+
+#: superset-frontend/src/filters/components/Select/controlPanel.ts:110
+msgid "When using this option, default value can’t be set"
+msgstr "Quand l'option est utilisée, une valeur par defaut doit être indiquée"
+
+#: superset-frontend/src/dashboard/components/DashboardGrid.jsx:199
+msgid "Edit the dashboard"
+msgstr "Modifier le tableau de bord"
+
+
+#: superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/dndControls.tsx:161
+msgid "Time Column"
+msgstr "Colonne de temps"
+
+
+#: superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/dndControls.tsx:199
+msgid "Time Grain"
+msgstr "Granularité"
+
+#: superset-frontend/plugins/legacy-plugin-chart-pivot-table/src/controlPanel.ts:46
+msgid "Pivot Options"
+msgstr "Options de pivot"
+
+#: superset-frontend/plugins/legacy-plugin-chart-pivot-table/src/controlPanel.ts:86
+msgid "Combine Metrics"
+msgstr "Combiner les métriques"
+
+#: superset-frontend/plugins/legacy-plugin-chart-pivot-table/src/controlPanel.ts:53
+msgid "Aggregation function"
+msgstr "Fonction d'agrégation"
+
+#: superset-frontend/plugins/legacy-plugin-chart-pivot-table/src/controlPanel.ts:101
+msgid "Transpose Pivot"
+msgstr "Pivot de transposition"
+
+#: superset-frontend/plugins/legacy-plugin-chart-pivot-table/src/controlPanel.ts:77
+msgid "Show totals"
+msgstr "Afficher les totaux"
+
+#: superset-frontend/plugins/legacy-plugin-chart-pivot-table/src/controlPanel.ts:77
+msgid "Swap Groups and Columns"
+msgstr "Permuter les groupes et les colonnes"
+
+#: superset-frontend/plugins/legacy-plugin-chart-pivot-table/src/controlPanel.ts:77
+msgid "Display metrics side by side within each column, as opposed to each column being displayed side by side for each metric."
+msgstr "Affichez les indicateurs côte à côte dans chaque colonne, au lieu d'afficher chaque colonne côte à côte pour chaque indicateur."
+
+#: superset-frontend/plugins/legacy-plugin-chart-pivot-table/src/controlPanel.ts:77
+msgid "Display total row/column"
+msgstr "Affiche le total ligne/colonne"
+
+#: superset-frontend/plugins/legacy-plugin-chart-pivot-table/src/controlPanel.ts:77
+msgid "Aggregate function to apply when pivoting and computing the total rows and columns"
+msgstr "Fonction d'agrégation à appliquer lors du pivotement et du calcul du total des lignes et des colonnes"
+
+#: superset-frontend/plugins/legacy-plugin-chart-pivot-table/src/controlPanel.ts:77
+msgid "update chart"
+msgstr "Mettre à jour"
+
+#: superset-frontend/src/explore/components/DataTablesPane/index.tsx:500
+msgid "Samples"
+msgstr "Exemples"
+
+#: superset-frontend/src/dashboard/components/nativeFilters/FiltersConfigModal/FiltersConfigForm/DependencyList.tsx:193
+msgid "Values are dependent on other filters"
+msgstr "Les valeurs dépendent d'autres filtres"
+
+#: superset-frontend/src/dashboard/components/nativeFilters/FiltersConfigModal/FiltersConfigForm/DependencyList.tsx:197
+msgid "Values selected in other filters will affect the filter options to only show relevant values"
+msgstr "Les valeurs sélectionnées dans d'autres filtres affecteront les options de filtrage afin de n'afficher que les valeurs pertinentes"
+
+#: superset-frontend/src/dashboard/components/nativeFilters/FiltersConfigModal/FiltersConfigForm/DependencyList.tsx:197
+msgid "Values dependent on"
+msgstr "Valeurs dépendent de"
+
+#: superset-frontend/src/dashboard/components/nativeFilters/FiltersConfigModal/FiltersConfigForm/DependencyList.tsx:197
+msgid "No results were returned for this query"
+msgstr "Aucun résultat avec ces paramètres"
+
+#: superset-frontend/src/dashboard/components/Header/HeaderActionsDropdown/index.jsx:154
+#: superset-frontend/src/dashboard/components/SliceHeaderControls/index.tsx:164
+msgid "Data refreshed"
+msgstr "Données rafraîchies"
+
+#: superset-frontend/plugins/plugin-chart-table/src/TableChart.tsx:415
+msgid "Shift + Click to sort by multiple columns"
+msgstr "Maintenir Shift + Clic pour trier plusieurs colonnes"
+
+#: #: flask_appbuilder/security/views.py:211
+msgid "User info"
+msgstr "Informations utilisateurs"
+
+#: flask_appbuilder/security/views.py:250
+msgid "Reset my password"
+msgstr "Réinitialiser mon mot de passe"
+
+#: superset-frontend/src/dashboard/components/nativeFilters/FilterCard/DependenciesRow.tsx:85
+msgid "Filter only displays values relevant to selections made in other filters."
+msgstr "Le filtre n'affiche que les valeurs pertinentes après les sélections effectuées dans d'autres filtres."
+
+#: superset-frontend/src/dashboard/components/nativeFilters/FilterCard/TypeRow.tsx:68
+msgid "Scope"
+msgstr "Périmètre"
+
+#: superset-frontend/src/dashboard/components/nativeFilters/FilterCard/DependenciesRow.tsx:86
+msgid "Dependent on"
+msgstr "Dépend de"
diff --git a/superset/translations/messages.pot b/superset/translations/messages.pot
index 2233ec416a..98c91df9b6 100644
--- a/superset/translations/messages.pot
+++ b/superset/translations/messages.pot
@@ -5907,6 +5907,7 @@ msgid "Histogram"
 msgstr ""
 
 #: superset/initialization/__init__.py:222
+#: superset-frontend/src/views/CRUD/welcome/Welcome.tsx:278
 msgid "Home"
 msgstr ""
 
@@ -7605,6 +7606,7 @@ msgstr ""
 #: superset-frontend/packages/superset-ui-chart-controls/src/shared-controls/index.tsx:313
 #: superset-frontend/src/explore/components/controls/DateFilterControl/utils/constants.ts:35
 #: superset-frontend/src/explore/controls.jsx:342
+#: superset-frontend/src/explore/components/controls/DateFilterControl/DateFilterLabel.tsx:320
 msgid "No filter"
 msgstr ""
 
@@ -14919,3 +14921,11 @@ msgstr ""
 #: superset-frontend/src/explore/components/controls/ConditionalFormattingControl/FormattingPopoverContent.tsx:42
 msgid "yellow"
 msgstr ""
+
+#: superset-frontend/src/dashboard/components/nativeFilters/FilterBar/Header/index.tsx:104
+msgid "Add/Edit Filters"
+msgstr ""
+
+#: superset-frontend/src/views/components/MenuRight.tsx:105
+msgid "Connect database"
+msgstr ""


[superset] 28/29: fix: dataset name change and permission change (#21161)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit cb91fe421c499c8304390ce0131889f61e4e59c1
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Wed Aug 31 18:11:03 2022 +0100

    fix: dataset name change and permission change (#21161)
    
    * fix: dataset name change and permission change
---
 superset/connectors/sqla/models.py            |   7 +-
 superset/databases/commands/update.py         |  34 +
 superset/datasets/commands/create.py          |  11 +-
 superset/datasets/commands/delete.py          |  24 -
 superset/models/core.py                       |   2 +-
 superset/security/manager.py                  | 625 +++++++++++++----
 tests/integration_tests/datasets/api_tests.py |  16 +
 tests/integration_tests/security_tests.py     | 948 +++++++++++++++++++++-----
 8 files changed, 1302 insertions(+), 365 deletions(-)

diff --git a/superset/connectors/sqla/models.py b/superset/connectors/sqla/models.py
index f06b7fa0b0..98cef3149f 100644
--- a/superset/connectors/sqla/models.py
+++ b/superset/connectors/sqla/models.py
@@ -2224,11 +2224,11 @@ class SqlaTable(Model, BaseDatasource):  # pylint: disable=too-many-public-metho
 
         For more context: https://github.com/apache/superset/issues/14909
         """
-        security_manager.set_perm(mapper, connection, sqla_table)
+        security_manager.dataset_after_insert(mapper, connection, sqla_table)
         sqla_table.write_shadow_dataset()
 
     @staticmethod
-    def after_delete(  # pylint: disable=unused-argument
+    def after_delete(
         mapper: Mapper,
         connection: Connection,
         sqla_table: "SqlaTable",
@@ -2245,6 +2245,7 @@ class SqlaTable(Model, BaseDatasource):  # pylint: disable=too-many-public-metho
 
         For more context: https://github.com/apache/superset/issues/14909
         """
+        security_manager.dataset_after_delete(mapper, connection, sqla_table)
         session = inspect(sqla_table).session
         dataset = (
             session.query(NewDataset).filter_by(uuid=sqla_table.uuid).one_or_none()
@@ -2271,7 +2272,7 @@ class SqlaTable(Model, BaseDatasource):  # pylint: disable=too-many-public-metho
         For more context: https://github.com/apache/superset/issues/14909
         """
         # set permissions
-        security_manager.set_perm(mapper, connection, sqla_table)
+        security_manager.dataset_after_update(mapper, connection, sqla_table)
 
         inspector = inspect(sqla_table)
         session = inspector.session
diff --git a/superset/databases/commands/update.py b/superset/databases/commands/update.py
index 30e67b79ca..bac606a145 100644
--- a/superset/databases/commands/update.py
+++ b/superset/databases/commands/update.py
@@ -33,6 +33,7 @@ from superset.databases.commands.exceptions import (
 from superset.databases.dao import DatabaseDAO
 from superset.extensions import db, security_manager
 from superset.models.core import Database
+from superset.utils.core import DatasourceType
 
 logger = logging.getLogger(__name__)
 
@@ -60,8 +61,10 @@ class UpdateDatabaseCommand(BaseCommand):
             except Exception as ex:
                 db.session.rollback()
                 raise DatabaseConnectionFailedError() from ex
+
             # Update database schema permissions
             new_schemas: List[str] = []
+
             for schema in schemas:
                 old_view_menu_name = security_manager.get_schema_perm(
                     old_database_name, schema
@@ -75,6 +78,10 @@ class UpdateDatabaseCommand(BaseCommand):
                 # Update the schema permission if the database name changed
                 if schema_pvm and old_database_name != database.database_name:
                     schema_pvm.view_menu.name = new_view_menu_name
+
+                    self._propagate_schema_permissions(
+                        old_view_menu_name, new_view_menu_name
+                    )
                 else:
                     new_schemas.append(schema)
             for schema in new_schemas:
@@ -88,6 +95,33 @@ class UpdateDatabaseCommand(BaseCommand):
             raise DatabaseUpdateFailedError() from ex
         return database
 
+    @staticmethod
+    def _propagate_schema_permissions(
+        old_view_menu_name: str, new_view_menu_name: str
+    ) -> None:
+        from superset.connectors.sqla.models import (  # pylint: disable=import-outside-toplevel
+            SqlaTable,
+        )
+        from superset.models.slice import (  # pylint: disable=import-outside-toplevel
+            Slice,
+        )
+
+        # Update schema_perm on all datasets
+        datasets = (
+            db.session.query(SqlaTable)
+            .filter(SqlaTable.schema_perm == old_view_menu_name)
+            .all()
+        )
+        for dataset in datasets:
+            dataset.schema_perm = new_view_menu_name
+            charts = db.session.query(Slice).filter(
+                Slice.datasource_type == DatasourceType.TABLE,
+                Slice.datasource_id == dataset.id,
+            )
+            # Update schema_perm on all charts
+            for chart in charts:
+                chart.schema_perm = new_view_menu_name
+
     def validate(self) -> None:
         exceptions: List[ValidationError] = []
         # Validate/populate model exists
diff --git a/superset/datasets/commands/create.py b/superset/datasets/commands/create.py
index 4a89b1a818..25852c732b 100644
--- a/superset/datasets/commands/create.py
+++ b/superset/datasets/commands/create.py
@@ -32,7 +32,7 @@ from superset.datasets.commands.exceptions import (
     TableNotFoundValidationError,
 )
 from superset.datasets.dao import DatasetDAO
-from superset.extensions import db, security_manager
+from superset.extensions import db
 
 logger = logging.getLogger(__name__)
 
@@ -49,15 +49,6 @@ class CreateDatasetCommand(CreateMixin, BaseCommand):
             dataset = DatasetDAO.create(self._properties, commit=False)
             # Updates columns and metrics from the dataset
             dataset.fetch_metadata(commit=False)
-            # Add datasource access permission
-            security_manager.add_permission_view_menu(
-                "datasource_access", dataset.get_perm()
-            )
-            # Add schema access permission if exists
-            if dataset.schema:
-                security_manager.add_permission_view_menu(
-                    "schema_access", dataset.schema_perm
-                )
             db.session.commit()
         except (SQLAlchemyError, DAOCreateFailedError) as ex:
             logger.warning(ex, exc_info=True)
diff --git a/superset/datasets/commands/delete.py b/superset/datasets/commands/delete.py
index a9e5a0ab5a..0691b94b2d 100644
--- a/superset/datasets/commands/delete.py
+++ b/superset/datasets/commands/delete.py
@@ -47,30 +47,6 @@ class DeleteDatasetCommand(BaseCommand):
         self.validate()
         try:
             dataset = DatasetDAO.delete(self._model, commit=False)
-
-            view_menu = (
-                security_manager.find_view_menu(self._model.get_perm())
-                if self._model
-                else None
-            )
-
-            if view_menu:
-                permission_views = (
-                    db.session.query(security_manager.permissionview_model)
-                    .filter_by(view_menu=view_menu)
-                    .all()
-                )
-
-                for permission_view in permission_views:
-                    db.session.delete(permission_view)
-                if view_menu:
-                    db.session.delete(view_menu)
-            else:
-                if not view_menu:
-                    logger.error(
-                        "Could not find the data access permission for the dataset",
-                        exc_info=True,
-                    )
             db.session.commit()
         except (SQLAlchemyError, DAODeleteFailedError) as ex:
             logger.exception(ex)
diff --git a/superset/models/core.py b/superset/models/core.py
index 997759c8b1..8d47eef4fe 100755
--- a/superset/models/core.py
+++ b/superset/models/core.py
@@ -794,7 +794,7 @@ class Database(
         return sqla_url.get_dialect()()
 
 
-sqla.event.listen(Database, "after_insert", security_manager.set_perm)
+sqla.event.listen(Database, "after_insert", security_manager.database_after_insert)
 sqla.event.listen(Database, "after_update", security_manager.database_after_update)
 sqla.event.listen(Database, "after_delete", security_manager.database_after_delete)
 
diff --git a/superset/security/manager.py b/superset/security/manager.py
index 699c7472d5..15383dfa10 100644
--- a/superset/security/manager.py
+++ b/superset/security/manager.py
@@ -78,12 +78,17 @@ from superset.security.guest_token import (
     GuestTokenUser,
     GuestUser,
 )
-from superset.utils.core import DatasourceName, RowLevelSecurityFilterType
+from superset.utils.core import (
+    DatasourceName,
+    DatasourceType,
+    RowLevelSecurityFilterType,
+)
 from superset.utils.urls import get_url_host
 
 if TYPE_CHECKING:
     from superset.common.query_context import QueryContext
     from superset.connectors.base.models import BaseDatasource
+    from superset.connectors.sqla.models import SqlaTable
     from superset.models.core import Database
     from superset.models.dashboard import Dashboard
     from superset.models.sql_lab import Query
@@ -937,16 +942,89 @@ class SupersetSecurityManager(  # pylint: disable=too-many-public-methods
 
         return pvm.permission.name in {"can_override_role_permissions", "can_approve"}
 
+    def database_after_insert(
+        self,
+        mapper: Mapper,
+        connection: Connection,
+        target: "Database",
+    ) -> None:
+        """
+        Handles permissions when a database is created.
+        Triggered by a SQLAlchemy after_insert event.
+
+        We need to create:
+         - The database PVM
+
+        :param mapper: The SQLA mapper
+        :param connection: The SQLA connection
+        :param target: The changed database object
+        :return:
+        """
+        self._insert_pvm_on_sqla_event(
+            mapper, connection, "database_access", target.get_perm()
+        )
+
     def database_after_delete(
         self,
         mapper: Mapper,
         connection: Connection,
         target: "Database",
     ) -> None:
+        """
+        Handles permissions update when a database is deleted.
+        Triggered by a SQLAlchemy after_delete event.
+
+        We need to delete:
+         - The database PVM
+
+        :param mapper: The SQLA mapper
+        :param connection: The SQLA connection
+        :param target: The changed database object
+        :return:
+        """
         self._delete_vm_database_access(
             mapper, connection, target.id, target.database_name
         )
 
+    def database_after_update(
+        self,
+        mapper: Mapper,
+        connection: Connection,
+        target: "Database",
+    ) -> None:
+        """
+        Handles all permissions update when a database is changed.
+        Triggered by a SQLAlchemy after_update event.
+
+        We need to update:
+         - The database PVM
+         - All datasets PVMs that reference the db, and it's local perm name
+         - All datasets local schema perm that reference the db.
+         - All charts local perm related with said datasets
+         - All charts local schema perm related with said datasets
+
+        :param mapper: The SQLA mapper
+        :param connection: The SQLA connection
+        :param target: The changed database object
+        :return:
+        """
+        # Check if database name has changed
+        state = inspect(target)
+        history = state.get_history("database_name", True)
+        if not history.has_changes() or not history.deleted:
+            return
+
+        old_database_name = history.deleted[0]
+        # update database access permission
+        self._update_vm_database_access(mapper, connection, old_database_name, target)
+        # update datasource access
+        self._update_vm_datasources_access(
+            mapper, connection, old_database_name, target
+        )
+        # Note schema permissions are updated at the API level
+        # (database.commands.update). Since we need to fetch all existing schemas from
+        # the db
+
     def _delete_vm_database_access(
         self,
         mapper: Mapper,
@@ -954,29 +1032,11 @@ class SupersetSecurityManager(  # pylint: disable=too-many-public-methods
         database_id: int,
         database_name: str,
     ) -> None:
-        view_menu_table = self.viewmenu_model.__table__  # pylint: disable=no-member
-        permission_view_menu_table = (
-            self.permissionview_model.__table__  # pylint: disable=no-member
-        )
         view_menu_name = self.get_database_perm(database_id, database_name)
         # Clean database access permission
-        db_pvm = self.find_permission_view_menu("database_access", view_menu_name)
-        if not db_pvm:
-            logger.warning(
-                "Could not find previous database permission %s",
-                view_menu_name,
-            )
-            return
-        connection.execute(
-            permission_view_menu_table.delete().where(
-                permission_view_menu_table.c.id == db_pvm.id
-            )
+        self._delete_pvm_on_sqla_event(
+            mapper, connection, "database_access", view_menu_name
         )
-        self.on_permission_after_delete(mapper, connection, db_pvm)
-        connection.execute(
-            view_menu_table.delete().where(view_menu_table.c.id == db_pvm.view_menu_id)
-        )
-
         # Clean database schema permissions
         schema_pvms = (
             self.get_session.query(self.permissionview_model)
@@ -987,17 +1047,7 @@ class SupersetSecurityManager(  # pylint: disable=too-many-public-methods
             .all()
         )
         for schema_pvm in schema_pvms:
-            connection.execute(
-                permission_view_menu_table.delete().where(
-                    permission_view_menu_table.c.id == schema_pvm.id
-                )
-            )
-            self.on_permission_after_delete(mapper, connection, schema_pvm)
-            connection.execute(
-                view_menu_table.delete().where(
-                    view_menu_table.c.id == schema_pvm.view_menu_id
-                )
-            )
+            self._delete_pvm_on_sqla_event(mapper, connection, pvm=schema_pvm)
 
     def _update_vm_database_access(
         self,
@@ -1006,6 +1056,15 @@ class SupersetSecurityManager(  # pylint: disable=too-many-public-methods
         old_database_name: str,
         target: "Database",
     ) -> Optional[ViewMenu]:
+        """
+        Helper method that Updates all database access permission
+        when a database name changes.
+
+        :param connection: Current connection (called on SQLAlchemy event listener scope)
+        :param old_database_name: the old database name
+        :param target: The database object
+        :return: A list of changed view menus (permission resource names)
+        """
         view_menu_table = self.viewmenu_model.__table__  # pylint: disable=no-member
         new_database_name = target.database_name
         old_view_menu_name = self.get_database_perm(target.id, old_database_name)
@@ -1016,6 +1075,9 @@ class SupersetSecurityManager(  # pylint: disable=too-many-public-methods
                 "Could not find previous database permission %s",
                 old_view_menu_name,
             )
+            self._insert_pvm_on_sqla_event(
+                mapper, connection, "database_access", new_view_menu_name
+            )
             return None
         new_updated_pvm = self.find_permission_view_menu(
             "database_access", new_view_menu_name
@@ -1047,11 +1109,12 @@ class SupersetSecurityManager(  # pylint: disable=too-many-public-methods
         target: "Database",
     ) -> List[ViewMenu]:
         """
-        Updates all datasource access permission when a database name changes
+        Helper method that Updates all datasource access permission
+        when a database name changes.
 
         :param connection: Current connection (called on SQLAlchemy event listener scope)
         :param old_database_name: the old database name
-        :param target: The new database name
+        :param target: The database object
         :return: A list of changed view menus (permission resource names)
         """
         from superset.connectors.sqla.models import (  # pylint: disable=import-outside-toplevel
@@ -1086,6 +1149,9 @@ class SupersetSecurityManager(  # pylint: disable=too-many-public-methods
                 .where(view_menu_table.c.name == old_dataset_vm_name)
                 .values(name=new_dataset_vm_name)
             )
+            # After update refresh
+            new_dataset_view_menu = self.find_view_menu(new_dataset_vm_name)
+
             # Update dataset (SqlaTable perm field)
             connection.execute(
                 sqlatable_table.update()
@@ -1102,83 +1168,417 @@ class SupersetSecurityManager(  # pylint: disable=too-many-public-methods
                 .values(perm=new_dataset_vm_name)
             )
             self.on_view_menu_after_update(mapper, connection, new_dataset_view_menu)
-            updated_view_menus.append(self.find_view_menu(new_dataset_view_menu))
+            updated_view_menus.append(new_dataset_view_menu)
         return updated_view_menus
 
-    def database_after_update(
+    def dataset_after_insert(
         self,
         mapper: Mapper,
         connection: Connection,
-        target: "Database",
+        target: "SqlaTable",
     ) -> None:
-        # Check if database name has changed
+        """
+        Handles permission creation when a dataset is inserted.
+        Triggered by a SQLAlchemy after_insert event.
+
+        We need to create:
+         - The dataset PVM and set local and schema perm
+
+        :param mapper: The SQLA mapper
+        :param connection: The SQLA connection
+        :param target: The changed dataset object
+        :return:
+        """
+        try:
+            dataset_perm = target.get_perm()
+        except DatasetInvalidPermissionEvaluationException:
+            logger.warning("Dataset has no database refusing to set permission")
+            return
+        dataset_table = target.__table__
+
+        self._insert_pvm_on_sqla_event(
+            mapper, connection, "datasource_access", dataset_perm
+        )
+        if target.perm != dataset_perm:
+            target.perm = dataset_perm
+            connection.execute(
+                dataset_table.update()
+                .where(dataset_table.c.id == target.id)
+                .values(perm=dataset_perm)
+            )
+
+        if target.schema:
+            dataset_schema_perm = self.get_schema_perm(
+                target.database.database_name, target.schema
+            )
+            self._insert_pvm_on_sqla_event(
+                mapper, connection, "schema_access", dataset_schema_perm
+            )
+            target.schema_perm = dataset_schema_perm
+            connection.execute(
+                dataset_table.update()
+                .where(dataset_table.c.id == target.id)
+                .values(schema_perm=dataset_schema_perm)
+            )
+
+    def dataset_after_delete(
+        self,
+        mapper: Mapper,
+        connection: Connection,
+        target: "SqlaTable",
+    ) -> None:
+        """
+        Handles permissions update when a dataset is deleted.
+        Triggered by a SQLAlchemy after_delete event.
+
+        We need to delete:
+         - The dataset PVM
+
+        :param mapper: The SQLA mapper
+        :param connection: The SQLA connection
+        :param target: The changed dataset object
+        :return:
+        """
+        dataset_vm_name = self.get_dataset_perm(
+            target.id, target.table_name, target.database.database_name
+        )
+        self._delete_pvm_on_sqla_event(
+            mapper, connection, "datasource_access", dataset_vm_name
+        )
+
+    def dataset_after_update(
+        self,
+        mapper: Mapper,
+        connection: Connection,
+        target: "SqlaTable",
+    ) -> None:
+        """
+        Handles all permissions update when a dataset is changed.
+        Triggered by a SQLAlchemy after_update event.
+
+        We need to update:
+         - The dataset PVM and local perm
+         - All charts local perm related with said datasets
+         - All charts local schema perm related with said datasets
+
+        :param mapper: The SQLA mapper
+        :param connection: The SQLA connection
+        :param target: The changed dataset object
+        :return:
+        """
+        # Check if watched fields have changed
         state = inspect(target)
-        history = state.get_history("database_name", True)
-        if not history.has_changes() or not history.deleted:
+        history_database = state.get_history("database_id", True)
+        history_table_name = state.get_history("table_name", True)
+        history_schema = state.get_history("schema", True)
+
+        # When database name changes
+        if history_database.has_changes() and history_database.deleted:
+            new_dataset_vm_name = self.get_dataset_perm(
+                target.id, target.table_name, target.database.database_name
+            )
+            self._update_dataset_perm(
+                mapper, connection, target.perm, new_dataset_vm_name, target
+            )
+
+            # Updates schema permissions
+            new_dataset_schema_name = self.get_schema_perm(
+                target.database.database_name, target.schema
+            )
+            self._update_dataset_schema_perm(
+                mapper,
+                connection,
+                new_dataset_schema_name,
+                target,
+            )
+
+        # When table name changes
+        if history_table_name.has_changes() and history_table_name.deleted:
+            old_dataset_name = history_table_name.deleted[0]
+            new_dataset_vm_name = self.get_dataset_perm(
+                target.id, target.table_name, target.database.database_name
+            )
+            old_dataset_vm_name = self.get_dataset_perm(
+                target.id, old_dataset_name, target.database.database_name
+            )
+            self._update_dataset_perm(
+                mapper, connection, old_dataset_vm_name, new_dataset_vm_name, target
+            )
+
+        # When schema changes
+        if history_schema.has_changes() and history_schema.deleted:
+            new_dataset_schema_name = self.get_schema_perm(
+                target.database.database_name, target.schema
+            )
+            self._update_dataset_schema_perm(
+                mapper,
+                connection,
+                new_dataset_schema_name,
+                target,
+            )
+
+    def _update_dataset_schema_perm(
+        self,
+        mapper: Mapper,
+        connection: Connection,
+        new_schema_permission_name: Optional[str],
+        target: "SqlaTable",
+    ) -> None:
+        """
+        Helper method that is called by SQLAlchemy events on datasets to update
+        a new schema permission name, propagates the name change to datasets and charts.
+
+        If the schema permission name does not exist already has a PVM,
+        creates a new one.
+
+        :param mapper: The SQLA event mapper
+        :param connection: The SQLA connection
+        :param new_schema_permission_name: The new schema permission name that changed
+        :param target: Dataset that was updated
+        :return:
+        """
+        from superset.connectors.sqla.models import (  # pylint: disable=import-outside-toplevel
+            SqlaTable,
+        )
+        from superset.models.slice import (  # pylint: disable=import-outside-toplevel
+            Slice,
+        )
+
+        sqlatable_table = SqlaTable.__table__  # pylint: disable=no-member
+        chart_table = Slice.__table__  # pylint: disable=no-member
+
+        # insert new schema PVM if it does not exist
+        self._insert_pvm_on_sqla_event(
+            mapper, connection, "schema_access", new_schema_permission_name
+        )
+
+        # Update dataset (SqlaTable schema_perm field)
+        connection.execute(
+            sqlatable_table.update()
+            .where(
+                sqlatable_table.c.id == target.id,
+            )
+            .values(schema_perm=new_schema_permission_name)
+        )
+
+        # Update charts (Slice schema_perm field)
+        connection.execute(
+            chart_table.update()
+            .where(
+                chart_table.c.datasource_id == target.id,
+                chart_table.c.datasource_type == DatasourceType.TABLE,
+            )
+            .values(schema_perm=new_schema_permission_name)
+        )
+
+    def _update_dataset_perm(  # pylint: disable=too-many-arguments
+        self,
+        mapper: Mapper,
+        connection: Connection,
+        old_permission_name: Optional[str],
+        new_permission_name: Optional[str],
+        target: "SqlaTable",
+    ) -> None:
+        """
+        Helper method that is called by SQLAlchemy events on datasets to update
+        a permission name change, propagates the name change to VM, datasets and charts.
+
+        :param mapper:
+        :param connection:
+        :param old_permission_name
+        :param new_permission_name:
+        :param target:
+        :return:
+        """
+        from superset.connectors.sqla.models import (  # pylint: disable=import-outside-toplevel
+            SqlaTable,
+        )
+        from superset.models.slice import (  # pylint: disable=import-outside-toplevel
+            Slice,
+        )
+
+        view_menu_table = self.viewmenu_model.__table__  # pylint: disable=no-member
+        sqlatable_table = SqlaTable.__table__  # pylint: disable=no-member
+        chart_table = Slice.__table__  # pylint: disable=no-member
+
+        new_dataset_view_menu = self.find_view_menu(new_permission_name)
+        if new_dataset_view_menu:
             return
+        # Update VM
+        connection.execute(
+            view_menu_table.update()
+            .where(view_menu_table.c.name == old_permission_name)
+            .values(name=new_permission_name)
+        )
+        # VM changed, so call hook
+        new_dataset_view_menu = self.find_view_menu(new_permission_name)
+        self.on_view_menu_after_update(mapper, connection, new_dataset_view_menu)
+        # Update dataset (SqlaTable perm field)
+        connection.execute(
+            sqlatable_table.update()
+            .where(
+                sqlatable_table.c.id == target.id,
+            )
+            .values(perm=new_permission_name)
+        )
+        # Update charts (Slice perm field)
+        connection.execute(
+            chart_table.update()
+            .where(
+                chart_table.c.datasource_type == DatasourceType.TABLE,
+                chart_table.c.datasource_id == target.id,
+            )
+            .values(perm=new_permission_name)
+        )
 
-        old_database_name = history.deleted[0]
-        # update database access permission
-        self._update_vm_database_access(mapper, connection, old_database_name, target)
-        # update datasource access
-        self._update_vm_datasources_access(
-            mapper, connection, old_database_name, target
+    def _delete_pvm_on_sqla_event(  # pylint: disable=too-many-arguments
+        self,
+        mapper: Mapper,
+        connection: Connection,
+        permission_name: Optional[str] = None,
+        view_menu_name: Optional[str] = None,
+        pvm: Optional[PermissionView] = None,
+    ) -> None:
+        """
+        Helper method that is called by SQLAlchemy events.
+        Deletes a PVM.
+
+        :param mapper: The SQLA event mapper
+        :param connection: The SQLA connection
+        :param permission_name: e.g.: datasource_access, schema_access
+        :param view_menu_name: e.g. [db1].[public]
+        :param pvm: Can be called with the actual PVM already
+        :return:
+        """
+        view_menu_table = self.viewmenu_model.__table__  # pylint: disable=no-member
+        permission_view_menu_table = (
+            self.permissionview_model.__table__  # pylint: disable=no-member
         )
 
-    def on_view_menu_after_update(
-        self, mapper: Mapper, connection: Connection, target: ViewMenu
+        if not pvm:
+            pvm = self.find_permission_view_menu(permission_name, view_menu_name)
+        if not pvm:
+            return
+        # Delete Any Role to PVM association
+        connection.execute(
+            assoc_permissionview_role.delete().where(
+                assoc_permissionview_role.c.permission_view_id == pvm.id
+            )
+        )
+        # Delete the database access PVM
+        connection.execute(
+            permission_view_menu_table.delete().where(
+                permission_view_menu_table.c.id == pvm.id
+            )
+        )
+        self.on_permission_view_after_delete(mapper, connection, pvm)
+        connection.execute(
+            view_menu_table.delete().where(view_menu_table.c.id == pvm.view_menu_id)
+        )
+
+    def _insert_pvm_on_sqla_event(
+        self,
+        mapper: Mapper,
+        connection: Connection,
+        permission_name: str,
+        view_menu_name: Optional[str],
     ) -> None:
         """
-        Hook that allows for further custom operations when a new ViewMenu
-        is updated
+        Helper method that is called by SQLAlchemy events.
+        Inserts a new PVM (if it does not exist already)
 
-        Since the update may be performed on after_update event. We cannot
-        update ViewMenus using a session, so any SQLAlchemy events hooked to
-        `ViewMenu` will not trigger an after_update.
+        :param mapper: The SQLA event mapper
+        :param connection: The SQLA connection
+        :param permission_name: e.g.: datasource_access, schema_access
+        :param view_menu_name: e.g. [db1].[public]
+        :return:
+        """
+        permission_table = self.permission_model.__table__  # pylint: disable=no-member
+        view_menu_table = self.viewmenu_model.__table__  # pylint: disable=no-member
+        permission_view_table = (
+            self.permissionview_model.__table__  # pylint: disable=no-member
+        )
+        if not view_menu_name:
+            return
+        pvm = self.find_permission_view_menu(permission_name, view_menu_name)
+        if pvm:
+            return
+        permission = self.find_permission(permission_name)
+        view_menu = self.find_view_menu(view_menu_name)
+        if not permission:
+            connection.execute(permission_table.insert().values(name=permission_name))
+            permission = self.find_permission(permission_name)
+            self.on_permission_after_insert(mapper, connection, permission)
+        if not view_menu:
+            connection.execute(view_menu_table.insert().values(name=view_menu_name))
+            view_menu = self.find_view_menu(view_menu_name)
+            self.on_view_menu_after_insert(mapper, connection, view_menu)
+        connection.execute(
+            permission_view_table.insert().values(
+                permission_id=permission.id, view_menu_id=view_menu.id
+            )
+        )
+        permission = self.find_permission_view_menu(permission_name, view_menu_name)
+        self.on_permission_view_after_insert(mapper, connection, permission)
+
+    def on_role_after_update(
+        self, mapper: Mapper, connection: Connection, target: Role
+    ) -> None:
+        """
+        Hook that allows for further custom operations when a Role update
+        is created by SQLAlchemy events.
+
+        On SQLAlchemy after_insert events, we cannot
+        create new view_menu's using a session, so any SQLAlchemy events hooked to
+        `ViewMenu` will not trigger an after_insert.
 
         :param mapper: The table mapper
         :param connection: The DB-API connection
-        :param target: The mapped instance being persisted
+        :param target: The mapped instance being changed
         """
 
-    def on_permission_after_delete(
-        self, mapper: Mapper, connection: Connection, target: Permission
+    def on_view_menu_after_insert(
+        self, mapper: Mapper, connection: Connection, target: ViewMenu
     ) -> None:
         """
-        Hook that allows for further custom operations when a permission
-        is deleted by sqlalchemy events.
+        Hook that allows for further custom operations when a new ViewMenu
+        is created by set_perm.
+
+        On SQLAlchemy after_insert events, we cannot
+        create new view_menu's using a session, so any SQLAlchemy events hooked to
+        `ViewMenu` will not trigger an after_insert.
 
         :param mapper: The table mapper
         :param connection: The DB-API connection
         :param target: The mapped instance being persisted
         """
 
-    def on_permission_after_insert(
-        self, mapper: Mapper, connection: Connection, target: Permission
+    def on_view_menu_after_update(
+        self, mapper: Mapper, connection: Connection, target: ViewMenu
     ) -> None:
         """
-        Hook that allows for further custom operations when a new permission
-        is created by set_perm.
+        Hook that allows for further custom operations when a new ViewMenu
+        is updated
 
-        Since set_perm is executed by SQLAlchemy after_insert events, we cannot
-        create new permissions using a session, so any SQLAlchemy events hooked to
-        `Permission` will not trigger an after_insert.
+        Since the update may be performed on after_update event. We cannot
+        update ViewMenus using a session, so any SQLAlchemy events hooked to
+        `ViewMenu` will not trigger an after_update.
 
         :param mapper: The table mapper
         :param connection: The DB-API connection
         :param target: The mapped instance being persisted
         """
 
-    def on_view_menu_after_insert(
-        self, mapper: Mapper, connection: Connection, target: ViewMenu
+    def on_permission_after_insert(
+        self, mapper: Mapper, connection: Connection, target: Permission
     ) -> None:
         """
-        Hook that allows for further custom operations when a new ViewMenu
+        Hook that allows for further custom operations when a new permission
         is created by set_perm.
 
         Since set_perm is executed by SQLAlchemy after_insert events, we cannot
-        create new view_menu's using a session, so any SQLAlchemy events hooked to
-        `ViewMenu` will not trigger an after_insert.
+        create new permissions using a session, so any SQLAlchemy events hooked to
+        `Permission` will not trigger an after_insert.
 
         :param mapper: The table mapper
         :param connection: The DB-API connection
@@ -1190,9 +1590,9 @@ class SupersetSecurityManager(  # pylint: disable=too-many-public-methods
     ) -> None:
         """
         Hook that allows for further custom operations when a new PermissionView
-        is created by set_perm.
+        is created by SQLAlchemy events.
 
-        Since set_perm is executed by SQLAlchemy after_insert events, we cannot
+        On SQLAlchemy after_insert events, we cannot
         create new pvms using a session, so any SQLAlchemy events hooked to
         `PermissionView` will not trigger an after_insert.
 
@@ -1201,86 +1601,21 @@ class SupersetSecurityManager(  # pylint: disable=too-many-public-methods
         :param target: The mapped instance being persisted
         """
 
-    def set_perm(
-        self, mapper: Mapper, connection: Connection, target: "BaseDatasource"
+    def on_permission_view_after_delete(
+        self, mapper: Mapper, connection: Connection, target: PermissionView
     ) -> None:
         """
-        Set the datasource permissions.
+        Hook that allows for further custom operations when a new PermissionView
+        is delete by SQLAlchemy events.
+
+        On SQLAlchemy after_delete events, we cannot
+        delete pvms using a session, so any SQLAlchemy events hooked to
+        `PermissionView` will not trigger an after_delete.
 
         :param mapper: The table mapper
         :param connection: The DB-API connection
         :param target: The mapped instance being persisted
         """
-        try:
-            target_get_perm = target.get_perm()
-        except DatasetInvalidPermissionEvaluationException:
-            logger.warning("Dataset has no database refusing to set permission")
-            return
-        link_table = target.__table__
-        if target.perm != target_get_perm:
-            connection.execute(
-                link_table.update()
-                .where(link_table.c.id == target.id)
-                .values(perm=target_get_perm)
-            )
-            target.perm = target_get_perm
-
-        # check schema perm for datasets
-        if (
-            hasattr(target, "schema_perm")
-            and target.schema_perm != target.get_schema_perm()
-        ):
-            connection.execute(
-                link_table.update()
-                .where(link_table.c.id == target.id)
-                .values(schema_perm=target.get_schema_perm())
-            )
-            target.schema_perm = target.get_schema_perm()
-
-        pvm_names = []
-        if target.__tablename__ in {"dbs", "clusters"}:
-            pvm_names.append(("database_access", target_get_perm))
-        else:
-            pvm_names.append(("datasource_access", target_get_perm))
-            if target.schema:
-                pvm_names.append(("schema_access", target.get_schema_perm()))
-
-        # TODO(bogdan): modify slice permissions as well.
-        for permission_name, view_menu_name in pvm_names:
-            permission = self.find_permission(permission_name)
-            view_menu = self.find_view_menu(view_menu_name)
-            pv = None
-
-            if not permission:
-                permission_table = (
-                    self.permission_model.__table__  # pylint: disable=no-member
-                )
-                connection.execute(
-                    permission_table.insert().values(name=permission_name)
-                )
-                permission = self.find_permission(permission_name)
-            if not view_menu:
-                view_menu_table = (
-                    self.viewmenu_model.__table__  # pylint: disable=no-member
-                )
-                connection.execute(view_menu_table.insert().values(name=view_menu_name))
-                view_menu = self.find_view_menu(view_menu_name)
-
-            if permission and view_menu:
-                pv = (
-                    self.get_session.query(self.permissionview_model)
-                    .filter_by(permission=permission, view_menu=view_menu)
-                    .first()
-                )
-            if not pv and permission and view_menu:
-                permission_view_table = (
-                    self.permissionview_model.__table__  # pylint: disable=no-member
-                )
-                connection.execute(
-                    permission_view_table.insert().values(
-                        permission_id=permission.id, view_menu_id=view_menu.id
-                    )
-                )
 
     def raise_for_access(
         # pylint: disable=too-many-arguments,too-many-locals
diff --git a/tests/integration_tests/datasets/api_tests.py b/tests/integration_tests/datasets/api_tests.py
index b1767bddad..79fb9f1606 100644
--- a/tests/integration_tests/datasets/api_tests.py
+++ b/tests/integration_tests/datasets/api_tests.py
@@ -242,6 +242,18 @@ class TestDatasetApi(SupersetTestCase):
         """
         Dataset API: Test get dataset related databases gamma
         """
+        if backend() == "sqlite":
+            return
+
+        # Add main database access to gamma role
+        main_db = get_main_database()
+        main_db_pvm = security_manager.find_permission_view_menu(
+            "database_access", main_db.perm
+        )
+        gamma_role = security_manager.find_role("Gamma")
+        gamma_role.permissions.append(main_db_pvm)
+        db.session.commit()
+
         self.login(username="gamma")
         uri = "api/v1/dataset/related/database"
         rv = self.client.get(uri)
@@ -252,6 +264,10 @@ class TestDatasetApi(SupersetTestCase):
         main_db = get_main_database()
         assert filter(lambda x: x.text == main_db, response["result"]) != []
 
+        # revert gamma permission
+        gamma_role.permissions.remove(main_db_pvm)
+        db.session.commit()
+
     @pytest.mark.usefixtures("load_energy_table_with_slice")
     def test_get_dataset_item(self):
         """
diff --git a/tests/integration_tests/security_tests.py b/tests/integration_tests/security_tests.py
index 25d946f9e5..523a26adf0 100644
--- a/tests/integration_tests/security_tests.py
+++ b/tests/integration_tests/security_tests.py
@@ -32,6 +32,9 @@ from flask import current_app
 from superset.models.dashboard import Dashboard
 
 from superset import app, appbuilder, db, security_manager, viz, ConnectorRegistry
+from flask_appbuilder.security.sqla.models import Role
+from superset.models.dashboard import Dashboard
+from superset import app, appbuilder, db, security_manager, viz
 from superset.connectors.sqla.models import SqlaTable
 from superset.errors import ErrorLevel, SupersetError, SupersetErrorType
 from superset.exceptions import SupersetSecurityException
@@ -155,125 +158,93 @@ class TestRolePermission(SupersetTestCase):
         session.delete(security_manager.find_role(SCHEMA_ACCESS_ROLE))
         session.commit()
 
-    def test_set_perm_sqla_table(self):
+    def test_after_insert_dataset(self):
+        security_manager.on_view_menu_after_insert = Mock()
+        security_manager.on_permission_view_after_insert = Mock()
+
         session = db.session
+        tmp_db1 = Database(database_name="tmp_db1", sqlalchemy_uri="sqlite://")
+        session.add(tmp_db1)
+
         table = SqlaTable(
             schema="tmp_schema",
             table_name="tmp_perm_table",
-            database=get_example_database(),
+            database=tmp_db1,
         )
         session.add(table)
         session.commit()
 
-        stored_table = (
-            session.query(SqlaTable).filter_by(table_name="tmp_perm_table").one()
-        )
-        self.assertEqual(
-            stored_table.perm, f"[examples].[tmp_perm_table](id:{stored_table.id})"
-        )
-        self.assertIsNotNone(
-            security_manager.find_permission_view_menu(
-                "datasource_access", stored_table.perm
-            )
-        )
-        self.assertEqual(stored_table.schema_perm, "[examples].[tmp_schema]")
-        self.assertIsNotNone(
-            security_manager.find_permission_view_menu(
-                "schema_access", stored_table.schema_perm
-            )
-        )
+        table = session.query(SqlaTable).filter_by(table_name="tmp_perm_table").one()
+        self.assertEqual(table.perm, f"[tmp_db1].[tmp_perm_table](id:{table.id})")
 
-        # table name change
-        stored_table.table_name = "tmp_perm_table_v2"
-        session.commit()
-        stored_table = (
-            session.query(SqlaTable).filter_by(table_name="tmp_perm_table_v2").one()
-        )
-        self.assertEqual(
-            stored_table.perm, f"[examples].[tmp_perm_table_v2](id:{stored_table.id})"
-        )
-        self.assertIsNotNone(
-            security_manager.find_permission_view_menu(
-                "datasource_access", stored_table.perm
-            )
+        pvm_dataset = security_manager.find_permission_view_menu(
+            "datasource_access", table.perm
         )
-        # no changes in schema
-        self.assertEqual(stored_table.schema_perm, "[examples].[tmp_schema]")
-        self.assertIsNotNone(
-            security_manager.find_permission_view_menu(
-                "schema_access", stored_table.schema_perm
-            )
+        pvm_schema = security_manager.find_permission_view_menu(
+            "schema_access", table.schema_perm
         )
 
-        # schema name change
-        stored_table.schema = "tmp_schema_v2"
-        session.commit()
-        stored_table = (
-            session.query(SqlaTable).filter_by(table_name="tmp_perm_table_v2").one()
-        )
-        self.assertEqual(
-            stored_table.perm, f"[examples].[tmp_perm_table_v2](id:{stored_table.id})"
+        # Assert dataset permission is created and local perms are ok
+        self.assertIsNotNone(pvm_dataset)
+        self.assertEqual(table.perm, f"[tmp_db1].[tmp_perm_table](id:{table.id})")
+        self.assertEqual(table.schema_perm, "[tmp_db1].[tmp_schema]")
+        self.assertIsNotNone(pvm_schema)
+
+        # assert on permission hooks
+        view_menu_dataset = security_manager.find_view_menu(
+            f"[tmp_db1].[tmp_perm_table](id:{table.id})"
         )
-        self.assertIsNotNone(
-            security_manager.find_permission_view_menu(
-                "datasource_access", stored_table.perm
-            )
+        view_menu_schema = security_manager.find_view_menu(f"[tmp_db1].[tmp_schema]")
+        security_manager.on_view_menu_after_insert.assert_has_calls(
+            [
+                call(ANY, ANY, view_menu_dataset),
+                call(ANY, ANY, view_menu_schema),
+            ]
         )
-        # no changes in schema
-        self.assertEqual(stored_table.schema_perm, "[examples].[tmp_schema_v2]")
-        self.assertIsNotNone(
-            security_manager.find_permission_view_menu(
-                "schema_access", stored_table.schema_perm
-            )
+        security_manager.on_permission_view_after_insert.assert_has_calls(
+            [
+                call(ANY, ANY, pvm_dataset),
+                call(ANY, ANY, pvm_schema),
+            ]
         )
 
-        # database change
-        new_db = Database(sqlalchemy_uri="sqlite://", database_name="tmp_db")
-        session.add(new_db)
-        stored_table.database = (
-            session.query(Database).filter_by(database_name="tmp_db").one()
-        )
+        # Cleanup
+        session.delete(table)
+        session.delete(tmp_db1)
         session.commit()
-        stored_table = (
-            session.query(SqlaTable).filter_by(table_name="tmp_perm_table_v2").one()
-        )
-        self.assertEqual(
-            stored_table.perm, f"[tmp_db].[tmp_perm_table_v2](id:{stored_table.id})"
-        )
-        self.assertIsNotNone(
-            security_manager.find_permission_view_menu(
-                "datasource_access", stored_table.perm
-            )
-        )
-        # no changes in schema
-        self.assertEqual(stored_table.schema_perm, "[tmp_db].[tmp_schema_v2]")
-        self.assertIsNotNone(
-            security_manager.find_permission_view_menu(
-                "schema_access", stored_table.schema_perm
-            )
-        )
 
-        # no schema
-        stored_table.schema = None
+    def test_after_insert_dataset_rollback(self):
+        session = db.session
+        tmp_db1 = Database(database_name="tmp_db1", sqlalchemy_uri="sqlite://")
+        session.add(tmp_db1)
         session.commit()
-        stored_table = (
-            session.query(SqlaTable).filter_by(table_name="tmp_perm_table_v2").one()
+
+        table = SqlaTable(
+            schema="tmp_schema",
+            table_name="tmp_table",
+            database=tmp_db1,
         )
-        self.assertEqual(
-            stored_table.perm, f"[tmp_db].[tmp_perm_table_v2](id:{stored_table.id})"
+        session.add(table)
+        session.flush()
+
+        pvm_dataset = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db1].[tmp_table](id:{table.id})"
         )
-        self.assertIsNotNone(
-            security_manager.find_permission_view_menu(
-                "datasource_access", stored_table.perm
-            )
+        self.assertIsNotNone(pvm_dataset)
+        table_id = table.id
+        session.rollback()
+
+        table = session.query(SqlaTable).filter_by(table_name="tmp_table").one_or_none()
+        self.assertIsNone(table)
+        pvm_dataset = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db1].[tmp_table](id:{table_id})"
         )
-        self.assertIsNone(stored_table.schema_perm)
+        self.assertIsNone(pvm_dataset)
 
-        session.delete(new_db)
-        session.delete(stored_table)
+        session.delete(tmp_db1)
         session.commit()
 
-    def test_set_perm_sqla_table_none(self):
+    def test_after_insert_dataset_table_none(self):
         session = db.session
         table = SqlaTable(
             schema="tmp_schema",
@@ -299,126 +270,197 @@ class TestRolePermission(SupersetTestCase):
                 "datasource_access", f"[None].[tmp_perm_table](id:{stored_table.id})"
             )
         )
+
+        # Cleanup
         session.delete(table)
         session.commit()
 
-    def test_set_perm_database(self):
+    def test_after_insert_database(self):
+        security_manager.on_permission_view_after_insert = Mock()
+
         session = db.session
-        database = Database(database_name="tmp_database", sqlalchemy_uri="sqlite://")
-        session.add(database)
+        tmp_db1 = Database(database_name="tmp_db1", sqlalchemy_uri="sqlite://")
+        session.add(tmp_db1)
+
+        tmp_db1 = session.query(Database).filter_by(database_name="tmp_db1").one()
+        self.assertEqual(tmp_db1.perm, f"[tmp_db1].(id:{tmp_db1.id})")
+        tmp_db1_pvm = security_manager.find_permission_view_menu(
+            "database_access", tmp_db1.perm
+        )
+        self.assertIsNotNone(tmp_db1_pvm)
+
+        # Assert the hook is called
+        security_manager.on_permission_view_after_insert.assert_has_calls(
+            [
+                call(ANY, ANY, tmp_db1_pvm),
+            ]
+        )
+        session.delete(tmp_db1)
+        session.commit()
+
+    def test_after_insert_database_rollback(self):
+        session = db.session
+        tmp_db1 = Database(database_name="tmp_db1", sqlalchemy_uri="sqlite://")
+        session.add(tmp_db1)
+        session.flush()
 
-        stored_db = (
-            session.query(Database).filter_by(database_name="tmp_database").one()
+        pvm_database = security_manager.find_permission_view_menu(
+            "database_access", f"[tmp_db1].(id:{tmp_db1.id})"
         )
-        self.assertEqual(stored_db.perm, f"[tmp_database].(id:{stored_db.id})")
+        self.assertIsNotNone(pvm_database)
+        session.rollback()
+
+        pvm_database = security_manager.find_permission_view_menu(
+            "database_access", f"[tmp_db1](id:{tmp_db1.id})"
+        )
+        self.assertIsNone(pvm_database)
+
+    def test_after_update_database__perm_database_access(self):
+        security_manager.on_view_menu_after_update = Mock()
+
+        session = db.session
+        tmp_db1 = Database(database_name="tmp_db1", sqlalchemy_uri="sqlite://")
+        session.add(tmp_db1)
+        session.commit()
+        tmp_db1 = session.query(Database).filter_by(database_name="tmp_db1").one()
+
         self.assertIsNotNone(
-            security_manager.find_permission_view_menu(
-                "database_access", stored_db.perm
-            )
+            security_manager.find_permission_view_menu("database_access", tmp_db1.perm)
         )
 
-        stored_db.database_name = "tmp_database2"
+        tmp_db1.database_name = "tmp_db2"
         session.commit()
-        stored_db = (
-            session.query(Database).filter_by(database_name="tmp_database2").one()
+
+        # Assert that the old permission was updated
+        self.assertIsNone(
+            security_manager.find_permission_view_menu(
+                "database_access", f"[tmp_db1].(id:{tmp_db1.id})"
+            )
         )
-        self.assertEqual(stored_db.perm, f"[tmp_database2].(id:{stored_db.id})")
+        # Assert that the db permission was updated
         self.assertIsNotNone(
             security_manager.find_permission_view_menu(
-                "database_access", stored_db.perm
+                "database_access", f"[tmp_db2].(id:{tmp_db1.id})"
             )
         )
 
-        session.delete(stored_db)
+        # Assert the hook is called
+        tmp_db1_view_menu = security_manager.find_view_menu(
+            f"[tmp_db2].(id:{tmp_db1.id})"
+        )
+        security_manager.on_view_menu_after_update.assert_has_calls(
+            [
+                call(ANY, ANY, tmp_db1_view_menu),
+            ]
+        )
+
+        session.delete(tmp_db1)
         session.commit()
 
-    def test_after_update_database__perm_database_access(self):
+    def test_after_update_database_rollback(self):
         session = db.session
-        database = Database(database_name="tmp_database", sqlalchemy_uri="sqlite://")
-        session.add(database)
+        tmp_db1 = Database(database_name="tmp_db1", sqlalchemy_uri="sqlite://")
+        session.add(tmp_db1)
         session.commit()
-        stored_db = (
-            session.query(Database).filter_by(database_name="tmp_database").one()
-        )
+        tmp_db1 = session.query(Database).filter_by(database_name="tmp_db1").one()
 
         self.assertIsNotNone(
-            security_manager.find_permission_view_menu(
-                "database_access", stored_db.perm
-            )
+            security_manager.find_permission_view_menu("database_access", tmp_db1.perm)
         )
 
-        stored_db.database_name = "tmp_database2"
-        session.commit()
+        tmp_db1.database_name = "tmp_db2"
+        session.flush()
 
         # Assert that the old permission was updated
         self.assertIsNone(
             security_manager.find_permission_view_menu(
-                "database_access", f"[tmp_database].(id:{stored_db.id})"
+                "database_access", f"[tmp_db1].(id:{tmp_db1.id})"
             )
         )
         # Assert that the db permission was updated
         self.assertIsNotNone(
             security_manager.find_permission_view_menu(
-                "database_access", f"[tmp_database2].(id:{stored_db.id})"
+                "database_access", f"[tmp_db2].(id:{tmp_db1.id})"
             )
         )
-        session.delete(stored_db)
+
+        session.rollback()
+        self.assertIsNotNone(
+            security_manager.find_permission_view_menu(
+                "database_access", f"[tmp_db1].(id:{tmp_db1.id})"
+            )
+        )
+        # Assert that the db permission was updated
+        self.assertIsNone(
+            security_manager.find_permission_view_menu(
+                "database_access", f"[tmp_db2].(id:{tmp_db1.id})"
+            )
+        )
+
+        session.delete(tmp_db1)
         session.commit()
 
     def test_after_update_database__perm_database_access_exists(self):
+        security_manager.on_permission_view_after_delete = Mock()
+
         session = db.session
         # Add a bogus existing permission before the change
 
-        database = Database(database_name="tmp_database", sqlalchemy_uri="sqlite://")
-        session.add(database)
+        tmp_db1 = Database(database_name="tmp_db1", sqlalchemy_uri="sqlite://")
+        session.add(tmp_db1)
         session.commit()
-        stored_db = (
-            session.query(Database).filter_by(database_name="tmp_database").one()
-        )
+        tmp_db1 = session.query(Database).filter_by(database_name="tmp_db1").one()
         security_manager.add_permission_view_menu(
-            "database_access", f"[tmp_database2].(id:{stored_db.id})"
+            "database_access", f"[tmp_db2].(id:{tmp_db1.id})"
         )
 
         self.assertIsNotNone(
-            security_manager.find_permission_view_menu(
-                "database_access", stored_db.perm
-            )
+            security_manager.find_permission_view_menu("database_access", tmp_db1.perm)
         )
 
-        stored_db.database_name = "tmp_database2"
+        tmp_db1.database_name = "tmp_db2"
         session.commit()
 
         # Assert that the old permission was updated
         self.assertIsNone(
             security_manager.find_permission_view_menu(
-                "database_access", f"[tmp_database].(id:{stored_db.id})"
+                "database_access", f"[tmp_db1].(id:{tmp_db1.id})"
             )
         )
         # Assert that the db permission was updated
         self.assertIsNotNone(
             security_manager.find_permission_view_menu(
-                "database_access", f"[tmp_database2].(id:{stored_db.id})"
+                "database_access", f"[tmp_db2].(id:{tmp_db1.id})"
             )
         )
-        session.delete(stored_db)
+
+        security_manager.on_permission_view_after_delete.assert_has_calls(
+            [
+                call(ANY, ANY, ANY),
+            ]
+        )
+
+        session.delete(tmp_db1)
         session.commit()
 
     def test_after_update_database__perm_datasource_access(self):
+        security_manager.on_view_menu_after_update = Mock()
+
         session = db.session
-        database = Database(database_name="tmp_database", sqlalchemy_uri="sqlite://")
-        session.add(database)
+        tmp_db1 = Database(database_name="tmp_db1", sqlalchemy_uri="sqlite://")
+        session.add(tmp_db1)
         session.commit()
 
         table1 = SqlaTable(
             schema="tmp_schema",
             table_name="tmp_table1",
-            database=database,
+            database=tmp_db1,
         )
         session.add(table1)
         table2 = SqlaTable(
             schema="tmp_schema",
             table_name="tmp_table2",
-            database=database,
+            database=tmp_db1,
         )
         session.add(table2)
         session.commit()
@@ -437,81 +479,633 @@ class TestRolePermission(SupersetTestCase):
         # assert initial perms
         self.assertIsNotNone(
             security_manager.find_permission_view_menu(
-                "datasource_access", f"[tmp_database].[tmp_table1](id:{table1.id})"
+                "datasource_access", f"[tmp_db1].[tmp_table1](id:{table1.id})"
             )
         )
         self.assertIsNotNone(
             security_manager.find_permission_view_menu(
-                "datasource_access", f"[tmp_database].[tmp_table2](id:{table2.id})"
+                "datasource_access", f"[tmp_db1].[tmp_table2](id:{table2.id})"
             )
         )
-        self.assertEqual(slice1.perm, f"[tmp_database].[tmp_table1](id:{table1.id})")
-        self.assertEqual(table1.perm, f"[tmp_database].[tmp_table1](id:{table1.id})")
-        self.assertEqual(table2.perm, f"[tmp_database].[tmp_table2](id:{table2.id})")
+        self.assertEqual(slice1.perm, f"[tmp_db1].[tmp_table1](id:{table1.id})")
+        self.assertEqual(table1.perm, f"[tmp_db1].[tmp_table1](id:{table1.id})")
+        self.assertEqual(table2.perm, f"[tmp_db1].[tmp_table2](id:{table2.id})")
 
-        stored_db = (
-            session.query(Database).filter_by(database_name="tmp_database").one()
-        )
-        stored_db.database_name = "tmp_database2"
+        # Refresh and update the database name
+        tmp_db1 = session.query(Database).filter_by(database_name="tmp_db1").one()
+        tmp_db1.database_name = "tmp_db2"
         session.commit()
 
         # Assert that the old permissions were updated
         self.assertIsNone(
             security_manager.find_permission_view_menu(
-                "datasource_access", f"[tmp_database].[tmp_table1](id:{table1.id})"
+                "datasource_access", f"[tmp_db1].[tmp_table1](id:{table1.id})"
             )
         )
         self.assertIsNone(
             security_manager.find_permission_view_menu(
-                "datasource_access", f"[tmp_database].[tmp_table2](id:{table2.id})"
+                "datasource_access", f"[tmp_db1].[tmp_table2](id:{table2.id})"
             )
         )
 
         # Assert that the db permission was updated
         self.assertIsNotNone(
             security_manager.find_permission_view_menu(
-                "datasource_access", f"[tmp_database2].[tmp_table1](id:{table1.id})"
+                "datasource_access", f"[tmp_db2].[tmp_table1](id:{table1.id})"
             )
         )
         self.assertIsNotNone(
             security_manager.find_permission_view_menu(
-                "datasource_access", f"[tmp_database2].[tmp_table2](id:{table2.id})"
+                "datasource_access", f"[tmp_db2].[tmp_table2](id:{table2.id})"
             )
         )
-        self.assertEqual(slice1.perm, f"[tmp_database2].[tmp_table1](id:{table1.id})")
-        self.assertEqual(table1.perm, f"[tmp_database2].[tmp_table1](id:{table1.id})")
-        self.assertEqual(table2.perm, f"[tmp_database2].[tmp_table2](id:{table2.id})")
+        self.assertEqual(slice1.perm, f"[tmp_db2].[tmp_table1](id:{table1.id})")
+        self.assertEqual(table1.perm, f"[tmp_db2].[tmp_table1](id:{table1.id})")
+        self.assertEqual(table2.perm, f"[tmp_db2].[tmp_table2](id:{table2.id})")
+
+        # Assert hooks are called
+        tmp_db1_view_menu = security_manager.find_view_menu(
+            f"[tmp_db2].(id:{tmp_db1.id})"
+        )
+        table1_view_menu = security_manager.find_view_menu(
+            f"[tmp_db2].[tmp_table1](id:{table1.id})"
+        )
+        table2_view_menu = security_manager.find_view_menu(
+            f"[tmp_db2].[tmp_table2](id:{table2.id})"
+        )
+        security_manager.on_view_menu_after_update.assert_has_calls(
+            [
+                call(ANY, ANY, tmp_db1_view_menu),
+                call(ANY, ANY, table1_view_menu),
+                call(ANY, ANY, table2_view_menu),
+            ]
+        )
 
         session.delete(slice1)
         session.delete(table1)
         session.delete(table2)
-        session.delete(stored_db)
+        session.delete(tmp_db1)
         session.commit()
 
-    def test_after_delete_database__perm_database_access(self):
+    def test_after_delete_database(self):
         session = db.session
-        database = Database(database_name="tmp_database", sqlalchemy_uri="sqlite://")
-        session.add(database)
+        tmp_db1 = Database(database_name="tmp_db1", sqlalchemy_uri="sqlite://")
+        session.add(tmp_db1)
         session.commit()
-        stored_db = (
-            session.query(Database).filter_by(database_name="tmp_database").one()
+        tmp_db1 = session.query(Database).filter_by(database_name="tmp_db1").one()
+
+        database_pvm = security_manager.find_permission_view_menu(
+            "database_access", tmp_db1.perm
         )
+        self.assertIsNotNone(database_pvm)
+        role1 = Role(name="tmp_role1")
+        role1.permissions.append(database_pvm)
+        session.add(role1)
+        session.commit()
 
-        self.assertIsNotNone(
+        session.delete(tmp_db1)
+        session.commit()
+
+        # Assert that PVM is removed from Role
+        role1 = security_manager.find_role("tmp_role1")
+        self.assertEqual(role1.permissions, [])
+
+        # Assert that the old permission was updated
+        self.assertIsNone(
             security_manager.find_permission_view_menu(
-                "database_access", stored_db.perm
+                "database_access", f"[tmp_db1].(id:{tmp_db1.id})"
             )
         )
-        session.delete(stored_db)
+
+        # Cleanup
+        session.delete(role1)
+        session.commit()
+
+    def test_after_delete_database_rollback(self):
+        session = db.session
+        tmp_db1 = Database(database_name="tmp_db1", sqlalchemy_uri="sqlite://")
+        session.add(tmp_db1)
+        session.commit()
+        tmp_db1 = session.query(Database).filter_by(database_name="tmp_db1").one()
+
+        database_pvm = security_manager.find_permission_view_menu(
+            "database_access", tmp_db1.perm
+        )
+        self.assertIsNotNone(database_pvm)
+        role1 = Role(name="tmp_role1")
+        role1.permissions.append(database_pvm)
+        session.add(role1)
         session.commit()
 
-        # Assert that the old permission was updated
+        session.delete(tmp_db1)
+        session.flush()
+
+        role1 = security_manager.find_role("tmp_role1")
+        self.assertEqual(role1.permissions, [])
+
         self.assertIsNone(
             security_manager.find_permission_view_menu(
-                "database_access", f"[tmp_database].(id:{stored_db.id})"
+                "database_access", f"[tmp_db1].(id:{tmp_db1.id})"
             )
         )
 
+        session.rollback()
+
+        # Test a rollback reverts everything
+        database_pvm = security_manager.find_permission_view_menu(
+            "database_access", f"[tmp_db1].(id:{tmp_db1.id})"
+        )
+
+        role1 = security_manager.find_role("tmp_role1")
+        self.assertEqual(role1.permissions, [database_pvm])
+
+        # Cleanup
+        session.delete(role1)
+        session.delete(tmp_db1)
+        session.commit()
+
+    def test_after_delete_dataset(self):
+        security_manager.on_permission_view_after_delete = Mock()
+
+        session = db.session
+        tmp_db = Database(database_name="tmp_db", sqlalchemy_uri="sqlite://")
+        session.add(tmp_db)
+        session.commit()
+
+        table1 = SqlaTable(
+            schema="tmp_schema",
+            table_name="tmp_table1",
+            database=tmp_db,
+        )
+        session.add(table1)
+        session.commit()
+
+        table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db].[tmp_table1](id:{table1.id})"
+        )
+        self.assertIsNotNone(table1_pvm)
+
+        role1 = Role(name="tmp_role1")
+        role1.permissions.append(table1_pvm)
+        session.add(role1)
+        session.commit()
+
+        # refresh
+        table1 = session.query(SqlaTable).filter_by(table_name="tmp_table1").one()
+
+        # Test delete
+        session.delete(table1)
+        session.commit()
+
+        role1 = security_manager.find_role("tmp_role1")
+        self.assertEqual(role1.permissions, [])
+
+        table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db].[tmp_table1](id:{table1.id})"
+        )
+        self.assertIsNone(table1_pvm)
+        table1_view_menu = security_manager.find_view_menu(
+            f"[tmp_db].[tmp_table1](id:{table1.id})"
+        )
+        self.assertIsNone(table1_view_menu)
+
+        # Assert the hook is called
+        security_manager.on_permission_view_after_delete.assert_has_calls(
+            [
+                call(ANY, ANY, ANY),
+            ]
+        )
+
+        # cleanup
+        session.delete(role1)
+        session.delete(tmp_db)
+        session.commit()
+
+    def test_after_delete_dataset_rollback(self):
+        session = db.session
+        tmp_db = Database(database_name="tmp_db", sqlalchemy_uri="sqlite://")
+        session.add(tmp_db)
+        session.commit()
+
+        table1 = SqlaTable(
+            schema="tmp_schema",
+            table_name="tmp_table1",
+            database=tmp_db,
+        )
+        session.add(table1)
+        session.commit()
+
+        table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db].[tmp_table1](id:{table1.id})"
+        )
+        self.assertIsNotNone(table1_pvm)
+
+        role1 = Role(name="tmp_role1")
+        role1.permissions.append(table1_pvm)
+        session.add(role1)
+        session.commit()
+
+        # refresh
+        table1 = session.query(SqlaTable).filter_by(table_name="tmp_table1").one()
+
+        # Test delete, permissions are correctly deleted
+        session.delete(table1)
+        session.flush()
+
+        role1 = security_manager.find_role("tmp_role1")
+        self.assertEqual(role1.permissions, [])
+
+        table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db].[tmp_table1](id:{table1.id})"
+        )
+        self.assertIsNone(table1_pvm)
+
+        # Test rollback, permissions exist everything is correctly rollback
+        session.rollback()
+        role1 = security_manager.find_role("tmp_role1")
+        table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db].[tmp_table1](id:{table1.id})"
+        )
+        self.assertIsNotNone(table1_pvm)
+        self.assertEqual(role1.permissions, [table1_pvm])
+
+        # cleanup
+        session.delete(table1)
+        session.delete(role1)
+        session.delete(tmp_db)
+        session.commit()
+
+    def test_after_update_dataset__name_changes(self):
+        security_manager.on_view_menu_after_update = Mock()
+
+        session = db.session
+        tmp_db = Database(database_name="tmp_db", sqlalchemy_uri="sqlite://")
+        session.add(tmp_db)
+        session.commit()
+
+        table1 = SqlaTable(
+            schema="tmp_schema",
+            table_name="tmp_table1",
+            database=tmp_db,
+        )
+        session.add(table1)
+        session.commit()
+
+        slice1 = Slice(
+            datasource_id=table1.id,
+            datasource_type=DatasourceType.TABLE,
+            datasource_name="tmp_table1",
+            slice_name="tmp_slice1",
+        )
+        session.add(slice1)
+        session.commit()
+
+        table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db].[tmp_table1](id:{table1.id})"
+        )
+        self.assertIsNotNone(table1_pvm)
+
+        # refresh
+        table1 = session.query(SqlaTable).filter_by(table_name="tmp_table1").one()
+        # Test update
+        table1.table_name = "tmp_table1_changed"
+        session.commit()
+
+        # Test old permission does not exist
+        old_table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db].[tmp_table1](id:{table1.id})"
+        )
+        self.assertIsNone(old_table1_pvm)
+
+        # Test new permission exist
+        new_table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db].[tmp_table1_changed](id:{table1.id})"
+        )
+        self.assertIsNotNone(new_table1_pvm)
+
+        # test dataset permission changed
+        changed_table1 = (
+            session.query(SqlaTable).filter_by(table_name="tmp_table1_changed").one()
+        )
+        self.assertEqual(
+            changed_table1.perm, f"[tmp_db].[tmp_table1_changed](id:{table1.id})"
+        )
+
+        # Test Chart permission changed
+        slice1 = session.query(Slice).filter_by(slice_name="tmp_slice1").one()
+        self.assertEqual(slice1.perm, f"[tmp_db].[tmp_table1_changed](id:{table1.id})")
+
+        # Assert hook is called
+        view_menu_dataset = security_manager.find_view_menu(
+            f"[tmp_db].[tmp_table1_changed](id:{table1.id})"
+        )
+        security_manager.on_view_menu_after_update.assert_has_calls(
+            [
+                call(ANY, ANY, view_menu_dataset),
+            ]
+        )
+        # cleanup
+        session.delete(slice1)
+        session.delete(table1)
+        session.delete(tmp_db)
+        session.commit()
+
+    def test_after_update_dataset_rollback(self):
+        session = db.session
+        tmp_db = Database(database_name="tmp_db", sqlalchemy_uri="sqlite://")
+        session.add(tmp_db)
+        session.commit()
+
+        table1 = SqlaTable(
+            schema="tmp_schema",
+            table_name="tmp_table1",
+            database=tmp_db,
+        )
+        session.add(table1)
+        session.commit()
+
+        slice1 = Slice(
+            datasource_id=table1.id,
+            datasource_type=DatasourceType.TABLE,
+            datasource_name="tmp_table1",
+            slice_name="tmp_slice1",
+        )
+        session.add(slice1)
+        session.commit()
+
+        # refresh
+        table1 = session.query(SqlaTable).filter_by(table_name="tmp_table1").one()
+        # Test update
+        table1.table_name = "tmp_table1_changed"
+        session.flush()
+
+        # Test old permission does not exist
+        old_table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db].[tmp_table1](id:{table1.id})"
+        )
+        self.assertIsNone(old_table1_pvm)
+
+        # Test new permission exist
+        new_table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db].[tmp_table1_changed](id:{table1.id})"
+        )
+        self.assertIsNotNone(new_table1_pvm)
+
+        # Test rollback
+        session.rollback()
+
+        old_table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db].[tmp_table1](id:{table1.id})"
+        )
+        self.assertIsNotNone(old_table1_pvm)
+
+        # cleanup
+        session.delete(slice1)
+        session.delete(table1)
+        session.delete(tmp_db)
+        session.commit()
+
+    def test_after_update_dataset__db_changes(self):
+        session = db.session
+        tmp_db1 = Database(database_name="tmp_db1", sqlalchemy_uri="sqlite://")
+        tmp_db2 = Database(database_name="tmp_db2", sqlalchemy_uri="sqlite://")
+        session.add(tmp_db1)
+        session.add(tmp_db2)
+        session.commit()
+
+        table1 = SqlaTable(
+            schema="tmp_schema",
+            table_name="tmp_table1",
+            database=tmp_db1,
+        )
+        session.add(table1)
+        session.commit()
+
+        slice1 = Slice(
+            datasource_id=table1.id,
+            datasource_type=DatasourceType.TABLE,
+            datasource_name="tmp_table1",
+            slice_name="tmp_slice1",
+        )
+        session.add(slice1)
+        session.commit()
+
+        table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db1].[tmp_table1](id:{table1.id})"
+        )
+        self.assertIsNotNone(table1_pvm)
+
+        # refresh
+        table1 = session.query(SqlaTable).filter_by(table_name="tmp_table1").one()
+        # Test update
+        table1.database = tmp_db2
+        session.commit()
+
+        # Test old permission does not exist
+        table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db1].[tmp_table1](id:{table1.id})"
+        )
+        self.assertIsNone(table1_pvm)
+
+        # Test new permission exist
+        table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db2].[tmp_table1](id:{table1.id})"
+        )
+        self.assertIsNotNone(table1_pvm)
+
+        # test dataset permission and schema permission changed
+        changed_table1 = (
+            session.query(SqlaTable).filter_by(table_name="tmp_table1").one()
+        )
+        self.assertEqual(changed_table1.perm, f"[tmp_db2].[tmp_table1](id:{table1.id})")
+        self.assertEqual(changed_table1.schema_perm, f"[tmp_db2].[tmp_schema]")
+
+        # Test Chart permission changed
+        slice1 = session.query(Slice).filter_by(slice_name="tmp_slice1").one()
+        self.assertEqual(slice1.perm, f"[tmp_db2].[tmp_table1](id:{table1.id})")
+        self.assertEqual(slice1.schema_perm, f"[tmp_db2].[tmp_schema]")
+
+        # cleanup
+        session.delete(slice1)
+        session.delete(table1)
+        session.delete(tmp_db1)
+        session.delete(tmp_db2)
+        session.commit()
+
+    def test_after_update_dataset__schema_changes(self):
+        session = db.session
+        tmp_db1 = Database(database_name="tmp_db1", sqlalchemy_uri="sqlite://")
+        session.add(tmp_db1)
+        session.commit()
+
+        table1 = SqlaTable(
+            schema="tmp_schema",
+            table_name="tmp_table1",
+            database=tmp_db1,
+        )
+        session.add(table1)
+        session.commit()
+
+        slice1 = Slice(
+            datasource_id=table1.id,
+            datasource_type=DatasourceType.TABLE,
+            datasource_name="tmp_table1",
+            slice_name="tmp_slice1",
+        )
+        session.add(slice1)
+        session.commit()
+
+        table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db1].[tmp_table1](id:{table1.id})"
+        )
+        self.assertIsNotNone(table1_pvm)
+
+        # refresh
+        table1 = session.query(SqlaTable).filter_by(table_name="tmp_table1").one()
+        # Test update
+        table1.schema = "tmp_schema_changed"
+        session.commit()
+
+        # Test permission still exists
+        table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db1].[tmp_table1](id:{table1.id})"
+        )
+        self.assertIsNotNone(table1_pvm)
+
+        # test dataset schema permission changed
+        changed_table1 = (
+            session.query(SqlaTable).filter_by(table_name="tmp_table1").one()
+        )
+        self.assertEqual(changed_table1.perm, f"[tmp_db1].[tmp_table1](id:{table1.id})")
+        self.assertEqual(changed_table1.schema_perm, f"[tmp_db1].[tmp_schema_changed]")
+
+        # Test Chart schema permission changed
+        slice1 = session.query(Slice).filter_by(slice_name="tmp_slice1").one()
+        self.assertEqual(slice1.perm, f"[tmp_db1].[tmp_table1](id:{table1.id})")
+        self.assertEqual(slice1.schema_perm, f"[tmp_db1].[tmp_schema_changed]")
+
+        # cleanup
+        session.delete(slice1)
+        session.delete(table1)
+        session.delete(tmp_db1)
+        session.commit()
+
+    def test_after_update_dataset__schema_none(self):
+        session = db.session
+        tmp_db1 = Database(database_name="tmp_db1", sqlalchemy_uri="sqlite://")
+        session.add(tmp_db1)
+        session.commit()
+
+        table1 = SqlaTable(
+            schema="tmp_schema",
+            table_name="tmp_table1",
+            database=tmp_db1,
+        )
+        session.add(table1)
+        session.commit()
+
+        slice1 = Slice(
+            datasource_id=table1.id,
+            datasource_type=DatasourceType.TABLE,
+            datasource_name="tmp_table1",
+            slice_name="tmp_slice1",
+        )
+        session.add(slice1)
+        session.commit()
+
+        table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db1].[tmp_table1](id:{table1.id})"
+        )
+        self.assertIsNotNone(table1_pvm)
+
+        # refresh
+        table1 = session.query(SqlaTable).filter_by(table_name="tmp_table1").one()
+        # Test update
+        table1.schema = None
+        session.commit()
+
+        # refresh
+        table1 = session.query(SqlaTable).filter_by(table_name="tmp_table1").one()
+
+        self.assertEqual(table1.perm, f"[tmp_db1].[tmp_table1](id:{table1.id})")
+        self.assertIsNone(table1.schema_perm)
+
+        # cleanup
+        session.delete(slice1)
+        session.delete(table1)
+        session.delete(tmp_db1)
+        session.commit()
+
+    def test_after_update_dataset__name_db_changes(self):
+        session = db.session
+        tmp_db1 = Database(database_name="tmp_db1", sqlalchemy_uri="sqlite://")
+        tmp_db2 = Database(database_name="tmp_db2", sqlalchemy_uri="sqlite://")
+        session.add(tmp_db1)
+        session.add(tmp_db2)
+        session.commit()
+
+        table1 = SqlaTable(
+            schema="tmp_schema",
+            table_name="tmp_table1",
+            database=tmp_db1,
+        )
+        session.add(table1)
+        session.commit()
+
+        slice1 = Slice(
+            datasource_id=table1.id,
+            datasource_type=DatasourceType.TABLE,
+            datasource_name="tmp_table1",
+            slice_name="tmp_slice1",
+        )
+        session.add(slice1)
+        session.commit()
+
+        table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db1].[tmp_table1](id:{table1.id})"
+        )
+        self.assertIsNotNone(table1_pvm)
+
+        # refresh
+        table1 = session.query(SqlaTable).filter_by(table_name="tmp_table1").one()
+        # Test update
+        table1.table_name = "tmp_table1_changed"
+        table1.database = tmp_db2
+        session.commit()
+
+        # Test old permission does not exist
+        table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db1].[tmp_table1](id:{table1.id})"
+        )
+        self.assertIsNone(table1_pvm)
+
+        # Test new permission exist
+        table1_pvm = security_manager.find_permission_view_menu(
+            "datasource_access", f"[tmp_db2].[tmp_table1_changed](id:{table1.id})"
+        )
+        self.assertIsNotNone(table1_pvm)
+
+        # test dataset permission and schema permission changed
+        changed_table1 = (
+            session.query(SqlaTable).filter_by(table_name="tmp_table1_changed").one()
+        )
+        self.assertEqual(
+            changed_table1.perm, f"[tmp_db2].[tmp_table1_changed](id:{table1.id})"
+        )
+        self.assertEqual(changed_table1.schema_perm, f"[tmp_db2].[tmp_schema]")
+
+        # Test Chart permission changed
+        slice1 = session.query(Slice).filter_by(slice_name="tmp_slice1").one()
+        self.assertEqual(slice1.perm, f"[tmp_db2].[tmp_table1_changed](id:{table1.id})")
+        self.assertEqual(slice1.schema_perm, f"[tmp_db2].[tmp_schema]")
+
+        # cleanup
+        session.delete(slice1)
+        session.delete(table1)
+        session.delete(tmp_db1)
+        session.delete(tmp_db2)
+        session.commit()
+
     def test_hybrid_perm_database(self):
         database = Database(database_name="tmp_database3", sqlalchemy_uri="sqlite://")
 
@@ -562,23 +1156,14 @@ class TestRolePermission(SupersetTestCase):
         table.schema = "tmp_perm_schema"
         table.table_name = "tmp_perm_table_v2"
         session.commit()
-        # TODO(bogdan): modify slice permissions on the table update.
-        self.assertNotEqual(slice.perm, table.perm)
-        self.assertEqual(slice.perm, f"[tmp_database].[tmp_perm_table](id:{table.id})")
-        self.assertEqual(
-            table.perm, f"[tmp_database].[tmp_perm_table_v2](id:{table.id})"
-        )
-        # TODO(bogdan): modify slice schema permissions on the table update.
-        self.assertNotEqual(slice.schema_perm, table.schema_perm)
-        self.assertIsNone(slice.schema_perm)
-
-        # updating slice refreshes the permissions
-        slice.slice_name = "slice_name_v2"
-        session.commit()
+        table = session.query(SqlaTable).filter_by(table_name="tmp_perm_table_v2").one()
         self.assertEqual(slice.perm, table.perm)
         self.assertEqual(
             slice.perm, f"[tmp_database].[tmp_perm_table_v2](id:{table.id})"
         )
+        self.assertEqual(
+            table.perm, f"[tmp_database].[tmp_perm_table_v2](id:{table.id})"
+        )
         self.assertEqual(slice.schema_perm, table.schema_perm)
         self.assertEqual(slice.schema_perm, "[tmp_database].[tmp_perm_schema]")
 
@@ -588,8 +1173,7 @@ class TestRolePermission(SupersetTestCase):
 
         session.commit()
 
-        # TODO test slice permission
-
+    @patch("superset.utils.core.g")
     @patch("superset.security.manager.g")
     def test_schemas_accessible_by_user_admin(self, mock_g):
         mock_g.user = security_manager.find_user("admin")


[superset] 16/29: fix(celery cache warmup): add auth and use warm_up_cache endpoint (#21076)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 4298690e169063d5421b4d193d4fda96bc62e472
Author: ʈᵃᵢ <td...@gmail.com>
AuthorDate: Tue Aug 30 09:24:24 2022 -0700

    fix(celery cache warmup): add auth and use warm_up_cache endpoint (#21076)
    
    (cherry picked from commit 04dd8d414db6a3cddcd073ad74acb2a4b7a53b0b)
---
 docker/pythonpath_dev/superset_config.py        |  10 ++
 superset/tasks/cache.py                         |  98 ++++++++--------
 tests/integration_tests/strategy_tests.py       | 141 +++---------------------
 tests/integration_tests/superset_test_config.py |   2 +
 4 files changed, 70 insertions(+), 181 deletions(-)

diff --git a/docker/pythonpath_dev/superset_config.py b/docker/pythonpath_dev/superset_config.py
index 794239d23f..84c1dc58ab 100644
--- a/docker/pythonpath_dev/superset_config.py
+++ b/docker/pythonpath_dev/superset_config.py
@@ -69,6 +69,16 @@ REDIS_RESULTS_DB = get_env_variable("REDIS_RESULTS_DB", "1")
 
 RESULTS_BACKEND = FileSystemCache("/app/superset_home/sqllab")
 
+CACHE_CONFIG = {
+    "CACHE_TYPE": "redis",
+    "CACHE_DEFAULT_TIMEOUT": 300,
+    "CACHE_KEY_PREFIX": "superset_",
+    "CACHE_REDIS_HOST": REDIS_HOST,
+    "CACHE_REDIS_PORT": REDIS_PORT,
+    "CACHE_REDIS_DB": REDIS_RESULTS_DB,
+}
+DATA_CACHE_CONFIG = CACHE_CONFIG
+
 
 class CeleryConfig(object):
     BROKER_URL = f"redis://{REDIS_HOST}:{REDIS_PORT}/{REDIS_CELERY_DB}"
diff --git a/superset/tasks/cache.py b/superset/tasks/cache.py
index ee73df5fde..137ec068e8 100644
--- a/superset/tasks/cache.py
+++ b/superset/tasks/cache.py
@@ -14,73 +14,36 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-import json
 import logging
 from typing import Any, Dict, List, Optional, Union
 from urllib import request
 from urllib.error import URLError
 
+from celery.beat import SchedulingError
 from celery.utils.log import get_task_logger
 from sqlalchemy import and_, func
 
-from superset import app, db
+from superset import app, db, security_manager
 from superset.extensions import celery_app
 from superset.models.core import Log
 from superset.models.dashboard import Dashboard
 from superset.models.slice import Slice
 from superset.models.tags import Tag, TaggedObject
 from superset.utils.date_parser import parse_human_datetime
-from superset.views.utils import build_extra_filters
+from superset.utils.machine_auth import MachineAuthProvider
 
 logger = get_task_logger(__name__)
 logger.setLevel(logging.INFO)
 
 
-def get_form_data(
-    chart_id: int, dashboard: Optional[Dashboard] = None
-) -> Dict[str, Any]:
-    """
-    Build `form_data` for chart GET request from dashboard's `default_filters`.
-
-    When a dashboard has `default_filters` they need to be added  as extra
-    filters in the GET request for charts.
-
-    """
-    form_data: Dict[str, Any] = {"slice_id": chart_id}
-
-    if dashboard is None or not dashboard.json_metadata:
-        return form_data
-
-    json_metadata = json.loads(dashboard.json_metadata)
-    default_filters = json.loads(json_metadata.get("default_filters", "null"))
-    if not default_filters:
-        return form_data
-
-    filter_scopes = json_metadata.get("filter_scopes", {})
-    layout = json.loads(dashboard.position_json or "{}")
-    if (
-        isinstance(layout, dict)
-        and isinstance(filter_scopes, dict)
-        and isinstance(default_filters, dict)
-    ):
-        extra_filters = build_extra_filters(
-            layout, filter_scopes, default_filters, chart_id
-        )
-        if extra_filters:
-            form_data["extra_filters"] = extra_filters
-
-    return form_data
-
-
-def get_url(chart: Slice, extra_filters: Optional[Dict[str, Any]] = None) -> str:
+def get_url(chart: Slice, dashboard: Optional[Dashboard] = None) -> str:
     """Return external URL for warming up a given chart/table cache."""
     with app.test_request_context():
-        baseurl = (
-            "{SUPERSET_WEBSERVER_PROTOCOL}://"
-            "{SUPERSET_WEBSERVER_ADDRESS}:"
-            "{SUPERSET_WEBSERVER_PORT}".format(**app.config)
-        )
-        return f"{baseurl}{chart.get_explore_url(overrides=extra_filters)}"
+        baseurl = "{WEBDRIVER_BASEURL}".format(**app.config)
+        url = f"{baseurl}superset/warm_up_cache/?slice_id={chart.id}"
+        if dashboard:
+            url += f"&dashboard_id={dashboard.id}"
+        return url
 
 
 class Strategy:  # pylint: disable=too-few-public-methods
@@ -179,8 +142,7 @@ class TopNDashboardsStrategy(Strategy):  # pylint: disable=too-few-public-method
         dashboards = session.query(Dashboard).filter(Dashboard.id.in_(dash_ids)).all()
         for dashboard in dashboards:
             for chart in dashboard.slices:
-                form_data_with_filters = get_form_data(chart.id, dashboard)
-                urls.append(get_url(chart, form_data_with_filters))
+                urls.append(get_url(chart, dashboard))
 
         return urls
 
@@ -253,6 +215,30 @@ class DashboardTagsStrategy(Strategy):  # pylint: disable=too-few-public-methods
 strategies = [DummyStrategy, TopNDashboardsStrategy, DashboardTagsStrategy]
 
 
+@celery_app.task(name="fetch_url")
+def fetch_url(url: str, headers: Dict[str, str]) -> Dict[str, str]:
+    """
+    Celery job to fetch url
+    """
+    result = {}
+    try:
+        logger.info("Fetching %s", url)
+        req = request.Request(url, headers=headers)
+        response = request.urlopen(  # pylint: disable=consider-using-with
+            req, timeout=600
+        )
+        logger.info("Fetched %s, status code: %s", url, response.code)
+        if response.code == 200:
+            result = {"success": url, "response": response.read().decode("utf-8")}
+        else:
+            result = {"error": url, "status_code": response.code}
+            logger.error("Error fetching %s, status code: %s", url, response.code)
+    except URLError as err:
+        logger.exception("Error warming up cache!")
+        result = {"error": url, "exception": str(err)}
+    return result
+
+
 @celery_app.task(name="cache-warmup")
 def cache_warmup(
     strategy_name: str, *args: Any, **kwargs: Any
@@ -282,14 +268,18 @@ def cache_warmup(
         logger.exception(message)
         return message
 
-    results: Dict[str, List[str]] = {"success": [], "errors": []}
+    user = security_manager.get_user_by_username(app.config["THUMBNAIL_SELENIUM_USER"])
+    cookies = MachineAuthProvider.get_auth_cookies(user)
+    headers = {"Cookie": f"session={cookies.get('session', '')}"}
+
+    results: Dict[str, List[str]] = {"scheduled": [], "errors": []}
     for url in strategy.get_urls():
         try:
-            logger.info("Fetching %s", url)
-            request.urlopen(url)  # pylint: disable=consider-using-with
-            results["success"].append(url)
-        except URLError:
-            logger.exception("Error warming up cache!")
+            logger.info("Scheduling %s", url)
+            fetch_url.delay(url, headers)
+            results["scheduled"].append(url)
+        except SchedulingError:
+            logger.exception("Error scheduling fetch_url: %s", url)
             results["errors"].append(url)
 
     return results
diff --git a/tests/integration_tests/strategy_tests.py b/tests/integration_tests/strategy_tests.py
index aec73b1efe..f31489bb04 100644
--- a/tests/integration_tests/strategy_tests.py
+++ b/tests/integration_tests/strategy_tests.py
@@ -38,9 +38,9 @@ from superset.models.core import Log
 from superset.models.tags import get_tag, ObjectTypes, TaggedObject, TagTypes
 from superset.tasks.cache import (
     DashboardTagsStrategy,
-    get_form_data,
     TopNDashboardsStrategy,
 )
+from superset.utils.urls import get_url_host
 
 from .base_tests import SupersetTestCase
 from .dashboard_utils import create_dashboard, create_slice, create_table_metadata
@@ -49,7 +49,6 @@ from .fixtures.unicode_dashboard import (
     load_unicode_data,
 )
 
-URL_PREFIX = "http://0.0.0.0:8081"
 
 mock_positions = {
     "DASHBOARD_VERSION_KEY": "v2",
@@ -69,128 +68,6 @@ mock_positions = {
 
 
 class TestCacheWarmUp(SupersetTestCase):
-    def test_get_form_data_chart_only(self):
-        chart_id = 1
-        result = get_form_data(chart_id, None)
-        expected = {"slice_id": chart_id}
-        self.assertEqual(result, expected)
-
-    def test_get_form_data_no_dashboard_metadata(self):
-        chart_id = 1
-        dashboard = MagicMock()
-        dashboard.json_metadata = None
-        dashboard.position_json = json.dumps(mock_positions)
-        result = get_form_data(chart_id, dashboard)
-        expected = {"slice_id": chart_id}
-        self.assertEqual(result, expected)
-
-    def test_get_form_data_immune_slice(self):
-        chart_id = 1
-        filter_box_id = 2
-        dashboard = MagicMock()
-        dashboard.position_json = json.dumps(mock_positions)
-        dashboard.json_metadata = json.dumps(
-            {
-                "filter_scopes": {
-                    str(filter_box_id): {
-                        "name": {"scope": ["ROOT_ID"], "immune": [chart_id]}
-                    }
-                },
-                "default_filters": json.dumps(
-                    {str(filter_box_id): {"name": ["Alice", "Bob"]}}
-                ),
-            }
-        )
-        result = get_form_data(chart_id, dashboard)
-        expected = {"slice_id": chart_id}
-        self.assertEqual(result, expected)
-
-    def test_get_form_data_no_default_filters(self):
-        chart_id = 1
-        dashboard = MagicMock()
-        dashboard.json_metadata = json.dumps({})
-        dashboard.position_json = json.dumps(mock_positions)
-        result = get_form_data(chart_id, dashboard)
-        expected = {"slice_id": chart_id}
-        self.assertEqual(result, expected)
-
-    def test_get_form_data_immune_fields(self):
-        chart_id = 1
-        filter_box_id = 2
-        dashboard = MagicMock()
-        dashboard.position_json = json.dumps(mock_positions)
-        dashboard.json_metadata = json.dumps(
-            {
-                "default_filters": json.dumps(
-                    {
-                        str(filter_box_id): {
-                            "name": ["Alice", "Bob"],
-                            "__time_range": "100 years ago : today",
-                        }
-                    }
-                ),
-                "filter_scopes": {
-                    str(filter_box_id): {
-                        "__time_range": {"scope": ["ROOT_ID"], "immune": [chart_id]}
-                    }
-                },
-            }
-        )
-        result = get_form_data(chart_id, dashboard)
-        expected = {
-            "slice_id": chart_id,
-            "extra_filters": [{"col": "name", "op": "in", "val": ["Alice", "Bob"]}],
-        }
-        self.assertEqual(result, expected)
-
-    def test_get_form_data_no_extra_filters(self):
-        chart_id = 1
-        filter_box_id = 2
-        dashboard = MagicMock()
-        dashboard.position_json = json.dumps(mock_positions)
-        dashboard.json_metadata = json.dumps(
-            {
-                "default_filters": json.dumps(
-                    {str(filter_box_id): {"__time_range": "100 years ago : today"}}
-                ),
-                "filter_scopes": {
-                    str(filter_box_id): {
-                        "__time_range": {"scope": ["ROOT_ID"], "immune": [chart_id]}
-                    }
-                },
-            }
-        )
-        result = get_form_data(chart_id, dashboard)
-        expected = {"slice_id": chart_id}
-        self.assertEqual(result, expected)
-
-    def test_get_form_data(self):
-        chart_id = 1
-        filter_box_id = 2
-        dashboard = MagicMock()
-        dashboard.position_json = json.dumps(mock_positions)
-        dashboard.json_metadata = json.dumps(
-            {
-                "default_filters": json.dumps(
-                    {
-                        str(filter_box_id): {
-                            "name": ["Alice", "Bob"],
-                            "__time_range": "100 years ago : today",
-                        }
-                    }
-                )
-            }
-        )
-        result = get_form_data(chart_id, dashboard)
-        expected = {
-            "slice_id": chart_id,
-            "extra_filters": [
-                {"col": "name", "op": "in", "val": ["Alice", "Bob"]},
-                {"col": "__time_range", "op": "==", "val": "100 years ago : today"},
-            ],
-        }
-        self.assertEqual(result, expected)
-
     @pytest.mark.usefixtures("load_birth_names_dashboard_with_slices")
     def test_top_n_dashboards_strategy(self):
         # create a top visited dashboard
@@ -202,7 +79,12 @@ class TestCacheWarmUp(SupersetTestCase):
 
         strategy = TopNDashboardsStrategy(1)
         result = sorted(strategy.get_urls())
-        expected = sorted([f"{URL_PREFIX}{slc.url}" for slc in dash.slices])
+        expected = sorted(
+            [
+                f"{get_url_host()}superset/warm_up_cache/?slice_id={slc.id}&dashboard_id={dash.id}"
+                for slc in dash.slices
+            ]
+        )
         self.assertEqual(result, expected)
 
     def reset_tag(self, tag):
@@ -228,7 +110,12 @@ class TestCacheWarmUp(SupersetTestCase):
         # tag dashboard 'births' with `tag1`
         tag1 = get_tag("tag1", db.session, TagTypes.custom)
         dash = self.get_dash_by_slug("births")
-        tag1_urls = sorted([f"{URL_PREFIX}{slc.url}" for slc in dash.slices])
+        tag1_urls = sorted(
+            [
+                f"{get_url_host()}superset/warm_up_cache/?slice_id={slc.id}"
+                for slc in dash.slices
+            ]
+        )
         tagged_object = TaggedObject(
             tag_id=tag1.id, object_id=dash.id, object_type=ObjectTypes.dashboard
         )
@@ -248,7 +135,7 @@ class TestCacheWarmUp(SupersetTestCase):
         # tag first slice
         dash = self.get_dash_by_slug("unicode-test")
         slc = dash.slices[0]
-        tag2_urls = [f"{URL_PREFIX}{slc.url}"]
+        tag2_urls = [f"{get_url_host()}superset/warm_up_cache/?slice_id={slc.id}"]
         object_id = slc.id
         tagged_object = TaggedObject(
             tag_id=tag2.id, object_id=object_id, object_type=ObjectTypes.chart
diff --git a/tests/integration_tests/superset_test_config.py b/tests/integration_tests/superset_test_config.py
index 2907f4ceb8..10d81a2cf1 100644
--- a/tests/integration_tests/superset_test_config.py
+++ b/tests/integration_tests/superset_test_config.py
@@ -65,6 +65,8 @@ FEATURE_FLAGS = {
     "DASHBOARD_NATIVE_FILTERS": True,
 }
 
+WEBDRIVER_BASEURL = "http://0.0.0.0:8081/"
+
 
 def GET_FEATURE_FLAGS_FUNC(ff):
     ff_copy = copy(ff)


[superset] 19/29: fix(explore): Time column label not formatted when GENERIC_X_AXES enabled (#21294)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 551f3064795b9635dbeaed9c12b018350817d737
Author: Kamil Gabryjelski <ka...@gmail.com>
AuthorDate: Thu Sep 1 18:07:49 2022 +0200

    fix(explore): Time column label not formatted when GENERIC_X_AXES enabled (#21294)
    
    (cherry picked from commit c3a00d43d055224d4a31ea9315934a59b556eea7)
---
 .../plugins/plugin-chart-echarts/src/MixedTimeseries/transformProps.ts  | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/transformProps.ts b/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/transformProps.ts
index 62ed57268f..663131dd97 100644
--- a/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/transformProps.ts
+++ b/superset-frontend/plugins/plugin-chart-echarts/src/MixedTimeseries/transformProps.ts
@@ -155,7 +155,7 @@ export default function transformProps(
   });
 
   const dataTypes = getColtypesMapping(queriesData[0]);
-  const xAxisDataType = dataTypes?.[xAxisCol];
+  const xAxisDataType = dataTypes?.[xAxisCol] ?? dataTypes?.[xAxisOrig];
   const xAxisType = getAxisType(xAxisDataType);
   const series: SeriesOption[] = [];
   const formatter = getNumberFormatter(contributionMode ? ',.0%' : yAxisFormat);


[superset] 13/29: fix(native filters): groupby filter issue (#21084)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 05d7c3d74d649c27334a23312c67dcd58a33664b
Author: stevetracvc <70...@users.noreply.github.com>
AuthorDate: Mon Aug 22 08:15:33 2022 -0600

    fix(native filters): groupby filter issue (#21084)
    
    (cherry picked from commit d79b0bfc744885f6e6f0b5e9a4128c63c1dea58d)
---
 .../src/filters/components/GroupBy/GroupByFilterPlugin.tsx              | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/superset-frontend/src/filters/components/GroupBy/GroupByFilterPlugin.tsx b/superset-frontend/src/filters/components/GroupBy/GroupByFilterPlugin.tsx
index 9742e27e59..091232a5ec 100644
--- a/superset-frontend/src/filters/components/GroupBy/GroupByFilterPlugin.tsx
+++ b/superset-frontend/src/filters/components/GroupBy/GroupByFilterPlugin.tsx
@@ -71,7 +71,7 @@ export default function PluginFilterGroupBy(props: PluginFilterGroupByProps) {
   }, [JSON.stringify(defaultValue), multiSelect]);
 
   const groupbys = ensureIsArray(formData.groupby).map(getColumnLabel);
-  const groupby = groupbys[0].length ? groupbys[0] : null;
+  const groupby = groupbys[0]?.length ? groupbys[0] : null;
 
   const withData = groupby
     ? data.filter(row => groupby.includes(row.column_name as string))


[superset] 15/29: fix(database-list): hidden upload file button if no permission (#21216)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 32736680dac054f6b464ff51e9e29b5a974eb41d
Author: Stephen Liu <75...@qq.com>
AuthorDate: Tue Aug 30 10:34:28 2022 +0800

    fix(database-list): hidden upload file button if no permission (#21216)
    
    (cherry picked from commit 0c43190e04edc182f8787cc88d9a6fcf7f86a9f7)
---
 .../views/CRUD/data/database/DatabaseList.test.jsx | 28 +++++++++++++++++++++-
 .../src/views/CRUD/data/database/DatabaseList.tsx  | 11 +++++----
 2 files changed, 34 insertions(+), 5 deletions(-)

diff --git a/superset-frontend/src/views/CRUD/data/database/DatabaseList.test.jsx b/superset-frontend/src/views/CRUD/data/database/DatabaseList.test.jsx
index 5fe6ead7fd..1cc66ef9e3 100644
--- a/superset-frontend/src/views/CRUD/data/database/DatabaseList.test.jsx
+++ b/superset-frontend/src/views/CRUD/data/database/DatabaseList.test.jsx
@@ -96,7 +96,7 @@ fetchMock.get(
 const useSelectorMock = jest.spyOn(reactRedux, 'useSelector');
 const userSelectorMock = jest.spyOn(reactRedux, 'useSelector');
 
-describe('DatabaseList', () => {
+describe('Admin DatabaseList', () => {
   useSelectorMock.mockReturnValue({
     CSV_EXTENSIONS: ['csv'],
     EXCEL_EXTENSIONS: ['xls', 'xlsx'],
@@ -212,4 +212,30 @@ describe('DatabaseList', () => {
       `"http://localhost/api/v1/database/?q=(filters:!((col:expose_in_sqllab,opr:eq,value:!t),(col:allow_run_async,opr:eq,value:!f),(col:database_name,opr:ct,value:fooo)),order_column:changed_on_delta_humanized,order_direction:desc,page:0,page_size:25)"`,
     );
   });
+
+  it('should not render dropdown menu button if user is not admin', () => {
+    userSelectorMock.mockReturnValue({
+      createdOn: '2021-05-27T18:12:38.952304',
+      email: 'alpha@gmail.com',
+      firstName: 'alpha',
+      isActive: true,
+      lastName: 'alpha',
+      permissions: {},
+      roles: {
+        Alpha: [
+          ['can_sqllab', 'Superset'],
+          ['can_write', 'Dashboard'],
+          ['can_write', 'Chart'],
+        ],
+      },
+      userId: 2,
+      username: 'alpha',
+    });
+    const newWrapper = mount(
+      <Provider store={store}>
+        <DatabaseList />
+      </Provider>,
+    );
+    expect(newWrapper.find('.dropdown-menu-links')).not.toExist();
+  });
 });
diff --git a/superset-frontend/src/views/CRUD/data/database/DatabaseList.tsx b/superset-frontend/src/views/CRUD/data/database/DatabaseList.tsx
index b9c5b4a846..dc9dc1a279 100644
--- a/superset-frontend/src/views/CRUD/data/database/DatabaseList.tsx
+++ b/superset-frontend/src/views/CRUD/data/database/DatabaseList.tsx
@@ -37,6 +37,7 @@ import { commonMenuData } from 'src/views/CRUD/data/common';
 import handleResourceExport from 'src/utils/export';
 import { ExtentionConfigs } from 'src/views/components/types';
 import { UserWithPermissionsAndRoles } from 'src/types/bootstrapTypes';
+import type { MenuObjectProps } from 'src/views/components/Menu';
 import DatabaseModal from './DatabaseModal';
 
 import { DatabaseObject } from './types';
@@ -230,11 +231,13 @@ function DatabaseList({ addDangerToast, addSuccessToast }: DatabaseListProps) {
 
   useEffect(() => hasFileUploadEnabled(), [databaseModalOpen]);
 
-  const filteredDropDown = uploadDropdownMenu.map(link => {
+  const filteredDropDown = uploadDropdownMenu.reduce((prev, cur) => {
     // eslint-disable-next-line no-param-reassign
-    link.childs = link.childs.filter(item => item.perm);
-    return link;
-  });
+    cur.childs = cur.childs.filter(item => item.perm);
+    if (!cur.childs.length) return prev;
+    prev.push(cur);
+    return prev;
+  }, [] as MenuObjectProps[]);
 
   const menuData: SubMenuProps = {
     activeChild: 'Databases',


[superset] 09/29: fix(plugin-chart-echarts): gauge chart enhancements and fixes (#21007)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 884e2f1ca7f5f852766a480d89840e0bc455de07
Author: Stephen Liu <75...@qq.com>
AuthorDate: Tue Aug 16 21:54:17 2022 +0800

    fix(plugin-chart-echarts): gauge chart enhancements and fixes (#21007)
    
    * fix(plugin-chart-echarts): gauge chart enhancements and fixes
    
    * fix lint
    
    (cherry picked from commit b303d1e156185d134927246004a4804931cd6bca)
---
 .../src/Gauge/controlPanel.tsx                     |  6 +-
 .../src/Gauge/transformProps.ts                    | 79 +++++++++++++++-------
 .../plugin-chart-echarts/src/Gauge/types.ts        |  8 +--
 3 files changed, 60 insertions(+), 33 deletions(-)

diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Gauge/controlPanel.tsx b/superset-frontend/plugins/plugin-chart-echarts/src/Gauge/controlPanel.tsx
index bb727888e5..28d358da99 100644
--- a/superset-frontend/plugins/plugin-chart-echarts/src/Gauge/controlPanel.tsx
+++ b/superset-frontend/plugins/plugin-chart-echarts/src/Gauge/controlPanel.tsx
@@ -17,7 +17,7 @@
  * under the License.
  */
 import React from 'react';
-import { t, validateNonEmpty, validateInteger } from '@superset-ui/core';
+import { t } from '@superset-ui/core';
 import {
   sharedControls,
   ControlPanelConfig,
@@ -81,8 +81,7 @@ const config: ControlPanelConfig = {
             config: {
               type: 'TextControl',
               isInt: true,
-              default: String(DEFAULT_FORM_DATA.minVal),
-              validators: [validateNonEmpty, validateInteger],
+              default: DEFAULT_FORM_DATA.minVal,
               renderTrigger: true,
               label: t('Min'),
               description: t('Minimum value on the gauge axis'),
@@ -94,7 +93,6 @@ const config: ControlPanelConfig = {
               type: 'TextControl',
               isInt: true,
               default: DEFAULT_FORM_DATA.maxVal,
-              validators: [validateNonEmpty, validateInteger],
               renderTrigger: true,
               label: t('Max'),
               description: t('Maximum value on the gauge axis'),
diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Gauge/transformProps.ts b/superset-frontend/plugins/plugin-chart-echarts/src/Gauge/transformProps.ts
index 899417a639..996164222f 100644
--- a/superset-frontend/plugins/plugin-chart-echarts/src/Gauge/transformProps.ts
+++ b/superset-frontend/plugins/plugin-chart-echarts/src/Gauge/transformProps.ts
@@ -28,6 +28,7 @@ import {
 } from '@superset-ui/core';
 import { EChartsCoreOption, GaugeSeriesOption } from 'echarts';
 import { GaugeDataItemOption } from 'echarts/types/src/chart/gauge/GaugeSeries';
+import { CallbackDataParams } from 'echarts/types/src/util/types';
 import range from 'lodash/range';
 import { parseNumbersList } from '../utils/controls';
 import {
@@ -80,6 +81,12 @@ const calculateAxisLineWidth = (
   overlap: boolean,
 ): number => (overlap ? fontSize : data.length * fontSize);
 
+const calculateMin = (data: GaugeDataItemOption[]) =>
+  2 * Math.min(...data.map(d => d.value as number).concat([0]));
+
+const calculateMax = (data: GaugeDataItemOption[]) =>
+  2 * Math.max(...data.map(d => d.value as number).concat([0]));
+
 export default function transformProps(
   chartProps: EchartsGaugeChartProps,
 ): GaugeChartTransformedProps {
@@ -115,12 +122,7 @@ export default function transformProps(
   const data = (queriesData[0]?.data || []) as DataRecord[];
   const numberFormatter = getNumberFormatter(numberFormat);
   const colorFn = CategoricalColorNamespace.getScale(colorScheme as string);
-  const normalizer = maxVal;
   const axisLineWidth = calculateAxisLineWidth(data, fontSize, overlap);
-  const axisLabels = range(minVal, maxVal, (maxVal - minVal) / splitNumber);
-  const axisLabelLength = Math.max(
-    ...axisLabels.map(label => numberFormatter(label).length).concat([1]),
-  );
   const groupbyLabels = groupby.map(getColumnLabel);
   const formatValue = (value: number) =>
     valueFormatter.replace('{value}', numberFormatter(value));
@@ -130,12 +132,6 @@ export default function transformProps(
     FONT_SIZE_MULTIPLIERS.titleOffsetFromTitle * fontSize;
   const detailOffsetFromTitle =
     FONT_SIZE_MULTIPLIERS.detailOffsetFromTitle * fontSize;
-  const intervalBoundsAndColors = setIntervalBoundsAndColors(
-    intervals,
-    intervalColorIndices,
-    colorFn,
-    normalizer,
-  );
   const columnsLabelMap = new Map<string, DataRecordValue[]>();
 
   const transformedData: GaugeDataItemOption[] = data.map(
@@ -196,6 +192,33 @@ export default function transformProps(
 
   const { setDataMask = () => {} } = hooks;
 
+  const min = minVal ?? calculateMin(transformedData);
+  const max = maxVal ?? calculateMax(transformedData);
+  const axisLabels = range(min, max, (max - min) / splitNumber);
+  const axisLabelLength = Math.max(
+    ...axisLabels.map(label => numberFormatter(label).length).concat([1]),
+  );
+  const normalizer = max;
+  const intervalBoundsAndColors = setIntervalBoundsAndColors(
+    intervals,
+    intervalColorIndices,
+    colorFn,
+    normalizer,
+  );
+  const splitLineDistance =
+    axisLineWidth + splitLineLength + OFFSETS.ticksFromLine;
+  const axisLabelDistance =
+    FONT_SIZE_MULTIPLIERS.axisLabelDistance *
+      fontSize *
+      FONT_SIZE_MULTIPLIERS.axisLabelLength *
+      axisLabelLength +
+    (showSplitLine ? splitLineLength : 0) +
+    (showAxisTick ? axisTickLength : 0) +
+    OFFSETS.ticksFromLine -
+    axisLineWidth;
+  const axisTickDistance =
+    axisLineWidth + axisTickLength + OFFSETS.ticksFromLine;
+
   const progress = {
     show: showProgress,
     overlap,
@@ -204,7 +227,7 @@ export default function transformProps(
   };
   const splitLine = {
     show: showSplitLine,
-    distance: -axisLineWidth - splitLineLength - OFFSETS.ticksFromLine,
+    distance: -splitLineDistance,
     length: splitLineLength,
     lineStyle: {
       width: FONT_SIZE_MULTIPLIERS.splitLineWidth * fontSize,
@@ -219,22 +242,14 @@ export default function transformProps(
     },
   };
   const axisLabel = {
-    distance:
-      axisLineWidth -
-      FONT_SIZE_MULTIPLIERS.axisLabelDistance *
-        fontSize *
-        FONT_SIZE_MULTIPLIERS.axisLabelLength *
-        axisLabelLength -
-      (showSplitLine ? splitLineLength : 0) -
-      (showAxisTick ? axisTickLength : 0) -
-      OFFSETS.ticksFromLine,
+    distance: -axisLabelDistance,
     fontSize,
     formatter: numberFormatter,
     color: gaugeSeriesOptions.axisLabel?.color,
   };
   const axisTick = {
     show: showAxisTick,
-    distance: -axisLineWidth - axisTickLength - OFFSETS.ticksFromLine,
+    distance: -axisTickDistance,
     length: axisTickLength,
     lineStyle: gaugeSeriesOptions.axisTick?.lineStyle as AxisTickLineStyle,
   };
@@ -243,8 +258,14 @@ export default function transformProps(
     formatter: (value: number) => formatValue(value),
     color: gaugeSeriesOptions.detail?.color,
   };
-  let pointer;
+  const tooltip = {
+    formatter: (params: CallbackDataParams) => {
+      const { name, value } = params;
+      return `${name} : ${formatValue(value as number)}`;
+    },
+  };
 
+  let pointer;
   if (intervalBoundsAndColors.length) {
     splitLine.lineStyle.color =
       INTERVAL_GAUGE_SERIES_OPTION.splitLine?.lineStyle?.color;
@@ -269,8 +290,8 @@ export default function transformProps(
       type: 'gauge',
       startAngle,
       endAngle,
-      min: minVal,
-      max: maxVal,
+      min,
+      max,
       progress,
       animation,
       axisLine: axisLine as GaugeSeriesOption['axisLine'],
@@ -280,11 +301,19 @@ export default function transformProps(
       axisTick,
       pointer,
       detail,
+      tooltip,
+      radius:
+        Math.min(width, height) / 2 - axisLabelDistance - axisTickDistance,
+      center: ['50%', '55%'],
       data: transformedData,
     },
   ];
 
   const echartOptions: EChartsCoreOption = {
+    tooltip: {
+      appendToBody: true,
+      trigger: 'item',
+    },
     series,
   };
 
diff --git a/superset-frontend/plugins/plugin-chart-echarts/src/Gauge/types.ts b/superset-frontend/plugins/plugin-chart-echarts/src/Gauge/types.ts
index f6a1b09ad6..d72bb283f9 100644
--- a/superset-frontend/plugins/plugin-chart-echarts/src/Gauge/types.ts
+++ b/superset-frontend/plugins/plugin-chart-echarts/src/Gauge/types.ts
@@ -34,8 +34,8 @@ export type EchartsGaugeFormData = QueryFormData & {
   groupby: QueryFormColumn[];
   metric?: string;
   rowLimit: number;
-  minVal: number;
-  maxVal: number;
+  minVal: number | null;
+  maxVal: number | null;
   fontSize: number;
   numberFormat: string;
   animation: boolean;
@@ -58,8 +58,8 @@ export const DEFAULT_FORM_DATA: Partial<EchartsGaugeFormData> = {
   ...DEFAULT_LEGEND_FORM_DATA,
   groupby: [],
   rowLimit: 10,
-  minVal: 0,
-  maxVal: 100,
+  minVal: null,
+  maxVal: null,
   fontSize: 15,
   numberFormat: 'SMART_NUMBER',
   animation: true,


[superset] 21/29: fix: cached common bootstrap Revert (#21018) (#21419)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 3d3ea39eb9d3205573c8bd08c67c2e4ae5fc1e58
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Sat Sep 10 01:02:24 2022 +0100

    fix: cached common bootstrap Revert (#21018) (#21419)
    
    (cherry picked from commit 094400c308b7e16fbabc9c4287054c298ff95899)
---
 superset/views/base.py                | 8 +-------
 tests/integration_tests/core_tests.py | 4 +---
 2 files changed, 2 insertions(+), 10 deletions(-)

diff --git a/superset/views/base.py b/superset/views/base.py
index f5a8c30184..9460b8d1ae 100644
--- a/superset/views/base.py
+++ b/superset/views/base.py
@@ -71,7 +71,6 @@ from superset.exceptions import (
     SupersetException,
     SupersetSecurityException,
 )
-from superset.extensions import cache_manager
 from superset.models.helpers import ImportExportMixin
 from superset.models.reports import ReportRecipientType
 from superset.superset_typing import FlaskResponse
@@ -344,13 +343,8 @@ def menu_data() -> Dict[str, Any]:
     }
 
 
-@cache_manager.cache.memoize(timeout=60)
 def common_bootstrap_payload() -> Dict[str, Any]:
-    """Common data always sent to the client
-
-    The function is memoized as the return value only changes based
-    on configuration and feature flag values.
-    """
+    """Common data always sent to the client"""
     messages = get_flashed_messages(with_categories=True)
     locale = str(get_locale())
 
diff --git a/tests/integration_tests/core_tests.py b/tests/integration_tests/core_tests.py
index 6aa1eac0ec..58943246c5 100644
--- a/tests/integration_tests/core_tests.py
+++ b/tests/integration_tests/core_tests.py
@@ -62,7 +62,7 @@ from superset.connectors.sqla.models import SqlaTable
 from superset.db_engine_specs.base import BaseEngineSpec
 from superset.db_engine_specs.mssql import MssqlEngineSpec
 from superset.exceptions import SupersetException
-from superset.extensions import async_query_manager, cache_manager
+from superset.extensions import async_query_manager
 from superset.models import core as models
 from superset.models.annotations import Annotation, AnnotationLayer
 from superset.models.dashboard import Dashboard
@@ -1400,8 +1400,6 @@ class TestCore(SupersetTestCase):
         """
         Functions in feature flags don't break bootstrap data serialization.
         """
-        # feature flags are cached
-        cache_manager.cache.clear()
         self.login()
 
         encoded = json.dumps(


[superset] 27/29: fix(explore): Prevent unnecessary series limit subquery (#21154)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 26963d371ca8fb5026ef778340e6f7eea3b26960
Author: Cody Leff <co...@preset.io>
AuthorDate: Fri Aug 26 10:16:13 2022 -0700

    fix(explore): Prevent unnecessary series limit subquery (#21154)
    
    * Prevent series limit when no series limit columns specified.
    
    * Add timeseries check for legacy charts.
    
    * Apply fix to helpers.py.
    
    * Skip Cypress color consistency tests.
---
 superset/connectors/sqla/models.py | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)

diff --git a/superset/connectors/sqla/models.py b/superset/connectors/sqla/models.py
index 7427709465..f06b7fa0b0 100644
--- a/superset/connectors/sqla/models.py
+++ b/superset/connectors/sqla/models.py
@@ -1405,7 +1405,9 @@ class SqlaTable(Model, BaseDatasource):  # pylint: disable=too-many-public-metho
                         col=selected, template_processor=template_processor
                     )
                 groupby_all_columns[outer.name] = outer
-                if not series_column_names or outer.name in series_column_names:
+                if (
+                    is_timeseries and not series_column_names
+                ) or outer.name in series_column_names:
                     groupby_series_columns[outer.name] = outer
                 select_exprs.append(outer)
         elif columns:


[superset] 01/29: fix: logger message (#20714)

Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

elizabeth pushed a commit to branch 2.0-test
in repository https://gitbox.apache.org/repos/asf/superset.git

commit 56137ebbe5197c68511fdc6edd705cc79592e050
Author: Beto Dealmeida <ro...@dealmeida.net>
AuthorDate: Fri Jul 15 07:51:59 2022 -0700

    fix: logger message (#20714)
    
    (cherry picked from commit c70d102b73704b301d1d2902680cfbf1c0dda605)
---
 superset/db_engine_specs/__init__.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/superset/db_engine_specs/__init__.py b/superset/db_engine_specs/__init__.py
index 4474f2a748..dac7001995 100644
--- a/superset/db_engine_specs/__init__.py
+++ b/superset/db_engine_specs/__init__.py
@@ -71,7 +71,7 @@ def load_engine_specs() -> List[Type[BaseEngineSpec]]:
         try:
             engine_spec = ep.load()
         except Exception:  # pylint: disable=broad-except
-            logger.warning("Unable to load Superset DB engine spec: %s", engine_spec)
+            logger.warning("Unable to load Superset DB engine spec: %s", ep.name)
             continue
         engine_specs.append(engine_spec)