You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@superset.apache.org by vi...@apache.org on 2020/09/05 19:35:58 UTC
[incubator-superset] branch 0.37 updated (0dbc1db -> 4ce3bd1)
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a change to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git.
from 0dbc1db add changelog entries
new ee70955 fix(log): don't log exceptions on test connection (#10522)
new f3feb9c fix(log): log endpoint authentication (#10435)
new 80e19fb fix: disable false positive error (#10576)
new 8619898 feat: make screenshot timeout configurable (#10517)
new 003161e fix: dataset delete and perm delete (#10578)
new 1c7022e fix: pie chart multiple groupbys (#10391)
new d86e4e6 fix: support non-string groupbys for pie chart (#10493)
new fdb9918 update code (#10430)
new 9a7c392 bugfix: table chart query mode initial value (#10544)
new fdc829d fix: handle query exceptions gracefully (#10548)
new 6744bab fix: embedded chart height (#10551)
new 2f9ff1e fix: show error if rolling window returns empty df (#10572)
new eebe3c4 fix: table viz query mode switch not working (#10552)
new b9f465c fix(dashboard): add animation state to fix tab switch re-renders (#10475)
new cbba961 fix: update time range select tooltip (#10458)
new 19cc65b fix: allow creating table option and remove schema requirement in dataset add modal (#10369)
new a037a47 fix: dedup groupby in viz.py while preserving order (#10633)
new fbe6b29 fix(jinja): extract form_data from json body (#10684)
new f04d067 fix(filter-box): don't add empty filter to filtersChoices (#10687)
new 7a67a28 feat(row-level-security): add hook for customizing form dropdowns (#10683)
new df1deb2 feat(viz-plugins): add date formatting to pivot-table (#10637)
new 634a90a refactor(database): use SupersetResultSet on SqlaTable.get_df() (#10707)
new ce00c3d fix(db-engine-spec): execute oracle DML statement bug in sqllab (#10706)
new 0663d49 fix: remove unnecessary exception when exploring non-legacy viz plugins (#10538)
new d20125e fix: pivot table timestamp grouping (#10774)
new 4ce3bd1 security: disallow uuid package on jinja2 (#10794)
The 26 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails. The revisions
listed as "add" were already present in the repository and have only
been added to this reference.
Summary of changes:
UPDATING.md | 4 +
docs/sqllab.rst | 5 +-
setup.cfg | 2 +-
.../explore/visualizations/table.test.ts | 34 +++--
superset-frontend/package-lock.json | 18 +--
superset-frontend/package.json | 4 +-
.../components/gridComponents/Chart_spec.jsx | 10 +-
.../explore/components/FilterBox_spec.jsx | 61 ++++++++
superset-frontend/src/chart/Chart.jsx | 2 +
superset-frontend/src/chart/ChartRenderer.jsx | 7 +-
superset-frontend/src/components/TableSelector.jsx | 70 ++++++----
.../src/dashboard/actions/dashboardState.js | 8 ++
.../src/dashboard/components/DashboardBuilder.jsx | 7 +
.../dashboard/components/gridComponents/Chart.jsx | 27 +++-
.../dashboard/components/gridComponents/Tabs.jsx | 11 +-
.../src/dashboard/containers/DashboardBuilder.jsx | 2 +
.../dashboard/containers/DashboardComponent.jsx | 3 +-
.../src/dashboard/reducers/dashboardState.js | 10 ++
superset-frontend/src/explore/App.jsx | 4 +-
.../components/controls/DateFilterControl.jsx | 4 +-
superset-frontend/src/explore/controlUtils.js | 4 +-
.../src/views/datasetList/DatasetModal.tsx | 4 +-
.../src/visualizations/FilterBox/FilterBox.jsx | 2 +-
superset/common/query_context.py | 6 +-
superset/config.py | 16 +++
superset/connectors/sqla/models.py | 85 ++++++++---
superset/connectors/sqla/views.py | 3 +
superset/datasets/commands/delete.py | 23 ++-
superset/db_engine_specs/base.py | 6 +-
superset/db_engine_specs/bigquery.py | 4 +-
superset/db_engine_specs/exasol.py | 6 +-
superset/db_engine_specs/hive.py | 6 +-
superset/db_engine_specs/mssql.py | 4 +-
superset/db_engine_specs/oracle.py | 15 +-
superset/db_engine_specs/postgres.py | 4 +-
superset/extensions.py | 5 +-
superset/models/core.py | 14 +-
superset/models/slice.py | 13 +-
superset/tasks/slack_util.py | 5 +-
superset/translations/messages.pot | 4 +-
superset/typing.py | 4 +-
superset/utils/core.py | 10 ++
superset/utils/decorators.py | 2 +-
superset/utils/log.py | 2 +-
superset/utils/screenshots.py | 8 +-
superset/views/core.py | 28 ++--
superset/views/utils.py | 8 ++
superset/viz.py | 155 ++++++++++++++++-----
tests/charts/api_tests.py | 20 ++-
tests/datasets/api_tests.py | 7 +
tests/datasource_tests.py | 2 -
tests/sqla_models_tests.py | 25 ++++
tests/utils_tests.py | 7 +
tests/viz_tests.py | 115 ++++++++++++++-
54 files changed, 735 insertions(+), 180 deletions(-)
create mode 100644 superset-frontend/spec/javascripts/explore/components/FilterBox_spec.jsx
[incubator-superset] 06/26: fix: pie chart multiple groupbys
(#10391)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit 1c7022e452efa2b8824b7c73882674715aaff382
Author: Ville Brofeldt <33...@users.noreply.github.com>
AuthorDate: Thu Jul 23 09:22:48 2020 +0300
fix: pie chart multiple groupbys (#10391)
---
superset/viz.py | 8 ++++----
1 file changed, 4 insertions(+), 4 deletions(-)
diff --git a/superset/viz.py b/superset/viz.py
index 8cb2aa6..3ce9434 100644
--- a/superset/viz.py
+++ b/superset/viz.py
@@ -1547,10 +1547,10 @@ class DistributionPieViz(NVD3Viz):
if df.empty:
return None
metric = self.metric_labels[0]
- df = df.pivot_table(index=self.groupby, values=[metric])
- df.sort_values(by=metric, ascending=False, inplace=True)
- df = df.reset_index()
- df.columns = ["x", "y"]
+ df = pd.DataFrame(
+ {"x": df[self.groupby].agg(func=", ".join, axis=1), "y": df[metric]}
+ )
+ df.sort_values(by="y", ascending=False, inplace=True)
return df.to_dict(orient="records")
[incubator-superset] 08/26: update code (#10430)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit fdb99181ce8acc2a555f80d3991a6005867df09b
Author: Stuart Hu <sh...@improbable.io>
AuthorDate: Mon Jul 27 17:48:11 2020 +0800
update code (#10430)
Signed-off-by: Stuart Hu <sh...@improbable.io>
---
superset-frontend/package-lock.json | 12 ++++++------
superset-frontend/package.json | 2 +-
2 files changed, 7 insertions(+), 7 deletions(-)
diff --git a/superset-frontend/package-lock.json b/superset-frontend/package-lock.json
index 65e6821..54a5fc6 100644
--- a/superset-frontend/package-lock.json
+++ b/superset-frontend/package-lock.json
@@ -6130,9 +6130,9 @@
}
},
"@superset-ui/legacy-preset-chart-nvd3": {
- "version": "0.14.9",
- "resolved": "https://registry.npmjs.org/@superset-ui/legacy-preset-chart-nvd3/-/legacy-preset-chart-nvd3-0.14.9.tgz",
- "integrity": "sha512-VUSOxpRXoTU49OofNzPoXTNpr2MzRRMRfbaPlbQQOwtY5/+9dFa2SfbsD0fVDMtWiX2jPBQeIunAhlrtNUhrBg==",
+ "version": "0.14.17",
+ "resolved": "https://registry.npmjs.org/@superset-ui/legacy-preset-chart-nvd3/-/legacy-preset-chart-nvd3-0.14.17.tgz",
+ "integrity": "sha512-n5mkaO9bqNcN2uSpXASIq3z6WmgajVxE6NV6qtDGOCGDlTnSylwfPVNVURdBhg2mfJT+mFeuYfx54xyn7s20sg==",
"requires": {
"@data-ui/xy-chart": "^0.0.84",
"d3": "^3.5.17",
@@ -12815,9 +12815,9 @@
}
},
"dompurify": {
- "version": "2.0.7",
- "resolved": "https://registry.npmjs.org/dompurify/-/dompurify-2.0.7.tgz",
- "integrity": "sha512-S3O0lk6rFJtO01ZTzMollCOGg+WAtCwS3U5E2WSDY/x/sy7q70RjEC4Dmrih5/UqzLLB9XoKJ8KqwBxaNvBu4A=="
+ "version": "2.0.12",
+ "resolved": "https://registry.npmjs.org/dompurify/-/dompurify-2.0.12.tgz",
+ "integrity": "sha512-Fl8KseK1imyhErHypFPA8qpq9gPzlsJ/EukA6yk9o0gX23p1TzC+rh9LqNg1qvErRTc0UNMYlKxEGSfSh43NDg=="
},
"domutils": {
"version": "1.5.1",
diff --git a/superset-frontend/package.json b/superset-frontend/package.json
index 9f74206..51bb284 100644
--- a/superset-frontend/package.json
+++ b/superset-frontend/package.json
@@ -89,7 +89,7 @@
"@superset-ui/legacy-plugin-chart-world-map": "^0.14.16",
"@superset-ui/legacy-preset-chart-big-number": "^0.14.9",
"@superset-ui/legacy-preset-chart-deckgl": "^0.2.4",
- "@superset-ui/legacy-preset-chart-nvd3": "^0.14.9",
+ "@superset-ui/legacy-preset-chart-nvd3": "^0.14.17",
"@superset-ui/number-format": "^0.14.9",
"@superset-ui/plugin-chart-table": "^0.14.16",
"@superset-ui/plugin-chart-word-cloud": "^0.14.9",
[incubator-superset] 03/26: fix: disable false positive error
(#10576)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit 80e19fbe184b2b6c9ecc20cb5afbda3c8b093f6b
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Tue Aug 11 18:20:57 2020 +0100
fix: disable false positive error (#10576)
---
superset/utils/decorators.py | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/superset/utils/decorators.py b/superset/utils/decorators.py
index bb0219c..694e07b 100644
--- a/superset/utils/decorators.py
+++ b/superset/utils/decorators.py
@@ -109,7 +109,7 @@ def etag_cache(max_age: int, check_perms: Callable[..., Any]) -> Callable[..., A
except Exception: # pylint: disable=broad-except
if app.debug:
raise
- logger.exception("Exception possibly due to cache backend.")
+ logger.exception("Exception possibly due to cache backend.")
return response.make_conditional(request)
[incubator-superset] 19/26: fix(filter-box): don't add empty filter
to filtersChoices (#10687)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit f04d0678cdbca62ebb27194452526d657d3527a3
Author: Ville Brofeldt <33...@users.noreply.github.com>
AuthorDate: Thu Aug 27 12:36:02 2020 +0300
fix(filter-box): don't add empty filter to filtersChoices (#10687)
* fix(filter-box): don't add empty filter to filtersChoices
* add test
---
.../explore/components/FilterBox_spec.jsx | 61 ++++++++++++++++++++++
.../src/visualizations/FilterBox/FilterBox.jsx | 2 +-
2 files changed, 62 insertions(+), 1 deletion(-)
diff --git a/superset-frontend/spec/javascripts/explore/components/FilterBox_spec.jsx b/superset-frontend/spec/javascripts/explore/components/FilterBox_spec.jsx
new file mode 100644
index 0000000..bc3362e
--- /dev/null
+++ b/superset-frontend/spec/javascripts/explore/components/FilterBox_spec.jsx
@@ -0,0 +1,61 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+import React from 'react';
+import { shallow } from 'enzyme';
+
+import FilterBox from 'src/visualizations/FilterBox/FilterBox';
+
+describe('FilterBox', () => {
+ it('should only add defined non-predefined options to filtersChoices', () => {
+ const wrapper = shallow(
+ <FilterBox
+ chartId={1001}
+ datasource={{ id: 1 }}
+ filtersChoices={{
+ name: [
+ { id: 'John', text: 'John', metric: 1234 },
+ { id: 'Jane', text: 'Jane', metric: 345678 },
+ ],
+ }}
+ filtersFields={[
+ {
+ asc: false,
+ clearable: true,
+ column: 'name',
+ key: 'name',
+ label: 'name',
+ metric: 'sum__COUNT',
+ multiple: true,
+ },
+ ]}
+ origSelectedValues={{}}
+ />,
+ );
+ const inst = wrapper.instance();
+ // choose a predefined value
+ inst.setState({ selectedValues: { name: ['John'] } });
+ expect(inst.props.filtersChoices.name.length).toEqual(2);
+ // reset selection
+ inst.setState({ selectedValues: { name: null } });
+ expect(inst.props.filtersChoices.name.length).toEqual(2);
+ // Add a new name
+ inst.setState({ selectedValues: { name: 'James' } });
+ expect(inst.props.filtersChoices.name.length).toEqual(3);
+ });
+});
diff --git a/superset-frontend/src/visualizations/FilterBox/FilterBox.jsx b/superset-frontend/src/visualizations/FilterBox/FilterBox.jsx
index 57e0a38..ef06c68 100644
--- a/superset-frontend/src/visualizations/FilterBox/FilterBox.jsx
+++ b/superset-frontend/src/visualizations/FilterBox/FilterBox.jsx
@@ -331,7 +331,7 @@ class FilterBox extends React.Component {
? selectedValues[key]
: [selectedValues[key]];
selectedValuesForKey
- .filter(value => !choiceIds.has(value))
+ .filter(value => value !== null && !choiceIds.has(value))
.forEach(value => {
choices.unshift({
filter: key,
[incubator-superset] 05/26: fix: dataset delete and perm delete
(#10578)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit 003161ebe57ad03aab454c8a115a514c7068d34b
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Thu Aug 13 10:18:13 2020 +0100
fix: dataset delete and perm delete (#10578)
---
superset/datasets/commands/delete.py | 23 ++++++++++++++++++++---
tests/datasets/api_tests.py | 7 +++++++
2 files changed, 27 insertions(+), 3 deletions(-)
diff --git a/superset/datasets/commands/delete.py b/superset/datasets/commands/delete.py
index 551d222..236fadd 100644
--- a/superset/datasets/commands/delete.py
+++ b/superset/datasets/commands/delete.py
@@ -46,10 +46,27 @@ class DeleteDatasetCommand(BaseCommand):
def run(self) -> Model:
self.validate()
try:
- dataset = DatasetDAO.delete(self._model, commit=False)
- security_manager.del_permission_view_menu(
- "datasource_access", dataset.get_perm()
+ view_menu = (
+ security_manager.find_view_menu(self._model.get_perm())
+ if self._model
+ else None
+ )
+ if not view_menu:
+ logger.error(
+ "Could not find the data access permission for the dataset"
+ )
+ raise DatasetDeleteFailedError()
+ permission_views = (
+ db.session.query(security_manager.permissionview_model)
+ .filter_by(view_menu=view_menu)
+ .all()
)
+ dataset = DatasetDAO.delete(self._model, commit=False)
+
+ for permission_view in permission_views:
+ db.session.delete(permission_view)
+ if view_menu:
+ db.session.delete(view_menu)
db.session.commit()
except (SQLAlchemyError, DAODeleteFailedError) as ex:
logger.exception(ex)
diff --git a/tests/datasets/api_tests.py b/tests/datasets/api_tests.py
index bd634ea..d230b2b 100644
--- a/tests/datasets/api_tests.py
+++ b/tests/datasets/api_tests.py
@@ -602,10 +602,17 @@ class TestDatasetApi(SupersetTestCase):
Dataset API: Test delete dataset item
"""
dataset = self.insert_default_dataset()
+ view_menu = security_manager.find_view_menu(dataset.get_perm())
+ self.assertIsNotNone(view_menu)
+ view_menu_id = view_menu.id
self.login(username="admin")
uri = f"api/v1/dataset/{dataset.id}"
rv = self.client.delete(uri)
self.assertEqual(rv.status_code, 200)
+ non_view_menu = db.session.query(security_manager.viewmenu_model).get(
+ view_menu_id
+ )
+ self.assertIsNone(non_view_menu)
def test_delete_item_dataset_not_owned(self):
"""
[incubator-superset] 01/26: fix(log): don't log exceptions on test
connection (#10522)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit ee709556adbebc32d1e60501f0427ea1f8649d9f
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Thu Aug 6 07:58:22 2020 +0100
fix(log): don't log exceptions on test connection (#10522)
* fix(log): don't log exceptions on test connection
* fix lint
---
superset/views/core.py | 20 ++++++++++----------
1 file changed, 10 insertions(+), 10 deletions(-)
diff --git a/superset/views/core.py b/superset/views/core.py
index bce09963..2bfb4b2 100755
--- a/superset/views/core.py
+++ b/superset/views/core.py
@@ -1125,10 +1125,10 @@ class Superset(BaseSupersetView): # pylint: disable=too-many-public-methods
conn.scalar(select([1]))
return json_success('"OK"')
except CertificateException as ex:
- logger.info(ex.message)
+ logger.info("Certificate exception")
return json_error_response(ex.message)
- except (NoSuchModuleError, ModuleNotFoundError) as ex:
- logger.info("Invalid driver %s", ex)
+ except (NoSuchModuleError, ModuleNotFoundError):
+ logger.info("Invalid driver")
driver_name = make_url(uri).drivername
return json_error_response(
_(
@@ -1137,24 +1137,24 @@ class Superset(BaseSupersetView): # pylint: disable=too-many-public-methods
),
400,
)
- except ArgumentError as ex:
- logger.info("Invalid URI %s", ex)
+ except ArgumentError:
+ logger.info("Invalid URI")
return json_error_response(
_(
"Invalid connection string, a valid string usually follows:\n"
"'DRIVER://USER:PASSWORD@DB-HOST/DATABASE-NAME'"
)
)
- except OperationalError as ex:
- logger.warning("Connection failed %s", ex)
+ except OperationalError:
+ logger.warning("Connection failed")
return json_error_response(
- _("Connection failed, please check your connection settings."), 400
+ _("Connection failed, please check your connection settings"), 400
)
except DBSecurityException as ex:
- logger.warning("Stopped an unsafe database connection. %s", ex)
+ logger.warning("Stopped an unsafe database connection")
return json_error_response(_(str(ex)), 400)
except Exception as ex: # pylint: disable=broad-except
- logger.error("Unexpected error %s", ex)
+ logger.error("Unexpected error %s", type(ex).__name__)
return json_error_response(
_("Unexpected error occurred, please check your logs for details"), 400
)
[incubator-superset] 09/26: bugfix: table chart query mode initial
value (#10544)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit 9a7c3929fa5afad97301a3dfbab8685423c0aede
Author: Jesse Yang <je...@airbnb.com>
AuthorDate: Fri Aug 7 02:25:48 2020 -0700
bugfix: table chart query mode initial value (#10544)
---
.../cypress/integration/explore/visualizations/table.test.ts | 3 ++-
superset-frontend/src/explore/controlUtils.js | 2 +-
2 files changed, 3 insertions(+), 2 deletions(-)
diff --git a/superset-frontend/cypress-base/cypress/integration/explore/visualizations/table.test.ts b/superset-frontend/cypress-base/cypress/integration/explore/visualizations/table.test.ts
index f01e1c7..8fa3041 100644
--- a/superset-frontend/cypress-base/cypress/integration/explore/visualizations/table.test.ts
+++ b/superset-frontend/cypress-base/cypress/integration/explore/visualizations/table.test.ts
@@ -165,8 +165,9 @@ describe('Visualization > Table', () => {
metrics: [],
row_limit: 10,
};
-
cy.visitChartByParams(JSON.stringify(formData));
+
+ cy.get('div[data-test="query_mode"] .btn.active').contains('Raw Records');
cy.verifySliceSuccess({ waitAlias: '@getJson', chartSelector: 'table' });
});
diff --git a/superset-frontend/src/explore/controlUtils.js b/superset-frontend/src/explore/controlUtils.js
index ce3022e..122214e 100644
--- a/superset-frontend/src/explore/controlUtils.js
+++ b/superset-frontend/src/explore/controlUtils.js
@@ -111,7 +111,6 @@ function handleMissingChoice(control) {
export function applyMapStateToPropsToControl(controlState, controlPanelState) {
const { mapStateToProps } = controlState;
- let { value } = controlState;
let state = { ...controlState };
if (mapStateToProps && controlPanelState) {
state = {
@@ -127,6 +126,7 @@ export function applyMapStateToPropsToControl(controlState, controlPanelState) {
delete state.default;
}
}
+ let { value } = state;
// If no current value, set it as default
if (state.default && value === undefined) {
value = state.default;
[incubator-superset] 25/26: fix: pivot table timestamp grouping
(#10774)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit d20125e57fbc7cbb4148d3bce4d763f08a6f2029
Author: Ville Brofeldt <33...@users.noreply.github.com>
AuthorDate: Thu Sep 3 19:49:54 2020 +0300
fix: pivot table timestamp grouping (#10774)
* fix: pivot table timestamp grouping
* address comments
---
superset/viz.py | 46 +++++++++++++++++++++++++++++++++-------------
tests/viz_tests.py | 34 +++++++++++++++++++++++++++++++++-
2 files changed, 66 insertions(+), 14 deletions(-)
diff --git a/superset/viz.py b/superset/viz.py
index bdf4c53..df00d43 100644
--- a/superset/viz.py
+++ b/superset/viz.py
@@ -27,7 +27,7 @@ import logging
import math
import re
from collections import defaultdict, OrderedDict
-from datetime import datetime, timedelta
+from datetime import date, datetime, timedelta
from itertools import product
from typing import (
Any,
@@ -602,11 +602,11 @@ class TableViz(BaseViz):
def process_metrics(self) -> None:
"""Process form data and store parsed column configs.
- 1. Determine query mode based on form_data params.
- - Use `query_mode` if it has a valid value
- - Set as RAW mode if `all_columns` is set
- - Otherwise defaults to AGG mode
- 2. Determine output columns based on query mode.
+ 1. Determine query mode based on form_data params.
+ - Use `query_mode` if it has a valid value
+ - Set as RAW mode if `all_columns` is set
+ - Otherwise defaults to AGG mode
+ 2. Determine output columns based on query mode.
"""
# Verify form data first: if not specifying query mode, then cannot have both
# GROUP BY and RAW COLUMNS.
@@ -813,6 +813,31 @@ class PivotTableViz(BaseViz):
# only min and max work properly for non-numerics
return aggfunc if aggfunc in ("min", "max") else "max"
+ @staticmethod
+ def _format_datetime(value: Union[pd.Timestamp, datetime, date, str]) -> str:
+ """
+ Format a timestamp in such a way that the viz will be able to apply
+ the correct formatting in the frontend.
+
+ :param value: the value of a temporal column
+ :return: formatted timestamp if it is a valid timestamp, otherwise
+ the original value
+ """
+ tstamp: Optional[pd.Timestamp] = None
+ if isinstance(value, pd.Timestamp):
+ tstamp = value
+ if isinstance(value, datetime) or isinstance(value, date):
+ tstamp = pd.Timestamp(value)
+ if isinstance(value, str):
+ try:
+ tstamp = pd.Timestamp(value)
+ except ValueError:
+ pass
+ if tstamp:
+ return f"__timestamp:{datetime_to_epoch(tstamp)}"
+ # fallback in case something incompatible is returned
+ return cast(str, value)
+
def get_data(self, df: pd.DataFrame) -> VizData:
if df.empty:
return None
@@ -828,15 +853,10 @@ class PivotTableViz(BaseViz):
groupby = self.form_data.get("groupby") or []
columns = self.form_data.get("columns") or []
- def _format_datetime(value: Any) -> Optional[str]:
- if isinstance(value, str):
- return f"__timestamp:{datetime_to_epoch(pd.Timestamp(value))}"
- return None
-
for column_name in groupby + columns:
column = self.datasource.get_column(column_name)
- if column and column.type in ("DATE", "DATETIME", "TIMESTAMP"):
- ts = df[column_name].apply(_format_datetime)
+ if column and column.is_temporal:
+ ts = df[column_name].apply(self._format_datetime)
df[column_name] = ts
if self.form_data.get("transpose_pivot"):
diff --git a/tests/viz_tests.py b/tests/viz_tests.py
index b52ed7f..6490a4f 100644
--- a/tests/viz_tests.py
+++ b/tests/viz_tests.py
@@ -16,7 +16,7 @@
# under the License.
# isort:skip_file
import uuid
-from datetime import datetime
+from datetime import date, datetime, timezone
import logging
from math import nan
from unittest.mock import Mock, patch
@@ -1345,6 +1345,38 @@ class TestPivotTableViz(SupersetTestCase):
== "min"
)
+ def test_format_datetime_from_pd_timestamp(self):
+ tstamp = pd.Timestamp(datetime(2020, 9, 3, tzinfo=timezone.utc))
+ assert (
+ viz.PivotTableViz._format_datetime(tstamp) == "__timestamp:1599091200000.0"
+ )
+
+ def test_format_datetime_from_datetime(self):
+ tstamp = datetime(2020, 9, 3, tzinfo=timezone.utc)
+ assert (
+ viz.PivotTableViz._format_datetime(tstamp) == "__timestamp:1599091200000.0"
+ )
+
+ def test_format_datetime_from_date(self):
+ tstamp = date(2020, 9, 3)
+ assert (
+ viz.PivotTableViz._format_datetime(tstamp) == "__timestamp:1599091200000.0"
+ )
+
+ def test_format_datetime_from_string(self):
+ tstamp = "2020-09-03T00:00:00"
+ assert (
+ viz.PivotTableViz._format_datetime(tstamp) == "__timestamp:1599091200000.0"
+ )
+
+ def test_format_datetime_from_invalid_string(self):
+ tstamp = "abracadabra"
+ assert viz.PivotTableViz._format_datetime(tstamp) == tstamp
+
+ def test_format_datetime_from_int(self):
+ assert viz.PivotTableViz._format_datetime(123) == 123
+ assert viz.PivotTableViz._format_datetime(123.0) == 123.0
+
class TestDistributionPieViz(SupersetTestCase):
base_df = pd.DataFrame(
[incubator-superset] 13/26: fix: table viz query mode switch not
working (#10552)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit eebe3c461a7d16a910ca4b04bd3ba26cc5eee28e
Author: Jesse Yang <je...@airbnb.com>
AuthorDate: Fri Aug 7 14:15:03 2020 -0700
fix: table viz query mode switch not working (#10552)
---
.../cypress/integration/explore/visualizations/table.test.ts | 10 ++++++++++
superset-frontend/src/explore/controlUtils.js | 4 +++-
2 files changed, 13 insertions(+), 1 deletion(-)
diff --git a/superset-frontend/cypress-base/cypress/integration/explore/visualizations/table.test.ts b/superset-frontend/cypress-base/cypress/integration/explore/visualizations/table.test.ts
index 8fa3041..c7015d9 100644
--- a/superset-frontend/cypress-base/cypress/integration/explore/visualizations/table.test.ts
+++ b/superset-frontend/cypress-base/cypress/integration/explore/visualizations/table.test.ts
@@ -167,8 +167,18 @@ describe('Visualization > Table', () => {
};
cy.visitChartByParams(JSON.stringify(formData));
+ // should display in raw records mode
cy.get('div[data-test="query_mode"] .btn.active').contains('Raw Records');
+ cy.get('div[data-test="all_columns"]').should('be.visible');
+ cy.get('div[data-test="groupby"]').should('not.be.visible');
+
cy.verifySliceSuccess({ waitAlias: '@getJson', chartSelector: 'table' });
+
+ // should allow switch to aggregate mode
+ cy.get('div[data-test="query_mode"] .btn').contains('Aggregate').click();
+ cy.get('div[data-test="query_mode"] .btn.active').contains('Aggregate');
+ cy.get('div[data-test="all_columns"]').should('not.be.visible');
+ cy.get('div[data-test="groupby"]').should('be.visible');
});
it('Test table with columns, ordering, and row limit', () => {
diff --git a/superset-frontend/src/explore/controlUtils.js b/superset-frontend/src/explore/controlUtils.js
index 122214e..b9ed101 100644
--- a/superset-frontend/src/explore/controlUtils.js
+++ b/superset-frontend/src/explore/controlUtils.js
@@ -112,11 +112,14 @@ function handleMissingChoice(control) {
export function applyMapStateToPropsToControl(controlState, controlPanelState) {
const { mapStateToProps } = controlState;
let state = { ...controlState };
+ let { value } = state; // value is current user-input value
if (mapStateToProps && controlPanelState) {
state = {
...controlState,
...mapStateToProps(controlPanelState, controlState),
};
+ // `mapStateToProps` may also provide a value
+ value = value || state.value;
}
// If default is a function, evaluate it
if (typeof state.default === 'function') {
@@ -126,7 +129,6 @@ export function applyMapStateToPropsToControl(controlState, controlPanelState) {
delete state.default;
}
}
- let { value } = state;
// If no current value, set it as default
if (state.default && value === undefined) {
value = state.default;
[incubator-superset] 21/26: feat(viz-plugins): add date formatting
to pivot-table (#10637)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit df1deb2149a9d6b50aea9cffaf80838149de1b22
Author: Ville Brofeldt <33...@users.noreply.github.com>
AuthorDate: Wed Aug 19 23:55:59 2020 +0300
feat(viz-plugins): add date formatting to pivot-table (#10637)
* feat: make pivot table dates formattable
* Bump npm packages
---
superset-frontend/package-lock.json | 12 ++++++------
superset-frontend/package.json | 4 ++--
superset/viz.py | 17 +++++++++++++++--
3 files changed, 23 insertions(+), 10 deletions(-)
diff --git a/superset-frontend/package-lock.json b/superset-frontend/package-lock.json
index 54a5fc6..4125e20 100644
--- a/superset-frontend/package-lock.json
+++ b/superset-frontend/package-lock.json
@@ -6024,9 +6024,9 @@
}
},
"@superset-ui/legacy-plugin-chart-pivot-table": {
- "version": "0.14.14",
- "resolved": "https://registry.npmjs.org/@superset-ui/legacy-plugin-chart-pivot-table/-/legacy-plugin-chart-pivot-table-0.14.14.tgz",
- "integrity": "sha512-+Dkzwv9yfiix6/u090RN2U2VF0Axa7sB+dB57UU5IS3zW1/jxW7e5ewEV2LsmcXQ0d87bWXvhUn+IGagkwx7Wg==",
+ "version": "0.14.21",
+ "resolved": "https://registry.npmjs.org/@superset-ui/legacy-plugin-chart-pivot-table/-/legacy-plugin-chart-pivot-table-0.14.21.tgz",
+ "integrity": "sha512-gmj3iu+ibkXwshcSna1V9Tmbh+wBCHi3HKTuy6R9KrB+0585U0dsHro3xe0o14Uamhld6PIeWbZBSl3axXK+SQ==",
"requires": {
"d3": "^3.5.17",
"datatables.net-bs": "^1.10.15",
@@ -6130,9 +6130,9 @@
}
},
"@superset-ui/legacy-preset-chart-nvd3": {
- "version": "0.14.17",
- "resolved": "https://registry.npmjs.org/@superset-ui/legacy-preset-chart-nvd3/-/legacy-preset-chart-nvd3-0.14.17.tgz",
- "integrity": "sha512-n5mkaO9bqNcN2uSpXASIq3z6WmgajVxE6NV6qtDGOCGDlTnSylwfPVNVURdBhg2mfJT+mFeuYfx54xyn7s20sg==",
+ "version": "0.14.21",
+ "resolved": "https://registry.npmjs.org/@superset-ui/legacy-preset-chart-nvd3/-/legacy-preset-chart-nvd3-0.14.21.tgz",
+ "integrity": "sha512-BbsVZnkkAL2a44XFYQtc24VNINGM5JwXAA9HbygdspumYTUu6cpH2nFVPwc06NREUeeN+EV/zF/AVW2O1IJ1tg==",
"requires": {
"@data-ui/xy-chart": "^0.0.84",
"d3": "^3.5.17",
diff --git a/superset-frontend/package.json b/superset-frontend/package.json
index 51bb284..0ef4a90 100644
--- a/superset-frontend/package.json
+++ b/superset-frontend/package.json
@@ -80,7 +80,7 @@
"@superset-ui/legacy-plugin-chart-paired-t-test": "^0.14.9",
"@superset-ui/legacy-plugin-chart-parallel-coordinates": "^0.14.9",
"@superset-ui/legacy-plugin-chart-partition": "^0.14.9",
- "@superset-ui/legacy-plugin-chart-pivot-table": "^0.14.14",
+ "@superset-ui/legacy-plugin-chart-pivot-table": "^0.14.21",
"@superset-ui/legacy-plugin-chart-rose": "^0.14.14",
"@superset-ui/legacy-plugin-chart-sankey": "^0.14.9",
"@superset-ui/legacy-plugin-chart-sankey-loop": "^0.14.9",
@@ -89,7 +89,7 @@
"@superset-ui/legacy-plugin-chart-world-map": "^0.14.16",
"@superset-ui/legacy-preset-chart-big-number": "^0.14.9",
"@superset-ui/legacy-preset-chart-deckgl": "^0.2.4",
- "@superset-ui/legacy-preset-chart-nvd3": "^0.14.17",
+ "@superset-ui/legacy-preset-chart-nvd3": "^0.14.21",
"@superset-ui/number-format": "^0.14.9",
"@superset-ui/plugin-chart-table": "^0.14.16",
"@superset-ui/plugin-chart-word-cloud": "^0.14.9",
diff --git a/superset/viz.py b/superset/viz.py
index 508d8aa..df1111a 100644
--- a/superset/viz.py
+++ b/superset/viz.py
@@ -72,6 +72,7 @@ from superset.utils.core import (
QueryMode,
to_adhoc,
)
+from superset.utils.dates import datetime_to_epoch
from superset.utils.hashing import md5_sha_from_str
if TYPE_CHECKING:
@@ -825,8 +826,20 @@ class PivotTableViz(BaseViz):
for metric in metrics:
aggfuncs[metric] = self.get_aggfunc(metric, df, self.form_data)
- groupby = self.form_data.get("groupby")
- columns = self.form_data.get("columns")
+ groupby = self.form_data.get("groupby") or []
+ columns = self.form_data.get("columns") or []
+
+ def _format_datetime(value: Any) -> Optional[str]:
+ if isinstance(value, str):
+ return f"__timestamp:{datetime_to_epoch(pd.Timestamp(value))}"
+ return None
+
+ for column_name in groupby + columns:
+ column = self.datasource.get_column(column_name)
+ if column and column.type in ("DATE", "DATETIME", "TIMESTAMP"):
+ ts = df[column_name].apply(_format_datetime)
+ df[column_name] = ts
+
if self.form_data.get("transpose_pivot"):
groupby, columns = columns, groupby
[incubator-superset] 12/26: fix: show error if rolling window
returns empty df (#10572)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit 2f9ff1e72785cb1d4f4a9de3747fb0f5e8176bb1
Author: Ville Brofeldt <33...@users.noreply.github.com>
AuthorDate: Thu Aug 13 20:51:03 2020 +0300
fix: show error if rolling window returns empty df (#10572)
* fix: show error if rolling window returns empty df
* add test
---
superset/viz.py | 8 ++++++++
tests/viz_tests.py | 23 ++++++++++++++++++++++-
2 files changed, 30 insertions(+), 1 deletion(-)
diff --git a/superset/viz.py b/superset/viz.py
index c67adf5..becdc4a 100644
--- a/superset/viz.py
+++ b/superset/viz.py
@@ -216,6 +216,14 @@ class BaseViz:
df = df.cumsum()
if min_periods:
df = df[min_periods:]
+ if df.empty:
+ raise QueryObjectValidationError(
+ _(
+ "Applied rolling window did not return any data. Please make sure "
+ "the source query satisfies the minimum periods defined in the "
+ "rolling window."
+ )
+ )
return df
def get_samples(self) -> List[Dict[str, Any]]:
diff --git a/tests/viz_tests.py b/tests/viz_tests.py
index b76c95c..b52ed7f 100644
--- a/tests/viz_tests.py
+++ b/tests/viz_tests.py
@@ -24,12 +24,13 @@ from typing import Any, Dict, List, Set
import numpy as np
import pandas as pd
+import pytest
import tests.test_app
import superset.viz as viz
from superset import app
from superset.constants import NULL_STRING
-from superset.exceptions import SpatialException
+from superset.exceptions import QueryObjectValidationError, SpatialException
from superset.utils.core import DTTM_ALIAS
from .base_tests import SupersetTestCase
@@ -1258,6 +1259,26 @@ class TestTimeSeriesViz(SupersetTestCase):
[1.0, 1.5, 2.0, 2.5],
)
+ def test_apply_rolling_without_data(self):
+ datasource = self.get_datasource_mock()
+ df = pd.DataFrame(
+ index=pd.to_datetime(
+ ["2019-01-01", "2019-01-02", "2019-01-05", "2019-01-07"]
+ ),
+ data={"y": [1.0, 2.0, 3.0, 4.0]},
+ )
+ test_viz = viz.BigNumberViz(
+ datasource,
+ {
+ "metrics": ["y"],
+ "rolling_type": "cumsum",
+ "rolling_periods": 4,
+ "min_periods": 4,
+ },
+ )
+ with pytest.raises(QueryObjectValidationError):
+ test_viz.apply_rolling(df)
+
class TestBigNumberViz(SupersetTestCase):
def test_get_data(self):
[incubator-superset] 24/26: fix: remove unnecessary exception when
exploring non-legacy viz plugins (#10538)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit 0663d490c9b5b8c8388956f9ad78b8b6180e44e9
Author: Ville Brofeldt <33...@users.noreply.github.com>
AuthorDate: Fri Aug 7 08:41:39 2020 +0300
fix: remove unnecessary exception when exploring non-legacy viz plugins (#10538)
* fix: remove unnecessary exception when exploring non-legacy viz plugins
* lint
---
superset/models/slice.py | 13 ++++++++-----
superset/utils/core.py | 10 ++++++++++
superset/viz.py | 2 +-
tests/utils_tests.py | 7 +++++++
4 files changed, 26 insertions(+), 6 deletions(-)
diff --git a/superset/models/slice.py b/superset/models/slice.py
index b8f1d93..c7f198b 100644
--- a/superset/models/slice.py
+++ b/superset/models/slice.py
@@ -155,10 +155,12 @@ class Slice(
@property # type: ignore
@utils.memoized
- def viz(self) -> BaseViz:
+ def viz(self) -> Optional[BaseViz]:
form_data = json.loads(self.params)
- viz_class = viz_types[self.viz_type]
- return viz_class(datasource=self.datasource, form_data=form_data)
+ viz_class = viz_types.get(self.viz_type)
+ if viz_class:
+ return viz_class(datasource=self.datasource, form_data=form_data)
+ return None
@property
def description_markeddown(self) -> str:
@@ -170,8 +172,9 @@ class Slice(
data: Dict[str, Any] = {}
self.token = ""
try:
- data = self.viz.data
- self.token = data.get("token") # type: ignore
+ viz = self.viz
+ data = viz.data if viz else self.form_data
+ self.token = utils.get_form_data_token(data)
except Exception as ex: # pylint: disable=broad-except
logger.exception(ex)
data["error"] = str(ex)
diff --git a/superset/utils/core.py b/superset/utils/core.py
index c464d78..297127e 100644
--- a/superset/utils/core.py
+++ b/superset/utils/core.py
@@ -1360,6 +1360,16 @@ def get_iterable(x: Any) -> List[Any]:
return x if isinstance(x, list) else [x]
+def get_form_data_token(form_data: Dict[str, Any]) -> str:
+ """
+ Return the token contained within form data or generate a new one.
+
+ :param form_data: chart form data
+ :return: original token if predefined, otherwise new uuid4 based token
+ """
+ return form_data.get("token") or "token_" + uuid.uuid4().hex[:8]
+
+
class LenientEnum(Enum):
"""Enums that do not raise ValueError when value is invalid"""
diff --git a/superset/viz.py b/superset/viz.py
index 1db8b32..bdf4c53 100644
--- a/superset/viz.py
+++ b/superset/viz.py
@@ -121,7 +121,7 @@ class BaseViz:
self.form_data = form_data
self.query = ""
- self.token = self.form_data.get("token", "token_" + uuid.uuid4().hex[:8])
+ self.token = utils.get_form_data_token(form_data)
self.groupby: List[str] = self.form_data.get("groupby") or []
self.time_shift = timedelta()
diff --git a/tests/utils_tests.py b/tests/utils_tests.py
index 4c2092a..7cb32b7 100644
--- a/tests/utils_tests.py
+++ b/tests/utils_tests.py
@@ -22,6 +22,7 @@ from decimal import Decimal
import hashlib
import json
import os
+import re
from unittest.mock import Mock, patch
import numpy
@@ -40,6 +41,7 @@ from superset.utils.core import (
convert_legacy_filters_into_adhoc,
create_ssl_cert_file,
format_timedelta,
+ get_form_data_token,
get_iterable,
get_email_address_list,
get_or_create_db,
@@ -1365,3 +1367,8 @@ class TestUtils(SupersetTestCase):
self.assertEqual("BaZ", validator("BaZ"))
self.assertRaises(marshmallow.ValidationError, validator, "qwerty")
self.assertRaises(marshmallow.ValidationError, validator, 4)
+
+ def test_get_form_data_token(self):
+ assert get_form_data_token({"token": "token_abcdefg1"}) == "token_abcdefg1"
+ generated_token = get_form_data_token({})
+ assert re.match(r"^token_[a-z0-9]{8}$", generated_token) is not None
[incubator-superset] 02/26: fix(log): log endpoint authentication
(#10435)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit f3feb9c1d6d0713a206d5198993cffc27552c2f9
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Wed Jul 29 09:32:10 2020 +0100
fix(log): log endpoint authentication (#10435)
* fix(log): log crashes if expired or not authenticated
* add auth to log endpoint
---
superset/utils/log.py | 2 +-
superset/views/core.py | 1 +
2 files changed, 2 insertions(+), 1 deletion(-)
diff --git a/superset/utils/log.py b/superset/utils/log.py
index 1b6e1b6..5b11d45 100644
--- a/superset/utils/log.py
+++ b/superset/utils/log.py
@@ -42,7 +42,7 @@ class AbstractEventLogger(ABC):
@functools.wraps(f)
def wrapper(*args: Any, **kwargs: Any) -> Any:
user_id = None
- if g.user:
+ if hasattr(g, "user") and g.user:
user_id = g.user.get_id()
payload = request.form.to_dict() or {}
diff --git a/superset/views/core.py b/superset/views/core.py
index 2bfb4b2..e7e23b7 100755
--- a/superset/views/core.py
+++ b/superset/views/core.py
@@ -1686,6 +1686,7 @@ class Superset(BaseSupersetView): # pylint: disable=too-many-public-methods
@api
@event_logger.log_this
+ @has_access
@expose("/log/", methods=["POST"])
def log(self) -> FlaskResponse: # pylint: disable=no-self-use
return Response(status=200)
[incubator-superset] 16/26: fix: allow creating table option and
remove schema requirement in dataset add modal (#10369)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit 19cc65beb6009a65e7ab5e6a747acb09892702f1
Author: ʈᵃᵢ <td...@gmail.com>
AuthorDate: Fri Jul 24 13:17:44 2020 -0700
fix: allow creating table option and remove schema requirement in dataset add modal (#10369)
(cherry picked from commit 09dfbab7ed7cdb518109fa3fb093ce20d52fa8af)
---
superset-frontend/src/components/TableSelector.jsx | 70 ++++++++++++++--------
.../src/views/datasetList/DatasetModal.tsx | 4 +-
2 files changed, 46 insertions(+), 28 deletions(-)
diff --git a/superset-frontend/src/components/TableSelector.jsx b/superset-frontend/src/components/TableSelector.jsx
index a477c85..3055955 100644
--- a/superset-frontend/src/components/TableSelector.jsx
+++ b/superset-frontend/src/components/TableSelector.jsx
@@ -19,7 +19,7 @@
import React from 'react';
import styled from '@superset-ui/style';
import PropTypes from 'prop-types';
-import { Select, AsyncSelect } from 'src/components/Select';
+import { AsyncSelect, CreatableSelect, Select } from 'src/components/Select';
import { ControlLabel, Label } from 'react-bootstrap';
import { t } from '@superset-ui/translation';
import { SupersetClient } from '@superset-ui/connection';
@@ -342,31 +342,49 @@ export default class TableSelector extends React.PureComponent {
tableSelectDisabled = true;
}
const options = this.state.tableOptions;
- const select = this.props.schema ? (
- <Select
- name="select-table"
- isLoading={this.state.tableLoading}
- ignoreAccents={false}
- placeholder={t('Select table or type table name')}
- autosize={false}
- onChange={this.changeTable}
- options={options}
- value={this.state.tableName}
- optionRenderer={this.renderTableOption}
- />
- ) : (
- <AsyncSelect
- name="async-select-table"
- placeholder={tableSelectPlaceholder}
- disabled={tableSelectDisabled}
- autosize={false}
- onChange={this.changeTable}
- value={this.state.tableName}
- loadOptions={this.getTableNamesBySubStr}
- optionRenderer={this.renderTableOption}
- isDisabled={this.props.formMode}
- />
- );
+ let select = null;
+ if (this.props.schema && !this.props.formMode) {
+ select = (
+ <Select
+ name="select-table"
+ isLoading={this.state.tableLoading}
+ ignoreAccents={false}
+ placeholder={t('Select table or type table name')}
+ autosize={false}
+ onChange={this.changeTable}
+ options={options}
+ value={this.state.tableName}
+ optionRenderer={this.renderTableOption}
+ />
+ );
+ } else if (this.props.formMode) {
+ select = (
+ <CreatableSelect
+ name="select-table"
+ isLoading={this.state.tableLoading}
+ ignoreAccents={false}
+ placeholder={t('Select table or type table name')}
+ autosize={false}
+ onChange={this.changeTable}
+ options={options}
+ value={this.state.tableName}
+ optionRenderer={this.renderTableOption}
+ />
+ );
+ } else {
+ select = (
+ <AsyncSelect
+ name="async-select-table"
+ placeholder={tableSelectPlaceholder}
+ isDisabled={tableSelectDisabled}
+ autosize={false}
+ onChange={this.changeTable}
+ value={this.state.tableName}
+ loadOptions={this.getTableNamesBySubStr}
+ optionRenderer={this.renderTableOption}
+ />
+ );
+ }
const refresh = !this.props.formMode && (
<RefreshLabel
onClick={() => this.changeSchema({ value: this.props.schema }, true)}
diff --git a/superset-frontend/src/views/datasetList/DatasetModal.tsx b/superset-frontend/src/views/datasetList/DatasetModal.tsx
index 0e43d26..d6a0fb7 100644
--- a/superset-frontend/src/views/datasetList/DatasetModal.tsx
+++ b/superset-frontend/src/views/datasetList/DatasetModal.tsx
@@ -65,7 +65,7 @@ const DatasetModal: FunctionComponent<DatasetModalProps> = ({
tableName: string;
}) => {
setDatasourceId(dbId);
- setDisableSave(isNil(dbId) || isEmpty(schema) || isEmpty(tableName));
+ setDisableSave(isNil(dbId) || isEmpty(tableName));
setSchema(schema);
setTableName(tableName);
};
@@ -73,7 +73,7 @@ const DatasetModal: FunctionComponent<DatasetModalProps> = ({
const onSave = () => {
const data = {
database: datasourceId,
- schema: currentSchema,
+ ...(currentSchema ? { schema: currentSchema } : {}),
table_name: currentTableName,
};
SupersetClient.post({
[incubator-superset] 26/26: security: disallow uuid package on
jinja2 (#10794)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit 4ce3bd1af8599673833c1ae9a649894b67935567
Author: Daniel Vaz Gaspar <da...@gmail.com>
AuthorDate: Fri Sep 4 16:37:14 2020 +0100
security: disallow uuid package on jinja2 (#10794)
* fix: disallow uuid package on jinja2
* update UPDATING.md
* Update UPDATING.md
Co-authored-by: Ville Brofeldt <33...@users.noreply.github.com>
Co-authored-by: Ville Brofeldt <33...@users.noreply.github.com>
---
UPDATING.md | 4 ++++
docs/sqllab.rst | 5 ++++-
superset/extensions.py | 5 ++++-
3 files changed, 12 insertions(+), 2 deletions(-)
diff --git a/UPDATING.md b/UPDATING.md
index 3755694..2a6dcb8 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -21,6 +21,10 @@ under the License.
This file documents any backwards-incompatible changes in Superset and
assists people when migrating to a new version.
+## 0.37.1
+
+* [10794](https://github.com/apache/incubator-superset/pull/10794): Breaking change: `uuid` python package is not supported on Jinja2 anymore, only uuid functions are exposed eg: `uuid1`, `uuid3`, `uuid4`, `uuid5`.
+
## 0.37.0
* [9964](https://github.com/apache/incubator-superset/pull/9964): Breaking change on Flask-AppBuilder 3. If you're using OAuth, find out what needs to be changed [here](https://github.com/dpgaspar/Flask-AppBuilder/blob/master/README.rst#change-log).
diff --git a/docs/sqllab.rst b/docs/sqllab.rst
index b582c53..27711cb 100644
--- a/docs/sqllab.rst
+++ b/docs/sqllab.rst
@@ -73,7 +73,10 @@ Superset's Jinja context:
- ``time``: ``time``
- ``datetime``: ``datetime.datetime``
-- ``uuid``: ``uuid``
+- ``uuid1``: ``uuid1``
+- ``uuid3``: ``uuid3``
+- ``uuid4``: ``uuid4``
+- ``uuid5``: ``uuid5``
- ``random``: ``random``
- ``relativedelta``: ``dateutil.relativedelta.relativedelta``
diff --git a/superset/extensions.py b/superset/extensions.py
index a0dad81..2a35166 100644
--- a/superset/extensions.py
+++ b/superset/extensions.py
@@ -48,7 +48,10 @@ class JinjaContextManager:
"relativedelta": relativedelta,
"time": time,
"timedelta": timedelta,
- "uuid": uuid,
+ "uuid1": uuid.uuid1,
+ "uuid3": uuid.uuid3,
+ "uuid4": uuid.uuid4,
+ "uuid5": uuid.uuid5,
}
self._template_processors: Dict[str, Type["BaseTemplateProcessor"]] = {}
[incubator-superset] 14/26: fix(dashboard): add animation state to
fix tab switch re-renders (#10475)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit b9f465cc7261fd068cb56de68fabaaba2a5ec00c
Author: Jesse Yang <je...@airbnb.com>
AuthorDate: Tue Aug 11 00:57:50 2020 -0700
fix(dashboard): add animation state to fix tab switch re-renders (#10475)
---
.../components/gridComponents/Chart_spec.jsx | 10 +++++++-
superset-frontend/src/chart/Chart.jsx | 2 ++
superset-frontend/src/chart/ChartRenderer.jsx | 7 ++----
.../src/dashboard/actions/dashboardState.js | 8 +++++++
.../src/dashboard/components/DashboardBuilder.jsx | 7 ++++++
.../dashboard/components/gridComponents/Chart.jsx | 27 ++++++++++++++++++++--
.../dashboard/components/gridComponents/Tabs.jsx | 11 ++++++++-
.../src/dashboard/containers/DashboardBuilder.jsx | 2 ++
.../dashboard/containers/DashboardComponent.jsx | 3 ++-
.../src/dashboard/reducers/dashboardState.js | 10 ++++++++
10 files changed, 77 insertions(+), 10 deletions(-)
diff --git a/superset-frontend/spec/javascripts/dashboard/components/gridComponents/Chart_spec.jsx b/superset-frontend/spec/javascripts/dashboard/components/gridComponents/Chart_spec.jsx
index d75b5fa..e7f3e42 100644
--- a/superset-frontend/spec/javascripts/dashboard/components/gridComponents/Chart_spec.jsx
+++ b/superset-frontend/spec/javascripts/dashboard/components/gridComponents/Chart_spec.jsx
@@ -20,7 +20,7 @@ import React from 'react';
import { shallow } from 'enzyme';
import sinon from 'sinon';
-import Chart from 'src/dashboard/components/gridComponents/Chart';
+import { ChartUnconnected as Chart } from 'src/dashboard/components/gridComponents/Chart';
import SliceHeader from 'src/dashboard/components/SliceHeader';
import ChartContainer from 'src/chart/ChartContainer';
@@ -44,6 +44,7 @@ describe('Chart', () => {
slice: {
...sliceEntities.slices[queryId],
description_markeddown: 'markdown',
+ owners: [],
},
sliceName: sliceEntities.slices[queryId].slice_name,
timeout: 60,
@@ -52,6 +53,13 @@ describe('Chart', () => {
toggleExpandSlice() {},
addFilter() {},
logEvent() {},
+ handleToggleFullSize() {},
+ changeFilter() {},
+ setFocusedFilterField() {},
+ unsetFocusedFilterField() {},
+ addDangerToast() {},
+ componentId: 'test',
+ dashboardId: 111,
editMode: false,
isExpanded: false,
supersetCanExplore: false,
diff --git a/superset-frontend/src/chart/Chart.jsx b/superset-frontend/src/chart/Chart.jsx
index 9562304..5991a00 100644
--- a/superset-frontend/src/chart/Chart.jsx
+++ b/superset-frontend/src/chart/Chart.jsx
@@ -62,6 +62,8 @@ const propTypes = {
onQuery: PropTypes.func,
onFilterMenuOpen: PropTypes.func,
onFilterMenuClose: PropTypes.func,
+ // id of the last mounted parent tab
+ mountedParent: PropTypes.string,
};
const BLANK = {};
diff --git a/superset-frontend/src/chart/ChartRenderer.jsx b/superset-frontend/src/chart/ChartRenderer.jsx
index e45130a..984e1b3 100644
--- a/superset-frontend/src/chart/ChartRenderer.jsx
+++ b/superset-frontend/src/chart/ChartRenderer.jsx
@@ -87,8 +87,7 @@ class ChartRenderer extends React.Component {
if (resultsReady) {
this.hasQueryResponseChange =
nextProps.queryResponse !== this.props.queryResponse;
-
- if (
+ return (
this.hasQueryResponseChange ||
nextProps.annotationData !== this.props.annotationData ||
nextProps.height !== this.props.height ||
@@ -96,9 +95,7 @@ class ChartRenderer extends React.Component {
nextProps.triggerRender ||
nextProps.formData.color_scheme !== this.props.formData.color_scheme ||
nextProps.cacheBusterProp !== this.props.cacheBusterProp
- ) {
- return true;
- }
+ );
}
return false;
}
diff --git a/superset-frontend/src/dashboard/actions/dashboardState.js b/superset-frontend/src/dashboard/actions/dashboardState.js
index 33737f8..5ecc335 100644
--- a/superset-frontend/src/dashboard/actions/dashboardState.js
+++ b/superset-frontend/src/dashboard/actions/dashboardState.js
@@ -321,6 +321,14 @@ export function setDirectPathToChild(path) {
return { type: SET_DIRECT_PATH, path };
}
+export const SET_MOUNTED_TAB = 'SET_MOUNTED_TAB';
+/**
+ * Set if tab switch animation is in progress
+ */
+export function setMountedTab(mountedTab) {
+ return { type: SET_MOUNTED_TAB, mountedTab };
+}
+
export const SET_FOCUSED_FILTER_FIELD = 'SET_FOCUSED_FILTER_FIELD';
export function setFocusedFilterField(chartId, column) {
return { type: SET_FOCUSED_FILTER_FIELD, chartId, column };
diff --git a/superset-frontend/src/dashboard/components/DashboardBuilder.jsx b/superset-frontend/src/dashboard/components/DashboardBuilder.jsx
index de9a94e..da3217e 100644
--- a/superset-frontend/src/dashboard/components/DashboardBuilder.jsx
+++ b/superset-frontend/src/dashboard/components/DashboardBuilder.jsx
@@ -62,6 +62,7 @@ const propTypes = {
handleComponentDrop: PropTypes.func.isRequired,
directPathToChild: PropTypes.arrayOf(PropTypes.string),
setDirectPathToChild: PropTypes.func.isRequired,
+ setMountedTab: PropTypes.func.isRequired,
};
const defaultProps = {
@@ -250,6 +251,12 @@ class DashboardBuilder extends React.Component {
<TabPane
key={index === 0 ? DASHBOARD_GRID_ID : id}
eventKey={index}
+ mountOnEnter
+ unmountOnExit={false}
+ onEntering={() => {
+ // Entering current tab, DOM is visible and has dimension
+ this.props.setMountedTab(id);
+ }}
>
<DashboardGrid
gridComponent={dashboardLayout[id]}
diff --git a/superset-frontend/src/dashboard/components/gridComponents/Chart.jsx b/superset-frontend/src/dashboard/components/gridComponents/Chart.jsx
index 4a443f1..de0e6e6 100644
--- a/superset-frontend/src/dashboard/components/gridComponents/Chart.jsx
+++ b/superset-frontend/src/dashboard/components/gridComponents/Chart.jsx
@@ -18,6 +18,7 @@
*/
import cx from 'classnames';
import React from 'react';
+import { connect } from 'react-redux';
import PropTypes from 'prop-types';
import { exploreChart, exportChart } from '../../../explore/exploreUtils';
import SliceHeader from '../SliceHeader';
@@ -41,6 +42,8 @@ const propTypes = {
height: PropTypes.number.isRequired,
updateSliceName: PropTypes.func.isRequired,
isComponentVisible: PropTypes.bool,
+ // last switched tab
+ mountedParent: PropTypes.string,
handleToggleFullSize: PropTypes.func.isRequired,
// from redux
@@ -70,6 +73,7 @@ const propTypes = {
const defaultProps = {
isCached: false,
isComponentVisible: true,
+ mountedParent: 'ROOT',
};
// we use state + shouldComponentUpdate() logic to prevent perf-wrecking
@@ -114,6 +118,9 @@ class Chart extends React.Component {
// allow chart update/re-render only if visible:
// under selected tab or no tab layout
if (nextProps.isComponentVisible) {
+ if (nextProps.mountedParent === null) {
+ return false;
+ }
if (nextProps.chart.triggerQuery) {
return true;
}
@@ -140,7 +147,7 @@ class Chart extends React.Component {
}
}
- // `cacheBusterProp` is nnjected by react-hot-loader
+ // `cacheBusterProp` is jected by react-hot-loader
return this.props.cacheBusterProp !== nextProps.cacheBusterProp;
}
@@ -346,4 +353,20 @@ class Chart extends React.Component {
Chart.propTypes = propTypes;
Chart.defaultProps = defaultProps;
-export default Chart;
+function mapStateToProps({ dashboardState }) {
+ return {
+ // needed to prevent chart from rendering while tab switch animation in progress
+ // when undefined, default to have mounted the root tab
+ mountedParent: dashboardState?.mountedTab,
+ };
+}
+
+/**
+ * The original Chart component not connected to state.
+ */
+export const ChartUnconnected = Chart;
+
+/**
+ * Redux connected Chart component.
+ */
+export default connect(mapStateToProps, null)(Chart);
diff --git a/superset-frontend/src/dashboard/components/gridComponents/Tabs.jsx b/superset-frontend/src/dashboard/components/gridComponents/Tabs.jsx
index 8c812c4..0ac44fb 100644
--- a/superset-frontend/src/dashboard/components/gridComponents/Tabs.jsx
+++ b/superset-frontend/src/dashboard/components/gridComponents/Tabs.jsx
@@ -47,9 +47,12 @@ const propTypes = {
renderTabContent: PropTypes.bool, // whether to render tabs + content or just tabs
editMode: PropTypes.bool.isRequired,
renderHoverMenu: PropTypes.bool,
- logEvent: PropTypes.func.isRequired,
directPathToChild: PropTypes.arrayOf(PropTypes.string),
+ // actions (from DashboardComponent.jsx)
+ logEvent: PropTypes.func.isRequired,
+ setMountedTab: PropTypes.func.isRequired,
+
// grid related
availableColumnCount: PropTypes.number,
columnWidth: PropTypes.number,
@@ -260,6 +263,12 @@ class Tabs extends React.PureComponent {
onDeleteTab={this.handleDeleteTab}
/>
}
+ onEntering={() => {
+ // Entering current tab, DOM is visible and has dimension
+ if (renderTabContent) {
+ this.props.setMountedTab(tabId);
+ }
+ }}
>
{renderTabContent && (
<DashboardComponent
diff --git a/superset-frontend/src/dashboard/containers/DashboardBuilder.jsx b/superset-frontend/src/dashboard/containers/DashboardBuilder.jsx
index 9887295..a508b5a 100644
--- a/superset-frontend/src/dashboard/containers/DashboardBuilder.jsx
+++ b/superset-frontend/src/dashboard/containers/DashboardBuilder.jsx
@@ -24,6 +24,7 @@ import {
setColorSchemeAndUnsavedChanges,
showBuilderPane,
setDirectPathToChild,
+ setMountedTab,
} from '../actions/dashboardState';
import {
deleteTopLevelTabs,
@@ -49,6 +50,7 @@ function mapDispatchToProps(dispatch) {
showBuilderPane,
setColorSchemeAndUnsavedChanges,
setDirectPathToChild,
+ setMountedTab,
},
dispatch,
);
diff --git a/superset-frontend/src/dashboard/containers/DashboardComponent.jsx b/superset-frontend/src/dashboard/containers/DashboardComponent.jsx
index e0eb38a..d4a403f 100644
--- a/superset-frontend/src/dashboard/containers/DashboardComponent.jsx
+++ b/superset-frontend/src/dashboard/containers/DashboardComponent.jsx
@@ -33,7 +33,7 @@ import {
updateComponents,
handleComponentDrop,
} from '../actions/dashboardLayout';
-import { setDirectPathToChild } from '../actions/dashboardState';
+import { setDirectPathToChild, setMountedTab } from '../actions/dashboardState';
import { logEvent } from '../../logger/actions';
import { addDangerToast } from '../../messageToasts/actions';
@@ -106,6 +106,7 @@ function mapDispatchToProps(dispatch) {
updateComponents,
handleComponentDrop,
setDirectPathToChild,
+ setMountedTab,
logEvent,
},
dispatch,
diff --git a/superset-frontend/src/dashboard/reducers/dashboardState.js b/superset-frontend/src/dashboard/reducers/dashboardState.js
index fc7e079..32ce82d 100644
--- a/superset-frontend/src/dashboard/reducers/dashboardState.js
+++ b/superset-frontend/src/dashboard/reducers/dashboardState.js
@@ -33,6 +33,7 @@ import {
UPDATE_CSS,
SET_REFRESH_FREQUENCY,
SET_DIRECT_PATH,
+ SET_MOUNTED_TAB,
SET_FOCUSED_FILTER_FIELD,
} from '../actions/dashboardState';
import { BUILDER_PANE_TYPE } from '../util/constants';
@@ -127,10 +128,19 @@ export default function dashboardStateReducer(state = {}, action) {
[SET_DIRECT_PATH]() {
return {
...state,
+ // change of direct path (tabs) will reset current mounted tab
+ mountedTab: null,
directPathToChild: action.path,
directPathLastUpdated: Date.now(),
};
},
+ [SET_MOUNTED_TAB]() {
+ // set current mounted tab after tab is really mounted to DOM
+ return {
+ ...state,
+ mountedTab: action.mountedTab,
+ };
+ },
[SET_FOCUSED_FILTER_FIELD]() {
const { focusedFilterField } = state;
if (action.chartId && action.column) {
[incubator-superset] 15/26: fix: update time range select tooltip
(#10458)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit cbba961eb8fae5a9cf7f55ddf2d13a0b40d18a09
Author: Moriah Kreeger <mo...@gmail.com>
AuthorDate: Tue Aug 4 16:34:20 2020 -0700
fix: update time range select tooltip (#10458)
---
.../src/explore/components/controls/DateFilterControl.jsx | 4 ++--
superset/translations/messages.pot | 4 ++--
2 files changed, 4 insertions(+), 4 deletions(-)
diff --git a/superset-frontend/src/explore/components/controls/DateFilterControl.jsx b/superset-frontend/src/explore/components/controls/DateFilterControl.jsx
index e2d1519..ee9fcbd 100644
--- a/superset-frontend/src/explore/components/controls/DateFilterControl.jsx
+++ b/superset-frontend/src/explore/components/controls/DateFilterControl.jsx
@@ -83,8 +83,8 @@ const DEFAULT_SINCE = moment()
const DEFAULT_UNTIL = moment().utc().startOf('day').format(MOMENT_FORMAT);
const SEPARATOR = ' : ';
const FREEFORM_TOOLTIP = t(
- 'Superset supports smart date parsing. Strings like `last sunday` or ' +
- '`last october` can be used.',
+ 'Superset supports smart date parsing. Strings like `3 weeks ago`, `last sunday`, or ' +
+ '`2 weeks from now` can be used.',
);
const DATE_FILTER_POPOVER_STYLE = { width: '250px' };
diff --git a/superset/translations/messages.pot b/superset/translations/messages.pot
index 7c48a30..a6ec520 100644
--- a/superset/translations/messages.pot
+++ b/superset/translations/messages.pot
@@ -4329,8 +4329,8 @@ msgstr ""
#: superset-frontend/src/explore/components/controls/DateFilterControl.jsx:85
msgid ""
-"Superset supports smart date parsing. Strings like `last sunday` or `last"
-" october` can be used."
+"Superset supports smart date parsing. Strings like `3 weeks ago`, `last sunday`"
+" or `2 weeks from now` can be used."
msgstr ""
#: superset-frontend/src/explore/components/controls/FilterBoxItemControl.jsx:142
[incubator-superset] 23/26: fix(db-engine-spec): execute oracle DML
statement bug in sqllab (#10706)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit ce00c3db4f60a21df6dfd0f6079b146057a26bc1
Author: chuancy <31...@qq.com>
AuthorDate: Mon Aug 31 13:03:07 2020 +0800
fix(db-engine-spec): execute oracle DML statement bug in sqllab (#10706)
* fix execute oracle DML statement bug in sqllab
when i execute oracle sql statements like update in SQLLAB, get "oracle error: not a query" error.
Refer https://www.python.org/dev/peps/pep-0249/, superset old version use
`cursor.description` ,because this attribute will be None for operations that do not return rows or if the cursor has not had an operation invoked via the .execute*() method yet.
* Apply suggestions from code review
Co-authored-by: Ville Brofeldt <33...@users.noreply.github.com>
* Update oracle.py
* Update oracle.py
* Update oracle.py
* Apply suggestions from code review
Co-authored-by: Ville Brofeldt <33...@users.noreply.github.com>
* Update oracle.py
* Update superset/db_engine_specs/oracle.py
Co-authored-by: Ville Brofeldt <33...@users.noreply.github.com>
Co-authored-by: Ville Brofeldt <33...@users.noreply.github.com>
---
superset/db_engine_specs/oracle.py | 15 ++++++++++++++-
1 file changed, 14 insertions(+), 1 deletion(-)
diff --git a/superset/db_engine_specs/oracle.py b/superset/db_engine_specs/oracle.py
index 813b150..01c06f4 100644
--- a/superset/db_engine_specs/oracle.py
+++ b/superset/db_engine_specs/oracle.py
@@ -15,7 +15,7 @@
# specific language governing permissions and limitations
# under the License.
from datetime import datetime
-from typing import Optional
+from typing import Any, List, Optional, Tuple
from superset.db_engine_specs.base import BaseEngineSpec, LimitMethod
from superset.utils import core as utils
@@ -57,3 +57,16 @@ class OracleEngineSpec(BaseEngineSpec):
@classmethod
def epoch_ms_to_dttm(cls) -> str:
return "TO_DATE('1970-01-01','YYYY-MM-DD')+(1/24/60/60/1000)*{col}"
+
+ @classmethod
+ def fetch_data(
+ cls, cursor: Any, limit: Optional[int] = None
+ ) -> List[Tuple[Any, ...]]:
+ """
+ :param cursor: Cursor instance
+ :param limit: Maximum number of rows to be returned by the cursor
+ :return: Result of query
+ """
+ if not cursor.description:
+ return []
+ return super().fetch_data(cursor, limit)
[incubator-superset] 07/26: fix: support non-string groupbys for
pie chart (#10493)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit d86e4e6135b619d65091305c9cfd2819fee05ce8
Author: Ville Brofeldt <33...@users.noreply.github.com>
AuthorDate: Fri Jul 31 11:19:21 2020 +0300
fix: support non-string groupbys for pie chart (#10493)
* chore: add unit tests to pie chart
* refine logic for floats and nans and add more tests
---
superset/viz.py | 34 +++++++++++++++++++++++++++++++-
tests/viz_tests.py | 58 ++++++++++++++++++++++++++++++++++++++++++++++++++++++
2 files changed, 91 insertions(+), 1 deletion(-)
diff --git a/superset/viz.py b/superset/viz.py
index 3ce9434..77b228b 100644
--- a/superset/viz.py
+++ b/superset/viz.py
@@ -1544,11 +1544,43 @@ class DistributionPieViz(NVD3Viz):
is_timeseries = False
def get_data(self, df: pd.DataFrame) -> VizData:
+ def _label_aggfunc(labels: pd.Series) -> str:
+ """
+ Convert a single or multi column label into a single label, replacing
+ null values with `NULL_STRING` and joining multiple columns together
+ with a comma. Examples:
+
+ >>> _label_aggfunc(pd.Series(["abc"]))
+ 'abc'
+ >>> _label_aggfunc(pd.Series([1]))
+ '1'
+ >>> _label_aggfunc(pd.Series(["abc", "def"]))
+ 'abc, def'
+ >>> # note: integer floats are stripped of decimal digits
+ >>> _label_aggfunc(pd.Series([0.1, 2.0, 0.3]))
+ '0.1, 2, 0.3'
+ >>> _label_aggfunc(pd.Series([1, None, "abc", 0.8], dtype="object"))
+ '1, <NULL>, abc, 0.8'
+ """
+ label_list: List[str] = []
+ for label in labels:
+ if isinstance(label, str):
+ label_recast = label
+ elif label is None or isinstance(label, float) and math.isnan(label):
+ label_recast = NULL_STRING
+ elif isinstance(label, float) and label.is_integer():
+ label_recast = str(int(label))
+ else:
+ label_recast = str(label)
+ label_list.append(label_recast)
+
+ return ", ".join(label_list)
+
if df.empty:
return None
metric = self.metric_labels[0]
df = pd.DataFrame(
- {"x": df[self.groupby].agg(func=", ".join, axis=1), "y": df[metric]}
+ {"x": df[self.groupby].agg(func=_label_aggfunc, axis=1), "y": df[metric]}
)
df.sort_values(by="y", ascending=False, inplace=True)
return df.to_dict(orient="records")
diff --git a/tests/viz_tests.py b/tests/viz_tests.py
index 17e43d8..b76c95c 100644
--- a/tests/viz_tests.py
+++ b/tests/viz_tests.py
@@ -20,6 +20,7 @@ from datetime import datetime
import logging
from math import nan
from unittest.mock import Mock, patch
+from typing import Any, Dict, List, Set
import numpy as np
import pandas as pd
@@ -1322,3 +1323,60 @@ class TestPivotTableViz(SupersetTestCase):
viz.PivotTableViz.get_aggfunc("strcol", self.df, {"pandas_aggfunc": "min"})
== "min"
)
+
+
+class TestDistributionPieViz(SupersetTestCase):
+ base_df = pd.DataFrame(
+ data={
+ "intcol": [1, 2, 3, 4, None],
+ "floatcol": [1.0, 0.2, 0.3, 0.4, None],
+ "strcol_a": ["a", "a", "a", "a", None],
+ "strcol": ["a", "b", "c", None, "d"],
+ }
+ )
+
+ @staticmethod
+ def get_cols(data: List[Dict[str, Any]]) -> Set[str]:
+ return set([row["x"] for row in data])
+
+ def test_bool_groupby(self):
+ datasource = self.get_datasource_mock()
+ df = pd.DataFrame(data={"intcol": [1, 2, None], "boolcol": [True, None, False]})
+
+ pie_viz = viz.DistributionPieViz(
+ datasource, {"metrics": ["intcol"], "groupby": ["boolcol"]},
+ )
+ data = pie_viz.get_data(df)
+ assert self.get_cols(data) == {"True", "False", "<NULL>"}
+
+ def test_string_groupby(self):
+ datasource = self.get_datasource_mock()
+ pie_viz = viz.DistributionPieViz(
+ datasource, {"metrics": ["floatcol"], "groupby": ["strcol"]},
+ )
+ data = pie_viz.get_data(self.base_df)
+ assert self.get_cols(data) == {"<NULL>", "a", "b", "c", "d"}
+
+ def test_int_groupby(self):
+ datasource = self.get_datasource_mock()
+ pie_viz = viz.DistributionPieViz(
+ datasource, {"metrics": ["floatcol"], "groupby": ["intcol"]},
+ )
+ data = pie_viz.get_data(self.base_df)
+ assert self.get_cols(data) == {"<NULL>", "1", "2", "3", "4"}
+
+ def test_float_groupby(self):
+ datasource = self.get_datasource_mock()
+ pie_viz = viz.DistributionPieViz(
+ datasource, {"metrics": ["intcol"], "groupby": ["floatcol"]},
+ )
+ data = pie_viz.get_data(self.base_df)
+ assert self.get_cols(data) == {"<NULL>", "1", "0.2", "0.3", "0.4"}
+
+ def test_multi_groupby(self):
+ datasource = self.get_datasource_mock()
+ pie_viz = viz.DistributionPieViz(
+ datasource, {"metrics": ["floatcol"], "groupby": ["intcol", "strcol"]},
+ )
+ data = pie_viz.get_data(self.base_df)
+ assert self.get_cols(data) == {"1, a", "2, b", "3, c", "4, <NULL>", "<NULL>, d"}
[incubator-superset] 17/26: fix: dedup groupby in viz.py while
preserving order (#10633)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit a037a4718306da3432386873f6b013c25b1667da
Author: Maxime Beauchemin <ma...@gmail.com>
AuthorDate: Thu Aug 20 22:02:02 2020 -0700
fix: dedup groupby in viz.py while preserving order (#10633)
---
setup.cfg | 2 +-
superset/tasks/slack_util.py | 5 ++---
superset/viz.py | 5 +++--
3 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/setup.cfg b/setup.cfg
index c126a4a..e8505c9 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -38,7 +38,7 @@ combine_as_imports = true
include_trailing_comma = true
line_length = 88
known_first_party = superset
-known_third_party =alembic,apispec,backoff,bleach,cachelib,celery,click,colorama,contextlib2,croniter,cryptography,dataclasses,dateutil,flask,flask_appbuilder,flask_babel,flask_caching,flask_compress,flask_login,flask_migrate,flask_sqlalchemy,flask_talisman,flask_testing,flask_wtf,geohash,geopy,humanize,isodate,jinja2,markdown,markupsafe,marshmallow,msgpack,numpy,pandas,parameterized,parsedatetime,pathlib2,polyline,prison,pyarrow,pyhive,pytest,pytz,retry,selenium,setuptools,simplejson,sl [...]
+known_third_party =alembic,apispec,backoff,bleach,cachelib,celery,click,colorama,contextlib2,croniter,cryptography,dateutil,flask,flask_appbuilder,flask_babel,flask_caching,flask_compress,flask_login,flask_migrate,flask_sqlalchemy,flask_talisman,flask_testing,flask_wtf,geohash,geopy,humanize,isodate,jinja2,markdown,markupsafe,marshmallow,msgpack,numpy,pandas,parameterized,parsedatetime,pathlib2,polyline,prison,pyarrow,pyhive,pytest,pytz,retry,selenium,setuptools,simplejson,slack,sphinx_r [...]
multi_line_output = 3
order_by_type = false
diff --git a/superset/tasks/slack_util.py b/superset/tasks/slack_util.py
index ef647eb..08dcb16 100644
--- a/superset/tasks/slack_util.py
+++ b/superset/tasks/slack_util.py
@@ -18,15 +18,13 @@ import logging
from io import IOBase
from typing import cast, Union
+from flask import current_app
from retry.api import retry
from slack import WebClient
from slack.errors import SlackApiError
from slack.web.slack_response import SlackResponse
-from superset import app
-
# Globals
-config = app.config # type: ignore
logger = logging.getLogger("tasks.slack_util")
@@ -34,6 +32,7 @@ logger = logging.getLogger("tasks.slack_util")
def deliver_slack_msg(
slack_channel: str, subject: str, body: str, file: Union[str, IOBase]
) -> None:
+ config = current_app.config
client = WebClient(token=config["SLACK_API_TOKEN"], proxy=config["SLACK_PROXY"])
# files_upload returns SlackResponse as we run it in sync mode.
response = cast(
diff --git a/superset/viz.py b/superset/viz.py
index becdc4a..508d8aa 100644
--- a/superset/viz.py
+++ b/superset/viz.py
@@ -21,6 +21,7 @@ These objects represent the backend of all the visualizations that
Superset can render.
"""
import copy
+import dataclasses
import inspect
import logging
import math
@@ -42,7 +43,6 @@ from typing import (
Union,
)
-import dataclasses
import geohash
import numpy as np
import pandas as pd
@@ -322,7 +322,8 @@ class BaseViz:
gb = self.groupby
metrics = self.all_metrics or []
columns = form_data.get("columns") or []
- groupby = list(set(gb + columns))
+ # merge list and dedup while preserving order
+ groupby = list(OrderedDict.fromkeys(gb + columns))
is_timeseries = self.is_timeseries
if DTTM_ALIAS in groupby:
[incubator-superset] 10/26: fix: handle query exceptions gracefully
(#10548)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit fdc829d288a586f4ede55bef807ccff5b70185ef
Author: Ville Brofeldt <33...@users.noreply.github.com>
AuthorDate: Fri Aug 7 17:37:40 2020 +0300
fix: handle query exceptions gracefully (#10548)
* fix: handle query exceptions gracefully
* add more recasts
* add test
* disable test for presto
* switch to SQLA error
---
superset/common/query_context.py | 6 ++-
superset/connectors/sqla/models.py | 85 ++++++++++++++++++++++++++++----------
superset/views/core.py | 7 ++--
superset/viz.py | 50 +++++++++++++++-------
tests/sqla_models_tests.py | 25 +++++++++++
5 files changed, 132 insertions(+), 41 deletions(-)
diff --git a/superset/common/query_context.py b/superset/common/query_context.py
index e602fbf..0d33f9c 100644
--- a/superset/common/query_context.py
+++ b/superset/common/query_context.py
@@ -27,6 +27,7 @@ from superset import app, cache, db, security_manager
from superset.common.query_object import QueryObject
from superset.connectors.base.models import BaseDatasource
from superset.connectors.connector_registry import ConnectorRegistry
+from superset.exceptions import QueryObjectValidationError
from superset.stats_logger import BaseStatsLogger
from superset.utils import core as utils
from superset.utils.core import DTTM_ALIAS
@@ -244,10 +245,13 @@ class QueryContext:
if not self.force:
stats_logger.incr("loaded_from_source_without_force")
is_loaded = True
+ except QueryObjectValidationError as ex:
+ error_message = str(ex)
+ status = utils.QueryStatus.FAILED
except Exception as ex: # pylint: disable=broad-except
logger.exception(ex)
if not error_message:
- error_message = "{}".format(ex)
+ error_message = str(ex)
status = utils.QueryStatus.FAILED
stacktrace = utils.get_stacktrace()
diff --git a/superset/connectors/sqla/models.py b/superset/connectors/sqla/models.py
index f571cbc..36a4c57 100644
--- a/superset/connectors/sqla/models.py
+++ b/superset/connectors/sqla/models.py
@@ -25,6 +25,7 @@ import sqlparse
from flask import escape, Markup
from flask_appbuilder import Model
from flask_babel import lazy_gettext as _
+from jinja2.exceptions import TemplateError
from sqlalchemy import (
and_,
asc,
@@ -40,7 +41,7 @@ from sqlalchemy import (
Table,
Text,
)
-from sqlalchemy.exc import CompileError
+from sqlalchemy.exc import CompileError, SQLAlchemyError
from sqlalchemy.orm import backref, Query, relationship, RelationshipProperty, Session
from sqlalchemy.orm.exc import NoResultFound
from sqlalchemy.schema import UniqueConstraint
@@ -51,7 +52,7 @@ from superset import app, db, is_feature_enabled, security_manager
from superset.connectors.base.models import BaseColumn, BaseDatasource, BaseMetric
from superset.constants import NULL_STRING
from superset.db_engine_specs.base import TimestampExpression
-from superset.exceptions import DatabaseNotFound
+from superset.exceptions import DatabaseNotFound, QueryObjectValidationError
from superset.jinja_context import (
BaseTemplateProcessor,
ExtraCache,
@@ -630,7 +631,15 @@ class SqlaTable( # pylint: disable=too-many-public-methods,too-many-instance-at
if self.fetch_values_predicate:
tp = self.get_template_processor()
- qry = qry.where(text(tp.process_template(self.fetch_values_predicate)))
+ try:
+ qry = qry.where(text(tp.process_template(self.fetch_values_predicate)))
+ except TemplateError as ex:
+ raise QueryObjectValidationError(
+ _(
+ "Error in jinja expression in fetch values predicate: %(msg)s",
+ msg=ex.message,
+ )
+ )
engine = self.database.get_sqla_engine()
sql = "{}".format(qry.compile(engine, compile_kwargs={"literal_binds": True}))
@@ -680,7 +689,16 @@ class SqlaTable( # pylint: disable=too-many-public-methods,too-many-instance-at
if self.sql:
from_sql = self.sql
if template_processor:
- from_sql = template_processor.process_template(from_sql)
+ try:
+ from_sql = template_processor.process_template(from_sql)
+ except TemplateError as ex:
+ raise QueryObjectValidationError(
+ _(
+ "Error in jinja expression in FROM clause: %(msg)s",
+ msg=ex.message,
+ )
+ )
+
from_sql = sqlparse.format(from_sql, strip_comments=True)
return TextAsFrom(sa.text(from_sql), []).alias("expr_qry")
return self.get_sqla_table()
@@ -726,10 +744,15 @@ class SqlaTable( # pylint: disable=too-many-public-methods,too-many-instance-at
:returns: A list of SQL clauses to be ANDed together.
:rtype: List[str]
"""
- return [
- text("({})".format(template_processor.process_template(f.clause)))
- for f in security_manager.get_rls_filters(self)
- ]
+ try:
+ return [
+ text("({})".format(template_processor.process_template(f.clause)))
+ for f in security_manager.get_rls_filters(self)
+ ]
+ except TemplateError as ex:
+ raise QueryObjectValidationError(
+ _("Error in jinja expression in RLS filters: %(msg)s", msg=ex.message,)
+ )
def get_sqla_query( # pylint: disable=too-many-arguments,too-many-locals,too-many-branches,too-many-statements
self,
@@ -787,7 +810,7 @@ class SqlaTable( # pylint: disable=too-many-public-methods,too-many-instance-at
metrics_by_name: Dict[str, SqlMetric] = {m.metric_name: m for m in self.metrics}
if not granularity and is_timeseries:
- raise Exception(
+ raise QueryObjectValidationError(
_(
"Datetime column not provided as part table configuration "
"and is required by this type of chart"
@@ -798,7 +821,7 @@ class SqlaTable( # pylint: disable=too-many-public-methods,too-many-instance-at
and not columns
and (is_sip_38 or (not is_sip_38 and not groupby))
):
- raise Exception(_("Empty query?"))
+ raise QueryObjectValidationError(_("Empty query?"))
metrics_exprs: List[ColumnElement] = []
for metric in metrics:
if utils.is_adhoc_metric(metric):
@@ -807,7 +830,9 @@ class SqlaTable( # pylint: disable=too-many-public-methods,too-many-instance-at
elif isinstance(metric, str) and metric in metrics_by_name:
metrics_exprs.append(metrics_by_name[metric].get_sqla_col())
else:
- raise Exception(_("Metric '%(metric)s' does not exist", metric=metric))
+ raise QueryObjectValidationError(
+ _("Metric '%(metric)s' does not exist", metric=metric)
+ )
if metrics_exprs:
main_metric_expr = metrics_exprs[0]
else:
@@ -954,7 +979,7 @@ class SqlaTable( # pylint: disable=too-many-public-methods,too-many-instance-at
!= None
)
else:
- raise Exception(
+ raise QueryObjectValidationError(
_("Invalid filter operation type: %(op)s", op=op)
)
if config["ENABLE_ROW_LEVEL_SECURITY"]:
@@ -962,11 +987,27 @@ class SqlaTable( # pylint: disable=too-many-public-methods,too-many-instance-at
if extras:
where = extras.get("where")
if where:
- where = template_processor.process_template(where)
+ try:
+ where = template_processor.process_template(where)
+ except TemplateError as ex:
+ raise QueryObjectValidationError(
+ _(
+ "Error in jinja expression in WHERE clause: %(msg)s",
+ msg=ex.message,
+ )
+ )
where_clause_and += [sa.text("({})".format(where))]
having = extras.get("having")
if having:
- having = template_processor.process_template(having)
+ try:
+ having = template_processor.process_template(having)
+ except TemplateError as ex:
+ raise QueryObjectValidationError(
+ _(
+ "Error in jinja expression in HAVING clause: %(msg)s",
+ msg=ex.message,
+ )
+ )
having_clause_and += [sa.text("({})".format(having))]
if granularity:
qry = qry.where(and_(*(time_filters + where_clause_and)))
@@ -1113,7 +1154,7 @@ class SqlaTable( # pylint: disable=too-many-public-methods,too-many-instance-at
):
ob = metrics_by_name[timeseries_limit_metric].get_sqla_col()
else:
- raise Exception(
+ raise QueryObjectValidationError(
_("Metric '%(metric)s' does not exist", metric=timeseries_limit_metric)
)
@@ -1155,7 +1196,7 @@ class SqlaTable( # pylint: disable=too-many-public-methods,too-many-instance-at
labels_expected = query_str_ext.labels_expected
if df is not None and not df.empty:
if len(df.columns) != len(labels_expected):
- raise Exception(
+ raise QueryObjectValidationError(
f"For {sql}, df.columns: {df.columns}"
f" differs from {labels_expected}"
)
@@ -1189,13 +1230,13 @@ class SqlaTable( # pylint: disable=too-many-public-methods,too-many-instance-at
"""Fetches the metadata for the table and merges it in"""
try:
table_ = self.get_sqla_table_object()
- except Exception as ex:
- logger.exception(ex)
- raise Exception(
+ except SQLAlchemyError:
+ raise QueryObjectValidationError(
_(
- "Table [{}] doesn't seem to exist in the specified database, "
- "couldn't fetch column information"
- ).format(self.table_name)
+ "Table %(table)s doesn't seem to exist in the specified database, "
+ "couldn't fetch column information",
+ table=self.table_name,
+ )
)
metrics = []
diff --git a/superset/views/core.py b/superset/views/core.py
index e7e23b7..3c990b3 100755
--- a/superset/views/core.py
+++ b/superset/views/core.py
@@ -32,6 +32,7 @@ from flask_appbuilder.models.sqla.interface import SQLAInterface
from flask_appbuilder.security.decorators import has_access, has_access_api
from flask_appbuilder.security.sqla import models as ab_models
from flask_babel import gettext as __, lazy_gettext as _
+from jinja2.exceptions import TemplateError
from sqlalchemy import and_, or_, select
from sqlalchemy.engine.url import make_url
from sqlalchemy.exc import (
@@ -535,7 +536,7 @@ class Superset(BaseSupersetView): # pylint: disable=too-many-public-methods
return self.generate_json(viz_obj, response_type)
except SupersetException as ex:
- return json_error_response(utils.error_msg_from_exception(ex))
+ return json_error_response(utils.error_msg_from_exception(ex), 400)
@event_logger.log_this
@has_access
@@ -2288,10 +2289,10 @@ class Superset(BaseSupersetView): # pylint: disable=too-many-public-methods
rendered_query = template_processor.process_template(
query.sql, **template_params
)
- except Exception as ex: # pylint: disable=broad-except
+ except TemplateError as ex:
error_msg = utils.error_msg_from_exception(ex)
return json_error_response(
- f"Query {query_id}: Template rendering failed: {error_msg}"
+ f"Query {query_id}: Template syntax error: {error_msg}"
)
# Limit is not applied to the CTA queries if SQLLAB_CTAS_NO_LIMIT flag is set
diff --git a/superset/viz.py b/superset/viz.py
index 77b228b..c67adf5 100644
--- a/superset/viz.py
+++ b/superset/viz.py
@@ -329,13 +329,17 @@ class BaseViz:
# default order direction
order_desc = form_data.get("order_desc", True)
- since, until = utils.get_since_until(
- relative_start=relative_start,
- relative_end=relative_end,
- time_range=form_data.get("time_range"),
- since=form_data.get("since"),
- until=form_data.get("until"),
- )
+ try:
+ since, until = utils.get_since_until(
+ relative_start=relative_start,
+ relative_end=relative_end,
+ time_range=form_data.get("time_range"),
+ since=form_data.get("since"),
+ until=form_data.get("until"),
+ )
+ except ValueError as ex:
+ raise QueryObjectValidationError(str(ex))
+
time_shift = form_data.get("time_shift", "")
self.time_shift = utils.parse_past_timedelta(time_shift)
from_dttm = None if since is None else (since - self.time_shift)
@@ -475,6 +479,16 @@ class BaseViz:
if not self.force:
stats_logger.incr("loaded_from_source_without_force")
is_loaded = True
+ except QueryObjectValidationError as ex:
+ error = dataclasses.asdict(
+ SupersetError(
+ message=str(ex),
+ level=ErrorLevel.ERROR,
+ error_type=SupersetErrorType.VIZ_GET_DF_ERROR,
+ )
+ )
+ self.errors.append(error)
+ self.status = utils.QueryStatus.FAILED
except Exception as ex:
logger.exception(ex)
@@ -888,13 +902,16 @@ class CalHeatmapViz(BaseViz):
values[str(v / 10 ** 9)] = obj.get(metric)
data[metric] = values
- start, end = utils.get_since_until(
- relative_start=relative_start,
- relative_end=relative_end,
- time_range=form_data.get("time_range"),
- since=form_data.get("since"),
- until=form_data.get("until"),
- )
+ try:
+ start, end = utils.get_since_until(
+ relative_start=relative_start,
+ relative_end=relative_end,
+ time_range=form_data.get("time_range"),
+ since=form_data.get("since"),
+ until=form_data.get("until"),
+ )
+ except ValueError as ex:
+ raise QueryObjectValidationError(str(ex))
if not start or not end:
raise QueryObjectValidationError(
"Please provide both time bounds (Since and Until)"
@@ -1285,7 +1302,10 @@ class NVD3TimeSeriesViz(NVD3Viz):
for option in time_compare:
query_object = self.query_obj()
- delta = utils.parse_past_timedelta(option)
+ try:
+ delta = utils.parse_past_timedelta(option)
+ except ValueError as ex:
+ raise QueryObjectValidationError(str(ex))
query_object["inner_from_dttm"] = query_object["from_dttm"]
query_object["inner_to_dttm"] = query_object["to_dttm"]
diff --git a/tests/sqla_models_tests.py b/tests/sqla_models_tests.py
index 4222cdb..c5b634b 100644
--- a/tests/sqla_models_tests.py
+++ b/tests/sqla_models_tests.py
@@ -17,10 +17,12 @@
# isort:skip_file
from typing import Any, Dict, NamedTuple, List, Tuple, Union
from unittest.mock import patch
+import pytest
import tests.test_app
from superset.connectors.sqla.models import SqlaTable, TableColumn
from superset.db_engine_specs.druid import DruidEngineSpec
+from superset.exceptions import QueryObjectValidationError
from superset.models.core import Database
from superset.utils.core import DbColumnType, get_example_database, FilterOperator
@@ -162,3 +164,26 @@ class TestDatabaseModel(SupersetTestCase):
sqla_query = table.get_sqla_query(**query_obj)
sql = table.database.compile_sqla_query(sqla_query.sqla_query)
self.assertIn(filter_.expected, sql)
+
+ def test_incorrect_jinja_syntax_raises_correct_exception(self):
+ query_obj = {
+ "granularity": None,
+ "from_dttm": None,
+ "to_dttm": None,
+ "groupby": ["user"],
+ "metrics": [],
+ "is_timeseries": False,
+ "filter": [],
+ "extras": {},
+ }
+
+ # Table with Jinja callable.
+ table = SqlaTable(
+ table_name="test_table",
+ sql="SELECT '{{ abcd xyz + 1 ASDF }}' as user",
+ database=get_example_database(),
+ )
+ # TODO(villebro): make it work with presto
+ if get_example_database().backend != "presto":
+ with pytest.raises(QueryObjectValidationError):
+ table.get_sqla_query(**query_obj)
[incubator-superset] 22/26: refactor(database): use
SupersetResultSet on SqlaTable.get_df() (#10707)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit 634a90a9268be51376771b4c62dd044fb2b88485
Author: Ville Brofeldt <33...@users.noreply.github.com>
AuthorDate: Fri Aug 28 21:12:03 2020 +0300
refactor(database): use SupersetResultSet on SqlaTable.get_df() (#10707)
* refactor(database): use SupersetResultSet on SqlaTable.get_df()
* lint
* change cypress test
---
.../explore/visualizations/table.test.ts | 21 +++++++++++----------
superset/db_engine_specs/base.py | 6 ++++--
superset/db_engine_specs/bigquery.py | 4 +++-
superset/db_engine_specs/exasol.py | 6 ++++--
superset/db_engine_specs/hive.py | 6 ++++--
superset/db_engine_specs/mssql.py | 4 +++-
superset/db_engine_specs/postgres.py | 4 +++-
superset/models/core.py | 14 ++++++--------
superset/typing.py | 4 ++--
superset/viz.py | 1 -
10 files changed, 40 insertions(+), 30 deletions(-)
diff --git a/superset-frontend/cypress-base/cypress/integration/explore/visualizations/table.test.ts b/superset-frontend/cypress-base/cypress/integration/explore/visualizations/table.test.ts
index c7015d9..77f9c6f 100644
--- a/superset-frontend/cypress-base/cypress/integration/explore/visualizations/table.test.ts
+++ b/superset-frontend/cypress-base/cypress/integration/explore/visualizations/table.test.ts
@@ -29,6 +29,16 @@ import readResponseBlob from '../../../utils/readResponseBlob';
describe('Visualization > Table', () => {
const VIZ_DEFAULTS = { ...FORM_DATA_DEFAULTS, viz_type: 'table' };
+ const PERCENT_METRIC = {
+ expressionType: 'SQL',
+ sqlExpression: 'CAST(SUM(sum_girls)+AS+FLOAT)/SUM(num)',
+ column: null,
+ aggregate: null,
+ hasCustomLabel: true,
+ label: 'Girls',
+ optionName: 'metric_6qwzgc8bh2v_zox7hil1mzs',
+ };
+
beforeEach(() => {
cy.login();
cy.server();
@@ -119,7 +129,7 @@ describe('Visualization > Table', () => {
it('Test table with percent metrics and groupby', () => {
const formData = {
...VIZ_DEFAULTS,
- percent_metrics: NUM_METRIC,
+ percent_metrics: PERCENT_METRIC,
metrics: [],
groupby: ['name'],
};
@@ -214,15 +224,6 @@ describe('Visualization > Table', () => {
});
it('Tests table number formatting with % in metric name', () => {
- const PERCENT_METRIC = {
- expressionType: 'SQL',
- sqlExpression: 'CAST(SUM(sum_girls)+AS+FLOAT)/SUM(num)',
- column: null,
- aggregate: null,
- hasCustomLabel: true,
- label: 'Girls',
- optionName: 'metric_6qwzgc8bh2v_zox7hil1mzs',
- };
const formData = {
...VIZ_DEFAULTS,
percent_metrics: PERCENT_METRIC,
diff --git a/superset/db_engine_specs/base.py b/superset/db_engine_specs/base.py
index fb8a4e0..45fba3d 100644
--- a/superset/db_engine_specs/base.py
+++ b/superset/db_engine_specs/base.py
@@ -296,7 +296,9 @@ class BaseEngineSpec: # pylint: disable=too-many-public-methods
return select_exprs
@classmethod
- def fetch_data(cls, cursor: Any, limit: int) -> List[Tuple[Any, ...]]:
+ def fetch_data(
+ cls, cursor: Any, limit: Optional[int] = None
+ ) -> List[Tuple[Any, ...]]:
"""
:param cursor: Cursor instance
@@ -305,7 +307,7 @@ class BaseEngineSpec: # pylint: disable=too-many-public-methods
"""
if cls.arraysize:
cursor.arraysize = cls.arraysize
- if cls.limit_method == LimitMethod.FETCH_MANY:
+ if cls.limit_method == LimitMethod.FETCH_MANY and limit:
return cursor.fetchmany(limit)
return cursor.fetchall()
diff --git a/superset/db_engine_specs/bigquery.py b/superset/db_engine_specs/bigquery.py
index f45145f..b3bc13a 100644
--- a/superset/db_engine_specs/bigquery.py
+++ b/superset/db_engine_specs/bigquery.py
@@ -84,7 +84,9 @@ class BigQueryEngineSpec(BaseEngineSpec):
return None
@classmethod
- def fetch_data(cls, cursor: Any, limit: int) -> List[Tuple[Any, ...]]:
+ def fetch_data(
+ cls, cursor: Any, limit: Optional[int] = None
+ ) -> List[Tuple[Any, ...]]:
data = super().fetch_data(cursor, limit)
# Support type BigQuery Row, introduced here PR #4071
# google.cloud.bigquery.table.Row
diff --git a/superset/db_engine_specs/exasol.py b/superset/db_engine_specs/exasol.py
index 23449f0..08c92d6 100644
--- a/superset/db_engine_specs/exasol.py
+++ b/superset/db_engine_specs/exasol.py
@@ -14,7 +14,7 @@
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
-from typing import Any, List, Tuple
+from typing import Any, List, Optional, Tuple
from superset.db_engine_specs.base import BaseEngineSpec
@@ -39,7 +39,9 @@ class ExasolEngineSpec(BaseEngineSpec): # pylint: disable=abstract-method
}
@classmethod
- def fetch_data(cls, cursor: Any, limit: int) -> List[Tuple[Any, ...]]:
+ def fetch_data(
+ cls, cursor: Any, limit: Optional[int] = None
+ ) -> List[Tuple[Any, ...]]:
data = super().fetch_data(cursor, limit)
# Lists of `pyodbc.Row` need to be unpacked further
return cls.pyodbc_rows_to_tuples(data)
diff --git a/superset/db_engine_specs/hive.py b/superset/db_engine_specs/hive.py
index 4cafaaf..95c7e67 100644
--- a/superset/db_engine_specs/hive.py
+++ b/superset/db_engine_specs/hive.py
@@ -108,7 +108,9 @@ class HiveEngineSpec(PrestoEngineSpec):
return BaseEngineSpec.get_all_datasource_names(database, datasource_type)
@classmethod
- def fetch_data(cls, cursor: Any, limit: int) -> List[Tuple[Any, ...]]:
+ def fetch_data(
+ cls, cursor: Any, limit: Optional[int] = None
+ ) -> List[Tuple[Any, ...]]:
import pyhive
from TCLIService import ttypes
@@ -116,7 +118,7 @@ class HiveEngineSpec(PrestoEngineSpec):
if state.operationState == ttypes.TOperationState.ERROR_STATE:
raise Exception("Query error", state.errorMessage)
try:
- return super(HiveEngineSpec, cls).fetch_data(cursor, limit)
+ return super().fetch_data(cursor, limit)
except pyhive.exc.ProgrammingError:
return []
diff --git a/superset/db_engine_specs/mssql.py b/superset/db_engine_specs/mssql.py
index a8a6b9a..4ba1f5c 100644
--- a/superset/db_engine_specs/mssql.py
+++ b/superset/db_engine_specs/mssql.py
@@ -67,7 +67,9 @@ class MssqlEngineSpec(BaseEngineSpec):
return None
@classmethod
- def fetch_data(cls, cursor: Any, limit: int) -> List[Tuple[Any, ...]]:
+ def fetch_data(
+ cls, cursor: Any, limit: Optional[int] = None
+ ) -> List[Tuple[Any, ...]]:
data = super().fetch_data(cursor, limit)
# Lists of `pyodbc.Row` need to be unpacked further
return cls.pyodbc_rows_to_tuples(data)
diff --git a/superset/db_engine_specs/postgres.py b/superset/db_engine_specs/postgres.py
index dceac26..7d08dd5 100644
--- a/superset/db_engine_specs/postgres.py
+++ b/superset/db_engine_specs/postgres.py
@@ -52,7 +52,9 @@ class PostgresBaseEngineSpec(BaseEngineSpec):
}
@classmethod
- def fetch_data(cls, cursor: Any, limit: int) -> List[Tuple[Any, ...]]:
+ def fetch_data(
+ cls, cursor: Any, limit: Optional[int] = None
+ ) -> List[Tuple[Any, ...]]:
cursor.tzinfo_factory = FixedOffsetTimezone
if not cursor.description:
return []
diff --git a/superset/models/core.py b/superset/models/core.py
index 7660150..775a9f0 100755
--- a/superset/models/core.py
+++ b/superset/models/core.py
@@ -57,6 +57,7 @@ from superset.db_engine_specs.base import TimeGrain
from superset.models.dashboard import Dashboard
from superset.models.helpers import AuditMixinNullable, ImportMixin
from superset.models.tags import DashboardUpdater, FavStarUpdater
+from superset.result_set import SupersetResultSet
from superset.utils import cache as cache_util, core as utils
config = app.config
@@ -392,21 +393,18 @@ class Database(
_log_query(sqls[-1])
self.db_engine_spec.execute(cursor, sqls[-1])
- if cursor.description is not None:
- columns = [col_desc[0] for col_desc in cursor.description]
- else:
- columns = []
-
- df = pd.DataFrame.from_records(
- data=list(cursor.fetchall()), columns=columns, coerce_float=True
+ data = self.db_engine_spec.fetch_data(cursor)
+ result_set = SupersetResultSet(
+ data, cursor.description, self.db_engine_spec
)
-
+ df = result_set.to_pandas_df()
if mutator:
mutator(df)
for k, v in df.dtypes.items():
if v.type == numpy.object_ and needs_conversion(df[k]):
df[k] = df[k].apply(utils.json_dumps_w_dates)
+
return df
def compile_sqla_query(self, qry: Select, schema: Optional[str] = None) -> str:
diff --git a/superset/typing.py b/superset/typing.py
index e238000..6f1fa2e 100644
--- a/superset/typing.py
+++ b/superset/typing.py
@@ -14,7 +14,7 @@
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
-from typing import Any, Callable, Dict, List, Optional, Tuple, Union
+from typing import Any, Callable, Dict, List, Optional, Sequence, Tuple, Union
from flask import Flask
from flask_caching import Cache
@@ -25,7 +25,7 @@ DbapiDescriptionRow = Tuple[
str, str, Optional[str], Optional[str], Optional[int], Optional[int], bool
]
DbapiDescription = Union[List[DbapiDescriptionRow], Tuple[DbapiDescriptionRow, ...]]
-DbapiResult = List[Union[List[Any], Tuple[Any, ...]]]
+DbapiResult = Sequence[Union[List[Any], Tuple[Any, ...]]]
FilterValue = Union[float, int, str]
FilterValues = Union[FilterValue, List[FilterValue], Tuple[FilterValue]]
FormData = Dict[str, Any]
diff --git a/superset/viz.py b/superset/viz.py
index df1111a..1db8b32 100644
--- a/superset/viz.py
+++ b/superset/viz.py
@@ -26,7 +26,6 @@ import inspect
import logging
import math
import re
-import uuid
from collections import defaultdict, OrderedDict
from datetime import datetime, timedelta
from itertools import product
[incubator-superset] 11/26: fix: embedded chart height (#10551)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit 6744bab42616d298e2577fb61fb108dbbef9bf84
Author: Erik Ritter <er...@airbnb.com>
AuthorDate: Fri Aug 7 13:53:18 2020 -0700
fix: embedded chart height (#10551)
---
superset-frontend/src/explore/App.jsx | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)
diff --git a/superset-frontend/src/explore/App.jsx b/superset-frontend/src/explore/App.jsx
index 0d3d55a..90b3494 100644
--- a/superset-frontend/src/explore/App.jsx
+++ b/superset-frontend/src/explore/App.jsx
@@ -33,10 +33,10 @@ setupPlugins();
const App = ({ store }) => (
<Provider store={store}>
<ThemeProvider theme={supersetTheme}>
- <div>
+ <>
<ExploreViewContainer />
<ToastPresenter />
- </div>
+ </>
</ThemeProvider>
</Provider>
);
[incubator-superset] 20/26: feat(row-level-security): add hook for
customizing form dropdowns (#10683)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit 7a67a28d4332772eeb2e19a221f0ea970056bf84
Author: Ville Brofeldt <33...@users.noreply.github.com>
AuthorDate: Wed Aug 26 11:55:35 2020 +0300
feat(row-level-security): add hook for customizing form dropdowns (#10683)
---
superset/config.py | 10 ++++++++++
superset/connectors/sqla/views.py | 3 +++
2 files changed, 13 insertions(+)
diff --git a/superset/config.py b/superset/config.py
index b8dc1d0..be7631a 100644
--- a/superset/config.py
+++ b/superset/config.py
@@ -846,6 +846,16 @@ TALISMAN_CONFIG = {
# a custom security config could potentially give access to setting filters on
# tables that users do not have access to.
ENABLE_ROW_LEVEL_SECURITY = False
+# It is possible to customize which tables and roles are featured in the RLS
+# dropdown. When set, this dict is assigned to `add_form_query_rel_fields` and
+# `edit_form_query_rel_fields` on `RowLevelSecurityFiltersModelView`. Example:
+#
+# from flask_appbuilder.models.sqla import filters
+# RLS_FORM_QUERY_REL_FIELDS = {
+# "roles": [["name", filters.FilterStartsWith, "RlsRole"]]
+# "tables": [["table_name", filters.FilterContains, "rls"]]
+# }
+RLS_FORM_QUERY_REL_FIELDS: Optional[Dict[str, List[List[Any]]]] = None
#
# Flask session cookie options
diff --git a/superset/connectors/sqla/views.py b/superset/connectors/sqla/views.py
index 5f3466f..8cca0d6 100644
--- a/superset/connectors/sqla/views.py
+++ b/superset/connectors/sqla/views.py
@@ -263,6 +263,9 @@ class RowLevelSecurityFiltersModelView( # pylint: disable=too-many-ancestors
"creator": _("Creator"),
"modified": _("Modified"),
}
+ if app.config["RLS_FORM_QUERY_REL_FIELDS"]:
+ add_form_query_rel_fields = app.config["RLS_FORM_QUERY_REL_FIELDS"]
+ edit_form_query_rel_fields = add_form_query_rel_fields
class TableModelView( # pylint: disable=too-many-ancestors
[incubator-superset] 18/26: fix(jinja): extract form_data from json
body (#10684)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit fbe6b29d0f60b9392c8a6bc61657b44400b3fd7a
Author: Ville Brofeldt <33...@users.noreply.github.com>
AuthorDate: Fri Aug 28 21:26:07 2020 +0300
fix(jinja): extract form_data from json body (#10684)
* fix(jinja): extract form_data from json body
* add test
* disable test for presto
---
superset/views/utils.py | 8 ++++++++
tests/charts/api_tests.py | 20 +++++++++++++++++++-
tests/datasource_tests.py | 2 --
3 files changed, 27 insertions(+), 3 deletions(-)
diff --git a/superset/views/utils.py b/superset/views/utils.py
index 2a8b2cc..7e50800 100644
--- a/superset/views/utils.py
+++ b/superset/views/utils.py
@@ -116,8 +116,16 @@ def get_form_data(
slice_id: Optional[int] = None, use_slice_data: bool = False
) -> Tuple[Dict[str, Any], Optional[Slice]]:
form_data = {}
+ # chart data API requests are JSON
+ request_json_data = (
+ request.json["queries"][0]
+ if request.is_json and "queries" in request.json
+ else None
+ )
request_form_data = request.form.get("form_data")
request_args_data = request.args.get("form_data")
+ if request_json_data:
+ form_data.update(request_json_data)
if request_form_data:
form_data.update(json.loads(request_form_data))
# request params can overwrite the body
diff --git a/tests/charts/api_tests.py b/tests/charts/api_tests.py
index c115b1d..0ddc018 100644
--- a/tests/charts/api_tests.py
+++ b/tests/charts/api_tests.py
@@ -767,7 +767,7 @@ class TestChartApi(SupersetTestCase, ApiOwnersTestCaseMixin):
self.login(username="admin")
table = self.get_table_by_name("birth_names")
request_payload = get_query_context(table.name, table.id, table.type)
- request_payload["result_type"] = "query"
+ request_payload["result_type"] = utils.ChartDataResultType.QUERY
rv = self.post_assert_metric(CHART_DATA_URI, request_payload, "data")
self.assertEqual(rv.status_code, 200)
@@ -869,3 +869,21 @@ class TestChartApi(SupersetTestCase, ApiOwnersTestCaseMixin):
self.assertEqual(rv.status_code, 200)
data = json.loads(rv.data.decode("utf-8"))
self.assertEqual(data["count"], 6)
+
+ def test_chart_data_jinja_filter_request(self):
+ """
+ Chart data API: Ensure request referencing filters via jinja renders a correct query
+ """
+ self.login(username="admin")
+ table = self.get_table_by_name("birth_names")
+ request_payload = get_query_context(table.name, table.id, table.type)
+ request_payload["result_type"] = utils.ChartDataResultType.QUERY
+ request_payload["queries"][0]["filters"] = [
+ {"col": "gender", "op": "==", "val": "boy"}
+ ]
+ request_payload["queries"][0]["extras"][
+ "where"
+ ] = "('boy' = '{{ filter_values('gender', 'xyz' )[0] }}')"
+ rv = self.post_assert_metric(CHART_DATA_URI, request_payload, "data")
+ response_payload = json.loads(rv.data.decode("utf-8"))
+ result = response_payload["result"][0]["query"]
diff --git a/tests/datasource_tests.py b/tests/datasource_tests.py
index b0bae9d..5fd81c0 100644
--- a/tests/datasource_tests.py
+++ b/tests/datasource_tests.py
@@ -18,8 +18,6 @@
import json
from copy import deepcopy
-from superset.utils.core import get_or_create_db
-
from .base_tests import SupersetTestCase
from .fixtures.datasource import datasource_post
[incubator-superset] 04/26: feat: make screenshot timeout
configurable (#10517)
Posted by vi...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
villebro pushed a commit to branch 0.37
in repository https://gitbox.apache.org/repos/asf/incubator-superset.git
commit 8619898554e4fc9300d54c0839035697922eadc1
Author: Jason Davis <32...@users.noreply.github.com>
AuthorDate: Tue Aug 4 17:16:31 2020 -0700
feat: make screenshot timeout configurable (#10517)
* made screenshot timeout configurable
* added default value to config and refractored use
* black
* updated config comment
* moves config variables to thumbnail section
Co-authored-by: Jason Davis <@dropbox.com>
---
superset/config.py | 6 ++++++
superset/utils/screenshots.py | 8 ++++----
2 files changed, 10 insertions(+), 4 deletions(-)
diff --git a/superset/config.py b/superset/config.py
index 74e7fec..b8dc1d0 100644
--- a/superset/config.py
+++ b/superset/config.py
@@ -335,6 +335,12 @@ GET_FEATURE_FLAGS_FUNC: Optional[Callable[[Dict[str, bool]], Dict[str, bool]]] =
THUMBNAIL_SELENIUM_USER = "Admin"
THUMBNAIL_CACHE_CONFIG: CacheConfig = {"CACHE_TYPE": "null"}
+# Used for thumbnails and other api: Time in seconds before selenium
+# times out after trying to locate an element on the page and wait
+# for that element to load for an alert screenshot.
+SCREENSHOT_LOCATE_WAIT = 10
+SCREENSHOT_LOAD_WAIT = 60
+
# ---------------------------------------------------
# Image and file configuration
# ---------------------------------------------------
diff --git a/superset/utils/screenshots.py b/superset/utils/screenshots.py
index 3c7519a..05c7518 100644
--- a/superset/utils/screenshots.py
+++ b/superset/utils/screenshots.py
@@ -159,11 +159,11 @@ class AuthWebDriverProxy:
time.sleep(SELENIUM_HEADSTART)
try:
logger.debug("Wait for the presence of %s", element_name)
- element = WebDriverWait(driver, 10).until(
- EC.presence_of_element_located((By.CLASS_NAME, element_name))
- )
+ element = WebDriverWait(
+ driver, current_app.config["SCREENSHOT_LOCATE_WAIT"]
+ ).until(EC.presence_of_element_located((By.CLASS_NAME, element_name)))
logger.debug("Wait for .loading to be done")
- WebDriverWait(driver, 60).until_not(
+ WebDriverWait(driver, current_app.config["SCREENSHOT_LOAD_WAIT"]).until_not(
EC.presence_of_all_elements_located((By.CLASS_NAME, "loading"))
)
logger.info("Taking a PNG screenshot")