You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@superset.apache.org by el...@apache.org on 2023/07/07 22:41:30 UTC
[superset] branch 2.1 updated (e0bc2391b7 -> ccd456679e)
This is an automated email from the ASF dual-hosted git repository.
elizabeth pushed a change to branch 2.1
in repository https://gitbox.apache.org/repos/asf/superset.git
omit e0bc2391b7 update changelog
omit 73d3b7ab9a fix: Native time range filter in legacy charts (#23865)
omit 8a0b66d5e5 pass force to reload data
omit c137b5428b update changelog
omit 3f20b9ebab update package version
omit 801389f2f1 update changelog
omit 7247b9bb07 merge in fix with migration (#24314)
omit 5f21e7385f fix: handle temporal columns in presto partitions (#24054)
omit cfb4d27d8c lint
omit 9e6c9d2aa3 fix: handle comments in `has_table_query` (#23882)
omit a25347c113 fix: enable strong session protection by default (#24256)
new 60a1652be2 fix: handle comments in `has_table_query` (#23882)
new 5df0b7ad57 lint
new 75be3dd7b4 fix: handle temporal columns in presto partitions (#24054)
new 9abe28bc09 merge in fix with migration (#24314)
new 685760f110 update changelog
new e892c0a8d2 update package version
new 542effab81 update changelog
new 3c44e6d639 pass force to reload data
new 7155dcd5df fix: Native time range filter in legacy charts (#23865)
new 128751b80f update changelog
new a34da923fe fix: Select all issue with "Dynamically search all filter values" in FilterBar (#23400)
new ccd456679e fix: Filter values are not updating when dependencies are set (#23566)
This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version. This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:
* -- * -- B -- O -- O -- O (e0bc2391b7)
\
N -- N -- N refs/heads/2.1 (ccd456679e)
You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.
Any revisions marked "omit" are not gone; other references still
refer to them. Any revisions marked "discard" are gone forever.
The 12 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails. The revisions
listed as "add" were already present in the repository and have only
been added to this reference.
Summary of changes:
UPDATING.md | 1 -
docs/docs/security.mdx | 31 +-------
superset-frontend/src/components/Select/Select.tsx | 23 +++---
superset-frontend/src/components/Select/types.ts | 5 ++
.../FiltersConfigForm/FiltersConfigForm.tsx | 2 +-
.../components/Select/SelectFilterPlugin.tsx | 88 +++++++++++-----------
superset/config.py | 2 -
7 files changed, 65 insertions(+), 87 deletions(-)
[superset] 07/12: update changelog
Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
elizabeth pushed a commit to branch 2.1
in repository https://gitbox.apache.org/repos/asf/superset.git
commit 542effab81215d68156adec29e64c65503a1d949
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Thu Jun 8 16:36:58 2023 -0700
update changelog
---
CHANGELOG.md | 3 +--
1 file changed, 1 insertion(+), 2 deletions(-)
diff --git a/CHANGELOG.md b/CHANGELOG.md
index 71955dbc79..2d970a8828 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -39,6 +39,7 @@ under the License.
**Fixes**
- [#23723](https://github.com/apache/superset/pull/23723) add enforce URI query params with a specific for MySQL (@dpgaspar)
+- [#23600](https://github.com/apache/superset/pull/23600) fix: load examples as anon user (@betodealmeida)
- [#24054](https://github.com/apache/superset/pull/24054) fix: handle temporal columns in presto partitions (@giftig)
- [#23882](https://github.com/apache/superset/pull/23882) fix: handle comments in `has_table_query` (@betodealmeida)
- [#24256](https://github.com/apache/superset/pull/24256) fix: enable strong session protection by default (@dpgaspar)
@@ -60,8 +61,6 @@ under the License.
- [#24294](https://github.com/apache/superset/pull/24294) chore: update UPDATING for 2.1.0 (@eschutho)
- [#24056](https://github.com/apache/superset/pull/24056) chore: Remove unnecessary information from response (@geido)
-
-
### 2.1.0 (Thu Mar 16 21:13:05 2023 -0700)
**Database Migrations**
- [#23139](https://github.com/apache/superset/pull/23139) fix: memoized decorator memory leak (@dpgaspar)
[superset] 03/12: fix: handle temporal columns in presto partitions (#24054)
Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
elizabeth pushed a commit to branch 2.1
in repository https://gitbox.apache.org/repos/asf/superset.git
commit 75be3dd7b45ed98ade643d56b05a1ab10d8874b4
Author: Rob Moore <gi...@users.noreply.github.com>
AuthorDate: Fri May 19 21:29:42 2023 +0100
fix: handle temporal columns in presto partitions (#24054)
---
superset/db_engine_specs/base.py | 2 +-
superset/db_engine_specs/hive.py | 2 +-
superset/db_engine_specs/presto.py | 18 ++++++-----
tests/unit_tests/db_engine_specs/test_presto.py | 43 ++++++++++++++++++++++++-
4 files changed, 54 insertions(+), 11 deletions(-)
diff --git a/superset/db_engine_specs/base.py b/superset/db_engine_specs/base.py
index 27dd34a802..b789bbe70c 100644
--- a/superset/db_engine_specs/base.py
+++ b/superset/db_engine_specs/base.py
@@ -1168,7 +1168,7 @@ class BaseEngineSpec: # pylint: disable=too-many-public-methods
schema: Optional[str],
database: Database,
query: Select,
- columns: Optional[List[Dict[str, str]]] = None,
+ columns: Optional[List[Dict[str, Any]]] = None,
) -> Optional[Select]:
"""
Add a where clause to a query to reference only the most recent partition
diff --git a/superset/db_engine_specs/hive.py b/superset/db_engine_specs/hive.py
index f07d53518c..44dc435c2c 100644
--- a/superset/db_engine_specs/hive.py
+++ b/superset/db_engine_specs/hive.py
@@ -404,7 +404,7 @@ class HiveEngineSpec(PrestoEngineSpec):
schema: Optional[str],
database: "Database",
query: Select,
- columns: Optional[List[Dict[str, str]]] = None,
+ columns: Optional[List[Dict[str, Any]]] = None,
) -> Optional[Select]:
try:
col_names, values = cls.latest_partition(
diff --git a/superset/db_engine_specs/presto.py b/superset/db_engine_specs/presto.py
index 6bd556b79e..87f362acc8 100644
--- a/superset/db_engine_specs/presto.py
+++ b/superset/db_engine_specs/presto.py
@@ -462,7 +462,7 @@ class PrestoBaseEngineSpec(BaseEngineSpec, metaclass=ABCMeta):
schema: Optional[str],
database: Database,
query: Select,
- columns: Optional[List[Dict[str, str]]] = None,
+ columns: Optional[List[Dict[str, Any]]] = None,
) -> Optional[Select]:
try:
col_names, values = cls.latest_partition(
@@ -480,13 +480,15 @@ class PrestoBaseEngineSpec(BaseEngineSpec, metaclass=ABCMeta):
}
for col_name, value in zip(col_names, values):
- if col_name in column_type_by_name:
- if column_type_by_name.get(col_name) == "TIMESTAMP":
- query = query.where(Column(col_name, TimeStamp()) == value)
- elif column_type_by_name.get(col_name) == "DATE":
- query = query.where(Column(col_name, Date()) == value)
- else:
- query = query.where(Column(col_name) == value)
+ col_type = column_type_by_name.get(col_name)
+
+ if isinstance(col_type, types.DATE):
+ col_type = Date()
+ elif isinstance(col_type, types.TIMESTAMP):
+ col_type = TimeStamp()
+
+ query = query.where(Column(col_name, col_type) == value)
+
return query
@classmethod
diff --git a/tests/unit_tests/db_engine_specs/test_presto.py b/tests/unit_tests/db_engine_specs/test_presto.py
index a30fab94c9..8f55b1c048 100644
--- a/tests/unit_tests/db_engine_specs/test_presto.py
+++ b/tests/unit_tests/db_engine_specs/test_presto.py
@@ -16,10 +16,13 @@
# under the License.
from datetime import datetime
from typing import Any, Dict, Optional, Type
+from unittest import mock
import pytest
import pytz
-from sqlalchemy import types
+from pyhive.sqlalchemy_presto import PrestoDialect
+from sqlalchemy import sql, text, types
+from sqlalchemy.engine.url import make_url
from superset.utils.core import GenericDataType
from tests.unit_tests.db_engine_specs.utils import (
@@ -82,3 +85,41 @@ def test_get_column_spec(
from superset.db_engine_specs.presto import PrestoEngineSpec as spec
assert_column_spec(spec, native_type, sqla_type, attrs, generic_type, is_dttm)
+
+
+@mock.patch("superset.db_engine_specs.presto.PrestoEngineSpec.latest_partition")
+@pytest.mark.parametrize(
+ ["column_type", "column_value", "expected_value"],
+ [
+ (types.DATE(), "2023-05-01", "DATE '2023-05-01'"),
+ (types.TIMESTAMP(), "2023-05-01", "TIMESTAMP '2023-05-01'"),
+ (types.VARCHAR(), "2023-05-01", "'2023-05-01'"),
+ (types.INT(), 1234, "1234"),
+ ],
+)
+def test_where_latest_partition(
+ mock_latest_partition: Any,
+ column_type: Any,
+ column_value: str,
+ expected_value: str,
+) -> None:
+ """
+ Test the ``where_latest_partition`` method
+ """
+ from superset.db_engine_specs.presto import PrestoEngineSpec as spec
+
+ mock_latest_partition.return_value = (["partition_key"], [column_value])
+
+ query = sql.select(text("* FROM table"))
+ columns = [{"name": "partition_key", "type": column_type}]
+
+ expected = f"""SELECT * FROM table \nWHERE "partition_key" = {expected_value}"""
+ result = spec.where_latest_partition(
+ "table", mock.MagicMock(), mock.MagicMock(), query, columns
+ )
+ assert result is not None
+ actual = result.compile(
+ dialect=PrestoDialect(), compile_kwargs={"literal_binds": True}
+ )
+
+ assert str(actual) == expected
[superset] 11/12: fix: Select all issue with "Dynamically search all filter values" in FilterBar (#23400)
Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
elizabeth pushed a commit to branch 2.1
in repository https://gitbox.apache.org/repos/asf/superset.git
commit a34da923fe52077a1560b07f19dc94ead29e9827
Author: Geido <60...@users.noreply.github.com>
AuthorDate: Fri Mar 17 13:04:07 2023 +0100
fix: Select all issue with "Dynamically search all filter values" in FilterBar (#23400)
(cherry picked from commit 2fe695d3cfa54f626c37944b01b64998936ad75e)
---
.../components/Select/SelectFilterPlugin.tsx | 92 ++++++++++++----------
1 file changed, 51 insertions(+), 41 deletions(-)
diff --git a/superset-frontend/src/filters/components/Select/SelectFilterPlugin.tsx b/superset-frontend/src/filters/components/Select/SelectFilterPlugin.tsx
index fb8e20093c..d3cb2b4dea 100644
--- a/superset-frontend/src/filters/components/Select/SelectFilterPlugin.tsx
+++ b/superset-frontend/src/filters/components/Select/SelectFilterPlugin.tsx
@@ -38,6 +38,7 @@ import { Select } from 'src/components';
import { SLOW_DEBOUNCE } from 'src/constants';
import { propertyComparator } from 'src/components/Select/utils';
import { FilterBarOrientation } from 'src/dashboard/types';
+import { uniqWith, isEqual } from 'lodash';
import { PluginFilterSelectProps, SelectValue } from './types';
import { FilterPluginStyle, StatusMessage, StyledFormItem } from '../common';
import { getDataRecordFormatter, getSelectExtraFormData } from '../../utils';
@@ -120,6 +121,7 @@ export default function PluginFilterSelect(props: PluginFilterSelectProps) {
}),
[],
);
+ const [initialData, setInitialData] = useState<typeof data>([]);
const updateDataMask = useCallback(
(values: SelectValue) => {
@@ -165,10 +167,6 @@ export default function PluginFilterSelect(props: PluginFilterSelectProps) {
],
);
- useEffect(() => {
- updateDataMask(filterState.value);
- }, [JSON.stringify(filterState.value)]);
-
const isDisabled =
appSection === AppSection.FILTER_CONFIG_MODAL && defaultToFirstItem;
@@ -224,6 +222,47 @@ export default function PluginFilterSelect(props: PluginFilterSelectProps) {
[updateDataMask],
);
+ const placeholderText =
+ data.length === 0
+ ? t('No data')
+ : tn('%s option', '%s options', data.length, data.length);
+
+ const formItemExtra = useMemo(() => {
+ if (filterState.validateMessage) {
+ return (
+ <StatusMessage status={filterState.validateStatus}>
+ {filterState.validateMessage}
+ </StatusMessage>
+ );
+ }
+ return undefined;
+ }, [filterState.validateMessage, filterState.validateStatus]);
+
+ const options = useMemo(() => {
+ const allOptions = [...data, ...initialData];
+ const uniqueOptions = uniqWith(allOptions, isEqual);
+ const selectOptions: { label: string; value: DataRecordValue }[] = [];
+ uniqueOptions.forEach(row => {
+ const [value] = groupby.map(col => row[col]);
+ selectOptions.push({
+ label: labelFormatter(value, datatype),
+ value,
+ });
+ });
+ return selectOptions;
+ }, [data, initialData, datatype, groupby, labelFormatter]);
+
+ const sortComparator = useCallback(
+ (a: AntdLabeledValue, b: AntdLabeledValue) => {
+ const labelComparator = propertyComparator('label');
+ if (formData.sortAscending) {
+ return labelComparator(a, b);
+ }
+ return labelComparator(b, a);
+ },
+ [formData.sortAscending],
+ );
+
useEffect(() => {
if (defaultToFirstItem && filterState.value === undefined) {
// initialize to first value if set to default to first item
@@ -254,48 +293,19 @@ export default function PluginFilterSelect(props: PluginFilterSelectProps) {
JSON.stringify(filterState),
]);
+ useEffect(() => {
+ updateDataMask(filterState.value);
+ }, [JSON.stringify(filterState.value)]);
+
useEffect(() => {
setDataMask(dataMask);
}, [JSON.stringify(dataMask)]);
- const placeholderText =
- data.length === 0
- ? t('No data')
- : tn('%s option', '%s options', data.length, data.length);
-
- const formItemExtra = useMemo(() => {
- if (filterState.validateMessage) {
- return (
- <StatusMessage status={filterState.validateStatus}>
- {filterState.validateMessage}
- </StatusMessage>
- );
+ useEffect(() => {
+ if (data.length && !initialData.length) {
+ setInitialData(data);
}
- return undefined;
- }, [filterState.validateMessage, filterState.validateStatus]);
-
- const options = useMemo(() => {
- const options: { label: string; value: DataRecordValue }[] = [];
- data.forEach(row => {
- const [value] = groupby.map(col => row[col]);
- options.push({
- label: labelFormatter(value, datatype),
- value,
- });
- });
- return options;
- }, [data, datatype, groupby, labelFormatter]);
-
- const sortComparator = useCallback(
- (a: AntdLabeledValue, b: AntdLabeledValue) => {
- const labelComparator = propertyComparator('label');
- if (formData.sortAscending) {
- return labelComparator(a, b);
- }
- return labelComparator(b, a);
- },
- [formData.sortAscending],
- );
+ }, [data, initialData.length]);
return (
<FilterPluginStyle height={height} width={width}>
[superset] 08/12: pass force to reload data
Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
elizabeth pushed a commit to branch 2.1
in repository https://gitbox.apache.org/repos/asf/superset.git
commit 3c44e6d63991138d620909a1c53b1d4a3ca94a52
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Thu Jun 15 15:05:43 2023 -0700
pass force to reload data
---
RELEASING/from_tarball_entrypoint.sh | 6 +++---
docker/docker-init.sh | 2 +-
2 files changed, 4 insertions(+), 4 deletions(-)
diff --git a/RELEASING/from_tarball_entrypoint.sh b/RELEASING/from_tarball_entrypoint.sh
index fb06a7f12c..d2457db7a4 100755
--- a/RELEASING/from_tarball_entrypoint.sh
+++ b/RELEASING/from_tarball_entrypoint.sh
@@ -35,11 +35,11 @@ superset fab create-admin \
# Initialize the database
superset db upgrade
-# Loading examples
-superset load_examples
-
# Create default roles and permissions
superset init
+# Loading examples
+superset load-examples --force
+
FLASK_ENV=development FLASK_APP="superset.app:create_app()" \
flask run -p 8088 --with-threads --reload --debugger --host=0.0.0.0
diff --git a/docker/docker-init.sh b/docker/docker-init.sh
index c98f49881a..b54f999cb2 100755
--- a/docker/docker-init.sh
+++ b/docker/docker-init.sh
@@ -72,7 +72,7 @@ if [ "$SUPERSET_LOAD_EXAMPLES" = "yes" ]; then
superset load_test_users
superset load_examples --load-test-data
else
- superset load_examples
+ superset load_examples --force
fi
echo_step "4" "Complete" "Loading examples"
fi
[superset] 06/12: update package version
Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
elizabeth pushed a commit to branch 2.1
in repository https://gitbox.apache.org/repos/asf/superset.git
commit e892c0a8d23e36226d09bd0a7dc23178b686f063
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Wed Jun 7 15:20:43 2023 -0700
update package version
---
superset-frontend/package-lock.json | 4 ++--
superset-frontend/package.json | 2 +-
2 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/superset-frontend/package-lock.json b/superset-frontend/package-lock.json
index f0ffd39160..adb7f27ae6 100644
--- a/superset-frontend/package-lock.json
+++ b/superset-frontend/package-lock.json
@@ -1,12 +1,12 @@
{
"name": "superset",
- "version": "2.1.0",
+ "version": "2.1.1",
"lockfileVersion": 2,
"requires": true,
"packages": {
"": {
"name": "superset",
- "version": "2.1.0",
+ "version": "2.1.1",
"license": "Apache-2.0",
"workspaces": [
"packages/*",
diff --git a/superset-frontend/package.json b/superset-frontend/package.json
index 774ea6106f..5cc84f7fe8 100644
--- a/superset-frontend/package.json
+++ b/superset-frontend/package.json
@@ -1,6 +1,6 @@
{
"name": "superset",
- "version": "2.1.0",
+ "version": "2.1.1",
"description": "Superset is a data exploration platform designed to be visual, intuitive, and interactive.",
"keywords": [
"big",
[superset] 05/12: update changelog
Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
elizabeth pushed a commit to branch 2.1
in repository https://gitbox.apache.org/repos/asf/superset.git
commit 685760f110a51877cd4e4a5cf91729c36e798e33
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Wed Jun 7 14:10:52 2023 -0700
update changelog
---
CHANGELOG.md | 5 +++++
UPDATING.md | 1 +
2 files changed, 6 insertions(+)
diff --git a/CHANGELOG.md b/CHANGELOG.md
index d25ea7c3c2..71955dbc79 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -33,6 +33,10 @@ under the License.
### 2.1.1 (Sun Apr 23 15:44:21 2023 +0100)
+**Database Migrations**
+- [##23980](https://github.com/apache/superset/pull/23980) fix(migration): handle permalink edge cases correctly (@villebro)
+- [23888](https://github.com/apache/superset/pull/23888) chore(key-value): use json serialization for main resources (@villebro)
+
**Fixes**
- [#23723](https://github.com/apache/superset/pull/23723) add enforce URI query params with a specific for MySQL (@dpgaspar)
- [#24054](https://github.com/apache/superset/pull/24054) fix: handle temporal columns in presto partitions (@giftig)
@@ -57,6 +61,7 @@ under the License.
- [#24056](https://github.com/apache/superset/pull/24056) chore: Remove unnecessary information from response (@geido)
+
### 2.1.0 (Thu Mar 16 21:13:05 2023 -0700)
**Database Migrations**
- [#23139](https://github.com/apache/superset/pull/23139) fix: memoized decorator memory leak (@dpgaspar)
diff --git a/UPDATING.md b/UPDATING.md
index f71d884091..980d0af0d3 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -26,6 +26,7 @@ assists people when migrating to a new version.
- [24185](https://github.com/apache/superset/pull/24185): `/api/v1/database/test_connection` and `api/v1/database/validate_parameters` permissions changed from `can_read` to `can_write`. Only Admin user's have access.
### Other
+- [23888](https://github.com/apache/superset/pull/23888): Database Migration for json serialization instead of pickle should upgrade/downgrade correctly when bumping to/from this patch version
## 2.1.0
[superset] 10/12: update changelog
Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
elizabeth pushed a commit to branch 2.1
in repository https://gitbox.apache.org/repos/asf/superset.git
commit 128751b80f0859cdace610ccc6ade39b70999913
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Thu Jun 15 16:21:59 2023 -0700
update changelog
---
CHANGELOG.md | 5 +++--
1 file changed, 3 insertions(+), 2 deletions(-)
diff --git a/CHANGELOG.md b/CHANGELOG.md
index 2d970a8828..6f67b4d5cf 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -38,14 +38,14 @@ under the License.
- [23888](https://github.com/apache/superset/pull/23888) chore(key-value): use json serialization for main resources (@villebro)
**Fixes**
-- [#23723](https://github.com/apache/superset/pull/23723) add enforce URI query params with a specific for MySQL (@dpgaspar)
-- [#23600](https://github.com/apache/superset/pull/23600) fix: load examples as anon user (@betodealmeida)
+- [#23865](https://github.com/apache/superset/pull/23865) fix: Native time range filter in legacy charts (@kgabryje)
- [#24054](https://github.com/apache/superset/pull/24054) fix: handle temporal columns in presto partitions (@giftig)
- [#23882](https://github.com/apache/superset/pull/23882) fix: handle comments in `has_table_query` (@betodealmeida)
- [#24256](https://github.com/apache/superset/pull/24256) fix: enable strong session protection by default (@dpgaspar)
- [#24137](https://github.com/apache/superset/pull/24137) fix: disable SHOW_STACKTRACE by default (@dpgaspar)
- [#24185](https://github.com/apache/superset/pull/24185) fix: db validate parameters permission (@dpgaspar)
- [#23769](https://github.com/apache/superset/pull/23769) fix: allow db driver distinction on enforced URI params (@dpgaspar)
+- [#23600](https://github.com/apache/superset/pull/23600) fix: load examples as anon user (@betodealmeida)
- [#23200](https://github.com/apache/superset/pull/23200) fix: permission checks on import (@betodealmeida)
- [#23901](https://github.com/apache/superset/pull/23901) fix: check sqlalchemy_uri (@dpgaspar)
- [#23751](https://github.com/apache/superset/pull/23751) fix(mssql): apply top after distinct (@villebro)
@@ -58,6 +58,7 @@ under the License.
- [#22851](https://github.com/apache/superset/pull/22851) fix: Validate jinja rendered query (@geido)
**Others**
+- [#23723](https://github.com/apache/superset/pull/23723) chore: add enforce URI query params with a specific for MySQL (@dpgaspar)
- [#24294](https://github.com/apache/superset/pull/24294) chore: update UPDATING for 2.1.0 (@eschutho)
- [#24056](https://github.com/apache/superset/pull/24056) chore: Remove unnecessary information from response (@geido)
[superset] 04/12: merge in fix with migration (#24314)
Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
elizabeth pushed a commit to branch 2.1
in repository https://gitbox.apache.org/repos/asf/superset.git
commit 9abe28bc09f42ef434e9a1fb1e7ab62f5750076d
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Wed Jun 7 13:41:32 2023 -0700
merge in fix with migration (#24314)
Co-authored-by: Ville Brofeldt <33...@users.noreply.github.com>
Co-authored-by: Ville Brofeldt <vi...@apple.com>
---
superset/dashboards/permalink/commands/base.py | 3 +-
superset/dashboards/permalink/commands/create.py | 1 +
superset/dashboards/permalink/commands/get.py | 6 +-
superset/explore/permalink/commands/base.py | 3 +-
superset/explore/permalink/commands/create.py | 3 +-
superset/explore/permalink/commands/get.py | 1 +
superset/extensions/metastore_cache.py | 11 ++-
superset/key_value/commands/create.py | 23 ++++--
superset/key_value/commands/get.py | 15 +++-
superset/key_value/commands/update.py | 11 ++-
superset/key_value/commands/upsert.py | 13 +--
superset/key_value/shared_entries.py | 12 ++-
superset/key_value/types.py | 33 +++++++-
...2a5681ddfd_convert_key_value_entries_to_json.py | 96 ++++++++++++++++++++++
superset/temporary_cache/api.py | 13 ++-
superset/temporary_cache/commands/parameters.py | 3 +
.../explore/permalink/api_tests.py | 5 +-
.../key_value/commands/create_test.py | 55 +++++++++++--
.../key_value/commands/delete_test.py | 13 +--
.../key_value/commands/fixtures.py | 15 +++-
.../key_value/commands/get_test.py | 25 +++---
.../key_value/commands/update_test.py | 11 ++-
.../key_value/commands/upsert_test.py | 11 ++-
23 files changed, 311 insertions(+), 71 deletions(-)
diff --git a/superset/dashboards/permalink/commands/base.py b/superset/dashboards/permalink/commands/base.py
index f4dc4f0726..82e24264ca 100644
--- a/superset/dashboards/permalink/commands/base.py
+++ b/superset/dashboards/permalink/commands/base.py
@@ -18,11 +18,12 @@ from abc import ABC
from superset.commands.base import BaseCommand
from superset.key_value.shared_entries import get_permalink_salt
-from superset.key_value.types import KeyValueResource, SharedKey
+from superset.key_value.types import JsonKeyValueCodec, KeyValueResource, SharedKey
class BaseDashboardPermalinkCommand(BaseCommand, ABC):
resource = KeyValueResource.DASHBOARD_PERMALINK
+ codec = JsonKeyValueCodec()
@property
def salt(self) -> str:
diff --git a/superset/dashboards/permalink/commands/create.py b/superset/dashboards/permalink/commands/create.py
index 51dac2d5de..2b6151fbb2 100644
--- a/superset/dashboards/permalink/commands/create.py
+++ b/superset/dashboards/permalink/commands/create.py
@@ -58,6 +58,7 @@ class CreateDashboardPermalinkCommand(BaseDashboardPermalinkCommand):
resource=self.resource,
key=get_deterministic_uuid(self.salt, (user_id, value)),
value=value,
+ codec=self.codec,
).run()
assert key.id # for type checks
return encode_permalink_key(key=key.id, salt=self.salt)
diff --git a/superset/dashboards/permalink/commands/get.py b/superset/dashboards/permalink/commands/get.py
index f89f9444e7..4206263a37 100644
--- a/superset/dashboards/permalink/commands/get.py
+++ b/superset/dashboards/permalink/commands/get.py
@@ -39,7 +39,11 @@ class GetDashboardPermalinkCommand(BaseDashboardPermalinkCommand):
self.validate()
try:
key = decode_permalink_id(self.key, salt=self.salt)
- command = GetKeyValueCommand(resource=self.resource, key=key)
+ command = GetKeyValueCommand(
+ resource=self.resource,
+ key=key,
+ codec=self.codec,
+ )
value: Optional[DashboardPermalinkValue] = command.run()
if value:
DashboardDAO.get_by_id_or_slug(value["dashboardId"])
diff --git a/superset/explore/permalink/commands/base.py b/superset/explore/permalink/commands/base.py
index bef9546e21..a87183b7e9 100644
--- a/superset/explore/permalink/commands/base.py
+++ b/superset/explore/permalink/commands/base.py
@@ -18,11 +18,12 @@ from abc import ABC
from superset.commands.base import BaseCommand
from superset.key_value.shared_entries import get_permalink_salt
-from superset.key_value.types import KeyValueResource, SharedKey
+from superset.key_value.types import JsonKeyValueCodec, KeyValueResource, SharedKey
class BaseExplorePermalinkCommand(BaseCommand, ABC):
resource: KeyValueResource = KeyValueResource.EXPLORE_PERMALINK
+ codec = JsonKeyValueCodec()
@property
def salt(self) -> str:
diff --git a/superset/explore/permalink/commands/create.py b/superset/explore/permalink/commands/create.py
index 77ce04c4e4..21c0f4e42f 100644
--- a/superset/explore/permalink/commands/create.py
+++ b/superset/explore/permalink/commands/create.py
@@ -45,13 +45,14 @@ class CreateExplorePermalinkCommand(BaseExplorePermalinkCommand):
value = {
"chartId": self.chart_id,
"datasourceId": datasource_id,
- "datasourceType": datasource_type,
+ "datasourceType": datasource_type.value,
"datasource": self.datasource,
"state": self.state,
}
command = CreateKeyValueCommand(
resource=self.resource,
value=value,
+ codec=self.codec,
)
key = command.run()
if key.id is None:
diff --git a/superset/explore/permalink/commands/get.py b/superset/explore/permalink/commands/get.py
index 3376cab080..4823117ece 100644
--- a/superset/explore/permalink/commands/get.py
+++ b/superset/explore/permalink/commands/get.py
@@ -43,6 +43,7 @@ class GetExplorePermalinkCommand(BaseExplorePermalinkCommand):
value: Optional[ExplorePermalinkValue] = GetKeyValueCommand(
resource=self.resource,
key=key,
+ codec=self.codec,
).run()
if value:
chart_id: Optional[int] = value.get("chartId")
diff --git a/superset/extensions/metastore_cache.py b/superset/extensions/metastore_cache.py
index 1e5cff7ee3..f69276c908 100644
--- a/superset/extensions/metastore_cache.py
+++ b/superset/extensions/metastore_cache.py
@@ -23,10 +23,11 @@ from flask import Flask
from flask_caching import BaseCache
from superset.key_value.exceptions import KeyValueCreateFailedError
-from superset.key_value.types import KeyValueResource
+from superset.key_value.types import KeyValueResource, PickleKeyValueCodec
from superset.key_value.utils import get_uuid_namespace
RESOURCE = KeyValueResource.METASTORE_CACHE
+CODEC = PickleKeyValueCodec()
class SupersetMetastoreCache(BaseCache):
@@ -68,6 +69,7 @@ class SupersetMetastoreCache(BaseCache):
resource=RESOURCE,
key=self.get_key(key),
value=value,
+ codec=CODEC,
expires_on=self._get_expiry(timeout),
).run()
return True
@@ -80,6 +82,7 @@ class SupersetMetastoreCache(BaseCache):
CreateKeyValueCommand(
resource=RESOURCE,
value=value,
+ codec=CODEC,
key=self.get_key(key),
expires_on=self._get_expiry(timeout),
).run()
@@ -92,7 +95,11 @@ class SupersetMetastoreCache(BaseCache):
# pylint: disable=import-outside-toplevel
from superset.key_value.commands.get import GetKeyValueCommand
- return GetKeyValueCommand(resource=RESOURCE, key=self.get_key(key)).run()
+ return GetKeyValueCommand(
+ resource=RESOURCE,
+ key=self.get_key(key),
+ codec=CODEC,
+ ).run()
def has(self, key: str) -> bool:
entry = self.get(key)
diff --git a/superset/key_value/commands/create.py b/superset/key_value/commands/create.py
index 93e99c223b..d66d99d6e9 100644
--- a/superset/key_value/commands/create.py
+++ b/superset/key_value/commands/create.py
@@ -15,7 +15,6 @@
# specific language governing permissions and limitations
# under the License.
import logging
-import pickle
from datetime import datetime
from typing import Any, Optional, Union
from uuid import UUID
@@ -26,7 +25,7 @@ from superset import db
from superset.commands.base import BaseCommand
from superset.key_value.exceptions import KeyValueCreateFailedError
from superset.key_value.models import KeyValueEntry
-from superset.key_value.types import Key, KeyValueResource
+from superset.key_value.types import Key, KeyValueCodec, KeyValueResource
from superset.utils.core import get_user_id
logger = logging.getLogger(__name__)
@@ -35,13 +34,15 @@ logger = logging.getLogger(__name__)
class CreateKeyValueCommand(BaseCommand):
resource: KeyValueResource
value: Any
+ codec: KeyValueCodec
key: Optional[Union[int, UUID]]
expires_on: Optional[datetime]
- def __init__(
+ def __init__( # pylint: disable=too-many-arguments
self,
resource: KeyValueResource,
value: Any,
+ codec: KeyValueCodec,
key: Optional[Union[int, UUID]] = None,
expires_on: Optional[datetime] = None,
):
@@ -50,16 +51,24 @@ class CreateKeyValueCommand(BaseCommand):
:param resource: the resource (dashboard, chart etc)
:param value: the value to persist in the key-value store
+ :param codec: codec used to encode the value
:param key: id of entry (autogenerated if undefined)
:param expires_on: entry expiration time
- :return: the key associated with the persisted value
+ :
"""
self.resource = resource
self.value = value
+ self.codec = codec
self.key = key
self.expires_on = expires_on
def run(self) -> Key:
+ """
+ Persist the value
+
+ :return: the key associated with the persisted value
+
+ """
try:
return self.create()
except SQLAlchemyError as ex:
@@ -70,9 +79,13 @@ class CreateKeyValueCommand(BaseCommand):
pass
def create(self) -> Key:
+ try:
+ value = self.codec.encode(self.value)
+ except Exception as ex: # pylint: disable=broad-except
+ raise KeyValueCreateFailedError("Unable to encode value") from ex
entry = KeyValueEntry(
resource=self.resource.value,
- value=pickle.dumps(self.value),
+ value=value,
created_on=datetime.now(),
created_by_fk=get_user_id(),
expires_on=self.expires_on,
diff --git a/superset/key_value/commands/get.py b/superset/key_value/commands/get.py
index 44c02331cc..9d659f3bc7 100644
--- a/superset/key_value/commands/get.py
+++ b/superset/key_value/commands/get.py
@@ -16,7 +16,6 @@
# under the License.
import logging
-import pickle
from datetime import datetime
from typing import Any, Optional, Union
from uuid import UUID
@@ -27,7 +26,7 @@ from superset import db
from superset.commands.base import BaseCommand
from superset.key_value.exceptions import KeyValueGetFailedError
from superset.key_value.models import KeyValueEntry
-from superset.key_value.types import KeyValueResource
+from superset.key_value.types import KeyValueCodec, KeyValueResource
from superset.key_value.utils import get_filter
logger = logging.getLogger(__name__)
@@ -36,17 +35,25 @@ logger = logging.getLogger(__name__)
class GetKeyValueCommand(BaseCommand):
resource: KeyValueResource
key: Union[int, UUID]
+ codec: KeyValueCodec
- def __init__(self, resource: KeyValueResource, key: Union[int, UUID]):
+ def __init__(
+ self,
+ resource: KeyValueResource,
+ key: Union[int, UUID],
+ codec: KeyValueCodec,
+ ):
"""
Retrieve a key value entry
:param resource: the resource (dashboard, chart etc)
:param key: the key to retrieve
+ :param codec: codec used to decode the value
:return: the value associated with the key if present
"""
self.resource = resource
self.key = key
+ self.codec = codec
def run(self) -> Any:
try:
@@ -66,5 +73,5 @@ class GetKeyValueCommand(BaseCommand):
.first()
)
if entry and (entry.expires_on is None or entry.expires_on > datetime.now()):
- return pickle.loads(entry.value)
+ return self.codec.decode(entry.value)
return None
diff --git a/superset/key_value/commands/update.py b/superset/key_value/commands/update.py
index b69ca5e70d..becd6d9ca8 100644
--- a/superset/key_value/commands/update.py
+++ b/superset/key_value/commands/update.py
@@ -16,7 +16,6 @@
# under the License.
import logging
-import pickle
from datetime import datetime
from typing import Any, Optional, Union
from uuid import UUID
@@ -27,7 +26,7 @@ from superset import db
from superset.commands.base import BaseCommand
from superset.key_value.exceptions import KeyValueUpdateFailedError
from superset.key_value.models import KeyValueEntry
-from superset.key_value.types import Key, KeyValueResource
+from superset.key_value.types import Key, KeyValueCodec, KeyValueResource
from superset.key_value.utils import get_filter
from superset.utils.core import get_user_id
@@ -37,14 +36,16 @@ logger = logging.getLogger(__name__)
class UpdateKeyValueCommand(BaseCommand):
resource: KeyValueResource
value: Any
+ codec: KeyValueCodec
key: Union[int, UUID]
expires_on: Optional[datetime]
- def __init__(
+ def __init__( # pylint: disable=too-many-arguments
self,
resource: KeyValueResource,
key: Union[int, UUID],
value: Any,
+ codec: KeyValueCodec,
expires_on: Optional[datetime] = None,
):
"""
@@ -53,12 +54,14 @@ class UpdateKeyValueCommand(BaseCommand):
:param resource: the resource (dashboard, chart etc)
:param key: the key to update
:param value: the value to persist in the key-value store
+ :param codec: codec used to encode the value
:param expires_on: entry expiration time
:return: the key associated with the updated value
"""
self.resource = resource
self.key = key
self.value = value
+ self.codec = codec
self.expires_on = expires_on
def run(self) -> Optional[Key]:
@@ -80,7 +83,7 @@ class UpdateKeyValueCommand(BaseCommand):
.first()
)
if entry:
- entry.value = pickle.dumps(self.value)
+ entry.value = self.codec.encode(self.value)
entry.expires_on = self.expires_on
entry.changed_on = datetime.now()
entry.changed_by_fk = get_user_id()
diff --git a/superset/key_value/commands/upsert.py b/superset/key_value/commands/upsert.py
index 06b33c90fc..c5668f1161 100644
--- a/superset/key_value/commands/upsert.py
+++ b/superset/key_value/commands/upsert.py
@@ -16,7 +16,6 @@
# under the License.
import logging
-import pickle
from datetime import datetime
from typing import Any, Optional, Union
from uuid import UUID
@@ -31,7 +30,7 @@ from superset.key_value.exceptions import (
KeyValueUpsertFailedError,
)
from superset.key_value.models import KeyValueEntry
-from superset.key_value.types import Key, KeyValueResource
+from superset.key_value.types import Key, KeyValueCodec, KeyValueResource
from superset.key_value.utils import get_filter
from superset.utils.core import get_user_id
@@ -42,13 +41,15 @@ class UpsertKeyValueCommand(BaseCommand):
resource: KeyValueResource
value: Any
key: Union[int, UUID]
+ codec: KeyValueCodec
expires_on: Optional[datetime]
- def __init__(
+ def __init__( # pylint: disable=too-many-arguments
self,
resource: KeyValueResource,
key: Union[int, UUID],
value: Any,
+ codec: KeyValueCodec,
expires_on: Optional[datetime] = None,
):
"""
@@ -57,13 +58,14 @@ class UpsertKeyValueCommand(BaseCommand):
:param resource: the resource (dashboard, chart etc)
:param key: the key to update
:param value: the value to persist in the key-value store
- :param key_type: the type of the key to update
+ :param codec: codec used to encode the value
:param expires_on: entry expiration time
:return: the key associated with the updated value
"""
self.resource = resource
self.key = key
self.value = value
+ self.codec = codec
self.expires_on = expires_on
def run(self) -> Key:
@@ -85,7 +87,7 @@ class UpsertKeyValueCommand(BaseCommand):
.first()
)
if entry:
- entry.value = pickle.dumps(self.value)
+ entry.value = self.codec.encode(self.value)
entry.expires_on = self.expires_on
entry.changed_on = datetime.now()
entry.changed_by_fk = get_user_id()
@@ -96,6 +98,7 @@ class UpsertKeyValueCommand(BaseCommand):
return CreateKeyValueCommand(
resource=self.resource,
value=self.value,
+ codec=self.codec,
key=self.key,
expires_on=self.expires_on,
).run()
diff --git a/superset/key_value/shared_entries.py b/superset/key_value/shared_entries.py
index 5f4ded9498..7895b75907 100644
--- a/superset/key_value/shared_entries.py
+++ b/superset/key_value/shared_entries.py
@@ -18,11 +18,12 @@
from typing import Any, Optional
from uuid import uuid3
-from superset.key_value.types import KeyValueResource, SharedKey
+from superset.key_value.types import JsonKeyValueCodec, KeyValueResource, SharedKey
from superset.key_value.utils import get_uuid_namespace, random_key
RESOURCE = KeyValueResource.APP
NAMESPACE = get_uuid_namespace("")
+CODEC = JsonKeyValueCodec()
def get_shared_value(key: SharedKey) -> Optional[Any]:
@@ -30,7 +31,7 @@ def get_shared_value(key: SharedKey) -> Optional[Any]:
from superset.key_value.commands.get import GetKeyValueCommand
uuid_key = uuid3(NAMESPACE, key)
- return GetKeyValueCommand(RESOURCE, key=uuid_key).run()
+ return GetKeyValueCommand(RESOURCE, key=uuid_key, codec=CODEC).run()
def set_shared_value(key: SharedKey, value: Any) -> None:
@@ -38,7 +39,12 @@ def set_shared_value(key: SharedKey, value: Any) -> None:
from superset.key_value.commands.create import CreateKeyValueCommand
uuid_key = uuid3(NAMESPACE, key)
- CreateKeyValueCommand(resource=RESOURCE, value=value, key=uuid_key).run()
+ CreateKeyValueCommand(
+ resource=RESOURCE,
+ value=value,
+ key=uuid_key,
+ codec=CODEC,
+ ).run()
def get_permalink_salt(key: SharedKey) -> str:
diff --git a/superset/key_value/types.py b/superset/key_value/types.py
index c3064fbef4..07d06414f6 100644
--- a/superset/key_value/types.py
+++ b/superset/key_value/types.py
@@ -14,9 +14,14 @@
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
+from __future__ import annotations
+
+import json
+import pickle
+from abc import ABC, abstractmethod
from dataclasses import dataclass
from enum import Enum
-from typing import Optional, TypedDict
+from typing import Any, Optional, TypedDict
from uuid import UUID
@@ -42,3 +47,29 @@ class KeyValueResource(str, Enum):
class SharedKey(str, Enum):
DASHBOARD_PERMALINK_SALT = "dashboard_permalink_salt"
EXPLORE_PERMALINK_SALT = "explore_permalink_salt"
+
+
+class KeyValueCodec(ABC):
+ @abstractmethod
+ def encode(self, value: Any) -> bytes:
+ ...
+
+ @abstractmethod
+ def decode(self, value: bytes) -> Any:
+ ...
+
+
+class JsonKeyValueCodec(KeyValueCodec):
+ def encode(self, value: dict[Any, Any]) -> bytes:
+ return bytes(json.dumps(value), encoding="utf-8")
+
+ def decode(self, value: bytes) -> dict[Any, Any]:
+ return json.loads(value)
+
+
+class PickleKeyValueCodec(KeyValueCodec):
+ def encode(self, value: dict[Any, Any]) -> bytes:
+ return pickle.dumps(value)
+
+ def decode(self, value: bytes) -> dict[Any, Any]:
+ return pickle.loads(value)
diff --git a/superset/migrations/versions/2023-05-01_12-03_9c2a5681ddfd_convert_key_value_entries_to_json.py b/superset/migrations/versions/2023-05-01_12-03_9c2a5681ddfd_convert_key_value_entries_to_json.py
new file mode 100644
index 0000000000..6e55f3ddc9
--- /dev/null
+++ b/superset/migrations/versions/2023-05-01_12-03_9c2a5681ddfd_convert_key_value_entries_to_json.py
@@ -0,0 +1,96 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied. See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""convert key-value entries to json
+
+Revision ID: 9c2a5681ddfd
+Revises: f3c2d8ec8595
+Create Date: 2023-05-01 12:03:17.079862
+
+"""
+
+# revision identifiers, used by Alembic.
+revision = "9c2a5681ddfd"
+down_revision = "f3c2d8ec8595"
+
+import io
+import json
+import pickle
+
+from alembic import op
+from sqlalchemy import Column, Integer, LargeBinary, String
+from sqlalchemy.ext.declarative import declarative_base
+from sqlalchemy.orm import Session
+
+from superset import db
+from superset.migrations.shared.utils import paginated_update
+
+Base = declarative_base()
+VALUE_MAX_SIZE = 2**24 - 1
+RESOURCES_TO_MIGRATE = ("app", "dashboard_permalink", "explore_permalink")
+
+
+class RestrictedUnpickler(pickle.Unpickler):
+ def find_class(self, module, name):
+ if not (module == "superset.utils.core" and name == "DatasourceType"):
+ raise pickle.UnpicklingError(f"Unpickling of {module}.{name} is forbidden")
+
+ return super().find_class(module, name)
+
+
+class KeyValueEntry(Base):
+ __tablename__ = "key_value"
+ id = Column(Integer, primary_key=True)
+ resource = Column(String(32), nullable=False)
+ value = Column(LargeBinary(length=VALUE_MAX_SIZE), nullable=False)
+
+
+def upgrade():
+ bind = op.get_bind()
+ session: Session = db.Session(bind=bind)
+ truncated_count = 0
+ for entry in paginated_update(
+ session.query(KeyValueEntry).filter(
+ KeyValueEntry.resource.in_(RESOURCES_TO_MIGRATE)
+ )
+ ):
+ try:
+ value = RestrictedUnpickler(io.BytesIO(entry.value)).load() or {}
+ except pickle.UnpicklingError as ex:
+ if str(ex) == "pickle data was truncated":
+ # make truncated values that were created prior to #20385 an empty
+ # dict so that downgrading will work properly.
+ truncated_count += 1
+ value = {}
+ else:
+ raise
+
+ entry.value = bytes(json.dumps(value), encoding="utf-8")
+
+ if truncated_count:
+ print(f"Replaced {truncated_count} corrupted values with an empty value")
+
+
+def downgrade():
+ bind = op.get_bind()
+ session: Session = db.Session(bind=bind)
+ for entry in paginated_update(
+ session.query(KeyValueEntry).filter(
+ KeyValueEntry.resource.in_(RESOURCES_TO_MIGRATE)
+ ),
+ ):
+ value = json.loads(entry.value) or {}
+ entry.value = pickle.dumps(value)
diff --git a/superset/temporary_cache/api.py b/superset/temporary_cache/api.py
index b6376c63c3..85db65c62c 100644
--- a/superset/temporary_cache/api.py
+++ b/superset/temporary_cache/api.py
@@ -24,6 +24,7 @@ from flask import request, Response
from marshmallow import ValidationError
from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP, RouteMethod
+from superset.key_value.types import JsonKeyValueCodec
from superset.temporary_cache.commands.exceptions import (
TemporaryCacheAccessDeniedError,
TemporaryCacheResourceNotFoundError,
@@ -37,6 +38,8 @@ from superset.views.base_api import BaseSupersetApi, requires_json
logger = logging.getLogger(__name__)
+CODEC = JsonKeyValueCodec()
+
class TemporaryCacheRestApi(BaseSupersetApi, ABC):
add_model_schema = TemporaryCachePostSchema()
@@ -69,7 +72,12 @@ class TemporaryCacheRestApi(BaseSupersetApi, ABC):
try:
item = self.add_model_schema.load(request.json)
tab_id = request.args.get("tab_id")
- args = CommandParameters(resource_id=pk, value=item["value"], tab_id=tab_id)
+ args = CommandParameters(
+ resource_id=pk,
+ value=item["value"],
+ tab_id=tab_id,
+ codec=CODEC,
+ )
key = self.get_create_command()(args).run()
return self.response(201, key=key)
except ValidationError as ex:
@@ -89,6 +97,7 @@ class TemporaryCacheRestApi(BaseSupersetApi, ABC):
key=key,
value=item["value"],
tab_id=tab_id,
+ codec=CODEC,
)
key = self.get_update_command()(args).run()
return self.response(200, key=key)
@@ -101,7 +110,7 @@ class TemporaryCacheRestApi(BaseSupersetApi, ABC):
def get(self, pk: int, key: str) -> Response:
try:
- args = CommandParameters(resource_id=pk, key=key)
+ args = CommandParameters(resource_id=pk, key=key, codec=CODEC)
value = self.get_get_command()(args).run()
if not value:
return self.response_404()
diff --git a/superset/temporary_cache/commands/parameters.py b/superset/temporary_cache/commands/parameters.py
index 74b9c1c632..e4e5b9b06a 100644
--- a/superset/temporary_cache/commands/parameters.py
+++ b/superset/temporary_cache/commands/parameters.py
@@ -17,10 +17,13 @@
from dataclasses import dataclass
from typing import Optional
+from superset.key_value.types import KeyValueCodec
+
@dataclass
class CommandParameters:
resource_id: int
+ codec: Optional[KeyValueCodec] = None
tab_id: Optional[int] = None
key: Optional[str] = None
value: Optional[str] = None
diff --git a/tests/integration_tests/explore/permalink/api_tests.py b/tests/integration_tests/explore/permalink/api_tests.py
index 22a36f41e1..4c6a3c12dd 100644
--- a/tests/integration_tests/explore/permalink/api_tests.py
+++ b/tests/integration_tests/explore/permalink/api_tests.py
@@ -15,7 +15,6 @@
# specific language governing permissions and limitations
# under the License.
import json
-import pickle
from typing import Any, Dict, Iterator
from uuid import uuid3
@@ -24,7 +23,7 @@ from sqlalchemy.orm import Session
from superset import db
from superset.key_value.models import KeyValueEntry
-from superset.key_value.types import KeyValueResource
+from superset.key_value.types import JsonKeyValueCodec, KeyValueResource
from superset.key_value.utils import decode_permalink_id, encode_permalink_key
from superset.models.slice import Slice
from superset.utils.core import DatasourceType
@@ -95,7 +94,7 @@ def test_get_missing_chart(
chart_id = 1234
entry = KeyValueEntry(
resource=KeyValueResource.EXPLORE_PERMALINK,
- value=pickle.dumps(
+ value=JsonKeyValueCodec().encode(
{
"chartId": chart_id,
"datasourceId": chart.datasource.id,
diff --git a/tests/integration_tests/key_value/commands/create_test.py b/tests/integration_tests/key_value/commands/create_test.py
index 0e789026ba..a2ee3d13ae 100644
--- a/tests/integration_tests/key_value/commands/create_test.py
+++ b/tests/integration_tests/key_value/commands/create_test.py
@@ -16,20 +16,23 @@
# under the License.
from __future__ import annotations
+import json
import pickle
-from uuid import UUID
+import pytest
from flask.ctx import AppContext
from flask_appbuilder.security.sqla.models import User
from superset.extensions import db
+from superset.key_value.exceptions import KeyValueCreateFailedError
from superset.utils.core import override_user
from tests.integration_tests.key_value.commands.fixtures import (
admin,
- ID_KEY,
+ JSON_CODEC,
+ JSON_VALUE,
+ PICKLE_CODEC,
+ PICKLE_VALUE,
RESOURCE,
- UUID_KEY,
- VALUE,
)
@@ -38,11 +41,15 @@ def test_create_id_entry(app_context: AppContext, admin: User) -> None:
from superset.key_value.models import KeyValueEntry
with override_user(admin):
- key = CreateKeyValueCommand(resource=RESOURCE, value=VALUE).run()
+ key = CreateKeyValueCommand(
+ resource=RESOURCE,
+ value=JSON_VALUE,
+ codec=JSON_CODEC,
+ ).run()
entry = (
db.session.query(KeyValueEntry).filter_by(id=key.id).autoflush(False).one()
)
- assert pickle.loads(entry.value) == VALUE
+ assert json.loads(entry.value) == JSON_VALUE
assert entry.created_by_fk == admin.id
db.session.delete(entry)
db.session.commit()
@@ -53,11 +60,43 @@ def test_create_uuid_entry(app_context: AppContext, admin: User) -> None:
from superset.key_value.models import KeyValueEntry
with override_user(admin):
- key = CreateKeyValueCommand(resource=RESOURCE, value=VALUE).run()
+ key = CreateKeyValueCommand(
+ resource=RESOURCE, value=JSON_VALUE, codec=JSON_CODEC
+ ).run()
entry = (
db.session.query(KeyValueEntry).filter_by(uuid=key.uuid).autoflush(False).one()
)
- assert pickle.loads(entry.value) == VALUE
+ assert json.loads(entry.value) == JSON_VALUE
assert entry.created_by_fk == admin.id
db.session.delete(entry)
db.session.commit()
+
+
+def test_create_fail_json_entry(app_context: AppContext, admin: User) -> None:
+ from superset.key_value.commands.create import CreateKeyValueCommand
+
+ with pytest.raises(KeyValueCreateFailedError):
+ CreateKeyValueCommand(
+ resource=RESOURCE,
+ value=PICKLE_VALUE,
+ codec=JSON_CODEC,
+ ).run()
+
+
+def test_create_pickle_entry(app_context: AppContext, admin: User) -> None:
+ from superset.key_value.commands.create import CreateKeyValueCommand
+ from superset.key_value.models import KeyValueEntry
+
+ with override_user(admin):
+ key = CreateKeyValueCommand(
+ resource=RESOURCE,
+ value=PICKLE_VALUE,
+ codec=PICKLE_CODEC,
+ ).run()
+ entry = (
+ db.session.query(KeyValueEntry).filter_by(id=key.id).autoflush(False).one()
+ )
+ assert type(pickle.loads(entry.value)) == type(PICKLE_VALUE)
+ assert entry.created_by_fk == admin.id
+ db.session.delete(entry)
+ db.session.commit()
diff --git a/tests/integration_tests/key_value/commands/delete_test.py b/tests/integration_tests/key_value/commands/delete_test.py
index 62f9883370..3c4892faa6 100644
--- a/tests/integration_tests/key_value/commands/delete_test.py
+++ b/tests/integration_tests/key_value/commands/delete_test.py
@@ -16,7 +16,7 @@
# under the License.
from __future__ import annotations
-import pickle
+import json
from typing import TYPE_CHECKING
from uuid import UUID
@@ -25,7 +25,11 @@ from flask.ctx import AppContext
from flask_appbuilder.security.sqla.models import User
from superset.extensions import db
-from tests.integration_tests.key_value.commands.fixtures import admin, RESOURCE, VALUE
+from tests.integration_tests.key_value.commands.fixtures import (
+ admin,
+ JSON_VALUE,
+ RESOURCE,
+)
if TYPE_CHECKING:
from superset.key_value.models import KeyValueEntry
@@ -42,7 +46,7 @@ def key_value_entry() -> KeyValueEntry:
id=ID_KEY,
uuid=UUID_KEY,
resource=RESOURCE,
- value=pickle.dumps(VALUE),
+ value=bytes(json.dumps(JSON_VALUE), encoding="utf-8"),
)
db.session.add(entry)
db.session.commit()
@@ -55,7 +59,6 @@ def test_delete_id_entry(
key_value_entry: KeyValueEntry,
) -> None:
from superset.key_value.commands.delete import DeleteKeyValueCommand
- from superset.key_value.models import KeyValueEntry
assert DeleteKeyValueCommand(resource=RESOURCE, key=ID_KEY).run() is True
@@ -66,7 +69,6 @@ def test_delete_uuid_entry(
key_value_entry: KeyValueEntry,
) -> None:
from superset.key_value.commands.delete import DeleteKeyValueCommand
- from superset.key_value.models import KeyValueEntry
assert DeleteKeyValueCommand(resource=RESOURCE, key=UUID_KEY).run() is True
@@ -77,6 +79,5 @@ def test_delete_entry_missing(
key_value_entry: KeyValueEntry,
) -> None:
from superset.key_value.commands.delete import DeleteKeyValueCommand
- from superset.key_value.models import KeyValueEntry
assert DeleteKeyValueCommand(resource=RESOURCE, key=456).run() is False
diff --git a/tests/integration_tests/key_value/commands/fixtures.py b/tests/integration_tests/key_value/commands/fixtures.py
index 2fd4fde4e1..66aea8a4ed 100644
--- a/tests/integration_tests/key_value/commands/fixtures.py
+++ b/tests/integration_tests/key_value/commands/fixtures.py
@@ -17,7 +17,7 @@
from __future__ import annotations
-import pickle
+import json
from typing import Generator, TYPE_CHECKING
from uuid import UUID
@@ -26,7 +26,11 @@ from flask_appbuilder.security.sqla.models import User
from sqlalchemy.orm import Session
from superset.extensions import db
-from superset.key_value.types import KeyValueResource
+from superset.key_value.types import (
+ JsonKeyValueCodec,
+ KeyValueResource,
+ PickleKeyValueCodec,
+)
from tests.integration_tests.test_app import app
if TYPE_CHECKING:
@@ -35,7 +39,10 @@ if TYPE_CHECKING:
ID_KEY = 123
UUID_KEY = UUID("3e7a2ab8-bcaf-49b0-a5df-dfb432f291cc")
RESOURCE = KeyValueResource.APP
-VALUE = {"foo": "bar"}
+JSON_VALUE = {"foo": "bar"}
+PICKLE_VALUE = object()
+JSON_CODEC = JsonKeyValueCodec()
+PICKLE_CODEC = PickleKeyValueCodec()
@pytest.fixture
@@ -46,7 +53,7 @@ def key_value_entry() -> Generator[KeyValueEntry, None, None]:
id=ID_KEY,
uuid=UUID_KEY,
resource=RESOURCE,
- value=pickle.dumps(VALUE),
+ value=bytes(json.dumps(JSON_VALUE), encoding="utf-8"),
)
db.session.add(entry)
db.session.commit()
diff --git a/tests/integration_tests/key_value/commands/get_test.py b/tests/integration_tests/key_value/commands/get_test.py
index b1800a4c3b..28a6dd73d5 100644
--- a/tests/integration_tests/key_value/commands/get_test.py
+++ b/tests/integration_tests/key_value/commands/get_test.py
@@ -16,7 +16,7 @@
# under the License.
from __future__ import annotations
-import pickle
+import json
import uuid
from datetime import datetime, timedelta
from typing import TYPE_CHECKING
@@ -26,10 +26,11 @@ from flask.ctx import AppContext
from superset.extensions import db
from tests.integration_tests.key_value.commands.fixtures import (
ID_KEY,
+ JSON_CODEC,
+ JSON_VALUE,
key_value_entry,
RESOURCE,
UUID_KEY,
- VALUE,
)
if TYPE_CHECKING:
@@ -39,8 +40,8 @@ if TYPE_CHECKING:
def test_get_id_entry(app_context: AppContext, key_value_entry: KeyValueEntry) -> None:
from superset.key_value.commands.get import GetKeyValueCommand
- value = GetKeyValueCommand(resource=RESOURCE, key=ID_KEY).run()
- assert value == VALUE
+ value = GetKeyValueCommand(resource=RESOURCE, key=ID_KEY, codec=JSON_CODEC).run()
+ assert value == JSON_VALUE
def test_get_uuid_entry(
@@ -48,8 +49,8 @@ def test_get_uuid_entry(
) -> None:
from superset.key_value.commands.get import GetKeyValueCommand
- value = GetKeyValueCommand(resource=RESOURCE, key=UUID_KEY).run()
- assert value == VALUE
+ value = GetKeyValueCommand(resource=RESOURCE, key=UUID_KEY, codec=JSON_CODEC).run()
+ assert value == JSON_VALUE
def test_get_id_entry_missing(
@@ -58,7 +59,7 @@ def test_get_id_entry_missing(
) -> None:
from superset.key_value.commands.get import GetKeyValueCommand
- value = GetKeyValueCommand(resource=RESOURCE, key=456).run()
+ value = GetKeyValueCommand(resource=RESOURCE, key=456, codec=JSON_CODEC).run()
assert value is None
@@ -70,12 +71,12 @@ def test_get_expired_entry(app_context: AppContext) -> None:
id=678,
uuid=uuid.uuid4(),
resource=RESOURCE,
- value=pickle.dumps(VALUE),
+ value=bytes(json.dumps(JSON_VALUE), encoding="utf-8"),
expires_on=datetime.now() - timedelta(days=1),
)
db.session.add(entry)
db.session.commit()
- value = GetKeyValueCommand(resource=RESOURCE, key=ID_KEY).run()
+ value = GetKeyValueCommand(resource=RESOURCE, key=ID_KEY, codec=JSON_CODEC).run()
assert value is None
db.session.delete(entry)
db.session.commit()
@@ -90,12 +91,12 @@ def test_get_future_expiring_entry(app_context: AppContext) -> None:
id=id_,
uuid=uuid.uuid4(),
resource=RESOURCE,
- value=pickle.dumps(VALUE),
+ value=bytes(json.dumps(JSON_VALUE), encoding="utf-8"),
expires_on=datetime.now() + timedelta(days=1),
)
db.session.add(entry)
db.session.commit()
- value = GetKeyValueCommand(resource=RESOURCE, key=id_).run()
- assert value == VALUE
+ value = GetKeyValueCommand(resource=RESOURCE, key=id_, codec=JSON_CODEC).run()
+ assert value == JSON_VALUE
db.session.delete(entry)
db.session.commit()
diff --git a/tests/integration_tests/key_value/commands/update_test.py b/tests/integration_tests/key_value/commands/update_test.py
index 8eb03b4eda..2c0fc3e31d 100644
--- a/tests/integration_tests/key_value/commands/update_test.py
+++ b/tests/integration_tests/key_value/commands/update_test.py
@@ -16,9 +16,8 @@
# under the License.
from __future__ import annotations
-import pickle
+import json
from typing import TYPE_CHECKING
-from uuid import UUID
from flask.ctx import AppContext
from flask_appbuilder.security.sqla.models import User
@@ -28,6 +27,7 @@ from superset.utils.core import override_user
from tests.integration_tests.key_value.commands.fixtures import (
admin,
ID_KEY,
+ JSON_CODEC,
key_value_entry,
RESOURCE,
UUID_KEY,
@@ -53,11 +53,12 @@ def test_update_id_entry(
resource=RESOURCE,
key=ID_KEY,
value=NEW_VALUE,
+ codec=JSON_CODEC,
).run()
assert key is not None
assert key.id == ID_KEY
entry = db.session.query(KeyValueEntry).filter_by(id=ID_KEY).autoflush(False).one()
- assert pickle.loads(entry.value) == NEW_VALUE
+ assert json.loads(entry.value) == NEW_VALUE
assert entry.changed_by_fk == admin.id
@@ -74,13 +75,14 @@ def test_update_uuid_entry(
resource=RESOURCE,
key=UUID_KEY,
value=NEW_VALUE,
+ codec=JSON_CODEC,
).run()
assert key is not None
assert key.uuid == UUID_KEY
entry = (
db.session.query(KeyValueEntry).filter_by(uuid=UUID_KEY).autoflush(False).one()
)
- assert pickle.loads(entry.value) == NEW_VALUE
+ assert json.loads(entry.value) == NEW_VALUE
assert entry.changed_by_fk == admin.id
@@ -92,5 +94,6 @@ def test_update_missing_entry(app_context: AppContext, admin: User) -> None:
resource=RESOURCE,
key=456,
value=NEW_VALUE,
+ codec=JSON_CODEC,
).run()
assert key is None
diff --git a/tests/integration_tests/key_value/commands/upsert_test.py b/tests/integration_tests/key_value/commands/upsert_test.py
index e5cd27e3a6..c26b66d02e 100644
--- a/tests/integration_tests/key_value/commands/upsert_test.py
+++ b/tests/integration_tests/key_value/commands/upsert_test.py
@@ -16,9 +16,8 @@
# under the License.
from __future__ import annotations
-import pickle
+import json
from typing import TYPE_CHECKING
-from uuid import UUID
from flask.ctx import AppContext
from flask_appbuilder.security.sqla.models import User
@@ -28,6 +27,7 @@ from superset.utils.core import override_user
from tests.integration_tests.key_value.commands.fixtures import (
admin,
ID_KEY,
+ JSON_CODEC,
key_value_entry,
RESOURCE,
UUID_KEY,
@@ -53,13 +53,14 @@ def test_upsert_id_entry(
resource=RESOURCE,
key=ID_KEY,
value=NEW_VALUE,
+ codec=JSON_CODEC,
).run()
assert key is not None
assert key.id == ID_KEY
entry = (
db.session.query(KeyValueEntry).filter_by(id=int(ID_KEY)).autoflush(False).one()
)
- assert pickle.loads(entry.value) == NEW_VALUE
+ assert json.loads(entry.value) == NEW_VALUE
assert entry.changed_by_fk == admin.id
@@ -76,13 +77,14 @@ def test_upsert_uuid_entry(
resource=RESOURCE,
key=UUID_KEY,
value=NEW_VALUE,
+ codec=JSON_CODEC,
).run()
assert key is not None
assert key.uuid == UUID_KEY
entry = (
db.session.query(KeyValueEntry).filter_by(uuid=UUID_KEY).autoflush(False).one()
)
- assert pickle.loads(entry.value) == NEW_VALUE
+ assert json.loads(entry.value) == NEW_VALUE
assert entry.changed_by_fk == admin.id
@@ -95,6 +97,7 @@ def test_upsert_missing_entry(app_context: AppContext, admin: User) -> None:
resource=RESOURCE,
key=456,
value=NEW_VALUE,
+ codec=JSON_CODEC,
).run()
assert key is not None
assert key.id == 456
[superset] 09/12: fix: Native time range filter in legacy charts (#23865)
Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
elizabeth pushed a commit to branch 2.1
in repository https://gitbox.apache.org/repos/asf/superset.git
commit 7155dcd5df03031fcf45f4300f425f7b85751dea
Author: Kamil Gabryjelski <ka...@gmail.com>
AuthorDate: Mon May 1 18:57:20 2023 +0200
fix: Native time range filter in legacy charts (#23865)
---
.../plugins/legacy-plugin-chart-calendar/src/controlPanel.ts | 8 ++++++--
superset/utils/core.py | 8 ++++++++
2 files changed, 14 insertions(+), 2 deletions(-)
diff --git a/superset-frontend/plugins/legacy-plugin-chart-calendar/src/controlPanel.ts b/superset-frontend/plugins/legacy-plugin-chart-calendar/src/controlPanel.ts
index 9071a278ee..6cdd79162b 100644
--- a/superset-frontend/plugins/legacy-plugin-chart-calendar/src/controlPanel.ts
+++ b/superset-frontend/plugins/legacy-plugin-chart-calendar/src/controlPanel.ts
@@ -22,12 +22,16 @@ import {
D3_FORMAT_DOCS,
D3_TIME_FORMAT_OPTIONS,
getStandardizedControls,
- sections,
} from '@superset-ui/chart-controls';
const config: ControlPanelConfig = {
controlPanelSections: [
- sections.legacyRegularTime,
+ {
+ label: t('Time'),
+ expanded: true,
+ description: t('Time related form attributes'),
+ controlSetRows: [['granularity_sqla'], ['time_range']],
+ },
{
label: t('Query'),
expanded: true,
diff --git a/superset/utils/core.py b/superset/utils/core.py
index 22ff3f8be4..8cf73b84aa 100644
--- a/superset/utils/core.py
+++ b/superset/utils/core.py
@@ -1159,6 +1159,14 @@ def merge_extra_form_data(form_data: Dict[str, Any]) -> None:
for fltr in append_filters
if fltr
)
+ if (
+ form_data.get("time_range")
+ and not form_data.get("granularity")
+ and not form_data.get("granularity_sqla")
+ ):
+ for adhoc_filter in form_data.get("adhoc_filters", []):
+ if adhoc_filter.get("operator") == "TEMPORAL_RANGE":
+ adhoc_filter["comparator"] = form_data["time_range"]
def merge_extra_filters(form_data: Dict[str, Any]) -> None:
[superset] 12/12: fix: Filter values are not updating when dependencies are set (#23566)
Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
elizabeth pushed a commit to branch 2.1
in repository https://gitbox.apache.org/repos/asf/superset.git
commit ccd456679e87a25087e9a32a6b3f880616fed421
Author: Michael S. Molina <70...@users.noreply.github.com>
AuthorDate: Mon Apr 3 17:20:00 2023 -0300
fix: Filter values are not updating when dependencies are set (#23566)
(cherry picked from commit 3bc496040d2834e2ed20086a8973c53d30419a89)
---
superset-frontend/src/components/Select/Select.tsx | 23 +++++++++++-----------
superset-frontend/src/components/Select/types.ts | 5 +++++
.../FiltersConfigForm/FiltersConfigForm.tsx | 2 +-
.../components/Select/SelectFilterPlugin.tsx | 12 +++--------
4 files changed, 21 insertions(+), 21 deletions(-)
diff --git a/superset-frontend/src/components/Select/Select.tsx b/superset-frontend/src/components/Select/Select.tsx
index 11f66d8dba..c2e47e3cc4 100644
--- a/superset-frontend/src/components/Select/Select.tsx
+++ b/superset-frontend/src/components/Select/Select.tsx
@@ -83,6 +83,7 @@ const Select = forwardRef(
{
allowClear,
allowNewOptions = false,
+ allowSelectAll = true,
ariaLabel,
filterOption = true,
header = null,
@@ -195,10 +196,17 @@ const Select = forwardRef(
const selectAllEnabled = useMemo(
() =>
!isSingleMode &&
+ allowSelectAll &&
selectOptions.length > 0 &&
enabledOptions.length > 1 &&
!inputValue,
- [isSingleMode, selectOptions.length, enabledOptions.length, inputValue],
+ [
+ isSingleMode,
+ allowSelectAll,
+ selectOptions.length,
+ enabledOptions.length,
+ inputValue,
+ ],
);
const selectAllMode = useMemo(
@@ -360,9 +368,8 @@ const Select = forwardRef(
useEffect(() => {
// if all values are selected, add select all to value
if (
- !isSingleMode &&
- ensureIsArray(value).length === selectAllEligible.length &&
- selectOptions.length > 0
+ selectAllEnabled &&
+ ensureIsArray(value).length === selectAllEligible.length
) {
setSelectValue(
labelInValue
@@ -373,13 +380,7 @@ const Select = forwardRef(
] as AntdLabeledValue[]),
);
}
- }, [
- value,
- isSingleMode,
- labelInValue,
- selectAllEligible.length,
- selectOptions.length,
- ]);
+ }, [labelInValue, selectAllEligible.length, selectAllEnabled, value]);
useEffect(() => {
const checkSelectAll = ensureIsArray(selectValue).some(
diff --git a/superset-frontend/src/components/Select/types.ts b/superset-frontend/src/components/Select/types.ts
index 6e4c7f072d..6ab9d7478c 100644
--- a/superset-frontend/src/components/Select/types.ts
+++ b/superset-frontend/src/components/Select/types.ts
@@ -155,6 +155,11 @@ export interface BaseSelectProps extends AntdExposedProps {
}
export interface SelectProps extends BaseSelectProps {
+ /**
+ * It enables the user to select all options.
+ * True by default.
+ * */
+ allowSelectAll?: boolean;
/**
* It defines the options of the Select.
* The options can be static, an array of options.
diff --git a/superset-frontend/src/dashboard/components/nativeFilters/FiltersConfigModal/FiltersConfigForm/FiltersConfigForm.tsx b/superset-frontend/src/dashboard/components/nativeFilters/FiltersConfigModal/FiltersConfigForm/FiltersConfigForm.tsx
index 0b6a33f069..3434107978 100644
--- a/superset-frontend/src/dashboard/components/nativeFilters/FiltersConfigModal/FiltersConfigForm/FiltersConfigForm.tsx
+++ b/superset-frontend/src/dashboard/components/nativeFilters/FiltersConfigModal/FiltersConfigForm/FiltersConfigForm.tsx
@@ -368,7 +368,7 @@ const FiltersConfigForm = (
const formFilter = formValues || undoFormValues || defaultFormFilter;
const dependencies: string[] =
- formFilter?.dependencies || filterToEdit?.cascadeParentIds;
+ formFilter?.dependencies || filterToEdit?.cascadeParentIds || [];
const nativeFilterItems = getChartMetadataRegistry().items;
const nativeFilterVizTypes = Object.entries(nativeFilterItems)
diff --git a/superset-frontend/src/filters/components/Select/SelectFilterPlugin.tsx b/superset-frontend/src/filters/components/Select/SelectFilterPlugin.tsx
index d3cb2b4dea..7de9efcfaf 100644
--- a/superset-frontend/src/filters/components/Select/SelectFilterPlugin.tsx
+++ b/superset-frontend/src/filters/components/Select/SelectFilterPlugin.tsx
@@ -121,7 +121,6 @@ export default function PluginFilterSelect(props: PluginFilterSelectProps) {
}),
[],
);
- const [initialData, setInitialData] = useState<typeof data>([]);
const updateDataMask = useCallback(
(values: SelectValue) => {
@@ -239,7 +238,7 @@ export default function PluginFilterSelect(props: PluginFilterSelectProps) {
}, [filterState.validateMessage, filterState.validateStatus]);
const options = useMemo(() => {
- const allOptions = [...data, ...initialData];
+ const allOptions = [...data];
const uniqueOptions = uniqWith(allOptions, isEqual);
const selectOptions: { label: string; value: DataRecordValue }[] = [];
uniqueOptions.forEach(row => {
@@ -250,7 +249,7 @@ export default function PluginFilterSelect(props: PluginFilterSelectProps) {
});
});
return selectOptions;
- }, [data, initialData, datatype, groupby, labelFormatter]);
+ }, [data, datatype, groupby, labelFormatter]);
const sortComparator = useCallback(
(a: AntdLabeledValue, b: AntdLabeledValue) => {
@@ -301,12 +300,6 @@ export default function PluginFilterSelect(props: PluginFilterSelectProps) {
setDataMask(dataMask);
}, [JSON.stringify(dataMask)]);
- useEffect(() => {
- if (data.length && !initialData.length) {
- setInitialData(data);
- }
- }, [data, initialData.length]);
-
return (
<FilterPluginStyle height={height} width={width}>
<StyledFormItem
@@ -316,6 +309,7 @@ export default function PluginFilterSelect(props: PluginFilterSelectProps) {
<Select
allowClear
allowNewOptions
+ allowSelectAll={!searchAllOptions}
// @ts-ignore
value={filterState.value || []}
disabled={isDisabled}
[superset] 01/12: fix: handle comments in `has_table_query` (#23882)
Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
elizabeth pushed a commit to branch 2.1
in repository https://gitbox.apache.org/repos/asf/superset.git
commit 60a1652be23f86d60f14b279a82d696c97fd6e5e
Author: Beto Dealmeida <ro...@dealmeida.net>
AuthorDate: Mon May 1 11:06:54 2023 -0700
fix: handle comments in `has_table_query` (#23882)
---
superset/sql_parse.py | 4 +++-
tests/unit_tests/sql_parse_tests.py | 8 ++++++++
2 files changed, 11 insertions(+), 1 deletion(-)
diff --git a/superset/sql_parse.py b/superset/sql_parse.py
index ab2f044172..a3c1af87b0 100644
--- a/superset/sql_parse.py
+++ b/superset/sql_parse.py
@@ -509,6 +509,9 @@ def has_table_query(token_list: TokenList) -> bool:
"""
state = InsertRLSState.SCANNING
for token in token_list.tokens:
+ # Ignore comments
+ if isinstance(token, sqlparse.sql.Comment):
+ continue
# Recurse into child token list
if isinstance(token, TokenList) and has_table_query(token):
@@ -607,7 +610,6 @@ def insert_rls(
rls: Optional[TokenList] = None
state = InsertRLSState.SCANNING
for token in token_list.tokens:
-
# Recurse into child token list
if isinstance(token, TokenList):
i = token_list.tokens.index(token)
diff --git a/tests/unit_tests/sql_parse_tests.py b/tests/unit_tests/sql_parse_tests.py
index ba3da69aae..d6939fa080 100644
--- a/tests/unit_tests/sql_parse_tests.py
+++ b/tests/unit_tests/sql_parse_tests.py
@@ -1195,6 +1195,14 @@ def test_sqlparse_issue_652():
("extract(HOUR from from_unixtime(hour_ts)", False),
("(SELECT * FROM table)", True),
("(SELECT COUNT(DISTINCT name) from birth_names)", True),
+ (
+ "(SELECT table_name FROM information_schema.tables WHERE table_name LIKE '%user%' LIMIT 1)",
+ True,
+ ),
+ (
+ "(SELECT table_name FROM /**/ information_schema.tables WHERE table_name LIKE '%user%' LIMIT 1)",
+ True,
+ ),
],
)
def test_has_table_query(sql: str, expected: bool) -> None:
[superset] 02/12: lint
Posted by el...@apache.org.
This is an automated email from the ASF dual-hosted git repository.
elizabeth pushed a commit to branch 2.1
in repository https://gitbox.apache.org/repos/asf/superset.git
commit 5df0b7ad57fc8e3dd8e8cdd764905f2b64015174
Author: Elizabeth Thompson <es...@gmail.com>
AuthorDate: Tue Jun 6 15:58:54 2023 -0700
lint
---
superset/config.py | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)
diff --git a/superset/config.py b/superset/config.py
index f24b040e0f..af46d489c9 100644
--- a/superset/config.py
+++ b/superset/config.py
@@ -1205,6 +1205,7 @@ def SQL_QUERY_MUTATOR( # pylint: disable=invalid-name,unused-argument
# functionality for both the SQL_Lab and Charts.
MUTATE_AFTER_SPLIT = False
+
# This allows for a user to add header data to any outgoing emails. For example,
# if you need to include metadata in the header or you want to change the specifications
# of the email title, header, or sender.
@@ -1584,7 +1585,7 @@ elif importlib.util.find_spec("superset_config") and not is_test():
try:
# pylint: disable=import-error,wildcard-import,unused-wildcard-import
import superset_config
- from superset_config import * # type: ignore
+ from superset_config import * # noqa
print(f"Loaded your LOCAL configuration at [{superset_config.__file__}]")
except Exception: