You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@superset.apache.org by mi...@apache.org on 2022/04/26 16:17:37 UTC

[superset] branch master updated: chore(docs): Spelling (#19675)

This is an automated email from the ASF dual-hosted git repository.

michaelsmolina pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/superset.git


The following commit(s) were added to refs/heads/master by this push:
     new c32c505742 chore(docs): Spelling (#19675)
c32c505742 is described below

commit c32c505742582c42b8228f07ff948bd7e5ae2676
Author: Josh Soref <21...@users.noreply.github.com>
AuthorDate: Tue Apr 26 12:17:15 2022 -0400

    chore(docs): Spelling (#19675)
    
    * spelling: adding
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: aggregate
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: avoid
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: blacklist
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: cached
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: discontinue
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: exhaustive
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: from
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: github
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: hybrid
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: implicit
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: interim
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: introduced
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: javascript
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: logstash
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: metadata
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: password
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: recommended
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: redshift
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: refactored
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: referencing
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: sqlite
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: the
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: thumbnails
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: undoes
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    * spelling: very
    
    Signed-off-by: Josh Soref <js...@users.noreply.github.com>
    
    Co-authored-by: Josh Soref <js...@users.noreply.github.com>
---
 CODE_OF_CONDUCT.md                                      |   2 +-
 CONTRIBUTING.md                                         |   8 ++++----
 RELEASING/README.md                                     |   6 +++---
 RELEASING/changelog.py                                  |   2 +-
 RELEASING/release-notes-0-38/README.md                  |   2 +-
 RELEASING/release-notes-1-2/README.md                   |   2 +-
 RELEASING/release-notes-1-4/README.md                   |   2 +-
 UPDATING.md                                             |  16 ++++++++--------
 docker-compose.yml                                      |   2 +-
 docs/docs/contributing/contributing-page.mdx            |   4 ++--
 docs/docs/contributing/pull-request-guidelines.mdx      |   2 +-
 .../creating-your-first-dashboard.mdx                   |   4 ++--
 docs/docs/creating-charts-dashboards/exploring-data.mdx |   2 +-
 docs/docs/databases/drill.mdx                           |   4 ++--
 docs/docs/databases/elasticsearch.mdx                   |   2 +-
 docs/docs/installation/alerts-reports.mdx               |   2 +-
 docs/docs/installation/configuring-superset.mdx         |   4 ++--
 .../installing-superset-using-docker-compose.mdx        |   2 +-
 docs/docs/installation/sql-templating.mdx               |   6 +++---
 docs/docs/intro.mdx                                     |   2 +-
 docs/docs/miscellaneous/chart-params.mdx                |   2 +-
 docs/src/resources/data.js                              |   4 ++--
 docs/static/img/databases/{sqllite.jpg => sqlite.jpg}   | Bin
 docs/static/img/databases/{sqllite.png => sqlite.png}   | Bin
 .../superset-ui-core/src/number-format/README.md        |   2 +-
 .../src/components/FilterableTable/FilterableTable.tsx  |   2 +-
 superset/charts/post_processing.py                      |   2 +-
 .../96e99fb176a0_add_import_mixing_to_saved_query.py    |   2 +-
 .../b56500de1855_add_uuid_column_to_import_mixin.py     |   2 +-
 .../versions/c501b7c653a3_add_missing_uuid_column.py    |   2 +-
 superset/utils/pandas_postprocessing/pivot.py           |   2 +-
 31 files changed, 48 insertions(+), 48 deletions(-)

diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md
index e49cf0a32c..bee3a24a1e 100644
--- a/CODE_OF_CONDUCT.md
+++ b/CODE_OF_CONDUCT.md
@@ -119,7 +119,7 @@ If you decide to join the [Community Slack](https://join.slack.com/t/apache-supe
 
 **3. Ask thoughtful questions.**
 
-- We’re all here to help each other out. The best way to get help is by investing effort into your questions. First check and see if your question is answered in [the Superset documentation](https://superset.apache.org/faq.html) or on [Stack Overflow](https://stackoverflow.com/search?q=apache+superset). You can also check [Github issues](https://github.com/apache/superset/issues) to see if your question or feature request has been submitted before. Then, use Slack search to see if your q [...]
+- We’re all here to help each other out. The best way to get help is by investing effort into your questions. First check and see if your question is answered in [the Superset documentation](https://superset.apache.org/faq.html) or on [Stack Overflow](https://stackoverflow.com/search?q=apache+superset). You can also check [GitHub issues](https://github.com/apache/superset/issues) to see if your question or feature request has been submitted before. Then, use Slack search to see if your q [...]
 
 - The steps you’ve already taken
 - Relevant details presented cleanly (text stacktraces, formatted markdown, or screenshots. Please don’t paste large blocks of code unformatted or post photos of your screen from your phone)
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index 865d4b62d6..738f018873 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -116,7 +116,7 @@ Here's a list of repositories that contain Superset-related packages:
   the [superset-frontend](https://github.com/apache/superset/tree/master/superset-frontend)
   folder.
 - [github.com/apache-superset](https://github.com/apache-superset) is the
-  Github organization under which we manage Superset-related
+  GitHub organization under which we manage Superset-related
   small tools, forks and Superset-related experimental ideas.
 
 ## Types of Contributions
@@ -209,7 +209,7 @@ Finally, never submit a PR that will put master branch in broken state. If the P
   - `chore` (updating tasks etc; no application logic change)
   - `perf` (performance-related change)
   - `build` (build tooling, Docker configuration change)
-  - `ci` (test runner, Github Actions workflow changes)
+  - `ci` (test runner, GitHub Actions workflow changes)
   - `other` (changes that don't correspond to the above -- should be rare!)
   - Examples:
     - `feat: export charts as ZIP files`
@@ -488,7 +488,7 @@ To bring all dependencies up to date as per the restrictions defined in `setup.p
 $ pip-compile-multi
 ```
 
-This should be done periodically, but it is rcommended to do thorough manual testing of the application to ensure no breaking changes have been introduced that aren't caught by the unit and integration tests.
+This should be done periodically, but it is recommended to do thorough manual testing of the application to ensure no breaking changes have been introduced that aren't caught by the unit and integration tests.
 
 #### Logging to the browser console
 
@@ -661,7 +661,7 @@ We use [Pylint](https://pylint.org/) for linting which can be invoked via:
 tox -e pylint
 ```
 
-In terms of best practices please advoid blanket disablement of Pylint messages globally (via `.pylintrc`) or top-level within the file header, albeit there being a few exceptions. Disablement should occur inline as it prevents masking issues and provides context as to why said message is disabled.
+In terms of best practices please avoid blanket disablement of Pylint messages globally (via `.pylintrc`) or top-level within the file header, albeit there being a few exceptions. Disablement should occur inline as it prevents masking issues and provides context as to why said message is disabled.
 
 Additionally, the Python code is auto-formatted using [Black](https://github.com/python/black) which
 is configured as a pre-commit hook. There are also numerous [editor integrations](https://black.readthedocs.io/en/stable/integrations/editors.html)
diff --git a/RELEASING/README.md b/RELEASING/README.md
index 32fb1aef34..46913d55ec 100644
--- a/RELEASING/README.md
+++ b/RELEASING/README.md
@@ -300,7 +300,7 @@ with the changes on `CHANGELOG.md` and `UPDATING.md`.
 ### Publishing a Convenience Release to PyPI
 
 Using the final release tarball, unpack it and run `./pypi_push.sh`.
-This script will build the Javascript bundle and echo the twine command
+This script will build the JavaScript bundle and echo the twine command
 allowing you to publish to PyPI. You may need to ask a fellow committer to grant
 you access to it if you don't have access already. Make sure to create
 an account first if you don't have one, and reference your username
@@ -315,9 +315,9 @@ Once it's all done, an [ANNOUNCE] thread announcing the release to the dev@ mail
 python send_email.py announce
 ```
 
-### Github Release
+### GitHub Release
 
-Finally, so the Github UI reflects the latest release, you should create a release from the
+Finally, so the GitHub UI reflects the latest release, you should create a release from the
 tag corresponding with the new version. Go to https://github.com/apache/superset/tags,
 click the 3-dot icon and select `Create Release`, paste the content of the ANNOUNCE thread in the
 release notes, and publish the new release.
diff --git a/RELEASING/changelog.py b/RELEASING/changelog.py
index 8e329b5fe0..5d4f346c8e 100644
--- a/RELEASING/changelog.py
+++ b/RELEASING/changelog.py
@@ -26,7 +26,7 @@ from click.core import Context
 try:
     from github import BadCredentialsException, Github, PullRequest, Repository
 except ModuleNotFoundError:
-    print("PyGithub is a required package for this script")
+    print("PyGitHub is a required package for this script")
     exit(1)
 
 SUPERSET_REPO = "apache/superset"
diff --git a/RELEASING/release-notes-0-38/README.md b/RELEASING/release-notes-0-38/README.md
index 483271fa25..817f27d771 100644
--- a/RELEASING/release-notes-0-38/README.md
+++ b/RELEASING/release-notes-0-38/README.md
@@ -167,7 +167,7 @@ Other features
 
 Alerts (send notification when a condition is met) ([Roadmap](https://github.com/apache-superset/superset-roadmap/issues/54))
 - feat: add test email functionality to SQL-based email alerts  (#[10476](https://github.com/apache/superset/pull/10476))
-- feat: refractored SQL-based alerting framework  (#[10605](https://github.com/apache/superset/pull/10605))
+- feat: refactored SQL-based alerting framework  (#[10605](https://github.com/apache/superset/pull/10605))
 
 
 [SIP-34] Proposal to establish a new design direction, system, and process for Superset ([SIP](https://github.com/apache/superset/issues/8976))
diff --git a/RELEASING/release-notes-1-2/README.md b/RELEASING/release-notes-1-2/README.md
index 2ae0a728f3..4e8895cbfe 100644
--- a/RELEASING/release-notes-1-2/README.md
+++ b/RELEASING/release-notes-1-2/README.md
@@ -87,7 +87,7 @@ Expanding the API has been an ongoing effort, and 1.2 introduces several new API
 - [14461](https://github.com/apache/superset/pull/14461) feat(native-filters): Auto apply changes in FiltersConfigModal (#14461) (@simcha90)
 - [13507](https://github.com/apache/superset/pull/13507) feat(native-filters): Filter set tabs (#13507) (@simcha90)
 - [14313](https://github.com/apache/superset/pull/14313) feat(native-filters): Implement adhoc filters and time picker in Range and Select native filters (#14313) (@Kamil Gabryjelski)
-- [14261](https://github.com/apache/superset/pull/14261) feat(native-filters): Show/Hide filter bar by metdata ff (#14261) (@simcha90)
+- [14261](https://github.com/apache/superset/pull/14261) feat(native-filters): Show/Hide filter bar by metadata ff (#14261) (@simcha90)
 - [13506](https://github.com/apache/superset/pull/13506) feat(native-filters): Update filter bar buttons (#13506) (@simcha90)
 - [14374](https://github.com/apache/superset/pull/14374) feat(native-filters): Use datasets in dashboard as default options for native filters (#14374) (@Kamil Gabryjelski)
 - [14314](https://github.com/apache/superset/pull/14314) feat(native-filters): add option to create value in select filter (#14314) (@Ville Brofeldt)
diff --git a/RELEASING/release-notes-1-4/README.md b/RELEASING/release-notes-1-4/README.md
index 9d3a7e99d3..267b122aba 100644
--- a/RELEASING/release-notes-1-4/README.md
+++ b/RELEASING/release-notes-1-4/README.md
@@ -19,7 +19,7 @@ under the License.
 
 # Release Notes for Superset 1.4
 
-Superset 1.4 focuses heavily on continuing to polish the core Superset experience. This release has a very very long list of fixes from across the community.
+Superset 1.4 focuses heavily on continuing to polish the core Superset experience. This release has a very long list of fixes from across the community.
 
 - [**User Experience**](#user-facing-features)
 - [**Database Experience**](#database-experience)
diff --git a/UPDATING.md b/UPDATING.md
index fb6565848a..e6cce38867 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -25,7 +25,7 @@ assists people when migrating to a new version.
 ## Next
 
 - [19046](https://github.com/apache/superset/pull/19046): Enables the drag and drop interface in Explore control panel by default. Flips `ENABLE_EXPLORE_DRAG_AND_DROP` and `ENABLE_DND_WITH_CLICK_UX` feature flags to `True`.
-- [18936](https://github.com/apache/superset/pull/18936): Removes legacy SIP-15 interm logic/flags—specifically the `SIP_15_ENABLED`, `SIP_15_GRACE_PERIOD_END`, `SIP_15_DEFAULT_TIME_RANGE_ENDPOINTS`, and `SIP_15_TOAST_MESSAGE` flags. Time range endpoints are no longer configurable and strictly adhere to the `[start, end)` paradigm, i.e., inclusive of the start and exclusive of the end. Additionally this change removes the now obsolete `time_range_endpoints` from the form-data and resulti [...]
+- [18936](https://github.com/apache/superset/pull/18936): Removes legacy SIP-15 interim logic/flags—specifically the `SIP_15_ENABLED`, `SIP_15_GRACE_PERIOD_END`, `SIP_15_DEFAULT_TIME_RANGE_ENDPOINTS`, and `SIP_15_TOAST_MESSAGE` flags. Time range endpoints are no longer configurable and strictly adhere to the `[start, end)` paradigm, i.e., inclusive of the start and exclusive of the end. Additionally this change removes the now obsolete `time_range_endpoints` from the form-data and result [...]
 - [19570](https://github.com/apache/superset/pull/19570): makes [sqloxide](https://pypi.org/project/sqloxide/) optional so the SIP-68 migration can be run on aarch64. If the migration is taking too long installing sqloxide manually should improve the performance.
 
 ### Breaking Changes
@@ -66,8 +66,8 @@ assists people when migrating to a new version.
 ### Other
 
 - [17589](https://github.com/apache/superset/pull/17589): It is now possible to limit access to users' recent activity data by setting the `ENABLE_BROAD_ACTIVITY_ACCESS` config flag to false, or customizing the `raise_for_user_activity_access` method in the security manager.
-- [17536](https://github.com/apache/superset/pull/17536): introduced a key-value endpoint to store dashboard filter state. This endpoint is backed by Flask-Caching and the default configuration assumes that the values will be stored in the file system. If you are already using another cache backend like Redis or Memchached, you'll probably want to change this setting in `superset_config.py`. The key is `FILTER_STATE_CACHE_CONFIG` and the available settings can be found in Flask-Caching [ [...]
-- [17882](https://github.com/apache/superset/pull/17882): introduced a key-value endpoint to store Explore form data. This endpoint is backed by Flask-Caching and the default configuration assumes that the values will be stored in the file system. If you are already using another cache backend like Redis or Memchached, you'll probably want to change this setting in `superset_config.py`. The key is `EXPLORE_FORM_DATA_CACHE_CONFIG` and the available settings can be found in Flask-Caching [ [...]
+- [17536](https://github.com/apache/superset/pull/17536): introduced a key-value endpoint to store dashboard filter state. This endpoint is backed by Flask-Caching and the default configuration assumes that the values will be stored in the file system. If you are already using another cache backend like Redis or Memcached, you'll probably want to change this setting in `superset_config.py`. The key is `FILTER_STATE_CACHE_CONFIG` and the available settings can be found in Flask-Caching [d [...]
+- [17882](https://github.com/apache/superset/pull/17882): introduced a key-value endpoint to store Explore form data. This endpoint is backed by Flask-Caching and the default configuration assumes that the values will be stored in the file system. If you are already using another cache backend like Redis or Memcached, you'll probably want to change this setting in `superset_config.py`. The key is `EXPLORE_FORM_DATA_CACHE_CONFIG` and the available settings can be found in Flask-Caching [d [...]
 
 ## 1.4.1
 
@@ -177,7 +177,7 @@ assists people when migrating to a new version.
 
 - [11575](https://github.com/apache/superset/pull/11575) The Row Level Security (RLS) config flag has been moved to a feature flag. To migrate, add `ROW_LEVEL_SECURITY: True` to the `FEATURE_FLAGS` dict in `superset_config.py`.
 
-- [11259](https://github.com/apache/superset/pull/11259): config flag ENABLE_REACT_CRUD_VIEWS has been set to `True` by default, set to `False` if you prefer to the vintage look and feel. However, we may discontine support on the vintage list view in the future.
+- [11259](https://github.com/apache/superset/pull/11259): config flag ENABLE_REACT_CRUD_VIEWS has been set to `True` by default, set to `False` if you prefer to the vintage look and feel. However, we may discontinue support on the vintage list view in the future.
 
 - [11244](https://github.com/apache/superset/pull/11244): The `REDUCE_DASHBOARD_BOOTSTRAP_PAYLOAD` feature flag has been removed after being set to True for multiple months.
 
@@ -190,7 +190,7 @@ assists people when migrating to a new version.
 
 ### Potential Downtime
 
-- [11920](https://github.com/apache/superset/pull/11920): Undos the DB migration from [11714](https://github.com/apache/superset/pull/11714) to prevent adding new columns to the logs table. Deploying a sha between these two PRs may result in locking your DB.
+- [11920](https://github.com/apache/superset/pull/11920): Undoes the DB migration from [11714](https://github.com/apache/superset/pull/11714) to prevent adding new columns to the logs table. Deploying a sha between these two PRs may result in locking your DB.
 
 - [11714](https://github.com/apache/superset/pull/11714): Logs
   significantly more analytics events (roughly double?), and when
@@ -219,7 +219,7 @@ assists people when migrating to a new version.
 
 - [10324](https://github.com/apache/superset/pull/10324): Facebook Prophet has been introduced as an optional dependency to add support for timeseries forecasting in the chart data API. To enable this feature, install Superset with the optional dependency `prophet` or directly `pip install fbprophet`.
 
-- [10320](https://github.com/apache/superset/pull/10320): References to blacklst/whitelist language have been replaced with more appropriate alternatives. All configs refencing containing `WHITE`/`BLACK` have been replaced with `ALLOW`/`DENY`. Affected config variables that need to be updated: `TIME_GRAIN_BLACKLIST`, `VIZ_TYPE_BLACKLIST`, `DRUID_DATA_SOURCE_BLACKLIST`.
+- [10320](https://github.com/apache/superset/pull/10320): References to blacklist/whitelist language have been replaced with more appropriate alternatives. All configs referencing containing `WHITE`/`BLACK` have been replaced with `ALLOW`/`DENY`. Affected config variables that need to be updated: `TIME_GRAIN_BLACKLIST`, `VIZ_TYPE_BLACKLIST`, `DRUID_DATA_SOURCE_BLACKLIST`.
 
 ## 0.37.1
 
@@ -233,7 +233,7 @@ assists people when migrating to a new version.
 
 - [10222](https://github.com/apache/superset/pull/10222): a change which changes how payloads are cached. Previous cached objects cannot be decoded and thus will be reloaded from source.
 
-- [10130](https://github.com/apache/superset/pull/10130): a change which deprecates the `dbs.perm` column in favor of SQLAlchemy [hybird attributes](https://docs.sqlalchemy.org/en/13/orm/extensions/hybrid.html).
+- [10130](https://github.com/apache/superset/pull/10130): a change which deprecates the `dbs.perm` column in favor of SQLAlchemy [hybrid attributes](https://docs.sqlalchemy.org/en/13/orm/extensions/hybrid.html).
 
 - [10034](https://github.com/apache/superset/pull/10034): a change which deprecates the public security manager `assert_datasource_permission`, `assert_query_context_permission`, `assert_viz_permission`, and `rejected_tables` methods with the `raise_for_access` method which also handles assertion logic for SQL tables.
 
@@ -326,7 +326,7 @@ assists people when migrating to a new version.
 - We're deprecating the concept of "restricted metric", this feature
   was not fully working anyhow.
 - [8117](https://github.com/apache/superset/pull/8117): If you are
-  using `ENABLE_PROXY_FIX = True`, review the newly-introducted variable,
+  using `ENABLE_PROXY_FIX = True`, review the newly-introduced variable,
   `PROXY_FIX_CONFIG`, which changes the proxy behavior in accordance with
   [Werkzeug](https://werkzeug.palletsprojects.com/en/0.15.x/middleware/proxy_fix/)
 
diff --git a/docker-compose.yml b/docker-compose.yml
index 907ca51129..2c814363e7 100644
--- a/docker-compose.yml
+++ b/docker-compose.yml
@@ -118,7 +118,7 @@ services:
     depends_on: *superset-depends-on
     user: *superset-user
     volumes: *superset-volumes
-    # Bump memory limit if processing selenium / thumbails on superset-worker
+    # Bump memory limit if processing selenium / thumbnails on superset-worker
     # mem_limit: 2038m
     # mem_reservation: 128M
 
diff --git a/docs/docs/contributing/contributing-page.mdx b/docs/docs/contributing/contributing-page.mdx
index f4f3cd6400..6e205bf0bb 100644
--- a/docs/docs/contributing/contributing-page.mdx
+++ b/docs/docs/contributing/contributing-page.mdx
@@ -13,8 +13,8 @@ which can be joined by anyone):
 
 - [Mailing list](https://lists.apache.org/list.html?dev@superset.apache.org)
 - [Apache Superset Slack community](https://join.slack.com/t/apache-superset/shared_invite/zt-16jvzmoi8-sI7jKWp~xc2zYRe~NqiY9Q)
-- [Github issues and PR's](https://github.com/apache/superset/issues)
+- [GitHub issues and PR's](https://github.com/apache/superset/issues)
 
 More references:
 - [Comprehensive Tutorial for Contributing Code to Apache Superset](https://preset.io/blog/tutorial-contributing-code-to-apache-superset/)
-- [CONTRIBUTING Guide on Github](https://github.com/apache/superset/blob/master/CONTRIBUTING.md)
+- [CONTRIBUTING Guide on GitHub](https://github.com/apache/superset/blob/master/CONTRIBUTING.md)
diff --git a/docs/docs/contributing/pull-request-guidelines.mdx b/docs/docs/contributing/pull-request-guidelines.mdx
index f37efd785e..4e2f823a97 100644
--- a/docs/docs/contributing/pull-request-guidelines.mdx
+++ b/docs/docs/contributing/pull-request-guidelines.mdx
@@ -41,7 +41,7 @@ Finally, never submit a PR that will put master branch in broken state. If the P
   - `chore` (updating tasks etc; no application logic change)
   - `perf` (performance-related change)
   - `build` (build tooling, Docker configuration change)
-  - `ci` (test runner, Github Actions workflow changes)
+  - `ci` (test runner, GitHub Actions workflow changes)
   - `other` (changes that don't correspond to the above -- should be rare!)
   - Examples:
     - `feat: export charts as ZIP files`
diff --git a/docs/docs/creating-charts-dashboards/creating-your-first-dashboard.mdx b/docs/docs/creating-charts-dashboards/creating-your-first-dashboard.mdx
index 39400a1c29..ecabf896f8 100644
--- a/docs/docs/creating-charts-dashboards/creating-your-first-dashboard.mdx
+++ b/docs/docs/creating-charts-dashboards/creating-your-first-dashboard.mdx
@@ -94,7 +94,7 @@ The Superset semantic layer can store 2 types of computed data:
 1. Virtual metrics: you can write SQL queries that aggregate values
 from multiple column (e.g. `SUM(recovered) / SUM(confirmed)`) and make them
 available as columns for (e.g. `recovery_rate`) visualization in Explore.
-Agggregate functions are allowed and encouraged for metrics.
+Aggregate functions are allowed and encouraged for metrics.
 
 <img src={useBaseUrl("/img/tutorial/tutorial_sql_metric.png" )} />
 
@@ -182,7 +182,7 @@ Access to dashboards is managed via owners (users that have edit permissions to
 
 Non-owner users access can be managed two different ways:
 
-1. Dataset permissions - if you add to the relevant role permissions to datasets it automatically grants implict access to all dashboards that uses those permitted datasets
+1. Dataset permissions - if you add to the relevant role permissions to datasets it automatically grants implicit access to all dashboards that uses those permitted datasets
 2. Dashboard roles - if you enable **DASHBOARD_RBAC** feature flag  then you be able to manage which roles can access the dashboard
 - Having dashboard access implicitly grants read access to the associated datasets, therefore
 all charts will load their data even if feature flag is turned on and no roles assigned
diff --git a/docs/docs/creating-charts-dashboards/exploring-data.mdx b/docs/docs/creating-charts-dashboards/exploring-data.mdx
index 65f7cae737..0386b23842 100644
--- a/docs/docs/creating-charts-dashboards/exploring-data.mdx
+++ b/docs/docs/creating-charts-dashboards/exploring-data.mdx
@@ -40,7 +40,7 @@ tick the checkbox for **Allow Data Upload**. End by clicking the **Save** button
 ### Loading CSV Data
 
 Download the CSV dataset to your computer from
-[Github](https://raw.githubusercontent.com/apache-superset/examples-data/master/tutorial_flights.csv).
+[GitHub](https://raw.githubusercontent.com/apache-superset/examples-data/master/tutorial_flights.csv).
 In the Superset menu, select **Data ‣ Upload a CSV**.
 
 <img src={useBaseUrl("/img/tutorial/upload_a_csv.png" )} />
diff --git a/docs/docs/databases/drill.mdx b/docs/docs/databases/drill.mdx
index 303eb55cbf..9006c8f98c 100644
--- a/docs/docs/databases/drill.mdx
+++ b/docs/docs/databases/drill.mdx
@@ -36,12 +36,12 @@ Connecting to Drill through JDBC is more complicated and we recommend following
 The connection string looks like:
 
 ```
-drill+jdbc://<username>:<passsword>@<host>:<port>
+drill+jdbc://<username>:<password>@<host>:<port>
 ```
 
 ### ODBC
 
 We recommend reading the
 [Apache Drill documentation](https://drill.apache.org/docs/installing-the-driver-on-linux/) and read
-the [Github README](https://github.com/JohnOmernik/sqlalchemy-drill#usage-with-odbc) to learn how to
+the [GitHub README](https://github.com/JohnOmernik/sqlalchemy-drill#usage-with-odbc) to learn how to
 work with Drill through ODBC.
diff --git a/docs/docs/databases/elasticsearch.mdx b/docs/docs/databases/elasticsearch.mdx
index 519bc370ed..70b7f8f685 100644
--- a/docs/docs/databases/elasticsearch.mdx
+++ b/docs/docs/databases/elasticsearch.mdx
@@ -46,7 +46,7 @@ POST /_aliases
 }
 ```
 
-Then register your table with the alias name logstasg_all
+Then register your table with the alias name logstash_all
 
 **Time zone**
 
diff --git a/docs/docs/installation/alerts-reports.mdx b/docs/docs/installation/alerts-reports.mdx
index 8ab37cc905..a7491ad03e 100644
--- a/docs/docs/installation/alerts-reports.mdx
+++ b/docs/docs/installation/alerts-reports.mdx
@@ -387,7 +387,7 @@ THUMBNAIL_SELENIUM_USER = 'username_with_permission_to_access_dashboards'
 
 ### Schedule Reports
 
-You can optionally allow your users to schedule queries directly in SQL Lab. This is done by addding
+You can optionally allow your users to schedule queries directly in SQL Lab. This is done by adding
 extra metadata to saved queries, which are then picked up by an external scheduled (like
 [Apache Airflow](https://airflow.apache.org/)).
 
diff --git a/docs/docs/installation/configuring-superset.mdx b/docs/docs/installation/configuring-superset.mdx
index 1384b62741..66c89c5806 100644
--- a/docs/docs/installation/configuring-superset.mdx
+++ b/docs/docs/installation/configuring-superset.mdx
@@ -125,7 +125,7 @@ If you're not using Gunicorn, you may want to disable the use of `flask-compress
 If you are running superset behind a load balancer or reverse proxy (e.g. NGINX or ELB on AWS), you
 may need to utilize a healthcheck endpoint so that your load balancer knows if your superset
 instance is running. This is provided at `/health` which will return a 200 response containing “OK”
-if the the webserver is running.
+if the webserver is running.
 
 If the load balancer is inserting `X-Forwarded-For/X-Forwarded-Proto` headers, you should set
 `ENABLE_PROXY_FIX = True` in the superset config file (`superset_config.py`) to extract and use the
@@ -140,7 +140,7 @@ RequestHeader set X-Forwarded-Proto "https"
 
 ### Custom OAuth2 Configuration
 
-Beyond FAB supported providers (Github, Twitter, LinkedIn, Google, Azure, etc), its easy to connect
+Beyond FAB supported providers (GitHub, Twitter, LinkedIn, Google, Azure, etc), its easy to connect
 Superset with other OAuth2 Authorization Server implementations that support “code” authorization.
 
 Make sure the pip package [`Authlib`](https://authlib.org/) is installed on the webserver.
diff --git a/docs/docs/installation/installing-superset-using-docker-compose.mdx b/docs/docs/installation/installing-superset-using-docker-compose.mdx
index ced6ba5660..8daaa2e630 100644
--- a/docs/docs/installation/installing-superset-using-docker-compose.mdx
+++ b/docs/docs/installation/installing-superset-using-docker-compose.mdx
@@ -38,7 +38,7 @@ of that VM. We recommend assigning at least 8GB of RAM to the virtual machine as
 provisioning a hard drive of at least 40GB, so that there will be enough space for both the OS and
 all of the required dependencies. Docker Desktop [recently added support for Windows Subsystem for Linux (WSL) 2](https://docs.docker.com/docker-for-windows/wsl/), which may be another option.
 
-### 2. Clone Superset's Github repository
+### 2. Clone Superset's GitHub repository
 
 [Clone Superset's repo](https://github.com/apache/superset) in your terminal with the
 following command:
diff --git a/docs/docs/installation/sql-templating.mdx b/docs/docs/installation/sql-templating.mdx
index 2a80f0fbf6..8908d39f02 100644
--- a/docs/docs/installation/sql-templating.mdx
+++ b/docs/docs/installation/sql-templating.mdx
@@ -119,7 +119,7 @@ In this section, we'll walkthrough the pre-defined Jinja macros in Superset.
 
 The `{{ current_username() }}` macro returns the username of the currently logged in user.
 
-If you have caching enabled in your Superset configuration, then by default the the `username` value will be used
+If you have caching enabled in your Superset configuration, then by default the `username` value will be used
 by Superset when calculating the cache key. A cache key is a unique identifier that determines if there's a
 cache hit in the future and Superset can retrieve cached data.
 
@@ -134,7 +134,7 @@ cache key by adding the following parameter to your Jinja code:
 
 The `{{ current_user_id() }}` macro returns the user_id of the currently logged in user.
 
-If you have caching enabled in your Superset configuration, then by default the the `user_id` value will be used
+If you have caching enabled in your Superset configuration, then by default the `user_id` value will be used
 by Superset when calculating the cache key. A cache key is a unique identifier that determines if there's a
 cache hit in the future and Superset can retrieve cached data.
 
@@ -182,7 +182,7 @@ Here's a concrete example:
 **Explicitly Including Values in Cache Key**
 
 The `{{ cache_key_wrapper() }}` function explicitly instructs Superset to add a value to the
-accumulated list of values used in the the calculation of the cache key.
+accumulated list of values used in the calculation of the cache key.
 
 This function is only needed when you want to wrap your own custom function return values
 in the cache key. You can gain more context
diff --git a/docs/docs/intro.mdx b/docs/docs/intro.mdx
index c4d0a13844..eb8a2f0a61 100644
--- a/docs/docs/intro.mdx
+++ b/docs/docs/intro.mdx
@@ -19,7 +19,7 @@ Here are a **few different ways you can get started with Superset**:
   using [Docker Compose](installation/installing-superset-using-docker-compose)
 - Download the [Docker image](https://hub.docker.com/r/apache/superset) from Dockerhub
 - Install the latest version of Superset
-  [from Github](https://github.com/apache/superset/tree/latest)
+  [from GitHub](https://github.com/apache/superset/tree/latest)
 
 Superset provides:
 
diff --git a/docs/docs/miscellaneous/chart-params.mdx b/docs/docs/miscellaneous/chart-params.mdx
index 0bd94db226..e7bef0e4a4 100644
--- a/docs/docs/miscellaneous/chart-params.mdx
+++ b/docs/docs/miscellaneous/chart-params.mdx
@@ -9,7 +9,7 @@ version: 1
 
 Chart parameters are stored as a JSON encoded string the `slices.params` column and are often referenced throughout the code as form-data. Currently the form-data is neither versioned nor typed as thus is somewhat free-formed. Note in the future there may be merit in using something like [JSON Schema](https://json-schema.org/) to both annotate and validate the JSON object in addition to using a Mypy `TypedDict` (introduced in Python 3.8) for typing the form-data in the backend. This sect [...]
 
-The following tables provide a non-exhausive list of the various fields which can be present in the JSON object grouped by the Explorer pane sections. These values were obtained by extracting the distinct fields from a legacy deployment consisting of tens of thousands of charts and thus some fields may be missing whilst others may be deprecated.
+The following tables provide a non-exhaustive list of the various fields which can be present in the JSON object grouped by the Explorer pane sections. These values were obtained by extracting the distinct fields from a legacy deployment consisting of tens of thousands of charts and thus some fields may be missing whilst others may be deprecated.
 
 Note not all fields are correctly categorized. The fields vary based on visualization type and may appear in different sections depending on the type. Verified deprecated columns may indicate a missing migration and/or prior migrations which were unsuccessful and thus future work may be required to clean up the form-data.
 
diff --git a/docs/src/resources/data.js b/docs/src/resources/data.js
index 3c0cf718a4..08d0781f86 100644
--- a/docs/src/resources/data.js
+++ b/docs/src/resources/data.js
@@ -19,7 +19,7 @@
 
 export const Databases = [
   {
-    title: 'Amazon Redshfit',
+    title: 'Amazon Redshift',
     href: 'https://aws.amazon.com/redshift/',
     imgName: 'aws-redshift.png',
   },
@@ -106,7 +106,7 @@ export const Databases = [
   {
     title: 'SQLite',
     href: 'https://www.sqlite.org/index.html',
-    imgName: 'sqllite.png',
+    imgName: 'sqlite.png',
   },
   {
     title: 'Trino',
diff --git a/docs/static/img/databases/sqllite.jpg b/docs/static/img/databases/sqlite.jpg
similarity index 100%
rename from docs/static/img/databases/sqllite.jpg
rename to docs/static/img/databases/sqlite.jpg
diff --git a/docs/static/img/databases/sqllite.png b/docs/static/img/databases/sqlite.png
similarity index 100%
rename from docs/static/img/databases/sqllite.png
rename to docs/static/img/databases/sqlite.png
diff --git a/superset-frontend/packages/superset-ui-core/src/number-format/README.md b/superset-frontend/packages/superset-ui-core/src/number-format/README.md
index c4663c0149..e3e5099243 100644
--- a/superset-frontend/packages/superset-ui-core/src/number-format/README.md
+++ b/superset-frontend/packages/superset-ui-core/src/number-format/README.md
@@ -68,7 +68,7 @@ There is also a formatter based on [pretty-ms](https://www.npmjs.com/package/pre
 used to format time durations:
 
 ```js
-import { createDurationFormatter, formatNumber, getNumberFormatterRegistry } from from '@superset-ui-number-format';
+import { createDurationFormatter, formatNumber, getNumberFormatterRegistry } from '@superset-ui-number-format';
 
 getNumberFormatterRegistry().registerValue('my_duration_format', createDurationFormatter({ colonNotation: true });
 console.log(formatNumber('my_duration_format', 95500))
diff --git a/superset-frontend/src/components/FilterableTable/FilterableTable.tsx b/superset-frontend/src/components/FilterableTable/FilterableTable.tsx
index c0b49f8619..0616c925fd 100644
--- a/superset-frontend/src/components/FilterableTable/FilterableTable.tsx
+++ b/superset-frontend/src/components/FilterableTable/FilterableTable.tsx
@@ -312,7 +312,7 @@ export default class FilterableTable extends PureComponent<
     this.props.orderedColumnKeys.forEach((key, index) => {
       // we can't use Math.max(...colWidths.slice(...)) here since the number
       // of elements might be bigger than the number of allowed arguments in a
-      // Javascript function
+      // JavaScript function
       widthsByColumnKey[key] =
         colWidths
           .slice(
diff --git a/superset/charts/post_processing.py b/superset/charts/post_processing.py
index 7b21290396..715f465574 100644
--- a/superset/charts/post_processing.py
+++ b/superset/charts/post_processing.py
@@ -18,7 +18,7 @@
 Functions to reproduce the post-processing of data on text charts.
 
 Some text-based charts (pivot tables and t-test table) perform
-post-processing of the data in Javascript. When sending the data
+post-processing of the data in JavaScript. When sending the data
 to users in reports we want to show the same data they would see
 on Explore.
 
diff --git a/superset/migrations/versions/96e99fb176a0_add_import_mixing_to_saved_query.py b/superset/migrations/versions/96e99fb176a0_add_import_mixing_to_saved_query.py
index f93deb1d0c..ffe0caf64c 100644
--- a/superset/migrations/versions/96e99fb176a0_add_import_mixing_to_saved_query.py
+++ b/superset/migrations/versions/96e99fb176a0_add_import_mixing_to_saved_query.py
@@ -78,7 +78,7 @@ def upgrade():
     try:
         # Add uniqueness constraint
         with op.batch_alter_table("saved_query") as batch_op:
-            # Batch mode is required for sqllite
+            # Batch mode is required for sqlite
             batch_op.create_unique_constraint("uq_saved_query_uuid", ["uuid"])
     except OperationalError:
         pass
diff --git a/superset/migrations/versions/b56500de1855_add_uuid_column_to_import_mixin.py b/superset/migrations/versions/b56500de1855_add_uuid_column_to_import_mixin.py
index 0872cf5b3b..c392c3e78c 100644
--- a/superset/migrations/versions/b56500de1855_add_uuid_column_to_import_mixin.py
+++ b/superset/migrations/versions/b56500de1855_add_uuid_column_to_import_mixin.py
@@ -139,7 +139,7 @@ def upgrade():
 
         # add uniqueness constraint
         with op.batch_alter_table(table_name) as batch_op:
-            # batch mode is required for sqllite
+            # batch mode is required for sqlite
             batch_op.create_unique_constraint(f"uq_{table_name}_uuid", ["uuid"])
 
     # add UUID to Dashboard.position_json
diff --git a/superset/migrations/versions/c501b7c653a3_add_missing_uuid_column.py b/superset/migrations/versions/c501b7c653a3_add_missing_uuid_column.py
index 786b41a1c7..6cdf7f5616 100644
--- a/superset/migrations/versions/c501b7c653a3_add_missing_uuid_column.py
+++ b/superset/migrations/versions/c501b7c653a3_add_missing_uuid_column.py
@@ -77,7 +77,7 @@ def upgrade():
 
         # add uniqueness constraint
         with op.batch_alter_table(table_name) as batch_op:
-            # batch mode is required for sqllite
+            # batch mode is required for sqlite
             batch_op.create_unique_constraint(f"uq_{table_name}_uuid", ["uuid"])
 
     # add UUID to Dashboard.position_json; this function is idempotent
diff --git a/superset/utils/pandas_postprocessing/pivot.py b/superset/utils/pandas_postprocessing/pivot.py
index 829329e71f..89a187ce89 100644
--- a/superset/utils/pandas_postprocessing/pivot.py
+++ b/superset/utils/pandas_postprocessing/pivot.py
@@ -56,7 +56,7 @@ def pivot(  # pylint: disable=too-many-arguments,too-many-locals
     :param drop_missing_columns: Do not include columns whose entries are all missing
     :param combine_value_with_metric: Display metrics side by side within each column,
            as opposed to each column being displayed side by side for each metric.
-    :param aggregates: A mapping from aggregate column name to the the aggregate
+    :param aggregates: A mapping from aggregate column name to the aggregate
            config.
     :param marginal_distributions: Add totals for row/column. Default to False
     :param marginal_distribution_name: Name of row/column with marginal distribution.