You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@streampipes.apache.org by ri...@apache.org on 2021/05/02 20:12:25 UTC

[incubator-streampipes-website] branch dev updated (d299cb2 -> eb95a4b)

This is an automated email from the ASF dual-hosted git repository.

riemer pushed a change to branch dev
in repository https://gitbox.apache.org/repos/asf/incubator-streampipes-website.git.


    from d299cb2  [STREAMPIPES-197] Add preliminary documentation for data explorer
     new caa2ced  [STREAMPIPES-197] Add documentation for notification view, add more detailed documentation to dashboard and data explorer view
     new 60489ad  [STREAMPIPES-197] Copy k8s documentation from Github readme
     new 444e355  [STREAMPIPES-197] Copy docker-compose documentation from Github readme
     new eb95a4b  [STREAMPIPES-197] Copy CLI docs from Github

The 4 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 documentation/docs/03_use-dashboard.md             |   5 +
 documentation/docs/03_use-data-explorer.md         |   4 +
 documentation/docs/03_use-notifications.md         |  20 ++-
 documentation/docs/05_deploy-docker.md             |  68 +++++++-
 documentation/docs/05_deploy-kubernetes.md         |  55 +++++-
 documentation/docs/06_extend-archetypes.md         |   2 +-
 documentation/docs/06_extend-cli.md                | 186 +++++++++++++++++++++
 documentation/docs/06_extend-setup.md              |  97 +----------
 documentation/website/i18n/en.json                 |   4 +
 documentation/website/sidebars.json                |   1 +
 .../01_notifications-overview.png                  | Bin 0 -> 111204 bytes
 11 files changed, 349 insertions(+), 93 deletions(-)
 create mode 100644 documentation/docs/06_extend-cli.md
 create mode 100644 documentation/website/static/img/03_use-notifications/01_notifications-overview.png

[incubator-streampipes-website] 02/04: [STREAMPIPES-197] Copy k8s documentation from Github readme

Posted by ri...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

riemer pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/incubator-streampipes-website.git

commit 60489ad051d9ba9b3a70d6c49e3b996b39ec24f1
Author: Dominik Riemer <ri...@fzi.de>
AuthorDate: Sun May 2 21:56:02 2021 +0200

    [STREAMPIPES-197] Copy k8s documentation from Github readme
---
 documentation/docs/05_deploy-docker.md     |  2 +-
 documentation/docs/05_deploy-kubernetes.md | 55 +++++++++++++++++++++++++++++-
 2 files changed, 55 insertions(+), 2 deletions(-)

diff --git a/documentation/docs/05_deploy-docker.md b/documentation/docs/05_deploy-docker.md
index c06dfdb..4ba8127 100644
--- a/documentation/docs/05_deploy-docker.md
+++ b/documentation/docs/05_deploy-docker.md
@@ -4,4 +4,4 @@ title: Docker Deployment
 sidebar_label: Docker Deployment
 ---
 
-tbd
\ No newline at end of file
+tbd
diff --git a/documentation/docs/05_deploy-kubernetes.md b/documentation/docs/05_deploy-kubernetes.md
index e1ea03e..fe84308 100644
--- a/documentation/docs/05_deploy-kubernetes.md
+++ b/documentation/docs/05_deploy-kubernetes.md
@@ -4,4 +4,57 @@ title: Kubernetes Deployment
 sidebar_label: Kubernetes Deployment
 ---
 
-tbd
\ No newline at end of file
+## Prerequisites
+Requires Helm (https://helm.sh/) and an active connection to a kubernetes cluster with a running tiller server.
+
+Tested with:
+* K3s v1.18.8+k3s1 (6b595318) with K8s v1.18.8
+* Helm v3.1.2
+
+## Usage
+We provide two helm chart options to get you going:
+
+- **default**: a light-weight option with few pipeline elements, needs less memory
+- **full**:  contains more pipeline elements, requires **>16 GB RAM** (recommended)
+
+**Starting** the **default** helm chart option is as easy as simply running the following command from the root of this folder:
+> **NOTE**: Starting might take a while since we also initially pull all Docker images from Dockerhub.
+
+```bash
+helm install streampipes ./
+```
+After a while, all containers should successfully started, indicated by the `Running` status.
+```bash
+kubectl get pods
+NAME                                           READY   STATUS    RESTARTS   AGE
+activemq-66d58f47cf-2r2nb                      1/1     Running   0          3m27s
+backend-76ddc486c8-nswpc                       1/1     Running   0          3m27s
+connect-master-7b477f9b79-8dfvr                1/1     Running   0          3m26s
+connect-worker-78d89c989c-9v8zs                1/1     Running   0          3m27s
+consul-55965f966b-gwb7l                        1/1     Running   0          3m27s
+couchdb-77db98cf7b-xnnvb                       1/1     Running   0          3m27s
+influxdb-b95b6479-r8wh8                        1/1     Running   0          3m27s
+kafka-657b5fb77-dp2d6                          1/1     Running   0          3m27s
+pipeline-elements-all-jvm-79c445dbd9-m8xcs     1/1     Running   0          3m27s
+sources-watertank-simulator-6c6b8844f6-6b4d7   1/1     Running   0          3m27s
+ui-b94bd9766-rm6zb                             2/2     Running   0          3m27s
+zookeeper-5d9947686f-6nzgs                     1/1     Running   0          3m26s
+```
+
+After all containers are successfully started just got to your browser and visit any of the k8s cluster nodes on
+`http://<NODE_IP>` to finish the installation.
+
+> **NOTE**: If you're running Docker for Mac or Docker for Windows with a local k8s cluster, the above step to use your host IP might not work. Luckily, you can port-forward a service port to your localhost using the following command to be able to access the UI either via `http://localhost` or `http://<HOST_IP>` (you require sudo to run this command in order to bind to a privileged port).
+```bash
+kubectl port-forward svc/ui --address=0.0.0.0 80:80
+```
+
+Starting the **full** helm chart option is almost the same:
+```bash
+helm install streampipes ./ --set deployment=full
+```
+
+**Deleting** the current helm chart deployment:
+```bash
+helm del streampipes
+```

[incubator-streampipes-website] 01/04: [STREAMPIPES-197] Add documentation for notification view, add more detailed documentation to dashboard and data explorer view

Posted by ri...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

riemer pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/incubator-streampipes-website.git

commit caa2cedcee92b8745bd5a53c66e16695a21bfd04
Author: Dominik Riemer <ri...@fzi.de>
AuthorDate: Sun May 2 21:54:08 2021 +0200

    [STREAMPIPES-197] Add documentation for notification view, add more detailed documentation to dashboard and data explorer view
---
 documentation/docs/03_use-dashboard.md              |   5 +++++
 documentation/docs/03_use-data-explorer.md          |   4 ++++
 documentation/docs/03_use-notifications.md          |  20 +++++++++++++++++++-
 .../01_notifications-overview.png                   | Bin 0 -> 111204 bytes
 4 files changed, 28 insertions(+), 1 deletion(-)

diff --git a/documentation/docs/03_use-dashboard.md b/documentation/docs/03_use-dashboard.md
index 494bb42..3bc2ed0 100644
--- a/documentation/docs/03_use-dashboard.md
+++ b/documentation/docs/03_use-dashboard.md
@@ -9,6 +9,11 @@ The entry page of the live dashboard lists all created dashboards as in the scre
 
 <img class="docs-image" src="/docs/img/03_use-dashboard/01_dashboard-overview.png" alt="StreamPipes Dashboard Overview">
 
+## Visualizing Data Streams
+
+To visualize data streams in the live dashboard, a pipeline must be created that makes use of the so-called **Dashboard Sink**.
+Any data stream or data processor can serve as an input of the dashboard sink. Switch to the pipeline editor, create a pipeline and configure the dashboard sink. The visualization name is used to identify the sink in case multiple dashboard sinks are used within a single pipeline.
+
 ## Managing Dashboards
 Multiple dashboards can be created, e.g., to organize different assets in a single dashboard view.
 
diff --git a/documentation/docs/03_use-data-explorer.md b/documentation/docs/03_use-data-explorer.md
index dfed4c1..1ff9053 100644
--- a/documentation/docs/03_use-data-explorer.md
+++ b/documentation/docs/03_use-data-explorer.md
@@ -12,3 +12,7 @@ It provides a canvas where various visualizations from multiple pipelines can be
 
 The data explorer is currently available as an early beta version and features are still subject to change. A more detailed documentation will be available once the data explorer is available in a stable version.
 
+## Using the data explorer
+
+In the data explorer, any pipeline that uses the so-called **Data Lake** sink can be explored in the data explorer. Switch to the pipeline editor and add the data lake sink to a data processor or stream.
+The sink requires an index name as a configuration parameter, which is used as an identifier in the data explorer.
diff --git a/documentation/docs/03_use-notifications.md b/documentation/docs/03_use-notifications.md
index 6f144b3..245a0bb 100644
--- a/documentation/docs/03_use-notifications.md
+++ b/documentation/docs/03_use-notifications.md
@@ -4,4 +4,22 @@ title: Notifications
 sidebar_label: Notifications
 ---
 
-tbd
\ No newline at end of file
+The notification module can be used to create internal notifications.
+
+<img class="docs-image" src="/docs/img/03_use-notifications/01_notifications-overview.png" alt="StreamPipes Notifications">
+
+## Using notifications
+
+Any pipeline that includes the data sink **Notification** can trigger notifications that appear in the notification view. To configure a new notification, switch to the pipeline editor and append the notification sink to a data processor or data stream.
+The sink requires a title and message as configuration parameters.
+
+### Placeholders
+
+The notification message can include placeholders for fields which are replaced with the actual value at runtime.
+
+## Managing notifications
+
+The notification view is split into two parts. The left sides lists all pipelines which include a notification sink. By selecting a pipeline, available notifications will be shown in the right panel.
+By scrolling up, older notifications become visible. Notifications that have appeared in the detail view will be automatically marked as read, so that only new, unread notifications will appear in the left toolbar.
+
+
diff --git a/documentation/website/static/img/03_use-notifications/01_notifications-overview.png b/documentation/website/static/img/03_use-notifications/01_notifications-overview.png
new file mode 100644
index 0000000..688a987
Binary files /dev/null and b/documentation/website/static/img/03_use-notifications/01_notifications-overview.png differ

[incubator-streampipes-website] 03/04: [STREAMPIPES-197] Copy docker-compose documentation from Github readme

Posted by ri...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

riemer pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/incubator-streampipes-website.git

commit 444e3555c187ecfbe917064a850f3843320d9b75
Author: Dominik Riemer <ri...@fzi.de>
AuthorDate: Sun May 2 21:58:42 2021 +0200

    [STREAMPIPES-197] Copy docker-compose documentation from Github readme
---
 documentation/docs/05_deploy-docker.md | 68 +++++++++++++++++++++++++++++++++-
 1 file changed, 67 insertions(+), 1 deletion(-)

diff --git a/documentation/docs/05_deploy-docker.md b/documentation/docs/05_deploy-docker.md
index 4ba8127..16b6ce1 100644
--- a/documentation/docs/05_deploy-docker.md
+++ b/documentation/docs/05_deploy-docker.md
@@ -4,4 +4,70 @@ title: Docker Deployment
 sidebar_label: Docker Deployment
 ---
 
-tbd
+StreamPipes Compose is a simple collection of user-friendly `docker-compose` files that easily lets gain first-hand experience with Apache StreamPipes.
+
+> **NOTE**: We recommend StreamPipes Compose to only use for initial try-out and testing. If you are a developer and want to develop new pipeline elements or core feature, use the [StreamPipes CLI](../cli).
+
+#### TL;DR: A one-liner to rule them all :-)
+
+```bash
+docker-compose up -d
+```
+Go to http://localhost to finish the installation in the browser. Once finished, switch to the pipeline editor and start the interactive tour or check the [online tour](https://streampipes.apache.org/docs/docs/user-guide-tour/) to learn how to create your first pipeline!
+
+## Prerequisites
+* Docker >= 17.06.0
+* Docker-Compose >= 1.17.0 (Compose file format: 3.4)
+* Google Chrome (recommended), Mozilla Firefox, Microsoft Edge
+
+Tested on: **macOS, Linux, Windows 10** (CMD, PowerShell, GitBash)
+
+**macOS** and **Windows 10** (Pro, Enterprise, Education) users can easily get Docker and Docker-Compose on their systems by installing **Docker for Mac/Windows** (recommended).
+
+> **NOTE**: On purpose, we disabled all port mappings except of http port **80** to access the StreamPipes UI to provide minimal surface for conflicting ports.
+
+## Usage
+We provide two options to get you going:
+
+- **default**: a light-weight option with few pipeline elements, needs less memory
+- **full**:  contains more pipeline elements, requires **>16 GB RAM** (recommended)
+
+**Starting** the **default** option is as easy as simply running:
+> **NOTE**: Starting might take a while since `docker-compose up` also initially pulls all Docker images from Dockerhub.
+
+```bash
+docker-compose up -d
+# go to after all services are started http://localhost
+```
+After all containers are successfully started just got to your browser and visit http://localhost to finish the installation. Once finished, switch to the pipeline editor and start the interactive tour or check the [online tour](https://streampipes.apache.org/docs/docs/user-guide-tour/) to learn how to create your first pipeline!
+
+**Stopping** the **default** option is similarly easy:
+```bash
+docker-compose down
+# if you want to remove mapped data volumes, run:
+# docker-compose down -v
+```
+
+Starting the **full** option is almost the same, just specify the `docker-compose.full.yml` file:
+```bash
+docker-compose -f docker-compose.full.yml up -d
+# go to after all services are started http://localhost
+```
+Stopping the **full** option:
+```bash
+docker-compose -f docker-compose.full.yml down
+#docker-compose -f docker-compose.full.yml down -v
+```
+
+## Update services
+To actively pull the latest available Docker images use:
+```bash
+docker-compose pull
+# docker-compose -f docker-compose.full.yml pull
+```
+
+## Upgrade
+To upgrade to another StreamPipes version, simply edit the `SP_VERSION` in the `.env` file.
+```
+SP_VERSION=<VERSION>
+```

[incubator-streampipes-website] 04/04: [STREAMPIPES-197] Copy CLI docs from Github

Posted by ri...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

riemer pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/incubator-streampipes-website.git

commit eb95a4b08035d17a269ca15777c955677a620891
Author: Dominik Riemer <ri...@fzi.de>
AuthorDate: Sun May 2 22:12:12 2021 +0200

    [STREAMPIPES-197] Copy CLI docs from Github
---
 documentation/docs/06_extend-archetypes.md |   2 +-
 documentation/docs/06_extend-cli.md        | 186 +++++++++++++++++++++++++++++
 documentation/docs/06_extend-setup.md      |  97 ++-------------
 documentation/website/i18n/en.json         |   4 +
 documentation/website/sidebars.json        |   1 +
 5 files changed, 200 insertions(+), 90 deletions(-)

diff --git a/documentation/docs/06_extend-archetypes.md b/documentation/docs/06_extend-archetypes.md
index df98143..71f7675 100644
--- a/documentation/docs/06_extend-archetypes.md
+++ b/documentation/docs/06_extend-archetypes.md
@@ -23,7 +23,7 @@ We use ``groupId``: ``org.example`` and ``artifactId``: ``ExampleProcessor``.
 You can keep the default values for the other settings, confirm them by hitting enter.
 Now, a new folder with the name ``ExampleProcessor`` is generated.
 
-The current {sp.version} is 0.66.0
+The current {sp.version} is 0.67.0
 
 ```bash
 mvn archetype:generate                              	 	     \
diff --git a/documentation/docs/06_extend-cli.md b/documentation/docs/06_extend-cli.md
new file mode 100644
index 0000000..546df8d
--- /dev/null
+++ b/documentation/docs/06_extend-cli.md
@@ -0,0 +1,186 @@
+---
+id: extend-cli
+title: StreamPipes CLI
+sidebar_label: StreamPipes CLI
+---
+
+The StreamPipes command-line interface (CLI) is focused on developers in order to provide an easy entrypoint to set up a suitable dev environment, either planning on developing
+
+* new extensions such as **connect adapters, processors, sinks** or,
+* new core features for **backend** and **ui**.
+
+## TL;DR
+
+```bash
+streampipes env --list
+[INFO] Available StreamPipes environment templates:
+pipeline-element
+...
+streampipes env --set pipeline-element
+streampipes up -d
+```
+> **NOTE**: use `./streampipes` if you haven't add it to the PATH and sourced it (see section "Run `streampipes` from anywhere?").
+
+## Prerequisites
+The CLI is basically a wrapper around multiple `docker` and `docker-compose` commands plus some additional sugar.
+
+* Docker >= 17.06.0
+* Docker-Compose >= 1.26.0 (Compose file format: 3.4)
+* Google Chrome (recommended), Mozilla Firefox, Microsoft Edge
+* For Windows Developer: GitBash only
+
+
+Tested on: **macOS**, **Linux**, **Windows***)
+
+> **NOTE**: *) If you're using Windows the CLI only works in combination with GitBash - CMD, PowerShell won't work.
+
+
+## CLI commands overview
+
+```
+StreamPipes CLI - Manage your StreamPipes environment with ease
+
+Usage: streampipes COMMAND [OPTIONS]
+
+Options:
+  --help, -h      show help
+  --version, -v   show version
+
+Commands:
+  clean       Remove StreamPipes data volumes, dangling images and network
+  down        Stop and remove StreamPipes containers
+  env         Inspect and select StreamPipes environments
+  info        Get information
+  logs        Get container logs for specific container
+  ps          List all StreamPipes container for running environment
+  pull        Download latest images from Dockerhub
+  restart     Restart StreamPipes environment
+  up          Create and start StreamPipes container environment
+
+Run 'streampipes COMMAND --help' for more info on a command.
+```
+
+## Usage: Along dev life-cycle
+
+**List** available environment templates.
+```bash
+streampipes env --list
+```
+
+**Inspect** services in an available environment to know what kind of services it is composed of.
+```bash
+streampipes env --inspect pipeline-element
+```
+
+**Set** environment, e.g. `pipeline-element`, if you want to write a new pipeline element.
+```bash
+streampipes env --set pipeline-element
+```
+
+**Start** environment ( default: `dev` mode). Here the service definition in the selected environment is used to start the multi-container landscape.
+> **NOTE**: `dev` mode is enabled by default since we rely on open ports to core service such as `consul`, `couchdb`, `kafka` etc. to reach from the IDE when developing. If you don't want to map ports (except the UI port), then use the `--no-ports` flag.
+
+```bash
+streampipes up -d
+# start in production mode with unmapped ports
+# streampipes up -d --no-ports
+```
+Now you're good to go to write your new pipeline element :tada: :tada: :tada:
+
+> **HINT for extensions**: Use our [Maven archetypes](https://streampipes.apache.org/docs/docs/dev-guide-archetype/) to setup a project skeleton and use your IDE of choice for development. However, we do recommend using IntelliJ.
+
+> **HINT for core**: To work on `backend` or `ui` features you need to set the template to `backend` and clone the core repository [incubator-streampipes](https://github.com/apache/incubator-streampipes) - check the prerequisites there for more information.
+
+**Stop** environment and remove docker container
+```bash
+streampipes down
+# want to also clean docker data volumes when stopping the environment?
+# streampipes down -v
+```
+
+## Additionally, useful commands
+
+**Start individual services only?** We got you! You chose a template that suits your needs and now you only want to start individual services from it, e.g. only Kafka and Consul.
+
+> **NOTE**: the service names need to be present and match your current `.spenv` environment.
+
+```bash
+streampipes up -d kafka consul
+```
+
+**Get current environment** (if previously set using `streampipes env --set <environment>`).
+```bash
+streampipes env
+```
+
+**Get logs** of specific service and use optional `--follow` flag to stay attached to the logs.
+```bash
+streampipes logs --follow backend
+```
+
+**Update** all services of current environment
+```bash
+streampipes pull
+```
+
+**Restart** all services of current environment or specific services
+```bash
+streampipes restart
+# restart backend & consul
+# streampipes restart backend consul
+```
+
+**Clean** your system and remove created StreamPipes Docker volumes, StreamPipes docker network and dangling StreamPipes images of old image layers.
+```bash
+streampipes clean
+# remove volumes, network and dangling images
+# streampipes clean --volumes
+```
+
+## Modify/Create an environment template
+As of now, this step has to be done **manually**. All environments are located in `environments/`.
+
+```bash
+├── adapter               # developing a new connect adapter
+├── backend               # developing core backend features
+├── basic                 # wanna run core, UI, connect etc from the IDE?
+├── full                  # full version containing more pipeline elements
+├── lite                  # few pipeline elements, less memory  
+├── pipeline-element      # developing new pipeline-elements
+└── ui                    # developing UI features
+```
+**Modifying an existing environment template**. To modify an existing template, you can simply add a `<YOUR_NEW_SERVICE>` to the template.
+> **NOTE**: You need to make sure, that the service your are adding exists in `deploy/standalone/service/<YOUR_NEW_SERVICE>`. If your're adding a completely new service take a look at existing ones, create a new service directory and include a `docker-compose.yml` and `docker-compose.dev.yml` file.
+
+```
+[environment:backend]
+activemq
+kafka
+...
+<YOUR_NEW_SERVICE>
+```
+
+**Creating a new** environment template. To create a new environment template, place a new file `environments/<YOUR_NEW_ENVIRONMENT>` in the template directory. Open the file and use the following schema.
+> **IMPORTANT**: Please make sure to have `[environment:<YOUR_NEW_ENVIRONMENT>]` header in the first line of your new template matching the name of the file. Make sure to use small caps letters (lowercase) only.
+
+```
+[environment:<YOUR_NEW_ENVIRONMENT>]
+<SERVICE_1>
+<SERVICE_2>
+...
+```
+
+## Run `streampipes` from anywhere? No problem
+Simply add the path to this cli directory to your `$PATH` (on macOS, Linux) variable, e.g. in your `.bashrc` or `.zshrc`, or `%PATH%` (on Windows).
+
+For **macOS**, or **Linux**:
+
+```bash
+export PATH="/path/to/incubator-streampipes-installer/cli:$PATH"
+```
+
+For **Windows 10**, e.g. check this [documentation](https://helpdeskgeek.com/windows-10/add-windows-path-environment-variable/).
+
+
+## Upgrade to new version
+To upgrade to a new version, simply edit the version tag `SP_VERSION` in the `.env` file.
diff --git a/documentation/docs/06_extend-setup.md b/documentation/docs/06_extend-setup.md
index eabaf81..75fecfe 100644
--- a/documentation/docs/06_extend-setup.md
+++ b/documentation/docs/06_extend-setup.md
@@ -4,18 +4,19 @@ title: Development Setup
 sidebar_label: Development Setup
 ---
 
+Pipeline elements in StreamPipes are provided as standalone microservices. New pipeline elements can be easily developed using the provided Maven archetypes and can be installed in StreamPipes at runtime.
+
 In this section, we describe our recommended minimum setup for locally setting up a development instance of StreamPipes needed to develop, run and test new pipeline elements.
 
 ## IDE & required dev tools
 StreamPipes does not have specific requirements on the IDE - so feel free to choose the IDE of your choice.
 The only requirements in terms of development tools are that you have Java 8 and Maven installed.
 
-## Docker-based local StreamPipes instance
+## StreamPipes CLI: Docker-based local StreamPipes instance
 In order to quickly test developed pipeline elements without needing to install all services required by StreamPipes, we provide a CLI tool that allows you to selectively start StreamPipes components.
-The CLI tool allows to switch to several templates (based on docker-compose) depending on the role. For instance, if you are developing a pipeline element, use the template ``pe-developer``. This will start backend and ui components in a Docker container, while you can easily test your pipeline element in your IDE.
-
-For now, we refer to the Github Readme for instructions on how to use the CLI tool: [https://github.com/apache/incubator-streampipes-installer](https://github.com/apache/incubator-streampipes-installer)
+The CLI tool allows to switch to several templates (based on docker-compose) depending on the role. 
 
+The documentation on the usage of the CLI tool is available [here](extend-cli).
 ## Starter projects
 
 Now, once you've started the development instance, you are ready to develop your very first pipeline element.
@@ -23,90 +24,8 @@ Instead of starting from scratch, we recommend using our provided maven archetyp
 
 ### Maven archetypes
 
-Create the Maven archetype as described in the [Getting Started](/docs/dev-guide-archetype) guide.
-
-### Starting from scratch
-
-In order to develop a new pipeline element from scratch, you need to create a new Maven project and import the following dependencies:
-
-<details class="info">
-<summary>pom.xml</summary>
-```
-<dependency>
-    <groupId>org.streampipes</groupId>
-    <artifactId>streampipes-container-standalone</artifactId>
-    <version>0.64.0</version>
-</dependency>
-
-<dependency>
-    <groupId>org.streampipes</groupId>
-    <artifactId>streampipes-sdk</artifactId>
-    <version>0.64.0</version>
-</dependency>
-
-<dependency>
-    <groupId>org.streampipes</groupId>
-    <artifactId>streampipes-vocabulary</artifactId>
-    <version>0.64.0</version>
-</dependency>
-
-<!-- This dependency needs to be imported if you plan to develop a new data processor or data sink using the Apache Flink wrapper -->
-<dependency>
-    <groupId>org.streampipes</groupId>
-    <artifactId>streampipes-wrapper-flink</artifactId>
-    <version>0.64.0</version>
-</dependency>
-
-<!-- This dependency needs to be imported if you plan to develop a new data processor or data sink which is running directly on the JVM -->
-<dependency>
-    <groupId>org.streampipes</groupId>
-    <artifactId>streampipes-wrapper-standalone</artifactId>
-    <version>0.64.0</version>
-</dependency>
-
-<dependency>
-    <groupId>org.streampipes</groupId>
-    <artifactId>streampipes-dataformat-json</artifactId>
-    <version>0.64.0</version>
-</dependency>
-<dependency>
-    <groupId>org.streampipes</groupId>
-    <artifactId>streampipes-dataformat-smile</artifactId>
-     <version>0.64.0</version>
-</dependency>
-<dependency>
-    <groupId>org.streampipes</groupId>
-    <artifactId>streampipes-dataformat-cbor</artifactId>
-     <version>0.64.0</version>
-</dependency>
-<dependency>
-    <groupId>org.streampipes</groupId>
-    <artifactId>streampipes-dataformat-fst</artifactId>
-     <version>0.64.0</version>
-</dependency>
-<dependency>
-    <groupId>org.streampipes</groupId>
-    <artifactId>streampipes-messaging-jms</artifactId>
-     <version>0.64.0</version>
-</dependency>
-<dependency>
-    <groupId>org.streampipes</groupId>
-    <artifactId>streampipes-messaging-kafka</artifactId>
-     <version>0.64.0</version>
-</dependency>
-```
-</details>
-
-The following three dependencies are mandatory:
-
-* `streampipes-container-standalone`, defines that we are going to create a new pipeline element where the description will be accessible through an embedded web server.
-* `streampipes-sdk` imports the SDK which provides many convencience functions to create new pipeline elements.
-* `streampipes-vocabulary` imports various RDF vocabularies which are used by the SDK to auto-generate the semantic description of pipeline elements.
-
-The following three dependencies might be optional depending on the pipeline element type you plan to create:
-
-*  `streampipes-wrapper-flink` should be used in case you plan to connect a new data processor or data sink that uses Apache Flink for processing events at runtime.
-*  `streampipes-wrapper-standalone` should be used in case you plan to connect a new data processor or data sink that does not use an external processing engine. Events are directly processed in a single-host fashion.
+Create the Maven archetype as described in the [Getting Started](extend-archetypes) guide.
 
+### Examples
 
-Finally, this dependency will provide abstract classes to define data sources and streams.
+We provide several examples that explain the usage of some concepts in this [Github repo](https://github.com/apache/incubator-streampipes-examples). 
diff --git a/documentation/website/i18n/en.json b/documentation/website/i18n/en.json
index fc28944..0622388 100644
--- a/documentation/website/i18n/en.json
+++ b/documentation/website/i18n/en.json
@@ -77,6 +77,10 @@
         "title": "Maven Archetypes",
         "sidebar_label": "Maven Archetypes"
       },
+      "extend-cli": {
+        "title": "StreamPipes CLI",
+        "sidebar_label": "StreamPipes CLI"
+      },
       "extend-sdk-event-model": {
         "title": "SDK Guide: Event Model",
         "sidebar_label": "SDK: Event Model"
diff --git a/documentation/website/sidebars.json b/documentation/website/sidebars.json
index 559dbc7..1eaa5d3 100644
--- a/documentation/website/sidebars.json
+++ b/documentation/website/sidebars.json
@@ -163,6 +163,7 @@
     ],
     "\uD83D\uDCBB Extend StreamPipes": [
       "extend-setup",
+      "extend-cli",
       "extend-archetypes",
       "extend-tutorial-data-sources",
       "extend-tutorial-data-processors",