You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@streampipes.apache.org by ri...@apache.org on 2020/05/19 20:50:58 UTC

[incubator-streampipes-website] 05/05: Release new docs version 0.66.0

This is an automated email from the ASF dual-hosted git repository.

riemer pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/incubator-streampipes-website.git

commit bde93ee5119f93b5d20d2832d21d7d330c7e37a7
Author: Dominik Riemer <ri...@fzi.de>
AuthorDate: Tue May 19 22:50:20 2020 +0200

    Release new docs version 0.66.0
---
 documentation/website/i18n/en.json                 |  36 +++
 .../version-0.66.0/dev-guide-archetype.md          | 144 +++++++++
 .../version-0.66.0/dev-guide-configuration.md      |  59 ++++
 .../version-0.66.0/dev-guide-environment.md        | 113 +++++++
 .../version-0.66.0/dev-guide-output-strategies.md  | 347 +++++++++++++++++++++
 .../version-0.66.0/dev-guide-static-properties.md  | 265 ++++++++++++++++
 .../dev-guide-stream-requirements.md               | 179 +++++++++++
 .../version-0.66.0/dev-guide-tutorial-sources.md   | 283 +++++++++++++++++
 .../version-0.66.0/pipeline-elements.md            |   8 +
 .../version-0.66.0/user-guide-installation.md      | 140 +++++++++
 documentation/website/versions.json                |   1 +
 11 files changed, 1575 insertions(+)

diff --git a/documentation/website/i18n/en.json b/documentation/website/i18n/en.json
index f0eb150..4e7d31f 100644
--- a/documentation/website/i18n/en.json
+++ b/documentation/website/i18n/en.json
@@ -1143,6 +1143,42 @@
       "version-0.65.0-pre-asf/version-0.65.0-pre-asf-user-guide-tour": {
         "title": "Tour",
         "sidebar_label": "Tour"
+      },
+      "version-0.66.0/version-0.66.0-dev-guide-archetype": {
+        "title": "Start Developing",
+        "sidebar_label": "Start Developing"
+      },
+      "version-0.66.0/version-0.66.0-dev-guide-configuration": {
+        "title": "Configuration",
+        "sidebar_label": "Configuration"
+      },
+      "version-0.66.0/version-0.66.0-dev-guide-development-environment": {
+        "title": "Development Environment",
+        "sidebar_label": "Development Environment"
+      },
+      "version-0.66.0/version-0.66.0-dev-guide-output-strategies": {
+        "title": "SDK Guide: Output Strategies",
+        "sidebar_label": "Output Strategies"
+      },
+      "version-0.66.0/version-0.66.0-dev-guide-static-properties": {
+        "title": "SDK Guide: Static Properties",
+        "sidebar_label": "Static Properties"
+      },
+      "version-0.66.0/version-0.66.0-dev-guide-stream-requirements": {
+        "title": "SDK Guide: Stream Requirements",
+        "sidebar_label": "Stream Requirements"
+      },
+      "version-0.66.0/version-0.66.0-dev-guide-tutorial-sources": {
+        "title": "Tutorial: Data Sources",
+        "sidebar_label": "Tutorial: Data Sources"
+      },
+      "version-0.66.0/version-0.66.0-pipeline-elements": {
+        "title": "Overview",
+        "sidebar_label": "Overview"
+      },
+      "version-0.66.0/version-0.66.0-user-guide-installation": {
+        "title": "Installation",
+        "sidebar_label": "Installation"
       }
     },
     "links": {
diff --git a/documentation/website/versioned_docs/version-0.66.0/dev-guide-archetype.md b/documentation/website/versioned_docs/version-0.66.0/dev-guide-archetype.md
new file mode 100644
index 0000000..064adb5
--- /dev/null
+++ b/documentation/website/versioned_docs/version-0.66.0/dev-guide-archetype.md
@@ -0,0 +1,144 @@
+---
+id: version-0.66.0-dev-guide-archetype
+title: Start Developing
+sidebar_label: Start Developing
+original_id: dev-guide-archetype
+---
+
+In this tutorial we explain how you can use the Maven archetypes to develop your own StreamPipes processors and sinks.
+We use IntelliJ in this tutorial, but it works with any IDE of your choice.
+
+## Prerequisites
+You need to have Maven installed, further you need an up and running StreamPipes installation on your development computer.
+To ease the configuration of environment variables, we use the IntelliJ [env Plugin](https://plugins.jetbrains.com/plugin/7861-envfile).
+Install this in IntelliJ. The development also works without the plugin, then you have to set the environment variables manually instead of using the env configuration file.
+
+## Create Project
+To create a new project, we provide multiple Maven Archteypes.
+Currently, we have archetypes for the JVM and Flink wrappers, each for processors and sinks.
+The commands required to create a new pipeline element project can be found below. Make sure that you select a version compatible with your StreamPipes installation.
+Copy the command into your terminal to create a new project.
+The project will be created in the current folder.
+First, the ``groupId`` of the resulting Maven artifact must be set.
+We use ``groupId``: ``org.example`` and ``artifactId``: ``ExampleProcessor``.
+You can keep the default values for the other settings, confirm them by hitting enter.
+Now, a new folder with the name ``ExampleProcessor`` is generated.
+
+The current {sp.version} is 0.66.0
+
+```bash
+mvn archetype:generate                              	 	     \
+  -DarchetypeGroupId=org.apache.streampipes          			         \
+  -DarchetypeArtifactId=streampipes-archetype-pe-processors-jvm  \
+  -DarchetypeVersion={sp.version}
+```
+<details class="info">
+    <summary>Select: [Processors / Sinks] [JVM / Flink]</summary>
+
+## Processors JVM
+```bash
+mvn archetype:generate                              	 	     \
+  -DarchetypeGroupId=org.apache.streampipes          			         \
+  -DarchetypeArtifactId=streampipes-archetype-pe-processors-jvm  \
+  -DarchetypeVersion={sp.version}
+```
+
+## Processors Flink
+```bash
+mvn archetype:generate                              	 	     \
+  -DarchetypeGroupId=org.apache.streampipes          			         \
+  -DarchetypeArtifactId=streampipes-archetype-pe-processors-flink  \
+  -DarchetypeVersion={sp.version}
+```
+
+## Sinks JVM
+```bash
+mvn archetype:generate                              	 	     \
+  -DarchetypeGroupId=org.apache.streampipes          			         \
+  -DarchetypeArtifactId=streampipes-archetype-pe-sinks-jvm  \
+  -DarchetypeVersion={sp.version}
+```
+
+## Sinks Flink
+```bash
+mvn archetype:generate                              	 	     \
+  -DarchetypeGroupId=org.apache.streampipes          			         \
+  -DarchetypeArtifactId=streampipes-archetype-pe-sinks-flink  \
+  -DarchetypeVersion={sp.version}
+```
+</details>
+
+
+## Edit Processor
+Open the project in your IDE.
+If everything worked, the structure should look similar to the following image.
+The *config* package contains all the configuration parameters of your processors / sinks.
+In the *main* package, it is defined which processors / sinks you want to activate and the *pe.processor.example* package contains three classes with the application logic.
+For details, have a look at the other parts of the Developer Guide, where these classes are explained in more depth.
+
+<img src="/docs/img/archetype/project_structure.png" width="30%" alt="Project Structure">
+
+Open the class *Example* and edit the ``onEvent`` method to print the incoming event, log it to the console and send it to the next component without changing it.
+
+```java
+@Override
+public void onEvent(Event event, SpOutputCollector collector) {
+    // Print the incoming event on the console
+    System.out.println(event);
+
+    // Hand the incoming event to the output collector without changing it.
+    collector.collect(event);
+}
+```
+
+## Start Processor
+Before the processor can be started, you need to edit the *env* file in the *development* folder.
+Replace all local hosts in this file with the IP address or DNS name of your computer.
+This is relevant to make the mapping between the services running in Docker and your component running in the local IDE.
+After all updates are updated, this file is used by the envfile plugin to provide configuration parameters to the pipeline element.
+Alternatively, environment variables can also be set on your host or IDE.
+Now start the project by clicking on **(Run -> Edit Configuration)**.
+Add a new configuration in the Configuration menu by clicking on the + sign and select **Application**.
+Name the configuration *ExampleProcessor* and select the *Init* class as the 'Main class'.
+Then set *ExampleProcessor* in 'Use classpath of module'.
+
+
+As the last step, switch to the tab *EnvFile* and load the env file.
+Click on 'Enable EnvFile' to activate it and add the just edited env file by clicking on the + sign.
+Save all the changes by clicking *Apply*.
+Now you can start the processor.
+
+<div class="my-carousel">
+    <img src="/docs/img/archetype/run_configuration.png" alt="Configuration View">
+    <img src="/docs/img/archetype/run_env_configuration.png" alt="Environment Configuration View">
+</div>
+
+To check if the service is up and running, open the browser on *'localhost:6666'*. The machine-readable description of the processor should be visible as shown below.
+
+<img src="/docs/img/archetype/endpoint.png" width="90%" alt="Project Structure">
+
+
+<div class="admonition error">
+<div class="admonition-title">Common Problems</div>
+<p>
+If the service description is not shown on 'localhost:6666', you might have to change the port address.
+This needs to be done in the configuration of your service, further explained in the configurations part of the developer guide.
+
+If the service does not show up in the StreamPipes installation menu, click on 'MANAGE ENDPOINTS' and add 'http://<span></span>YOUR_IP_OR_DNS_NAME:6666'.
+Use the IP or DNS name you provided in the env file.
+After adding the endpoint, a new processor with the name *Example* should show up.
+</p>
+</div>
+
+Now you can go to StreamPipes.
+Your new processor *'Example'* should now show up in the installation menu.
+Install it, then switch to the pipeline view and create a simple pipeline that makes use of your newly created processor.
+In case you opened the StreamPipes installation for the first time, it should have been automatically installed during the setup process.
+
+<img src="/docs/img/archetype/example_pipeline.png" width="80%" alt="Project Structure">
+
+Start this pipeline.
+Now you should see logging messages in your console and, once you've created a visualization, you can also see the resulting events of your component in StreamPipes.
+
+Congratulations, you have just created your first processor!
+From here on you can start experimenting and implement your own algorithms.
diff --git a/documentation/website/versioned_docs/version-0.66.0/dev-guide-configuration.md b/documentation/website/versioned_docs/version-0.66.0/dev-guide-configuration.md
new file mode 100644
index 0000000..18bc915
--- /dev/null
+++ b/documentation/website/versioned_docs/version-0.66.0/dev-guide-configuration.md
@@ -0,0 +1,59 @@
+---
+id: version-0.66.0-dev-guide-configuration
+title: Configuration
+sidebar_label: Configuration
+original_id: dev-guide-configuration
+---
+
+On this page we explain how the StreamPipes configuration works.
+StreamPipes allows the individual services (pipeline element containers and third-party services) to store configuration parameters in a distributed key-value store.
+This has the advantage that individual services do not need to store any configurations on the local file system, enabling us to run containers anywhere.
+As a key-value store we use [Consul](https://www.consul.io/), which is an essential service for all our services.
+
+<img src="/docs/img/configuration/consul.png" width="50%" alt="Semantic description of data processor">
+
+
+## Edit Configurations
+All services in StreamPipes can have configuration parameters.
+You can either change them in the consul user interface (which is by default running on port 8500) or directly in the StreamPipes Configurations Page.
+Once a new  pipeline element container is started, it is registered in Consul and the parameters can be edited in the configuration page, as shown below.
+To store changes in Consul, the update button must be clicked.
+
+<div class="my-carousel">
+    <img src="/docs/img/configuration/configuration_1.png" alt="Configuration View">
+</div>
+
+## Configuration for Developers
+We provide a Configurations API for the use of configuration parameters in your services.
+Each processing element project has a “config” package [[Example]](https://github.com/apache/incubator-streampipes-extensions/tree/dev/streampipes-sinks-internal-jvm/src/main/java/org/streampipes/sinks/internal/jvm/config).
+This package usually contains two classes.
+One containing unique keys for the configuration values and one containing the getter and setter methods to access these values.
+For the naming of configuration keys, we recommend to use “SP” as a prefix.
+As we explain later, it is possible to set default configurations as environment variables, this prefix makes them unique on your server.
+A configuration entry needs a unique config key. For this key, a value can be specified containing the configuration, like for example the port number of the service.
+For each configuration, a description explaining the parameter can be provided, further the data type must be specified and whether it is a password or not.
+Below, the schema of a configuration item is shown on the left and an example of a port configuration on the right.
+
+<img src="/docs/img/configuration/config_key.png" width="80%" alt="Semantic description of data processor">
+
+As a developer, you can add as many new configurations to services as you wish, but there are some that are required for all processing element containers.
+Those are **the host**, **the port**, and **the name** of the service.
+
+## Default Values
+You can provide default values for the configurations, which are used when a configuration is read for the first time.
+The first option is to register a configuration parameter in the Config class.
+This is a fallback value, which is used if nothing else is defined.
+Since this value is static, we offer a second option.
+It is possible to provide a default value by setting an environment variable.
+In this case, the convention is that the key of a configuration parameter must be used as the environment variable.
+Now, this value is used instead of the value defined in the Config class.
+During development, the configuration values often need to be changed for debugging purposes, therefore we provide an .env file in all processing element projects and archetypes.
+This file can be used by your IDE to set the environment variables. (e.g., [Intellij Plugin](https://plugins.jetbrains.com/plugin/7861-envfile))
+When you need to change the variable at runtime, you can do this in the StreamPipes configurations as explained before.
+Those changes take effect immediately without the need of a container restart.
+
+<div class="admonition warning">
+<div class="admonition-title">Installed pipeline elements</div>
+<p>Be cautious, when the configuration is used in the semantic description of a processing element which is already installed in StreamPipes, you have to reload this element in StreamPipes (my elements -> reload).
+   In addition, changes might affect already running pipelines.</p>
+</div>
diff --git a/documentation/website/versioned_docs/version-0.66.0/dev-guide-environment.md b/documentation/website/versioned_docs/version-0.66.0/dev-guide-environment.md
new file mode 100644
index 0000000..79205e1
--- /dev/null
+++ b/documentation/website/versioned_docs/version-0.66.0/dev-guide-environment.md
@@ -0,0 +1,113 @@
+---
+id: version-0.66.0-dev-guide-development-environment
+title: Development Environment
+sidebar_label: Development Environment
+original_id: dev-guide-development-environment
+---
+
+In this section, we describe our recommended minimum setup for locally setting up a development instance of StreamPipes needed to develop, run and test new pipeline elements.
+
+## IDE & required dev tools
+StreamPipes does not have specific requirements on the IDE - so feel free to choose the IDE of your choice.
+The only requirements in terms of development tools are that you have Java 8 and Maven installed.
+
+## Docker-based local StreamPipes instance
+In order to quickly test developed pipeline elements without needing to install all services required by StreamPipes, we provide a CLI tool that allows you to selectively start StreamPipes components.
+The CLI tool allows to switch to several templates (based on docker-compose) depending on the role. For instance, if you are developing a pipeline element, use the template ``pe-developer``. This will start backend and ui components in a Docker container, while you can easily test your pipeline element in your IDE.
+
+For now, we refer to the Github Readme for instructions on how to use the CLI tool: [https://github.com/apache/incubator-streampipes-installer](https://github.com/apache/incubator-streampipes-installer)
+
+## Starter projects
+
+Now, once you've started the development instance, you are ready to develop your very first pipeline element.
+Instead of starting from scratch, we recommend using our provided maven archetypes:
+
+### Maven archetypes
+
+Create the Maven archetype as described in the [Getting Started](/docs/dev-guide-archetype) guide.
+
+### Starting from scratch
+
+In order to develop a new pipeline element from scratch, you need to create a new Maven project and import the following dependencies:
+
+<details class="info">
+<summary>pom.xml</summary>
+```
+<dependency>
+    <groupId>org.streampipes</groupId>
+    <artifactId>streampipes-container-standalone</artifactId>
+    <version>0.64.0</version>
+</dependency>
+
+<dependency>
+    <groupId>org.streampipes</groupId>
+    <artifactId>streampipes-sdk</artifactId>
+    <version>0.64.0</version>
+</dependency>
+
+<dependency>
+    <groupId>org.streampipes</groupId>
+    <artifactId>streampipes-vocabulary</artifactId>
+    <version>0.64.0</version>
+</dependency>
+
+<!-- This dependency needs to be imported if you plan to develop a new data processor or data sink using the Apache Flink wrapper -->
+<dependency>
+    <groupId>org.streampipes</groupId>
+    <artifactId>streampipes-wrapper-flink</artifactId>
+    <version>0.64.0</version>
+</dependency>
+
+<!-- This dependency needs to be imported if you plan to develop a new data processor or data sink which is running directly on the JVM -->
+<dependency>
+    <groupId>org.streampipes</groupId>
+    <artifactId>streampipes-wrapper-standalone</artifactId>
+    <version>0.64.0</version>
+</dependency>
+
+<dependency>
+    <groupId>org.streampipes</groupId>
+    <artifactId>streampipes-dataformat-json</artifactId>
+    <version>0.64.0</version>
+</dependency>
+<dependency>
+    <groupId>org.streampipes</groupId>
+    <artifactId>streampipes-dataformat-smile</artifactId>
+     <version>0.64.0</version>
+</dependency>
+<dependency>
+    <groupId>org.streampipes</groupId>
+    <artifactId>streampipes-dataformat-cbor</artifactId>
+     <version>0.64.0</version>
+</dependency>
+<dependency>
+    <groupId>org.streampipes</groupId>
+    <artifactId>streampipes-dataformat-fst</artifactId>
+     <version>0.64.0</version>
+</dependency>
+<dependency>
+    <groupId>org.streampipes</groupId>
+    <artifactId>streampipes-messaging-jms</artifactId>
+     <version>0.64.0</version>
+</dependency>
+<dependency>
+    <groupId>org.streampipes</groupId>
+    <artifactId>streampipes-messaging-kafka</artifactId>
+     <version>0.64.0</version>
+</dependency>
+```
+</details>
+
+The following three dependencies are mandatory:
+
+* `streampipes-container-standalone`, defines that we are going to create a new pipeline element where the description will be accessible through an embedded web server.
+* `streampipes-sdk` imports the SDK which provides many convencience functions to create new pipeline elements.
+* `streampipes-vocabulary` imports various RDF vocabularies which are used by the SDK to auto-generate the semantic description of pipeline elements.
+
+The following three dependencies might be optional depending on the pipeline element type you plan to create:
+
+*  `streampipes-wrapper-flink` should be used in case you plan to connect a new data processor or data sink that uses Apache Flink for processing events at runtime.
+*  `streampipes-wrapper-standalone` should be used in case you plan to connect a new data processor or data sink that does not use an external processing engine. Events are directly processed in a single-host fashion.
+
+
+Finally, this dependency will provide abstract classes to define data sources and streams.
diff --git a/documentation/website/versioned_docs/version-0.66.0/dev-guide-output-strategies.md b/documentation/website/versioned_docs/version-0.66.0/dev-guide-output-strategies.md
new file mode 100644
index 0000000..2e7922a
--- /dev/null
+++ b/documentation/website/versioned_docs/version-0.66.0/dev-guide-output-strategies.md
@@ -0,0 +1,347 @@
+---
+id: version-0.66.0-dev-guide-output-strategies
+title: SDK Guide: Output Strategies
+sidebar_label: Output Strategies
+original_id: dev-guide-output-strategies
+---
+
+## Introduction
+In StreamPipes, output strategies determine the output of a data processor.
+As the exact input schema of a processor is usually not yet known at development time (as processors can be connected with any stream that matches their requirements), output strategies are a concept to define how an input data stream is transformed to an output data stream.
+
+The following reference describes how output strategies can be defined using the SDK.
+
+<div class="admonition tip">
+<div class="admonition-title">Code on Github</div>
+<p>For all examples, the code can be found on <a href="https://www.github.com/apache/incubator-streampipes-examples/tree/dev/streampipes-pipeline-elements-examples-processors-jvm/src/main/java/org/streampipes/pe/examples/jvm/outputstrategy/">Github</a>.</p>
+</div>
+
+## Reference
+
+The methods described below to create static properties are available in the ``ProcessingElementBuilder`` class and are usually used in the ``declareModel`` method of the controller class.
+
+As follows, we will use the following example event to explain how output strategies define the output of a data processor:
+
+```json
+{
+    "timestamp" : 1234556,
+    "temperature" : 37.0,
+    "deviceId" : "1"
+
+}
+```
+
+### Keep Output
+
+A ``KeepOutputStrategy`` declares that the output event schema will be equal to the input event schema.
+In other terms, the processor does not change the schema, but might change the values of event properties.
+
+A keep output strategy can be defined as follows:
+
+```java
+
+@Override
+  public DataProcessorDescription declareModel() {
+    return ProcessingElementBuilder.create("org.streampipes.examples.outputstrategy" +
+            ".keep", "Keep output example example", "")
+            .requiredStream(StreamRequirementsBuilder.
+                    create()
+                    .requiredProperty(EpRequirements.anyProperty())
+                    .build())
+            .supportedProtocols(SupportedProtocols.kafka())
+            .supportedFormats(SupportedFormats.jsonFormat())
+
+            // declaring a keep output strategy
+            .outputStrategy(OutputStrategies.keep())
+
+            .build();
+  }
+
+```
+
+According to the example above, the expected output event schema of the example input event would be:
+
+```json
+{
+    "timestamp" : 1234556,
+    "temperature" : 37.0,
+    "deviceId" : "1"
+
+}
+```
+
+Data processors that perform filter operations (e.g., filtering temperature values that are above a given threshold) are a common example for using keep output strategies.
+
+
+### Fixed Output
+
+A ``FixedOutputStrategy`` declares that the data processor itself provides the event schema. The output schema does not depend on the input event.
+
+Fixed output strategies need to provide the event schema they produce at development time:
+
+```java
+
+  @Override
+  public DataProcessorDescription declareModel() {
+    return ProcessingElementBuilder.create("org.streampipes.examples.outputstrategy" +
+            ".fixed", "Fixed output example", "")
+            .requiredStream(StreamRequirementsBuilder.
+                    create()
+                    .requiredProperty(EpRequirements.anyProperty())
+                    .build())
+            .supportedProtocols(SupportedProtocols.kafka())
+            .supportedFormats(SupportedFormats.jsonFormat())
+
+            // the fixed output strategy provides the schema
+            .outputStrategy(OutputStrategies.fixed(EpProperties.timestampProperty("timestamp"),
+                    EpProperties.doubleEp(Labels.from("avg", "Average value", ""), "avg", SO.Number)))
+
+            .build();
+  }
+
+```
+
+In this example, we declare that the output schema always consists of two fields (``timestamp`` and ``avg``).
+
+Therefore, an output event should look like:
+
+```json
+{
+    "timestamp" : 1234556,
+    "avg" : 36.0
+}
+```
+
+
+### Append Output
+
+An ``AppendOutputStrategy`` appends additional fields to a schema of an incoming event stream. For instance, data processors that perform enrichment operations usually make use of append output strategies.
+
+Similar to the fixed output strategy, the additional fields must be provided at development time in the controller method as follows:
+
+```java
+  @Override
+  public DataProcessorDescription declareModel() {
+    return ProcessingElementBuilder.create("org.streampipes.examples.outputstrategy" +
+            ".append", "Append output example", "")
+
+            // boilerplate code not relevant here, see above
+
+            // declaring an append output
+            .outputStrategy(OutputStrategies.append(EpProperties.integerEp(Labels.from("avg",
+                    "The average value", ""), "avg", SO.Number)))
+
+            .build();
+  }
+```
+
+In this case, the output event would have an additional field ``avg``:
+
+```json
+{
+    "timestamp" : 1234556,
+    "temperature" : 37.0,
+    "deviceId" : "1",
+    "avg" : 123.0
+
+}
+```
+
+### Custom Output
+
+In some cases, pipeline developers using the StreamPipes UI should be able to manually select fields from an input event schema. For such use cases, a ``CustomOutputStrategy`` can be used:
+
+```java
+
+@Override
+  public DataProcessorDescription declareModel() {
+    return ProcessingElementBuilder.create("org.streampipes.examples.outputstrategy" +
+            ".custom", "Custom output example", "")
+
+            // boilerplate code not relevant here, see above
+
+            // declaring a custom output
+            .outputStrategy(OutputStrategies.custom())
+
+            .build();
+  }
+
+```
+
+If a data processor defines a custom output strategy, the customization dialog in the pipeline editor will show a dialog to let users select the fields to keep:
+
+<img src="/docs/img/dev-guide-output-strategies/os-custom.png" width="80%" alt="Number Parameter">
+
+Taking our example, and assuming that the user selects both the ``timestamp`` and the ``temperature`` the expected output event should look like this:
+
+```json
+{
+    "timestamp" : 1234556,
+    "temperature" : 37.0
+}
+```
+
+How do we know which fields were selected once the data processor is invoked? Use the proper method from the extractor in the ``onInvocation`` method:
+
+```java
+@Override
+  public ConfiguredEventProcessor<DummyParameters> onInvocation(DataProcessorInvocation graph, ProcessingElementParameterExtractor extractor) {
+
+    List<String> outputSelectors = extractor.outputKeySelectors();
+
+    return new ConfiguredEventProcessor<>(new DummyParameters(graph), DummyEngine::new);
+  }
+```
+
+### Transform Output
+
+A ``TransformOutputStrategy`` declares that one or more fields of an incoming event stream are transformed. Transformations can be applied to the datatype of the property, the runtime name of the property, or any other scheam-related declaration such as measurement units.
+
+#### Static Transform Operations
+
+Static transform operations do not depend on any user input (at pipeline development time) in order to know how to transform a field of an incoming event schema.
+
+Let's say our data processor transforms strings (that are actually a number) to a number datatype. In this case, we can use a static transform output strategy:
+
+```java
+
+  @Override
+  public DataProcessorDescription declareModel() {
+    return ProcessingElementBuilder.create("org.streampipes.examples.outputstrategy" +
+            ".transform", "Transform output example example", "")
+            .requiredStream(StreamRequirementsBuilder.
+                    create()
+                    .requiredPropertyWithUnaryMapping(EpRequirements.stringReq(), Labels.from
+                            ("str", "The date property as a string", ""), PropertyScope.NONE)
+                    .build())
+            .supportedProtocols(SupportedProtocols.kafka())
+            .supportedFormats(SupportedFormats.jsonFormat())
+
+            // static transform operation
+            .outputStrategy(OutputStrategies.transform(TransformOperations
+                    .staticDatatypeTransformation("str", Datatypes.Long)))
+
+            .build();
+  }
+
+```
+
+Note the mapping property that we use to determine which field of the input event should be transformed.
+
+The expected output event would look like this:
+
+```json
+{
+    "timestamp" : 1234556,
+    "temperature" : 37.0,
+    "deviceId" : 1
+}
+```
+
+#### Dynamic Transform Operations
+
+Sometimes, user input depends on the exact transform output. Let's take a field renaming processor as an example, which lets the user rename a field from an input event schema to another field name.
+For such use cases, we can use a ``DynamicTransformOperation``:
+
+```java
+
+  @Override
+  public DataProcessorDescription declareModel() {
+    return ProcessingElementBuilder.create("org.streampipes.examples.outputstrategy" +
+            ".transform", "Transform output example example", "")
+            .requiredStream(StreamRequirementsBuilder.
+                    create()
+                    .requiredPropertyWithUnaryMapping(EpRequirements.stringReq(), Labels.from
+                            ("str", "The date property as a string", ""), PropertyScope.NONE)
+                    .build())
+            .supportedProtocols(SupportedProtocols.kafka())
+            .supportedFormats(SupportedFormats.jsonFormat())
+
+            // the text input to enter the new runtime name
+            .requiredTextparameter(Labels.from("new-runtime-name", "New Runtime Name", ""))
+
+            // static transform operation
+            .outputStrategy(OutputStrategies.transform(TransformOperations
+                    .dynamicRuntimeNameTransformation("str", "new-runtime-name")))
+
+            .build();
+  }
+
+```
+
+For dynamic transform operations, an additional identifier that links to another static property can be assigned and later be fetched in the ``onInvocation`` method.
+
+Assuming we want to rename the field ``temperature`` to ``temp``, the resulting output event should look like this:
+
+```json
+{
+    "timestamp" : 1234556,
+    "temp" : 37.0,
+    "deviceId" : 1
+}
+```
+
+### Custom Transform Output
+
+Finally, in some cases the output schema cannot be described at pipeline development time. For these (usually rare) cases, a ``CustomTransformOutput`` strategy can be used.
+
+In this case, a callback function will be invoked in the controller class just after a user has filled in any static properties and clicks on ``Save`` in the pipeline editor.
+
+To define a custom transform output, we need to implement an interface in the controller class:
+
+```java
+public class CustomTransformOutputController extends
+        StandaloneEventProcessingDeclarer<DummyParameters> implements
+        ResolvesContainerProvidedOutputStrategy<DataProcessorInvocation, ProcessingElementParameterExtractor> {
+
+
+@Override
+  public EventSchema resolveOutputStrategy(DataProcessorInvocation processingElement, ProcessingElementParameterExtractor parameterExtractor) throws SpRuntimeException {
+
+  }
+}
+```
+
+In addition, the output strategy must be declared in the ``declareModel`` method:
+
+```java
+
+@Override
+  public DataProcessorDescription declareModel() {
+    return ProcessingElementBuilder.create("org.streampipes.examples.outputstrategy" +
+            ".customtransform", "Custom transform output example example", "")
+            .requiredStream(StreamRequirementsBuilder.
+                    create()
+                    .requiredPropertyWithUnaryMapping(EpRequirements.stringReq(), Labels.from
+                            ("str", "The date property as a string", ""), PropertyScope.NONE)
+                    .build())
+            .supportedProtocols(SupportedProtocols.kafka())
+            .supportedFormats(SupportedFormats.jsonFormat())
+
+            // declare a custom transform output
+            .outputStrategy(OutputStrategies.customTransformation())
+
+            .build();
+  }
+
+```
+
+Once a new pipeline using this data processor is created and the configuration is saved, the ``resolveOutputStrategy`` method will be called, so that an event schema can be provided based on the given configuration. An extractor instance (see the guide on static properties) is available to extract the selected static properties and the connected event stream.
+
+```java
+@Override
+  public EventSchema resolveOutputStrategy(DataProcessorInvocation processingElement, ProcessingElementParameterExtractor parameterExtractor) throws SpRuntimeException {
+    return new EventSchema(Arrays
+            .asList(EpProperties
+                    .stringEp(Labels.from("runtime", "I was added at runtime", ""), "runtime", SO.Text)));
+  }
+```
+
+In this example, the output event schema should look like this:
+
+```json
+{
+    "runtime" : "Hello world!"
+}
+```
+
diff --git a/documentation/website/versioned_docs/version-0.66.0/dev-guide-static-properties.md b/documentation/website/versioned_docs/version-0.66.0/dev-guide-static-properties.md
new file mode 100644
index 0000000..5cba2bf
--- /dev/null
+++ b/documentation/website/versioned_docs/version-0.66.0/dev-guide-static-properties.md
@@ -0,0 +1,265 @@
+---
+id: version-0.66.0-dev-guide-static-properties
+title: SDK Guide: Static Properties
+sidebar_label: Static Properties
+original_id: dev-guide-static-properties
+---
+
+## Introduction
+Static properties represent user-faced parameters that are provided by pipeline developers.
+Processing elements can specify required static properties, which will render different UI views in the pipeline editor.
+
+The following reference describes how static properties can be defined using the SDK.
+
+<div class="admonition tip">
+<div class="admonition-title">Code on Github</div>
+<p>For all examples, the code can be found on <a href="https://github.com/apache/incubator-streampipes-examples/tree/dev/streampipes-pipeline-elements-examples-processors-jvm/src/main/java/org/streampipes/pe/examples/jvm/staticproperty">Github</a>.</p>
+</div>
+
+## Reference
+
+The methods described below to create static properties are available in the ``ProcessingElementBuilder`` and ``DataSinkBuilder`` classes and are usually used in the ``declareModel`` method of the controller class.
+
+### Mapping property
+
+In StreamPipes, processing elements usually operate on fields of an event stream. For instance, a filter processor operates on a specific field from an input stream (e.g., a field measuring the temperature).
+Typically, pipeline developers should select the exact field where the operations is applied upon by themselves.
+As this field is not yet known at pipeline element development time (as it is defined by the pipeline developer in the pipeline editor), mapping properties serve to map a stream requirement to a specific field from the actual input event stream.
+
+### Unary mapping property
+
+A unary mapping property maps a stream requirement to an actual field of an event stream. Therefore, the ``StreamRequirementsBuilder`` provides the opportunity to directly add a mapping property based along with a property requirement:
+
+```java
+.requiredStream(StreamRequirementsBuilder.
+    create()
+    .requiredPropertyWithUnaryMapping(EpRequirements.numberReq(),
+            Labels.from("mp-key", "My Mapping", ""),
+            PropertyScope.NONE)
+    .build())
+```
+
+This leads to a selection dialog in the pipeline element customization which provides the user with a selection of all event properties (fields) from the input stream that match the specified property requirement:
+
+<img src="/docs/img/dev-guide-static-properties/sp-mapping-unary.png" width="80%" alt="Text">
+
+At invocation time, the value can be extracted in the ``onInvocation`` method as follows:
+
+```java
+// Extract the mapping property value
+String mappingPropertySelector = extractor.mappingPropertyValue("mp-key");
+```
+
+Note that this method returns a ``PropertySelector``, which can be used by the event model to extract the actual value of this field.
+
+### N-ary mapping property
+
+N-ary mapping properties work similar to unary mapping properties, but allow the mapping of one requirement to multiple event properties matching the requirement:
+
+```java
+.requiredStream(StreamRequirementsBuilder.
+    create()
+    .requiredPropertyWithNaryMapping(EpRequirements.numberReq(),
+            Labels.from("mp-key", "My Mapping", ""),
+            PropertyScope.NONE)
+    .build())
+```
+
+This renders the following selection, where users can select more than one matching event property:
+
+<img src="/docs/img/dev-guide-static-properties/sp-mapping-nary.png" width="80%" alt="Text">
+
+The following snippet returns a list containing the property selectors of all event properties that have been selected:
+
+```java
+// Extract the mapping property value
+List<String> mappingPropertySelectors = extractor.mappingPropertyValues("mp-key");
+```
+
+### Free-Text Parameters
+
+A free-text parameter requires the pipeline developer to enter a single value - which can be a string or another primitive data type.
+The input of free-text parameters can be restricted to specific value ranges or can be linked to the value set of a connected input data stream.
+
+#### Text Parameters
+
+A text parameter lets the user enter a string value. The following code line in the controller class
+
+```java
+.requiredTextParameter(Labels.from(SP_KEY, "Example Name", "Example Description"))
+```
+
+leads to the following input dialog in the pipeline editor:
+
+<img src="/docs/img/dev-guide-static-properties/sp-text-parameter.png" width="80%" alt="Text">
+
+Users can enter any value that will be converted to a string datatype. To receive the entered value in the ``onInvocation`` method, use the following method from the ``ParameterExtractor``
+
+```java
+String textParameter = extractor.singleValueParameter(SP_KEY, String.class);
+```
+
+#### Number parameters
+
+A number parameter lets the user enter a number value, either a floating-point number or an integer:
+
+```java
+// create an integer parameter
+.requiredIntegerParameter(Labels.from(SP_KEY, "Integer Parameter", "Example Description"))
+
+// create a float parameter
+.requiredFloatParameter(Labels.from("float-key", "Float Parameter", "Example Description"))
+
+```
+
+leads to the following input dialog in the pipeline editor only accepting integer values:
+
+<img src="/docs/img/dev-guide-static-properties/sp-number-parameter.png" width="80%" alt="Number Parameter">
+
+The pipeline editor performs type validation and ensures that only numbers can be added by the user. To receive the entered value in the ``onInvocation`` method, use the following method from the ``ParameterExtractor``
+
+```java
+// Extract the integer parameter value
+Integer integerParameter = extractor.singleValueParameter(SP_KEY, Integer.class);
+
+// Extract the float parameter value
+Float floatParameter = extractor.singleValueParameter("float-key", Float.class);
+
+```
+
+#### Numbers with value specification
+
+You can also specify the value range of a number-based free text parameter:
+
+```java
+// create an integer parameter with value range
+.requiredIntegerParameter(Labels.from(SP_KEY, "Integer Parameter", "Example Description"), 0, 100, 1)
+
+```
+
+which renders the following input field:
+
+<img src="/docs/img/dev-guide-static-properties/sp-number-parameter-with-range.png" width="80%" alt="Number Parameter">
+
+Receive the entered value in the same way as a standard number parameter.
+
+#### Free-text parameters linked to an event property
+
+
+### Single-Value Selections
+
+Single-value selections let the user select from a pre-defined list of options.
+A single-value selection requires to select exactly one option.
+
+```java
+.requiredSingleValueSelection(Labels.from("id", "Example Name", "Example Description"),
+    Options.from("Option A", "Option B", "Option C"))
+
+```
+
+Single-value selections will be rendered as a set of radio buttons in the pipeline editor:
+
+<img src="/docs/img/dev-guide-static-properties/sp-single-selection.png" width="80%" alt="Number Parameter">
+
+To extract the selected value, use the following method from the parameter extractor:
+
+```java
+// Extract the selected value
+String selectedSingleValue = extractor.selectedSingleValue("id", String.class);
+```
+
+<div class="admonition tip">
+<div class="admonition-title">Declaring options</div>
+<p>Sometimes, you may want to use an internal name that differs from the display name of an option.
+For that, you can use the method Options.from(Tuple2<String, String>) and the extractor method selectedSingleValueInternalName.</p>
+</div>
+
+
+
+### Multi-Value Selections
+
+Multi-value selections let the user select from a pre-defined list of options, where multiple or no option might be selected.
+
+```java
+.requiredMultiValueSelection(Labels.from("id", "Example Name", "Example Description"),
+    Options.from("Option A", "Option B", "Option C"))
+
+```
+
+Multi-value selections will be rendered as a set of checkboxes in the pipeline editor:
+
+<img src="/docs/img/dev-guide-static-properties/sp-multi-selection.png" width="80%" alt="Number Parameter">
+
+To extract the selected value, use the following method from the parameter extractor:
+
+```java
+// Extract the selected value
+List<String> selectedMultiValue = extractor.selectedMultiValues("id", String.class);
+```
+
+### Domain Concepts
+
+(coming soon...)
+
+### Collections
+
+You can also define collections based on other static properties.
+
+```java
+// create a collection parameter
+.requiredParameterAsCollection(Labels.from("collection", "Example Name", "Example " +
+        "Description"), StaticProperties.stringFreeTextProperty(Labels
+        .from("text-property","Text","")))
+```
+
+While the items of the collection can be provided in the same way as the underlying static property, the UI provides buttons to add and remove items to the collections.
+
+<img src="/docs/img/dev-guide-static-properties/sp-collection.png" width="80%" alt="Number Parameter">
+
+To extract the selected values from the collection, use the following method from the parameter extractor:
+
+```java
+// Extract the text parameter value
+List<String> textParameters = extractor.singleValueParameterFromCollection("collection", String.class);
+```
+
+### Runtime-resolvable selections
+
+In some cases, the options of selection parameters are not static, but depend on other values or might change at runtime. In this case, you can use runtime-resolvable selections.
+
+First, let your controller class implement ``ResolvesContainerProvidedOptions``:
+
+```java
+public class RuntimeResolvableSingleValue extends
+     StandaloneEventProcessingDeclarer<DummyParameters> implements ResolvesContainerProvidedOptions { ... }
+```
+
+Next, define the parameter in the ``declareModel`` method:
+
+```java
+// create a single value selection parameter that is resolved at runtime
+    .requiredSingleValueSelectionFromContainer(Labels.from("id", "Example Name", "Example " +
+            "Description"))
+```
+
+Finally, implement the method ``resolveOptions``, which will be called at runtime once the processor is used:
+
+```java
+  @Override
+  public List<RuntimeOptions> resolveOptions(String requestId, EventProperty linkedEventProperty) {
+    return Arrays.asList(new RuntimeOptions("I was defined at runtime", ""));
+  }
+```
+
+The UI will render a single-value parameter based on the options provided at runtime:
+
+<img src="/docs/img/dev-guide-static-properties/sp-single-selection-remote.png" width="80%" alt="Number Parameter">
+
+The parameter extraction does not differ from the extraction of static single-value parameters.
+
+<div class="admonition info">
+<div class="admonition-title">Multi-value selections</div>
+<p>Although this example shows the usage of runtime-resolvable selections using single value selections, the same also works for multi-value selections!</p>
+</div>
+
+
diff --git a/documentation/website/versioned_docs/version-0.66.0/dev-guide-stream-requirements.md b/documentation/website/versioned_docs/version-0.66.0/dev-guide-stream-requirements.md
new file mode 100644
index 0000000..c588b11
--- /dev/null
+++ b/documentation/website/versioned_docs/version-0.66.0/dev-guide-stream-requirements.md
@@ -0,0 +1,179 @@
+---
+id: version-0.66.0-dev-guide-stream-requirements
+title: SDK Guide: Stream Requirements
+sidebar_label: Stream Requirements
+original_id: dev-guide-stream-requirements
+---
+
+## Introduction
+
+Data processors and data sinks can define ``StreamRequirements``. Stream requirements allow pipeline elements to express requirements on an incoming event stream that are needed for the element to work properly.
+Once users create pipelines in the StreamPipes Pipeline Editor, these requirements are verified against the connected event stream.
+By using this feature, StreamPipes ensures that only pipeline elements can be connected that are syntactically and semantically valid.
+
+This guide covers the creation of stream requirements. Before reading this section, we recommend that you make yourself familiar with the SDK guide on [data processors](dev-guide-processor-sdk.md) and [data sinks](dev-guide-sink-sdk.md).
+
+<div class="admonition tip">
+<div class="admonition-title">Code on Github</div>
+<p>For all examples, the code can be found on <a href="https://www.github.com/apache/incubator-streampipes-examples/tree/dev/streampipes-pipeline-elements-examples-processors-jvm/src/main/java/org/streampipes/pe/examples/jvm/requirements/">Github</a>.</p>
+</div>
+
+## The StreamRequirementsBuilder
+
+Stream requirements can be defined in the ``Controller`` class of the pipeline element. Start with a method body like this:
+
+```java
+
+@Override
+  public DataProcessorDescription declareModel() {
+    return ProcessingElementBuilder.create(ID, PIPELINE_ELEMENT_NAME, DESCRIPTION)
+            .requiredStream(StreamRequirementsBuilder.
+                    create()
+
+                    .build())
+
+            .supportedProtocols(SupportedProtocols.kafka())
+            .supportedFormats(SupportedFormats.jsonFormat())
+            .outputStrategy(OutputStrategies.keep())
+
+            .build();
+  }
+```
+
+The ``StreamRequirementsBuilder`` class provides methods to add stream requirements to a pipeline element.
+
+## Requirements on primitive fields
+
+As a very first example, let's assume we would like to create a data processor that filters numerical values that are above a given threshold.
+Consequently, any data stream that is connected to the filter processor needs to provide a numerical value.
+
+The stream requirement would be assigned as follows:
+
+```java
+@Override
+  public DataProcessorDescription declareModel() {
+    return ProcessingElementBuilder.create(ID, PIPELINE_ELEMENT_NAME, DESCRIPTION)
+            .requiredStream(StreamRequirementsBuilder
+                    .create()
+                    .requiredProperty(EpRequirements.numberReq())
+                    .build())
+
+            .supportedProtocols(SupportedProtocols.kafka())
+            .supportedFormats(SupportedFormats.jsonFormat())
+            .outputStrategy(OutputStrategies.keep())
+
+            .build();
+  }
+```
+
+Note the line starting with ``requiredProperty``, which requires any stream to provide a datatype of type ``number``.
+
+In many cases, you'll want to let the user select a specific field from a data stream from all available fields that match the specified requirement. For that, you simply use the method ``requiredPropertyWithUnaryMapping`` as follows:
+
+```java
+@Override
+  public DataProcessorDescription declareModel() {
+    return ProcessingElementBuilder.create(ID, PIPELINE_ELEMENT_NAME, DESCRIPTION)
+            .requiredStream(StreamRequirementsBuilder
+                    .create()
+                    .requiredPropertyWithUnaryMapping(EpRequirements.numberReq(),
+                    Labels.from("number-mapping", "The value that should be filtered", ""), PropertyScope.NONE)
+                    .build())
+
+            .supportedProtocols(SupportedProtocols.kafka())
+            .supportedFormats(SupportedFormats.jsonFormat())
+            .outputStrategy(OutputStrategies.keep())
+
+            .build();
+  }
+```
+
+See also the developer guide on [static properties](dev-guide-static-properties.md) to better understand the usage of ``MappingProperties``.
+
+Requirements on primitive fields can be specified for all common datatypes:
+
+```java
+ @Override
+  public DataProcessorDescription declareModel() {
+    return ProcessingElementBuilder.create("org.streampipes.examples.requirements" +
+            ".simple", "Simple requirements specification examples", "")
+            .requiredStream(StreamRequirementsBuilder.
+                    create()
+                    .requiredProperty(EpRequirements.numberReq()) // any number
+                    .requiredProperty(EpRequirements.doubleReq()) // any field of type double
+                    .requiredProperty(EpRequirements.booleanReq()) // any field of type boolean
+                    .requiredProperty(EpRequirements.integerReq()) // any field of type integer
+                    .requiredProperty(EpRequirements.stringReq()) // any field of type string
+
+                    .requiredProperty(EpRequirements.anyProperty()) // any field allowed (no restriction)
+                    .requiredProperty(EpRequirements.timestampReq())  // any timestamp field
+                    .build())
+
+
+            .supportedProtocols(SupportedProtocols.kafka())
+            .supportedFormats(SupportedFormats.jsonFormat())
+            .outputStrategy(OutputStrategies.keep())
+
+            .build();
+```
+
+### Specifying semantics
+
+For some algorithms, only specifying the datatype is not sufficient. Let's consider a geofencing algorithm that detects the precense some geospatial coordinate (e.g., from a vehicle) within a given location.
+
+You could specify something like this:
+
+```java
+    StreamRequirementsBuilder
+    .create()
+    .requiredPropertyWithUnaryMapping(EpRequirements.doubleEp(), Labels.from("mapping-latitude", "Latitude", ""), PropertyScope.NONE)
+    .requiredPropertyWithUnaryMapping(EpRequirements.doubleEp(), Labels.from("mapping-longitude", "Longitude", ""), PropertyScope.NONE)
+    .build()
+```
+
+However, this would allow users to create strange pipelines as users could connect any stream containing a double value to our geofencing algorithm.
+To avoid such situations, you can also specify requirements based on the semantics of a field:
+
+```java
+    StreamRequirementsBuilder
+    .create()
+    .requiredPropertyWithUnaryMapping(EpRequirements.domainPropertyReq(SO.Latitude), Labels.from("mapping-latitude", "Latitude", ""), PropertyScope.NONE)
+    .requiredPropertyWithUnaryMapping(EpRequirements.domainPropertyReq(SO.Longitude), Labels.from("mapping-longitude", "Longitude", ""), PropertyScope.NONE)
+    .build()
+```
+
+Note that in this case, we make use of Schema.org's ``Latitude`` concept ([https://schema.org/latitude](https://schema.org/latitude)). StreamPipes already includes popular vocabularies for specifying semantics. You are also free to use your own vocabularies.
+
+
+## Requirements on lists
+
+Similarly to primitive requirements, you can define processors that require data streams with list fields, see the following examples:
+
+```java
+@Override
+  public DataProcessorDescription declareModel() {
+    return ProcessingElementBuilder.create("org.streampipes.examples.requirements" +
+            ".list", "List requirements specification examples", "")
+            .requiredStream(StreamRequirementsBuilder.
+                    create()
+                    .requiredProperty(EpRequirements.listRequirement(Datatypes.Integer))
+                    .requiredProperty(EpRequirements.listRequirement(Datatypes.Double))
+                    .requiredProperty(EpRequirements.listRequirement(Datatypes.Boolean))
+                    .requiredProperty(EpRequirements.listRequirement(Datatypes.String))
+                    .build())
+
+
+            .supportedProtocols(SupportedProtocols.kafka())
+            .supportedFormats(SupportedFormats.jsonFormat())
+            .outputStrategy(OutputStrategies.keep())
+
+            .build();
+  }
+```
+
+## Requirements on nested properties
+
+(coming soon, see the Javadoc for now)
+
+
+
diff --git a/documentation/website/versioned_docs/version-0.66.0/dev-guide-tutorial-sources.md b/documentation/website/versioned_docs/version-0.66.0/dev-guide-tutorial-sources.md
new file mode 100644
index 0000000..cb91af1
--- /dev/null
+++ b/documentation/website/versioned_docs/version-0.66.0/dev-guide-tutorial-sources.md
@@ -0,0 +1,283 @@
+---
+id: version-0.66.0-dev-guide-tutorial-sources
+title: Tutorial: Data Sources
+sidebar_label: Tutorial: Data Sources
+original_id: dev-guide-tutorial-sources
+---
+
+In this tutorial, we will add a new data source consisting of a single data stream. The source will be provided as a standalone component (i.e., the description will be accessible through an integrated web server).
+
+## Objective
+
+We are going to create a new data stream that is produced by a GPS sensor installed in a delivery vehicle.
+The sensor produces a continuous stream of events that contain the current timestamp, the current lat/lng position of the vehicle and the plate number of the vehicle.
+Events are published in a JSON format as follows:
+```json
+{
+  "timestamp" : 145838399,
+  "latitude" : 37.04,
+  "longitude" : 17.04,
+  "plateNumber" : "KA-AB 123"
+}
+```
+
+These events are published to a Kafka broker using the topic `org.streampipes.tutorial.vehicle`.
+
+In the following section, we show how to describe this stream in a form that allows you to import and use it in StreamPipes.
+
+## Project setup
+
+Instead of creating a new project from scratch, we recommend to use the Maven archetype to create a new project skeleton.
+Enter the following command in a command line of your choice (Apache Maven needs to be installed):
+
+```
+mvn archetype:generate \
+-DarchetypeGroupId=org.streampipes -DarchetypeArtifactId=streampipes-archetype-pe-sources \
+-DarchetypeVersion=0.65.0 -DgroupId=my.groupId \
+-DartifactId=my-source -DclassNamePrefix=MySource -DpackageName=mypackagename
+```
+
+Configure the variables ``artifactId`` (which will be the Maven artifactId), ``classNamePrefix`` (which will be the class name of your data stream) and ``packageName``.
+
+For this tutorial, use ``Vehicle`` as ``classNamePrefix``.
+
+Your project will look as follows:
+
+<img src="/docs/img/tutorial-sources/project-structure.PNG" alt="Project Structure">
+
+That's it, go to the next section to learn how to create your first data stream!
+
+<div class="admonition tip">
+<div class="admonition-title">Tip</div>
+<p>Besides the basic project skeleton, the sample project also includes an example Dockerfile you can use to package your application into a Docker container.
+</p>
+</div>
+
+## Adding a data stream description
+
+Now we will add a new data stream definition.
+First, open the class `VehicleStream` which should look as follows:
+
+```java
+
+package my.groupId.pe.mypackagename;
+
+import org.streampipes.model.SpDataStream;
+import org.streampipes.model.graph.DataSourceDescription;
+import org.streampipes.sdk.builder.DataStreamBuilder;
+import org.streampipes.sdk.helpers.EpProperties;
+import org.streampipes.sdk.helpers.Formats;
+import org.streampipes.sdk.helpers.Protocols;
+import org.streampipes.sources.AbstractAdapterIncludedStream;
+
+
+public class MySourceStream extends AbstractAdapterIncludedStream {
+
+  @Override
+  public SpDataStream declareModel(DataSourceDescription sep) {
+    return DataStreamBuilder.create("my.groupId-mypackagename", "MySource", "")
+            .property(EpProperties.timestampProperty("timestamp"))
+
+            // configure your stream here
+
+            .format(Formats.jsonFormat())
+            .protocol(Protocols.kafka("localhost", 9092, "TOPIC_SHOULD_BE_CHANGED"))
+            .build();
+  }
+
+  @Override
+  public void executeStream() {
+
+  }
+}
+```
+
+This class extends the class ``AbstractAdapterIncludedStream``, which indicates that this source continuously produces data (configured in the ``executeStream()`` method.
+In contrast, the class `AbstractAlreadyExistingStream` indicates that we only want to describe an already existing stream (e.g., a stream that already sends data to an existing Kafka broker).
+
+Next, we will add the definition of the data stream. Add the following code inside of the `declareModel` method:
+```java
+return DataStreamBuilder.create("org.streampipes.tutorial.vehicle.position", "Vehicle Position", "An event stream " +
+          "that produces current vehicle positions")
+```
+
+This line creates a new instance of the SDK's `DataStreamBuilder` by providing three basic parameters:
+The first parameter must be a unique identifier of your data stream.
+The second and third parameters indicate a label and a description of your stream.
+These values will later be used in the StreamPipes UI to display stream details in a human-readable manner.
+
+Next, we will add the properties as stated above to the stream definition by adding the following lines:
+```java
+.property(EpProperties.timestampProperty("timestamp"))
+.property(EpProperties.stringEp(Labels.from("plate-number", "Plate Number", "Denotes the plate number of the vehicle"), "plateNumber", "http://my.company/plateNumber"))
+.property(EpProperties.doubleEp(Labels.from("latitude", "Latitude", "Denotes the latitude value of the vehicle's position"), "latitude", Geo.lat))
+.property(EpProperties.doubleEp(Labels.from("longitude", "Longitude", "Denotes the longitude value of the vehicle's position"), "longitude", Geo.lng))
+```
+These four _event properties_ compose our _event schema_. An event property must, at least, provide the following attributes:
+
+* **Runtime Name**. The runtime name indicates the key of the property at runtime, e.g., if our JSON message contains a structure such as `{"plateNumber" : "KA-F 123"}`, the runtime name must be `plateNumber`.
+* **Runtime Type**. An event property must have a primitive type (we will later see how to model more complex properties such as lists and nested properties).
+The type must be an instance of `XMLSchema` primitives, however, the SDK provides convenience methods to provide the property type.
+* **Domain Property**. The domain property indicates the semantics of the event property. For instance, the `latitude` property is linked to the `http://www.w3.org/2003/01/geo/wgs84_pos#lat` property of the WGS84 vocabulary.
+The domain property should be an URI as part of an existing or domain-specific vocabulary. The SDK provides convenience methods for popuplar vocabularies (e.g., Schema.org, Dolce or WGS84).
+
+In order to complete the minimum required specification of an event stream, we need to provide information on the transport format and protocol of the data stream at runtime.
+
+This can be achieved by extending the builder with the respective properties (which should already have been auto-generated):
+```java
+.format(Formats.jsonFormat())
+.protocol(Protocols.kafka("localhost", 9092, "TOPIC_SHOULD_BE_CHANGED"))
+.build();
+```
+
+Set ``org.streampipes.tutorial.vehicle`` as your new topic by replacing the term ``TOPIC_SHOULD_BE_CHANGED`.
+
+In this example, we defined that the data stream consists of events in a JSON format and that Kafka is used as a message broker to transmit events.
+The last build() method call triggers the construction of the RDF-based data stream definition.
+
+That's it! In the next section, we will connect the data stream to a source and inspect the generated RDF description.
+
+## Creating some dummy data
+
+Let's assume our stream should produce some random values that are sent to StreamPipes. We'll add a very simple data simulator to the ``executeStream`` method as follows:
+
+```java
+@Override
+  public void executeStream() {
+
+    SpKafkaProducer producer = new SpKafkaProducer("localhost:9092", "TOPIC_SHOULD_BE_CHANGED");
+    Random random = new Random();
+    Runnable runnable = new Runnable() {
+      @Override
+      public void run() {
+        for (;;) {
+          JsonObject jsonObject = new JsonObject();
+          jsonObject.addProperty("timestamp", System.currentTimeMillis());
+          jsonObject.addProperty("plateNumber", "KA-FZ 1");
+          jsonObject.addProperty("latitude", random.nextDouble());
+          jsonObject.addProperty("longitude", random.nextDouble());
+
+          producer.publish(jsonObject.toString());
+
+          try {
+            Thread.sleep(1000);
+          } catch (InterruptedException e) {
+            e.printStackTrace();
+          }
+
+        }
+      }
+    };
+
+    new Thread(runnable).start();
+  }
+```
+
+Change the topic and the URL of your Kafka broker (as stated in the controller).
+
+## Adding a source description
+
+A data source can be seen like a container for a set of data streams. Usually, a data source includes events that are logically or physically connected.
+For instance, in our example we would add other streams produced by vehicle sensors (such as fuel consumption) to the same data source description.
+
+Open the class `DataSource` which should look as follows:
+```java
+
+package my.groupId.pe.mypackagename;
+
+import org.streampipes.container.declarer.DataStreamDeclarer;
+import org.streampipes.container.declarer.SemanticEventProducerDeclarer;
+import org.streampipes.model.graph.DataSourceDescription;
+import org.streampipes.sdk.builder.DataSourceBuilder;
+
+import java.util.Arrays;
+import java.util.List;
+
+
+public class DataSource implements SemanticEventProducerDeclarer {
+
+  public DataSourceDescription declareModel() {
+    return DataSourceBuilder.create("my.groupId.mypackagename.source", "MySource " +
+        "Source", "")
+            .build();
+  }
+
+  public List<DataStreamDeclarer> getEventStreams() {
+    return Arrays.asList(new MySourceStream());
+  }
+}
+```
+First, we need to define the source. Similar to data streams, a source consists of an id, a human-readable name and a description.
+Replace the content defined in the `declareModel` method with the following code:
+```java
+return DataSourceBuilder.create("org.streampipes.tutorial.source.vehicle", "Vehicle Source", "A data source that " +
+    "holds event streams produced by vehicles.")
+    .build();
+```
+
+## Preparing the container
+
+The final step is to define the deployment type of our new data source. In this tutorial, we will create a so-called `StandaloneModelSubmitter`.
+This client will start an embedded web server that provides the description of our data source.
+
+Go to the class `Init` that implements `StandaloneModelSubmitter`, which should look as follows:
+```java
+package my.groupId.main;
+
+import org.streampipes.container.init.DeclarersSingleton;
+import org.streampipes.container.standalone.init.StandaloneModelSubmitter;
+import my.groupId.config.Config;
+import my.groupId.pe.mypackagename.DataSource;
+
+public class Init extends StandaloneModelSubmitter {
+
+  public static void main(String[] args) throws Exception {
+    DeclarersSingleton.getInstance()
+            .add(new DataSource());
+
+    new Init().init(Config.INSTANCE);
+
+  }
+}
+```
+This code adds the `VehicleSource`. Finally, the `init` method is called
+which triggers the generation of the corresponding RDF description and startup of the web server.
+
+<div class="admonition info">
+<div class="admonition-title">Info</div>
+<p>In the example above, we make use of a class `Config`.
+       This class contains both mandatory and additional configuration parameters required by a pipeline element container.
+       These values are stored in the Consul-based key-value store of your StreamPipes installation.
+       The SDK guide contains a detailed manual on managing container configurations.</p>
+</div>
+
+## Starting the container
+
+<div class="admonition tip">
+<div class="admonition-title">Tip</div>
+<p>By default, the container registers itself using the hostname later used by the Docker container, leading to a 404 error when you try to access an RDF description.
+       For local development, we provide an environment file in the ``development`` folder. You can add your hostname here, which will override settings from the Config class.
+       For instance, use the IntelliJ ``EnvFile`` plugin to automatically provide the environment variables upon start.
+</p>
+</div>
+
+Now we are ready to start our first container!
+
+Execute the main method in the class `Main` we've just created, open a web browser and navigate to http://localhost:8090, or change the port according to the value of the ``SP_PORT`` variable in the env file.
+
+You should see something as follows:
+
+<img src="/docs/img/tutorial-sources/pe-overview.PNG" alt="Pipeline Element Container Overview">
+
+Click on the link of the data source to see the RDF description of the pipeline element.
+
+<img src="/docs/img/tutorial-sources/pe-rdf.PNG" alt="Pipeline Element RDF description">
+
+The container automatically registers itself in the Consul installation of StreamPipes.
+To install the just created element, open the StreamPipes UI and follow the manual provided in the [user guide](user-guide-introduction).
+
+## Read more
+
+Congratulations! You've just created your first pipeline element for StreamPipes.
+There are many more things to explore and data sources can be defined in much more detail.
+Follow our [SDK guide](dev-guide-source-sdk) to see what's possible!
\ No newline at end of file
diff --git a/documentation/website/versioned_docs/version-0.66.0/pipeline-elements.md b/documentation/website/versioned_docs/version-0.66.0/pipeline-elements.md
new file mode 100644
index 0000000..2d2f8f8
--- /dev/null
+++ b/documentation/website/versioned_docs/version-0.66.0/pipeline-elements.md
@@ -0,0 +1,8 @@
+---
+id: version-0.66.0-pipeline-elements
+title: Overview
+sidebar_label: Overview
+original_id: pipeline-elements
+---
+
+<div class="pe-grid-container"><div class="pe-container-item pe-container-item-processor"><div class="pe-container-item-header"><div class="pe-container-item-icon pe-icon-processor"><img class="pe-icon" src="/docs/img/pipeline-elements/org.apache.streampipes.processors.aggregation.flink.aggregation/icon.png"></div><div class="pe-container-item-header-pe"><div class="pe-container-item-label pe-container-item-label-processor">Data Processor</div><div class="pe-container-item-label-name">Ag [...]
\ No newline at end of file
diff --git a/documentation/website/versioned_docs/version-0.66.0/user-guide-installation.md b/documentation/website/versioned_docs/version-0.66.0/user-guide-installation.md
new file mode 100644
index 0000000..e0b1dd4
--- /dev/null
+++ b/documentation/website/versioned_docs/version-0.66.0/user-guide-installation.md
@@ -0,0 +1,140 @@
+---
+id: version-0.66.0-user-guide-installation
+title: Installation
+sidebar_label: Installation
+original_id: user-guide-installation
+---
+## Prerequisites
+
+### Hardware
+
+* Docker (latest version, see instructions below)
+* Docker Compose (latest version., see instructions below)
+
+### Supported operating systems
+We rely on Docker and support three operating systems for the StreamPipes system
+
+* Linux
+* OSX
+* Windows 10
+    * Please note that older Windows versions are not compatible with Docker. Also Linux VMs under Windows might not work, due to network problems with docker.
+
+### Web Browser
+StreamPipes is a modern web application, therefore you need a recent version of Chrome (recommended), Firefox or Edge.
+
+### Docker
+You need to have Docker installed on your system before you continue with the installation guide.
+
+
+<div class="admonition info">
+<div class="admonition-title">Install Docker</div>
+<p>Go to https://docs.docker.com/installation/ and follow the instructions to install Docker for your OS. Make sure docker can be started as a non-root user (described in the installation manual, don’t forget to log out and in again) and check that Docker is installed correctly by executing docker-run hello-world</p>
+</div>
+
+<div class="admonition info">
+<div class="admonition-title">Configure Docker</div>
+<p>By default, Docker uses only a limited number of CPU cores and memory.
+       If you run StreamPipes on Windows or on a Mac you need to adjust the default settings.
+       To do that, click on the Docker icon in your tab bar and open the preferences.
+       Go to the advanced preferences and set the **number of CPUs to 6** (recommended) and the **Memory to 4GB**.
+       After changing the settings, Docker needs to be restarted.</p></div>
+
+
+## Install StreamPipes
+
+<div class="tab-content" id="myTabContent">
+    <div class="tab-pane fade show active" id="linux" role="tabpanel" aria-labelledby="linux-tab">
+        <ul style="padding-left:0">
+            <li class="installation-step">
+                <div class="wrapper-container" style="align-items: center;justify-content: center;">
+                    <div class="wrapper-step">
+                        <span class="fa-stack fa-2x">
+                             <i class="fas fa-circle fa-stack-2x sp-color-green"></i>
+                             <strong class="fa-stack-1x" style="color:white;">1</strong>
+                        </span>
+                    </div>
+                    <div class="wrapper-instruction">
+                        <a href="https://www.apache.org/dyn/mirrors/mirrors.cgi?action=download&filename=incubator/streampipes/installer/0.66.0/apache-streampipes-installer-0.66.0-incubating-source-release.zip">Download</a>
+                        the latest Apache StreamPipes release and extract the zip file to a directory of your choice.
+                    </div>
+                </div>
+            </li>
+            <li class="installation-step">
+                <div class="wrapper-container" style="align-items: center;justify-content: center;">
+                    <div class="wrapper-step">
+                        <span class="fa-stack fa-2x">
+                             <i class="fas fa-circle fa-stack-2x sp-color-green"></i>
+                             <strong class="fa-stack-1x" style="color:white;">2</strong>
+                        </span>
+                    </div>
+                    <div class="wrapper-instruction">
+                       <div style="margin-bottom:5px;"><b>Linux/Mac:</b> In a command prompt, open the folder <code>installer/osx_linux</code> and run <code>./streampipes
+                            start</code>.<br/>
+                        </div>
+                        <div style="margin-top:5px;">
+                        <b>Windows 10:</b> In a command prompt, open the folder <code>installer/windows10</code> and run <code>streampipes.bat
+                                                    start</code>.<br/>
+                        </div>
+                    </div>
+                </div>
+            </li>
+            <li class="installation-step">
+                <div class="wrapper-container" style="align-items: center;justify-content: center;">
+                    <div class="wrapper-step">
+                        <span class="fa-stack fa-2x">
+                             <i class="fas fa-circle fa-stack-2x sp-color-green"></i>
+                             <strong class="fa-stack-1x" style="color:white;">3</strong>
+                        </span>
+                    </div>
+                    <div class="wrapper-instruction">
+                       When asked, enter the version (full or lite).
+                    </div>
+                </div>
+            </li>
+            <li class="installation-step">
+                <div class="wrapper-container" style="align-items: center;justify-content: center;">
+                    <div class="wrapper-step">
+                        <span class="fa-stack fa-2x">
+                             <i class="fas fa-circle fa-stack-2x sp-color-green"></i>
+                             <strong class="fa-stack-1x" style="color:white;">4</strong>
+                        </span>
+                    </div>
+                    <div class="wrapper-instruction">
+                        Open your browser, navigate to http://localhost:80 (or the domain name of your server) and finish the setup according to the instructions below.
+                    </div>
+                </div>
+            </li>
+        </ul>
+        </div>
+    </div>
+
+## Setup StreamPipes
+
+Once you've opened the browser at the URL given above, you should see StreamPipes application as shown below.
+To set up the system, enter an email address and a password and click on install.
+At this point, it is not necessary to change anything in the advanced settings menu.
+The installation might take some time, continue by clicking on "Go to login page", once all components are successfully configured.
+
+
+On the login page, enter your credentials, then you should be forwarded to the home page.
+
+Congratulations! You've successfully managed to install StreamPipes. Now we're ready to build our first pipeline!
+
+<div class="my-carousel">
+    <img src="/docs/img/quickstart/setup/01_register_user.png" alt="Set Up User">
+    <img src="/docs/img/quickstart/setup/02_user_set_up.png" alt="SetUp StreamPipes Components">
+    <img src="/docs/img/quickstart/setup/03_login.png" alt="Go to login page">
+    <img src="/docs/img/quickstart/setup/04_home.png" alt="Home page">
+</div>
+
+<div class="admonition error">
+<div class="admonition-title">Errors during the installation process</div>
+<p>In most cases, errors during the installation are due to an under-powered system.<br/>
+If there is a problem with any of the components, please restart the whole system and delete the "config" directory on the server.
+   This directory is in the same folder as the docker-compose.yml file.<br/>
+   Please also make sure that your system meets the hardware requirements as mentioned in the first section of the installation guide.</p>
+</div>
+
+## Next Steps
+
+Now you can continue with the tutorial on page [First steps](user-guide-first-steps.md).
diff --git a/documentation/website/versions.json b/documentation/website/versions.json
index dc16fdc..e132de1 100644
--- a/documentation/website/versions.json
+++ b/documentation/website/versions.json
@@ -1,4 +1,5 @@
 [
+  "0.66.0",
   "0.65.0-pre-asf",
   "0.64.0-pre-asf",
   "0.63.0-pre-asf",