You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@devlake.apache.org by yu...@apache.org on 2022/08/26 06:35:01 UTC

[incubator-devlake-website] branch main updated (86978e9c -> 0efe1466)

This is an automated email from the ASF dual-hosted git repository.

yumeng pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/incubator-devlake-website.git


    from 86978e9c docs: use config-ui url instead of devlake to avoid confusion
     new 2627440d issue-172: proof-read webpages and make adjustments
     new d93a40f3 DB migrations extra docs added
     new 3da9a0d3 enhanced e2e test guide
     new 503f7f57 enhancements to plugin impl doc
     new d90f70be enhancement to Dev setup doc
     new 0efe1466 enhanced GettingStarted manuals

The 6 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 docs/DeveloperManuals/DBMigration.md          | 16 +++++++++++++
 docs/DeveloperManuals/DeveloperSetup.md       | 12 +++++++---
 docs/DeveloperManuals/E2E-Test-Guide.md       | 31 +++++++++++++++++--------
 docs/DeveloperManuals/PluginImplementation.md | 33 +++++++++++++++------------
 docs/GettingStarted/DockerComposeSetup.md     |  4 ++--
 docs/GettingStarted/HelmSetup.md              |  1 +
 docs/GettingStarted/KubernetesSetup.md        |  1 +
 docs/GettingStarted/TemporalSetup.md          |  4 ++--
 docs/UserManuals/ConfigUI/GitHub.md           |  2 +-
 9 files changed, 72 insertions(+), 32 deletions(-)


[incubator-devlake-website] 05/06: enhancement to Dev setup doc

Posted by yu...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

yumeng pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/incubator-devlake-website.git

commit d90f70be78bf1535212361d8f9d6ef2eecc4793b
Author: Keon Amini <ke...@skael.com>
AuthorDate: Thu Aug 25 19:57:54 2022 -0500

    enhancement to Dev setup doc
---
 docs/DeveloperManuals/DeveloperSetup.md | 8 +++++++-
 1 file changed, 7 insertions(+), 1 deletion(-)

diff --git a/docs/DeveloperManuals/DeveloperSetup.md b/docs/DeveloperManuals/DeveloperSetup.md
index 88e9470d..ef7ffa2a 100644
--- a/docs/DeveloperManuals/DeveloperSetup.md
+++ b/docs/DeveloperManuals/DeveloperSetup.md
@@ -16,6 +16,10 @@ sidebar_position: 1
   - Ubuntu: `sudo apt-get install build-essential libssl-dev`
 
 ## How to setup dev environment
+
+The following guide will walk you through the procedure to run local config-ui and devlake servers against dockerized
+MySQL and Grafana containers.
+
 1. Navigate to where you would like to install this project and clone the repository:
 
    ```sh
@@ -66,11 +70,13 @@ sidebar_position: 1
 
     Q: I got an error saying: `libgit2.so.1.3: cannot open share object file: No such file or directory`
 
-    A: Make sure your program can find `libgit2.so.1.3`. `LD_LIBRARY_PATH` can be assigned like this if your `libgit2.so.1.3` is located at `/usr/local/lib`:
+    A: This library is needed by the git-extractor plugin. Make sure your program can find `libgit2.so.1.3`. `LD_LIBRARY_PATH` can be assigned like this if your `libgit2.so.1.3` is located at `/usr/local/lib`:
 
     ```sh
     export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib
     ```
+   
+    Note that the version has to be pinned to 1.3.0. If you don't have it, you may need to build it manually with CMake from [source](https://github.com/libgit2/libgit2/releases/tag/v1.3.0).
 
 8. Visit config UI at `localhost:4000` to configure data connections.
     - Please follow the [tutorial](UserManuals/ConfigUI/Tutorial.md)


[incubator-devlake-website] 04/06: enhancements to plugin impl doc

Posted by yu...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

yumeng pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/incubator-devlake-website.git

commit 503f7f573f11f171667f4ba1af00df1b1508049f
Author: Keon Amini <ke...@skael.com>
AuthorDate: Thu Aug 25 19:44:33 2022 -0500

    enhancements to plugin impl doc
---
 docs/DeveloperManuals/PluginImplementation.md | 33 +++++++++++++++------------
 1 file changed, 18 insertions(+), 15 deletions(-)

diff --git a/docs/DeveloperManuals/PluginImplementation.md b/docs/DeveloperManuals/PluginImplementation.md
index 7fc58b05..bcc6599f 100644
--- a/docs/DeveloperManuals/PluginImplementation.md
+++ b/docs/DeveloperManuals/PluginImplementation.md
@@ -128,7 +128,7 @@ Before we start, it is helpful to know how collection task is executed:
 > ```
 > More info at: https://devlake.apache.org/blog/how-apache-devlake-runs/
 
-#### Step 2.1 Create a sub-task(Collector) for data collection
+#### Step 2.1: Create a sub-task(Collector) for data collection
 
 Let's run `go run generator/main.go create-collector icla committer` and confirm it. This sub-task is activated by registering in `plugin_main.go/SubTaskMetas` automatically.
 
@@ -137,7 +137,7 @@ Let's run `go run generator/main.go create-collector icla committer` and confirm
 > - Collector will collect data from HTTP or other data sources, and save the data into the raw layer. 
 > - Inside the func `SubTaskEntryPoint` of `Collector`, we use `helper.NewApiCollector` to create an object of [ApiCollector](https://github.com/apache/incubator-devlake/blob/main/generator/template/plugin/tasks/api_collector.go-template), then call `execute()` to do the job. 
 
-Now you can notice `data.ApiClient` is inited in `plugin_main.go/PrepareTaskData.ApiClient`. `PrepareTaskData` create a new `ApiClient`, and it's a tool Apache DevLake suggests to request data from HTTP Apis. This tool support some valuable features for HttpApi, like rateLimit, proxy and retry. Of course, if you like, you may use the lib `http` instead, but it will be more tedious.
+Now you can notice `data.ApiClient` is initiated in `plugin_main.go/PrepareTaskData.ApiClient`. `PrepareTaskData` create a new `ApiClient`, which is a tool Apache DevLake suggests to request data from HTTP Apis. This tool support some valuable features for HttpApi, like rateLimit, proxy and retry. Of course, if you like, you may use the lib `http` instead, but it will be more tedious.
 
 Let's move forward to use it.
 
@@ -146,7 +146,7 @@ we have filled `https://people.apache.org/` into `tasks/api_client.go/ENDPOINT`
 
 ![](https://i.imgur.com/q8Zltnl.png)
 
-2. And fill `public/icla-info.json` into `UrlTemplate`, delete unnecessary iterator and add `println("receive data:", res)` in `ResponseParser` to see if collection was successful.
+2. Fill `public/icla-info.json` into `UrlTemplate`, delete the unnecessary iterator and add `println("receive data:", res)` in `ResponseParser` to see if collection was successful.
 
 ![](https://i.imgur.com/ToLMclH.png)
 
@@ -191,7 +191,7 @@ receive data: 272956 /* <- the number means 272956 models received */
 
 ![](https://i.imgur.com/aVYNMRr.png)
 
-#### Step 2.2 Create a sub-task(Extractor) to extract data from the raw layer
+#### Step 2.2: Create a sub-task(Extractor) to extract data from the raw layer
 
 > - Extractor will extract data from raw layer and save it into tool db table.
 > - Except for some pre-processing, the main flow is similar to the collector.
@@ -230,7 +230,7 @@ Next, let's run `go run generator/main.go create-extractor icla committer` and t
 
 ![](https://i.imgur.com/UyDP9Um.png)
 
-Let's look at the function `extract` in `committer_extractor.go` created just now, and some codes need to be written here. It's obviously `resData.data` is raw data, so we could decode them by json and add new `IclaCommitter` to save them.
+Let's look at the function `extract` in `committer_extractor.go` created just now, and the code that needs to be written here. It's obvious that `resData.data` is the raw data, so we could json-decode each row add a new `IclaCommitter` for each and save them.
 ```go
 Extract: func(resData *helper.RawData) ([]interface{}, error) {
     names := &map[string]string{}
@@ -268,14 +268,17 @@ receive data: 272956
 Now committer data have been saved in _tool_icla_committer.
 ![](https://i.imgur.com/6svX0N2.png)
 
-#### Step 2.3 Convertor
+#### Step 2.3: Convertor
 
-Notes: There are two ways here (open source or using it yourself). It is unnecessary, but we encourage it because convertors and the domain layer will significantly help build dashboards. More info about the domain layer at: https://devlake.apache.org/docs/DataModels/DevLakeDomainLayerSchema/
+Notes: The goal of Converters is to create a vendor-agnostic model out of the vendor-dependent ones created by the Extractors. 
+They are not necessary to have per se, but we encourage it because converters and the domain layer will significantly help with building dashboards. More info about the domain layer [here](https://devlake.apache.org/docs/DataModels/DevLakeDomainLayerSchema/).
+
+In short:
 
 > - Convertor will convert data from the tool layer and save it into the domain layer.
 > - We use `helper.NewDataConverter` to create an object of DataConvertor, then call `execute()`. 
 
-#### Step 2.4 Let's try it
+#### Step 2.4: Let's try it
 Sometimes OpenApi will be protected by token or other auth types, and we need to log in to gain a token to visit it. For example, only after logging in `private@apahce.com` could we gather the data about contributors signing ICLA. Here we briefly introduce how to authorize DevLake to collect data.
 
 Let's look at `api_client.go`. `NewIclaApiClient` load config `ICLA_TOKEN` by `.env`, so we can add `ICLA_TOKEN=XXXXXX` in `.env` and use it in `apiClient.SetHeaders()` to mock the login status. Code as below:
@@ -285,12 +288,12 @@ Of course, we can use `username/password` to get a token after login mockery. Ju
 
 Look for more related details at https://github.com/apache/incubator-devlake
 
-#### Step 2.5 Implement the GetTablesInfo() method of the PluginModel interface
+#### Step 2.5: Implement the GetTablesInfo() method of the PluginModel interface
 
-As shown in the following gitlab plugin example
-Add all models that need to be accessed by external plugins to the return value.
+As shown in the following gitlab plugin example,
+add all models that need to be accessed by external plugins to the return value.
 
-```golang
+```go
 var _ core.PluginModel = (*Gitlab)(nil)
 
 func (plugin Gitlab) GetTablesInfo() []core.Tabler {
@@ -315,9 +318,9 @@ func (plugin Gitlab) GetTablesInfo() []core.Tabler {
 }
 ```
 
-You can use it as follow
+You can use it as follows:
 
-```
+```go
 if pm, ok := plugin.(core.PluginModel); ok {
     tables := pm.GetTablesInfo()
     for _, table := range tables {
@@ -328,7 +331,7 @@ if pm, ok := plugin.(core.PluginModel); ok {
 ```
 
 #### Final step: Submit the code as open source code
-Good ideas and we encourage contributions~ Let's learn about migration scripts and domain layers to write normative and platform-neutral codes. More info at https://devlake.apache.org/docs/DataModels/DevLakeDomainLayerSchema or contact us for ebullient help.
+We encourage ideas and contributions ~ Let's use migration scripts, domain layers and other discussed concepts to write normative and platform-neutral code. More info at [here](https://devlake.apache.org/docs/DataModels/DevLakeDomainLayerSchema) or contact us for ebullient help.
 
 
 ## Done!


[incubator-devlake-website] 06/06: enhanced GettingStarted manuals

Posted by yu...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

yumeng pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/incubator-devlake-website.git

commit 0efe1466e2469ba0b8ceb9d48508ff4fe27acd48
Author: Keon Amini <ke...@skael.com>
AuthorDate: Thu Aug 25 20:16:38 2022 -0500

    enhanced GettingStarted manuals
---
 docs/GettingStarted/DockerComposeSetup.md | 4 ++--
 docs/GettingStarted/HelmSetup.md          | 1 +
 docs/GettingStarted/KubernetesSetup.md    | 1 +
 docs/GettingStarted/TemporalSetup.md      | 4 ++--
 docs/UserManuals/ConfigUI/GitHub.md       | 2 +-
 5 files changed, 7 insertions(+), 5 deletions(-)

diff --git a/docs/GettingStarted/DockerComposeSetup.md b/docs/GettingStarted/DockerComposeSetup.md
index bde5053e..c17bb9f5 100644
--- a/docs/GettingStarted/DockerComposeSetup.md
+++ b/docs/GettingStarted/DockerComposeSetup.md
@@ -16,13 +16,13 @@ sidebar_position: 1
 - Commands written `like this` are to be run in your terminal.
 
 1. Download `docker-compose.yml` and `env.example` from [latest release page](https://github.com/apache/incubator-devlake/releases/latest) into a folder.
-2. Rename `env.example` to `.env`. For Mac/Linux users, please run `mv env.example .env` in the terminal.
+2. Rename `env.example` to `.env`. For Mac/Linux users, please run `mv env.example .env` in the terminal. This file contains the environment variables that the Devlake server will use. Additional ones can be found in the compose file(s).
 3. Run `docker-compose up -d` to launch DevLake.
 
 ## Configure and collect data
 
 1. Visit `config-ui` at `http://localhost:4000` in your browser to configure and collect data.
-   - Please follow the [turorial](UserManuals/ConfigUI/Tutorial.md)
+   - Please follow the [tutorial](UserManuals/ConfigUI/Tutorial.md)
    - `devlake` takes a while to fully boot up. if `config-ui` complaining about api being unreachable, please wait a few seconds and try refreshing the page.
 2. Click *View Dashboards* button in the top left when done, or visit `localhost:3002` (username: `admin`, password: `admin`).
    - We use [Grafana](https://grafana.com/) as a visualization tool to build charts for the [data](../SupportedDataSources.md) stored in our database.
diff --git a/docs/GettingStarted/HelmSetup.md b/docs/GettingStarted/HelmSetup.md
index c8747f05..e8920b4a 100644
--- a/docs/GettingStarted/HelmSetup.md
+++ b/docs/GettingStarted/HelmSetup.md
@@ -99,6 +99,7 @@ Some useful parameters for the chart, you could also check them in values.yaml
 | lake.image.repository  | repository for lake's image | mericodev/lake  |
 | lake.image.tag  | image tag for lake's image | latest  |
 | lake.image.pullPolicy  | pullPolicy for lake's image | Always  |
+| lake.loggingDir | the root logging directory of Devlake | /app/logs | 
 | ui.image.repository  | repository for ui's image | mericodev/config-ui  |
 | ui.image.tag  | image tag for ui's image | latest  |
 | ui.image.pullPolicy  | pullPolicy for ui's image | Always  |
diff --git a/docs/GettingStarted/KubernetesSetup.md b/docs/GettingStarted/KubernetesSetup.md
index b074dfa0..f87d5ac1 100644
--- a/docs/GettingStarted/KubernetesSetup.md
+++ b/docs/GettingStarted/KubernetesSetup.md
@@ -27,6 +27,7 @@ We provide a sample [k8s-deploy.yaml](https://github.com/apache/incubator-devlak
      * `ADMIN_USER`/`ADMIN_PASS`: Not required, but highly recommended
    - Settings used by `devlake`:
      * `DB_URL`: update this value if  `MYSQL_USER`, `MYSQL_PASSWORD` or `MYSQL_DATABASE` were changed
+     * `LOGGING_DIR`: the directory of logs for Devlake - you likely don't need to change it.
 3. The `devlake` deployment store its configuration in `/app/.env`. In our sample yaml, we use `hostPath` volume, so please make sure directory `/var/lib/devlake` exists on your k8s workers, or employ other techniques to persist `/app/.env` file. Please do NOT mount the entire `/app` directory, because plugins are located in `/app/bin` folder.
 4. Finally, execute the following command and DevLake should be up and running:
    ```sh
diff --git a/docs/GettingStarted/TemporalSetup.md b/docs/GettingStarted/TemporalSetup.md
index c5b91c61..58132999 100644
--- a/docs/GettingStarted/TemporalSetup.md
+++ b/docs/GettingStarted/TemporalSetup.md
@@ -31,5 +31,5 @@ But, be careful, many API services like JIRA/GITHUB have a request rate limit me
 
 ### How to setup
 
-1. Clone and fire up  [temporalio](https://temporal.io/) services
-2. Clone this repo, and fire up DevLake with command `docker-compose -f docker-compose-temporal.yml up -d`
\ No newline at end of file
+1. Clone and fire up the [temporalio](https://temporal.io/) services
+2. Clone this repo, and fire up DevLake with command `docker-compose -f deployment/temporal/docker-compose-temporal.yml up -d`
\ No newline at end of file
diff --git a/docs/UserManuals/ConfigUI/GitHub.md b/docs/UserManuals/ConfigUI/GitHub.md
index d775f7b1..aaae0da2 100644
--- a/docs/UserManuals/ConfigUI/GitHub.md
+++ b/docs/UserManuals/ConfigUI/GitHub.md
@@ -33,7 +33,7 @@ Click `Test Connection`, if the connection is successful, click `Save Connection
 Enter the GitHub repos to collect. If you want to collect more than 1 repo, please separate repos with comma. For example, "apache/incubator-devlake,apache/incubator-devlake-website".
 
 #### Data Entities
-Usually, you don't have to modify this part. However, if you don't want to collect certain GitHub entities, you can unselect some entities to accerlerate the collection speed.
+Usually, you don't have to modify this part. However, if you don't want to collect certain GitHub entities, you can unselect some entities to accelerate the collection speed.
 - Issue Tracking: GitHub issues, issue comments, issue labels, etc.
 - Source Code Management: GitHub repos, refs, commits, etc.
 - Code Review: GitHub PRs, PR comments and reviews, etc.


[incubator-devlake-website] 03/06: enhanced e2e test guide

Posted by yu...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

yumeng pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/incubator-devlake-website.git

commit 3da9a0d3d83f79c3841c045e6fbba5d1bae63dd5
Author: Keon Amini <ke...@skael.com>
AuthorDate: Thu Aug 25 19:10:55 2022 -0500

    enhanced e2e test guide
---
 docs/DeveloperManuals/E2E-Test-Guide.md | 31 ++++++++++++++++++++++---------
 1 file changed, 22 insertions(+), 9 deletions(-)

diff --git a/docs/DeveloperManuals/E2E-Test-Guide.md b/docs/DeveloperManuals/E2E-Test-Guide.md
index 5aaa653e..9e28fef1 100644
--- a/docs/DeveloperManuals/E2E-Test-Guide.md
+++ b/docs/DeveloperManuals/E2E-Test-Guide.md
@@ -9,7 +9,8 @@ description: >
 ## Why write E2E tests
 
 E2E testing, as a part of automated testing, generally refers to black-box testing at the file and module level or unit testing that allows the use of some external services such as databases. The purpose of writing E2E tests is to shield some internal implementation logic and see whether the same external input can output the same result in terms of data aspects. In addition, compared to the black-box integration tests, it can avoid some chance problems caused by network and other facto [...]
-In DevLake, E2E testing consists of interface testing and input/output result validation for the plugin Extract/Convert subtask. This article only describes the process of writing the latter.
+In DevLake, E2E testing consists of interface testing and input/output result validation for the plugin Extract/Convert subtask. This article only describes the process of writing the latter. As the Collectors invoke external
+services we typically do not write E2E tests for them.
 
 ## Preparing data
 
@@ -17,8 +18,8 @@ Let's take a simple plugin - Feishu Meeting Hours Collection as an example here.
 ![image](https://user-images.githubusercontent.com/3294100/175061114-53404aac-16ca-45d1-a0ab-3f61d84922ca.png)
 Next, we will write the E2E tests of the sub-tasks.
 
-The first step in writing the E2E test is to run the Collect task of the corresponding plugin to complete the data collection, that is, to have the corresponding data saved in the table starting with `_raw_feishu_` in the database.
-Here are the logs and database tables using the DirectRun (cmd) run method.
+The first step in writing the E2E test is to run the Collect task of the corresponding plugin to complete the data collection; that is, to have the corresponding data saved in the table starting with `_raw_feishu_` in the database.
+This data will be presumed to be the "source of truth" for our tests. Here are the logs and database tables using the DirectRun (cmd) run method.
 ```
 $ go run plugins/feishu/main.go --numOfDaysToCollect 2 --connectionId 1 (Note: command may change with version upgrade)
 [2022-06-22 23:03:29] INFO failed to create dir logs: mkdir logs: file exists
@@ -40,9 +41,9 @@ press `c` to send cancel signal
 <img width="993" alt="image" src="https://user-images.githubusercontent.com/3294100/175064505-bc2f98d6-3f2e-4ccf-be68-a1cab1e46401.png"/>
 Ok, the data has now been saved to the `_raw_feishu_*` table, and the `data` column is the return information from the plugin. Here we only collected data for the last 2 days. The data information is not much, but it also covers a variety of situations. That is, the same person has data on different days.
 
-It is also worth mentioning that the plugin runs two tasks, `collectMeetingTopUserItem` and `extractMeetingTopUserItem`, the former is the task of collecting, which is needed to run this time, and the latter is the task of extracting data. It doesn't matter whether it runs in the prepared data session.
+It is also worth mentioning that the plugin runs two tasks, `collectMeetingTopUserItem` and `extractMeetingTopUserItem`. The former is the task of collecting, which is needed to run this time, and the latter is the task of extracting data. It doesn't matter whether the extractor runs in the prepared data session.
 
-Next, we need to export the data to .csv format. This step is a variety of options. You can show your skills, and I only introduce a few common methods here.
+Next, we need to export the data to .csv format. This step can be done in a variety of different ways - you can show your skills. I will only introduce a few common methods here.
 
 ### DevLake Code Generator Export
 
@@ -116,7 +117,7 @@ func TestMeetingDataFlow(t *testing.T) {
 ```
 The signature of the import function is as follows.
 ```func (t *DataFlowTester) ImportCsvIntoRawTable(csvRelPath string, rawTableName string)```
-He has a twin, with only slight differences in parameters.
+It has a twin, with only slight differences in parameters.
 ```func (t *DataFlowTester) ImportCsvIntoTabler(csvRelPath string, dst schema.Tabler)```
 The former is used to import tables in the raw layer. The latter is used to import arbitrary tables.
 **Note:** These two functions will delete the db table and use `gorm.AutoMigrate` to re-create a new table to clear data in it.
@@ -158,7 +159,6 @@ func TestMeetingDataFlow(t *testing.T) {
     dataflowTester.VerifyTable(
       models.FeishuMeetingTopUserItem{},
       "./snapshot_tables/_tool_feishu_meeting_top_user_items.csv",
-      []string{"connection_id", "start_time", "name"},
       []string{
         "meeting_count",
         "meeting_duration",
@@ -171,9 +171,22 @@ func TestMeetingDataFlow(t *testing.T) {
     )
 }
 ```
-Its purpose is to call `dataflowTester.VerifyTable` to complete the validation of the data results. The third parameter is the table's primary keys, and the fourth parameter is all the fields of the table that need to be verified. The data used for validation exists in `. /snapshot_tables/_tool_feishu_meeting_top_user_items.csv`, but of course, this file does not exist yet.
+Its purpose is to call `dataflowTester.VerifyTable` to complete the validation of the data results. The third parameter is all the fields of the table that need to be verified. 
+The data used for validation exists in `. /snapshot_tables/_tool_feishu_meeting_top_user_items.csv`, but of course, this file does not exist yet.
 
-To facilitate the generation of the file mentioned above, DevLake has adopted a testing technique called `Snapshot`, which will automatically generate the file based on the run results when the `VerifyTable` file is called without the csv existing.
+There is a twin, more generalized function, that could be used instead:
+```go
+dataflowTester.VerifyTableWithOptions(models.FeishuMeetingTopUserItem{}, 
+        dataflowTester.TableOptions{
+	        CSVRelPath: "./snapshot_tables/_tool_feishu_meeting_top_user_items.csv"
+        },
+    )
+
+```
+The above usage will default to validating against all fields of the ```models.FeishuMeetingTopUserItem``` model. There are additional fields on ```TableOptions``` that can be specified
+to limit which fields on that model to perform validation on.
+
+To facilitate the generation of the file mentioned above, DevLake has adopted a testing technique called `Snapshot`, which will automatically generate the file based on the run results when the `VerifyTable` or `VerifyTableWithOptions` functions are called without the csv existing.
 
 But note! Please do two things after the snapshot is created: 1. check if the file is generated correctly 2. re-run it to make sure there are no errors between the generated results and the re-run results.
 These two operations are critical and directly related to the quality of test writing. We should treat the snapshot file in `.csv' format like a code file.


[incubator-devlake-website] 02/06: DB migrations extra docs added

Posted by yu...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

yumeng pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/incubator-devlake-website.git

commit d93a40f38eb5891908e36a004891585bdfb6a771
Author: Keon Amini <ke...@skael.com>
AuthorDate: Thu Aug 25 18:45:47 2022 -0500

    DB migrations extra docs added
---
 docs/DeveloperManuals/DBMigration.md | 16 ++++++++++++++++
 1 file changed, 16 insertions(+)

diff --git a/docs/DeveloperManuals/DBMigration.md b/docs/DeveloperManuals/DBMigration.md
index 95302379..53160498 100644
--- a/docs/DeveloperManuals/DBMigration.md
+++ b/docs/DeveloperManuals/DBMigration.md
@@ -18,17 +18,33 @@ When DevLake starts, scripts register themselves to the framework by invoking th
 
 ```go
 type Script interface {
+    // this function will contain the business logic of the migration (e.g. DDL logic)
 	Up(ctx context.Context, db *gorm.DB) error
+    // the version number of the migration. typically in date format (YYYYMMDDHHMMSS), e.g. 20220728000001. Migrations are executed sequentially based on this number.
 	Version() uint64
+	// The name of this migration
 	Name() string
 }
 ```
 
+## Migration Model
+
+For each migration we define a "snapshot" datamodel of the model that we wish to perform the migration on.
+The fields on this model shall be identical to the actual model, but unlike the actual one, this one will
+never change in the future. The naming convention of these models is `<ModelName>YYYYMMDD` and they must implement
+the `func TableName() string` method, and consumed by the `Script::Up` method.
+
 ## Table `migration_history`
 
 The table tracks migration scripts execution and schemas changes.
 From which, DevLake could figure out the current state of database schemas.
 
+## Execution
+
+Each plugin has a `migrationscripts` subpackage that lists all the migrations to be executed for that plugin. You
+will need to add your migration to that list for the framework to pick it up. Similarly, there is such a package
+for the framework-only migrations defined under the `models` package.
+
 
 ## How It Works
 1. Check `migration_history` table, calculate all the migration scripts need to be executed.


[incubator-devlake-website] 01/06: issue-172: proof-read webpages and make adjustments

Posted by yu...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

yumeng pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/incubator-devlake-website.git

commit 2627440d36e3dacb5ccef70f08bec11fd21e7885
Author: Keon Amini <ke...@skael.com>
AuthorDate: Mon Aug 22 18:20:14 2022 -0500

    issue-172: proof-read webpages and make adjustments
---
 docs/DeveloperManuals/DeveloperSetup.md | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/DeveloperManuals/DeveloperSetup.md b/docs/DeveloperManuals/DeveloperSetup.md
index a3f56c57..88e9470d 100644
--- a/docs/DeveloperManuals/DeveloperSetup.md
+++ b/docs/DeveloperManuals/DeveloperSetup.md
@@ -1,7 +1,7 @@
 ---
 title: "Developer Setup"
 description: >
-  The steps to install DevLake in develper mode.
+  The steps to install DevLake in developer mode.
 sidebar_position: 1
 ---
 
@@ -9,7 +9,7 @@ sidebar_position: 1
 ## Requirements
 
 - <a href="https://docs.docker.com/get-docker" target="_blank">Docker v19.03.10+</a>
-- <a href="https://golang.org/doc/install" target="_blank">Golang v1.17+</a>
+- <a href="https://golang.org/doc/install" target="_blank">Golang v1.19+</a>
 - Make
   - Mac (Already installed)
   - Windows: [Download](http://gnuwin32.sourceforge.net/packages/make.htm)