You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@devlake.apache.org by "mindlesscloud (via GitHub)" <gi...@apache.org> on 2023/03/02 11:18:57 UTC

[GitHub] [incubator-devlake-website] mindlesscloud opened a new pull request, #453: doc: update the doc of plugin customize

mindlesscloud opened a new pull request, #453:
URL: https://github.com/apache/incubator-devlake-website/pull/453

   # Summary
   Update the document of plugin customize.
   The documentation of the old API has been updated and provided new documents for the new APIs.
   ### Does this close any open issues?
   Closes [#3840](https://github.com/apache/incubator-devlake/issues/3840)
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@devlake.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-devlake-website] mindlesscloud commented on a diff in pull request #453: doc: update the doc of plugin customize

Posted by "mindlesscloud (via GitHub)" <gi...@apache.org>.
mindlesscloud commented on code in PR #453:
URL: https://github.com/apache/incubator-devlake-website/pull/453#discussion_r1124331141


##########
docs/Plugins/customize.md:
##########
@@ -11,10 +11,17 @@ description: >
 This plugin provides users the ability to create/delete columns and extract data from certain raw layer tables.

Review Comment:
   Not only the new columns but all the columns could get their new values in this way. I will change the text to:
   
   Insert values to certain columns with data extracted from some raw layer tables
   
   Does it sound good to you?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@devlake.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-devlake-website] mindlesscloud merged pull request #453: doc: update the doc of plugin customize

Posted by "mindlesscloud (via GitHub)" <gi...@apache.org>.
mindlesscloud merged PR #453:
URL: https://github.com/apache/incubator-devlake-website/pull/453


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@devlake.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-devlake-website] mindlesscloud commented on a diff in pull request #453: doc: update the doc of plugin customize

Posted by "mindlesscloud (via GitHub)" <gi...@apache.org>.
mindlesscloud commented on code in PR #453:
URL: https://github.com/apache/incubator-devlake-website/pull/453#discussion_r1124325634


##########
docs/Plugins/customize.md:
##########
@@ -75,25 +82,129 @@ curl 'http://localhost:8080/pipelines' \
 }
 '
 ```
-Get all extension columns(start with `x_`) of the table `issues`
+
+### List Columns
+Get all columns of the table `issues`
 > GET /plugins/customize/issues/fields
 
+**NOTE** some fields are omitted in the following example
 response
 ```json
 [
-    {
-        "columnName": "x_test",
-        "columnType": "VARCHAR(255)"
-    }
+  {
+    "columnName": "id",
+    "displayName": "",
+    "dataType": "varchar(255)",
+    "description": ""
+  },
+  {
+    "columnName": "created_at",
+    "displayName": "",
+    "dataType": "datetime(3)",
+    "description": ""
+  },
+  {
+    "columnName": "x_time",
+    "displayName": "time",
+    "dataType": "timestamp",
+    "description": "test for time"
+  },
+  {
+    "columnName": "x_int",
+    "displayName": "bigint",
+    "dataType": "bigint",
+    "description": "test for int"
+  },
+  {
+    "columnName": "x_float",
+    "displayName": "float",
+    "dataType": "float",
+    "description": "test for float"
+  },
+  {
+    "columnName": "x_text",
+    "displayName": "text",
+    "dataType": "text",
+    "description": "test for text"
+  },
+  {
+    "columnName": "x_varchar",
+    "displayName": "varchar",
+    "dataType": "varchar(255)",
+    "description": "test for varchar"
+  }
 ]
+
 ```
-Create extension column `x_test` for the table `issues`
+
+### Create A Customized Column
+Create a new column `x_abc` with datatype `varchar(255)` for the table `issues`.
+The value of `columnName` must start with `x_` and consist of no more than 50 alphanumerics and underscores
+The value of field `dataType` must be one of the following 5 types:
+- varchar(255)
+- text
+- bigint
+- float
+- timestamp 
 
 > POST /plugins/customize/issues/fields
 ```json
 {
-    "name": "x_test"
+  "columnName": "x_abc",
+  "displayName": "ABC",
+  "dataType": "varchar(255)",
+  "description": "test field"
 }
 ```
+
+### Drop A Column
 Drop the column `x_text` for the table `issues`
+
 > DELETE /plugins/customize/issues/fields/x_test
+
+### Upload `issues.csv` file
+
+> POST /plugins/customize/csvfiles/issues.csv
+
+The HTTP `Content-Type` must be  `multipart/form-data`, and the form should have three fields:
+
+- `file` the CSV file
+- `boardId` It will be written to the `id` field of the `boards` table, the `board_id` field of `board_issues`, and the `_raw_data_params` field of `issues`
+- `boardName` It will be written to the `name` field of the `boards` table
+
+Upload a CSV file and import it to the `issues` table via this API. There should be no extra fields in the file except the `labels` field, 
+and if the field value is `NULL`, it should be `NULL` in the CSV instead of the empty string.
+DevLake will parse the CSV file and store it in the `issues` table, where the `labels` are stored in the `issue_labels` table. 
+If the `boardId` does not appear, a new record will be created in the boards table. The `board_issues` table will be updated at the same time as the import.
+The following is an issues.CSV file sample:
+
+|id                           |created_at             |updated_at             |_raw_data_params     |_raw_data_table          |_raw_data_id|_raw_data_remark|url                                                                 |icon_url|issue_key|title        |description                      |epic_key|type |status|original_status|story_point|resolution_date|created_date                 |updated_date                 |parent_issue_id|priority|original_estimate_minutes|time_spent_minutes|time_remaining_minutes|creator_id                                           |creator_name|assignee_id                                          |assignee_name|severity|component|lead_time_minutes|original_project|original_type|x_int         |x_time             |x_varchar|x_float|labels              |

Review Comment:
   No, it was set by the code. However, usually, the orm framework will help us set them up automatically



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@devlake.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-devlake-website] Startrekzky commented on a diff in pull request #453: doc: update the doc of plugin customize

Posted by "Startrekzky (via GitHub)" <gi...@apache.org>.
Startrekzky commented on code in PR #453:
URL: https://github.com/apache/incubator-devlake-website/pull/453#discussion_r1124229993


##########
docs/Plugins/customize.md:
##########
@@ -11,10 +11,17 @@ description: >
 This plugin provides users the ability to create/delete columns and extract data from certain raw layer tables.
 The columns created with this plugin must be start with the prefix `x_`
 
-**NOTE:** All columns created by this plugin are of the datatype `VARCHAR(255)`
+**NOTE:** For now, only the following 5 types were supported:

Review Comment:
   Text change: For now, only the following five types are supported:



##########
docs/Plugins/customize.md:
##########
@@ -11,10 +11,17 @@ description: >
 This plugin provides users the ability to create/delete columns and extract data from certain raw layer tables.

Review Comment:
   This plugin provides users the ability to: 
   - Add/delete columns in domain layer tables[URL]
   - Insert values to the new columns with data extracted from certain raw layer tables



##########
docs/Plugins/customize.md:
##########
@@ -75,25 +82,129 @@ curl 'http://localhost:8080/pipelines' \
 }
 '
 ```
-Get all extension columns(start with `x_`) of the table `issues`
+
+### List Columns
+Get all columns of the table `issues`
 > GET /plugins/customize/issues/fields
 
+**NOTE** some fields are omitted in the following example
 response
 ```json
 [
-    {
-        "columnName": "x_test",
-        "columnType": "VARCHAR(255)"
-    }
+  {
+    "columnName": "id",
+    "displayName": "",
+    "dataType": "varchar(255)",
+    "description": ""
+  },
+  {
+    "columnName": "created_at",
+    "displayName": "",
+    "dataType": "datetime(3)",
+    "description": ""
+  },
+  {
+    "columnName": "x_time",
+    "displayName": "time",
+    "dataType": "timestamp",
+    "description": "test for time"
+  },
+  {
+    "columnName": "x_int",
+    "displayName": "bigint",
+    "dataType": "bigint",
+    "description": "test for int"
+  },
+  {
+    "columnName": "x_float",
+    "displayName": "float",
+    "dataType": "float",
+    "description": "test for float"
+  },
+  {
+    "columnName": "x_text",
+    "displayName": "text",
+    "dataType": "text",
+    "description": "test for text"
+  },
+  {
+    "columnName": "x_varchar",
+    "displayName": "varchar",
+    "dataType": "varchar(255)",
+    "description": "test for varchar"
+  }
 ]
+
 ```
-Create extension column `x_test` for the table `issues`
+
+### Create A Customized Column
+Create a new column `x_abc` with datatype `varchar(255)` for the table `issues`.
+The value of `columnName` must start with `x_` and consist of no more than 50 alphanumerics and underscores
+The value of field `dataType` must be one of the following 5 types:
+- varchar(255)
+- text
+- bigint
+- float
+- timestamp 
 
 > POST /plugins/customize/issues/fields
 ```json
 {
-    "name": "x_test"
+  "columnName": "x_abc",
+  "displayName": "ABC",
+  "dataType": "varchar(255)",
+  "description": "test field"
 }
 ```
+
+### Drop A Column
 Drop the column `x_text` for the table `issues`
+
 > DELETE /plugins/customize/issues/fields/x_test
+
+### Upload `issues.csv` file
+
+> POST /plugins/customize/csvfiles/issues.csv
+
+The HTTP `Content-Type` must be  `multipart/form-data`, and the form should have three fields:
+
+- `file` the CSV file
+- `boardId` It will be written to the `id` field of the `boards` table, the `board_id` field of `board_issues`, and the `_raw_data_params` field of `issues`
+- `boardName` It will be written to the `name` field of the `boards` table
+
+Upload a CSV file and import it to the `issues` table via this API. There should be no extra fields in the file except the `labels` field, 
+and if the field value is `NULL`, it should be `NULL` in the CSV instead of the empty string.
+DevLake will parse the CSV file and store it in the `issues` table, where the `labels` are stored in the `issue_labels` table. 
+If the `boardId` does not appear, a new record will be created in the boards table. The `board_issues` table will be updated at the same time as the import.
+The following is an issues.CSV file sample:
+
+|id                           |created_at             |updated_at             |_raw_data_params     |_raw_data_table          |_raw_data_id|_raw_data_remark|url                                                                 |icon_url|issue_key|title        |description                      |epic_key|type |status|original_status|story_point|resolution_date|created_date                 |updated_date                 |parent_issue_id|priority|original_estimate_minutes|time_spent_minutes|time_remaining_minutes|creator_id                                           |creator_name|assignee_id                                          |assignee_name|severity|component|lead_time_minutes|original_project|original_type|x_int         |x_time             |x_varchar|x_float|labels              |
+|-----------------------------|-----------------------|-----------------------|---------------------|-------------------------|------------|----------------|--------------------------------------------------------------------|--------|---------|-------------|---------------------------------|--------|-----|------|---------------|-----------|---------------|-----------------------------|-----------------------------|---------------|--------|-------------------------|------------------|----------------------|-----------------------------------------------------|------------|-----------------------------------------------------|-------------|--------|---------|-----------------|----------------|-------------|--------------|-------------------|---------|-------|--------------------|
+|bitbucket:BitbucketIssue:1:1 |2022-09-15 15:27:56.395|2022-09-15 15:27:56.395|board789             |_raw_bitbucket_api_issues|60          |                |https://api.bitbucket.org/2.0/repositories/thenicetgp/lake/issues/1 |        |1        |issue test   |bitbucket issues test for devlake|        |issue|TODO  |new            |0          |NULL           |2022-07-17 07:15:55.959+00:00|2022-07-17 09:11:42.656+00:00|               |major   |0                        |0                 |0                     |bitbucket:BitbucketAccount:1:62abf394192edb006fa0e8cf|tgp         |bitbucket:BitbucketAccount:1:62abf394192edb006fa0e8cf|tgp          |        |         |NULL             |NULL            |NULL         |10            |2022-09-15 15:27:56|world    |8      |NULL                |
+|bitbucket:BitbucketIssue:1:10|2022-09-15 15:27:56.395|2022-09-15 15:27:56.395|board789             |_raw_bitbucket_api_issues|52          |                |https://api.bitbucket.org/2.0/repositories/thenicetgp/lake/issues/10|        |10       |issue test007|issue test007                    |        |issue|TODO  |new            |0          |NULL           |2022-08-12 13:43:00.783+00:00|2022-08-12 13:43:00.783+00:00|               |trivial |0                        |0                 |0                     |bitbucket:BitbucketAccount:1:62abf394192edb006fa0e8cf|tgp         |bitbucket:BitbucketAccount:1:62abf394192edb006fa0e8cf|tgp          |        |         |NULL             |NULL            |NULL         |30            |2022-09-15 15:27:56|abc      |2456790|hello worlds        |
+|bitbucket:BitbucketIssue:1:13|2022-09-15 15:27:56.395|2022-09-15 15:27:56.395|board789             |_raw_bitbucket_api_issues|50          |                |https://api.bitbucket.org/2.0/repositories/thenicetgp/lake/issues/13|        |13       |issue test010|issue test010                    |        |issue|TODO  |new            |0          |NULL           |2022-08-12 13:44:46.508+00:00|2022-08-12 13:44:46.508+00:00|               |critical|0                        |0                 |0                     |bitbucket:BitbucketAccount:1:62abf394192edb006fa0e8cf|tgp         |                                                     |             |        |         |NULL             |NULL            |NULL         |1             |2022-09-15 15:27:56|NULL     |0.00014|NULL                |
+|bitbucket:BitbucketIssue:1:14|2022-09-15 15:27:56.395|2022-09-15 15:27:56.395|board789             |_raw_bitbucket_api_issues|49          |                |https://api.bitbucket.org/2.0/repositories/thenicetgp/lake/issues/14|        |14       |issue test011|issue test011                    |        |issue|TODO  |new            |0          |NULL           |2022-08-12 13:45:12.810+00:00|2022-08-12 13:45:12.810+00:00|               |blocker |0                        |0                 |0                     |bitbucket:BitbucketAccount:1:62abf394192edb006fa0e8cf|tgp         |bitbucket:BitbucketAccount:1:62abf394192edb006fa0e8cf|tgp          |        |         |NULL             |NULL            |NULL         |41534568464351|2022-09-15 15:27:56|NULL     |NULL   |label1,label2,label3|
+
+
+### Upload `issue_commitss.csv` file
+
+> POST /plugins/customize/csvfiles/issue_commits.csv
+
+The `Content-Type` should be  `multipart/form-data`, and the form should have three fields:
+
+- `file` the CSV file

Review Comment:
   These are not the three fields to be imported.



##########
docs/Plugins/customize.md:
##########
@@ -11,10 +11,17 @@ description: >
 This plugin provides users the ability to create/delete columns and extract data from certain raw layer tables.
 The columns created with this plugin must be start with the prefix `x_`

Review Comment:
   The names of columns added via this plugin must start with the prefix `x_`



##########
docs/Plugins/customize.md:
##########
@@ -75,25 +82,129 @@ curl 'http://localhost:8080/pipelines' \
 }
 '
 ```
-Get all extension columns(start with `x_`) of the table `issues`
+
+### List Columns
+Get all columns of the table `issues`
 > GET /plugins/customize/issues/fields
 
+**NOTE** some fields are omitted in the following example
 response
 ```json
 [
-    {
-        "columnName": "x_test",
-        "columnType": "VARCHAR(255)"
-    }
+  {
+    "columnName": "id",
+    "displayName": "",
+    "dataType": "varchar(255)",
+    "description": ""
+  },
+  {
+    "columnName": "created_at",
+    "displayName": "",
+    "dataType": "datetime(3)",
+    "description": ""
+  },
+  {
+    "columnName": "x_time",
+    "displayName": "time",
+    "dataType": "timestamp",
+    "description": "test for time"
+  },
+  {
+    "columnName": "x_int",
+    "displayName": "bigint",
+    "dataType": "bigint",
+    "description": "test for int"
+  },
+  {
+    "columnName": "x_float",
+    "displayName": "float",
+    "dataType": "float",
+    "description": "test for float"
+  },
+  {
+    "columnName": "x_text",
+    "displayName": "text",
+    "dataType": "text",
+    "description": "test for text"
+  },
+  {
+    "columnName": "x_varchar",
+    "displayName": "varchar",
+    "dataType": "varchar(255)",
+    "description": "test for varchar"
+  }
 ]
+
 ```
-Create extension column `x_test` for the table `issues`
+
+### Create A Customized Column
+Create a new column `x_abc` with datatype `varchar(255)` for the table `issues`.
+The value of `columnName` must start with `x_` and consist of no more than 50 alphanumerics and underscores
+The value of field `dataType` must be one of the following 5 types:
+- varchar(255)
+- text
+- bigint
+- float
+- timestamp 
 
 > POST /plugins/customize/issues/fields
 ```json
 {
-    "name": "x_test"
+  "columnName": "x_abc",
+  "displayName": "ABC",
+  "dataType": "varchar(255)",
+  "description": "test field"
 }
 ```
+
+### Drop A Column
 Drop the column `x_text` for the table `issues`
+
 > DELETE /plugins/customize/issues/fields/x_test
+
+### Upload `issues.csv` file
+
+> POST /plugins/customize/csvfiles/issues.csv
+
+The HTTP `Content-Type` must be  `multipart/form-data`, and the form should have three fields:
+
+- `file` the CSV file

Review Comment:
   `file`: The CSV file



##########
docs/Plugins/customize.md:
##########
@@ -75,25 +82,129 @@ curl 'http://localhost:8080/pipelines' \
 }
 '
 ```
-Get all extension columns(start with `x_`) of the table `issues`
+
+### List Columns
+Get all columns of the table `issues`
 > GET /plugins/customize/issues/fields
 
+**NOTE** some fields are omitted in the following example
 response
 ```json
 [
-    {
-        "columnName": "x_test",
-        "columnType": "VARCHAR(255)"
-    }
+  {
+    "columnName": "id",
+    "displayName": "",
+    "dataType": "varchar(255)",
+    "description": ""
+  },
+  {
+    "columnName": "created_at",
+    "displayName": "",
+    "dataType": "datetime(3)",
+    "description": ""
+  },
+  {
+    "columnName": "x_time",
+    "displayName": "time",
+    "dataType": "timestamp",
+    "description": "test for time"
+  },
+  {
+    "columnName": "x_int",
+    "displayName": "bigint",
+    "dataType": "bigint",
+    "description": "test for int"
+  },
+  {
+    "columnName": "x_float",
+    "displayName": "float",
+    "dataType": "float",
+    "description": "test for float"
+  },
+  {
+    "columnName": "x_text",
+    "displayName": "text",
+    "dataType": "text",
+    "description": "test for text"
+  },
+  {
+    "columnName": "x_varchar",
+    "displayName": "varchar",
+    "dataType": "varchar(255)",
+    "description": "test for varchar"
+  }
 ]
+
 ```
-Create extension column `x_test` for the table `issues`
+
+### Create A Customized Column
+Create a new column `x_abc` with datatype `varchar(255)` for the table `issues`.
+The value of `columnName` must start with `x_` and consist of no more than 50 alphanumerics and underscores
+The value of field `dataType` must be one of the following 5 types:
+- varchar(255)
+- text
+- bigint
+- float
+- timestamp 
 
 > POST /plugins/customize/issues/fields
 ```json
 {
-    "name": "x_test"
+  "columnName": "x_abc",
+  "displayName": "ABC",
+  "dataType": "varchar(255)",
+  "description": "test field"
 }
 ```
+
+### Drop A Column
 Drop the column `x_text` for the table `issues`
+
 > DELETE /plugins/customize/issues/fields/x_test
+
+### Upload `issues.csv` file
+
+> POST /plugins/customize/csvfiles/issues.csv
+
+The HTTP `Content-Type` must be  `multipart/form-data`, and the form should have three fields:
+
+- `file` the CSV file
+- `boardId` It will be written to the `id` field of the `boards` table, the `board_id` field of `board_issues`, and the `_raw_data_params` field of `issues`
+- `boardName` It will be written to the `name` field of the `boards` table
+
+Upload a CSV file and import it to the `issues` table via this API. There should be no extra fields in the file except the `labels` field, 
+and if the field value is `NULL`, it should be `NULL` in the CSV instead of the empty string.
+DevLake will parse the CSV file and store it in the `issues` table, where the `labels` are stored in the `issue_labels` table. 
+If the `boardId` does not appear, a new record will be created in the boards table. The `board_issues` table will be updated at the same time as the import.
+The following is an issues.CSV file sample:
+
+|id                           |created_at             |updated_at             |_raw_data_params     |_raw_data_table          |_raw_data_id|_raw_data_remark|url                                                                 |icon_url|issue_key|title        |description                      |epic_key|type |status|original_status|story_point|resolution_date|created_date                 |updated_date                 |parent_issue_id|priority|original_estimate_minutes|time_spent_minutes|time_remaining_minutes|creator_id                                           |creator_name|assignee_id                                          |assignee_name|severity|component|lead_time_minutes|original_project|original_type|x_int         |x_time             |x_varchar|x_float|labels              |

Review Comment:
   I reckon that `created_at` and `updated_at` are system fields in MySQL, it should not be defined by users?
   ![image](https://user-images.githubusercontent.com/14050754/222691383-2db56b65-f57f-4b18-81ff-a751b49f566f.png)
   



##########
docs/Plugins/customize.md:
##########
@@ -75,25 +82,129 @@ curl 'http://localhost:8080/pipelines' \
 }
 '
 ```
-Get all extension columns(start with `x_`) of the table `issues`
+
+### List Columns
+Get all columns of the table `issues`
 > GET /plugins/customize/issues/fields
 
+**NOTE** some fields are omitted in the following example
 response
 ```json
 [
-    {
-        "columnName": "x_test",
-        "columnType": "VARCHAR(255)"
-    }
+  {
+    "columnName": "id",
+    "displayName": "",
+    "dataType": "varchar(255)",
+    "description": ""
+  },
+  {
+    "columnName": "created_at",
+    "displayName": "",
+    "dataType": "datetime(3)",
+    "description": ""
+  },
+  {
+    "columnName": "x_time",
+    "displayName": "time",
+    "dataType": "timestamp",
+    "description": "test for time"
+  },
+  {
+    "columnName": "x_int",
+    "displayName": "bigint",
+    "dataType": "bigint",
+    "description": "test for int"
+  },
+  {
+    "columnName": "x_float",
+    "displayName": "float",
+    "dataType": "float",
+    "description": "test for float"
+  },
+  {
+    "columnName": "x_text",
+    "displayName": "text",
+    "dataType": "text",
+    "description": "test for text"
+  },
+  {
+    "columnName": "x_varchar",
+    "displayName": "varchar",
+    "dataType": "varchar(255)",
+    "description": "test for varchar"
+  }
 ]
+
 ```
-Create extension column `x_test` for the table `issues`
+
+### Create A Customized Column
+Create a new column `x_abc` with datatype `varchar(255)` for the table `issues`.
+The value of `columnName` must start with `x_` and consist of no more than 50 alphanumerics and underscores
+The value of field `dataType` must be one of the following 5 types:
+- varchar(255)
+- text
+- bigint
+- float
+- timestamp 
 
 > POST /plugins/customize/issues/fields
 ```json
 {
-    "name": "x_test"
+  "columnName": "x_abc",
+  "displayName": "ABC",
+  "dataType": "varchar(255)",
+  "description": "test field"
 }
 ```
+
+### Drop A Column
 Drop the column `x_text` for the table `issues`
+
 > DELETE /plugins/customize/issues/fields/x_test
+
+### Upload `issues.csv` file
+
+> POST /plugins/customize/csvfiles/issues.csv
+
+The HTTP `Content-Type` must be  `multipart/form-data`, and the form should have three fields:
+
+- `file` the CSV file
+- `boardId` It will be written to the `id` field of the `boards` table, the `board_id` field of `board_issues`, and the `_raw_data_params` field of `issues`
+- `boardName` It will be written to the `name` field of the `boards` table
+
+Upload a CSV file and import it to the `issues` table via this API. There should be no extra fields in the file except the `labels` field, 
+and if the field value is `NULL`, it should be `NULL` in the CSV instead of the empty string.
+DevLake will parse the CSV file and store it in the `issues` table, where the `labels` are stored in the `issue_labels` table. 
+If the `boardId` does not appear, a new record will be created in the boards table. The `board_issues` table will be updated at the same time as the import.
+The following is an issues.CSV file sample:
+
+|id                           |created_at             |updated_at             |_raw_data_params     |_raw_data_table          |_raw_data_id|_raw_data_remark|url                                                                 |icon_url|issue_key|title        |description                      |epic_key|type |status|original_status|story_point|resolution_date|created_date                 |updated_date                 |parent_issue_id|priority|original_estimate_minutes|time_spent_minutes|time_remaining_minutes|creator_id                                           |creator_name|assignee_id                                          |assignee_name|severity|component|lead_time_minutes|original_project|original_type|x_int         |x_time             |x_varchar|x_float|labels              |
+|-----------------------------|-----------------------|-----------------------|---------------------|-------------------------|------------|----------------|--------------------------------------------------------------------|--------|---------|-------------|---------------------------------|--------|-----|------|---------------|-----------|---------------|-----------------------------|-----------------------------|---------------|--------|-------------------------|------------------|----------------------|-----------------------------------------------------|------------|-----------------------------------------------------|-------------|--------|---------|-----------------|----------------|-------------|--------------|-------------------|---------|-------|--------------------|
+|bitbucket:BitbucketIssue:1:1 |2022-09-15 15:27:56.395|2022-09-15 15:27:56.395|board789             |_raw_bitbucket_api_issues|60          |                |https://api.bitbucket.org/2.0/repositories/thenicetgp/lake/issues/1 |        |1        |issue test   |bitbucket issues test for devlake|        |issue|TODO  |new            |0          |NULL           |2022-07-17 07:15:55.959+00:00|2022-07-17 09:11:42.656+00:00|               |major   |0                        |0                 |0                     |bitbucket:BitbucketAccount:1:62abf394192edb006fa0e8cf|tgp         |bitbucket:BitbucketAccount:1:62abf394192edb006fa0e8cf|tgp          |        |         |NULL             |NULL            |NULL         |10            |2022-09-15 15:27:56|world    |8      |NULL                |
+|bitbucket:BitbucketIssue:1:10|2022-09-15 15:27:56.395|2022-09-15 15:27:56.395|board789             |_raw_bitbucket_api_issues|52          |                |https://api.bitbucket.org/2.0/repositories/thenicetgp/lake/issues/10|        |10       |issue test007|issue test007                    |        |issue|TODO  |new            |0          |NULL           |2022-08-12 13:43:00.783+00:00|2022-08-12 13:43:00.783+00:00|               |trivial |0                        |0                 |0                     |bitbucket:BitbucketAccount:1:62abf394192edb006fa0e8cf|tgp         |bitbucket:BitbucketAccount:1:62abf394192edb006fa0e8cf|tgp          |        |         |NULL             |NULL            |NULL         |30            |2022-09-15 15:27:56|abc      |2456790|hello worlds        |
+|bitbucket:BitbucketIssue:1:13|2022-09-15 15:27:56.395|2022-09-15 15:27:56.395|board789             |_raw_bitbucket_api_issues|50          |                |https://api.bitbucket.org/2.0/repositories/thenicetgp/lake/issues/13|        |13       |issue test010|issue test010                    |        |issue|TODO  |new            |0          |NULL           |2022-08-12 13:44:46.508+00:00|2022-08-12 13:44:46.508+00:00|               |critical|0                        |0                 |0                     |bitbucket:BitbucketAccount:1:62abf394192edb006fa0e8cf|tgp         |                                                     |             |        |         |NULL             |NULL            |NULL         |1             |2022-09-15 15:27:56|NULL     |0.00014|NULL                |
+|bitbucket:BitbucketIssue:1:14|2022-09-15 15:27:56.395|2022-09-15 15:27:56.395|board789             |_raw_bitbucket_api_issues|49          |                |https://api.bitbucket.org/2.0/repositories/thenicetgp/lake/issues/14|        |14       |issue test011|issue test011                    |        |issue|TODO  |new            |0          |NULL           |2022-08-12 13:45:12.810+00:00|2022-08-12 13:45:12.810+00:00|               |blocker |0                        |0                 |0                     |bitbucket:BitbucketAccount:1:62abf394192edb006fa0e8cf|tgp         |bitbucket:BitbucketAccount:1:62abf394192edb006fa0e8cf|tgp          |        |         |NULL             |NULL            |NULL         |41534568464351|2022-09-15 15:27:56|NULL     |NULL   |label1,label2,label3|
+
+
+### Upload `issue_commitss.csv` file

Review Comment:
   Typo: issue_commits.csv



##########
docs/Plugins/customize.md:
##########
@@ -75,25 +82,129 @@ curl 'http://localhost:8080/pipelines' \
 }
 '
 ```
-Get all extension columns(start with `x_`) of the table `issues`
+
+### List Columns
+Get all columns of the table `issues`
 > GET /plugins/customize/issues/fields
 
+**NOTE** some fields are omitted in the following example
 response
 ```json
 [
-    {
-        "columnName": "x_test",
-        "columnType": "VARCHAR(255)"
-    }
+  {
+    "columnName": "id",
+    "displayName": "",
+    "dataType": "varchar(255)",
+    "description": ""
+  },
+  {
+    "columnName": "created_at",
+    "displayName": "",
+    "dataType": "datetime(3)",
+    "description": ""
+  },
+  {
+    "columnName": "x_time",
+    "displayName": "time",
+    "dataType": "timestamp",
+    "description": "test for time"
+  },
+  {
+    "columnName": "x_int",
+    "displayName": "bigint",
+    "dataType": "bigint",
+    "description": "test for int"
+  },
+  {
+    "columnName": "x_float",
+    "displayName": "float",
+    "dataType": "float",
+    "description": "test for float"
+  },
+  {
+    "columnName": "x_text",
+    "displayName": "text",
+    "dataType": "text",
+    "description": "test for text"
+  },
+  {
+    "columnName": "x_varchar",
+    "displayName": "varchar",
+    "dataType": "varchar(255)",
+    "description": "test for varchar"
+  }
 ]
+
 ```
-Create extension column `x_test` for the table `issues`
+
+### Create A Customized Column

Review Comment:
   “Create a Customized Column”. Use lowercase for crowns like a, an, the



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@devlake.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org