You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@iotdb.apache.org by jf...@apache.org on 2021/03/26 09:16:57 UTC

[iotdb] branch feature/restrucutre-python-module updated (5f058f1 -> c35fb87)

This is an automated email from the ASF dual-hosted git repository.

jfeinauer pushed a change to branch feature/restrucutre-python-module
in repository https://gitbox.apache.org/repos/asf/iotdb.git.


 discard 5f058f1  [CLIENT-PY] Changed directory layout to be more pythonic.
     add c351b32  fix add "-" in thread name (#2884)
     add ec70703  [IOTDB-1214] Add Flink-IoTDB documents to the website (#2813)
     add b214511  [IOTDB-1251] optimize route intervals (#2886)
     add 155df75  [IOTDB-1207]Open time partition causes stackoverflow in Cluster Module (#2809)
     add 77cdc38  [IOTDB-1240]support show node command in cluster version (#2866)
     add ee4f073  Fix_IoTDBNewTsFileCompactionIT_problem (#2890)
     add 3781cc2  fix spelling mistake (#2889)
     add 64c11bc  [IOTDB-1255] refactor cluster package and jar name (#2892)
     add 5455d3b  [IOTDB-1252] optimize test mult-times perform (#2887)
     add 4f4ee90  Update cluster setup example after changing artifactId in cluster module  (#2899)
     add 85604b4  [IOTDB-1250] fix pull schema bug that the pullTimeSeriesSchema and  pullMeasurementSchema is misused(#2894)
     add 8fc8de3  [IOTDB-854] Limit the memory foorprint of the committed log cache (#2879)
     add ec5b756  fix Flink TsFile doc url error caused by PR2813 (#2893)
     add 9b22439  IOTDB-1241 support redirect query for cluster (#2867)
     add 3bdde72  Open IoTDBRemovePartitionIT ci log for diagnosis; fix System.out is closed in IoTDBCheckConfigIT (#2812)
     add 4377d18  [IOTDB-1256] upgrade Jackson to 2.11.0 because of loopholes CVE-2020-25649 (#2896)
     add 952a180  Add the method size() to Row in UDF (#2897)
     add 910b184  [IOTDB-1117][Distributed]Batched creation and fetch of RemoteSeriesReader(#2875)
     add be1d6a6  Fix alter context for max_degree_of_index_node default-value. (#2910)
     add 1c66316  fix Chinese Flink TsFile doc name
     add e67b770  [IOTDB-1236] Improve jdbc performance for creating timeseries in cluster module (#2872)
     add c5be711  Improve Python Client (#2898)
     add 876a576  Fix Add Operation bug introduced by #2872 (#2918)
     add 1fd2e19  [IOTDB-1248] forward the pull schema request to the leader (#2902)
     new 899a2ec  [CLIENT-PY] Changed directory layout to be more pythonic.
     new 5c9e579  Added first test.
     add 8a7aa94  [CLIENT-PY] Added support for pandas dataframe to make data science easier.
     new a2b104e  Merge branch 'feature/client-py-pandas' into feature/restrucutre-python-module
     new c35fb87  Added automatic test. Added flake8 and config. Added black and config.

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (5f058f1)
            \
             N -- N -- N   refs/heads/feature/restrucutre-python-module (c35fb87)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 4 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .gitignore                                         |   2 +-
 client-py/.flake8                                  |  16 +
 client-py/README.md                                |  56 +++-
 client-py/SessionExample.py                        | 134 +++++---
 client-py/SessionTest.py                           | 207 +++++++++----
 client-py/iotdb/Session.py                         | 343 ++++++++++++++++-----
 client-py/iotdb/TestContainer.py                   |  51 +++
 client-py/iotdb/__init__.py                        |   1 -
 client-py/iotdb/utils/Field.py                     |  10 +-
 client-py/iotdb/utils/IoTDBRpcDataSet.py           |  90 ++++--
 client-py/iotdb/utils/RowRecord.py                 |   5 +-
 client-py/iotdb/utils/SessionDataSet.py            | 109 +++++--
 client-py/iotdb/utils/Tablet.py                    |  16 +-
 client-py/iotdb/utils/__init__.py                  |   1 -
 client-py/pyproject.toml                           |  27 ++
 client-py/requirements.txt                         |   5 +-
 client-py/requirements_dev.txt                     |   7 +-
 client-py/setup.py                                 |  17 +-
 client-py/tests/__init__.py                        |   0
 client-py/tests/test_dataframe.py                  |  26 +-
 cluster/pom.xml                                    |   2 +-
 .../resources/conf/iotdb-cluster.properties        |   7 +
 .../cluster/client/sync/SyncClientAdaptor.java     |  17 +
 .../apache/iotdb/cluster/config/ClusterConfig.java |  21 ++
 .../iotdb/cluster/config/ClusterDescriptor.java    |  11 +
 .../java/org/apache/iotdb/cluster/log/Log.java     |  10 +
 .../apache/iotdb/cluster/log/LogDispatcher.java    |   8 +-
 .../cluster/log/manage/CommittedEntryManager.java  |  34 ++
 .../iotdb/cluster/log/manage/RaftLogManager.java   |  37 ++-
 .../apache/iotdb/cluster/metadata/CMManager.java   |  28 +-
 .../apache/iotdb/cluster/metadata/MetaPuller.java  |   3 +-
 .../cluster/query/ClusterDataQueryExecutor.java    | 213 ++++++++++++-
 .../iotdb/cluster/query/ClusterPlanExecutor.java   |  86 ++++++
 .../iotdb/cluster/query/LocalQueryExecutor.java    | 143 +++++++++
 .../query/aggregate/ClusterAggregateExecutor.java  |   2 +-
 .../groupby/ClusterGroupByVFilterDataSet.java      |   2 +-
 .../cluster/query/reader/ClusterReaderFactory.java | 321 ++++++++++++++++++-
 .../cluster/query/reader/ClusterTimeGenerator.java | 145 ++++++++-
 .../query/reader/mult/AbstractMultPointReader.java |  70 +++++
 .../AssignPathManagedMergeReader.java}             |  12 +-
 .../reader/mult/AssignPathPriorityMergeReader.java |  66 ++++
 .../query/reader/mult/IMultBatchReader.java        |  11 +-
 .../cluster/query/reader/mult/MultBatchReader.java |  73 +++++
 .../MultDataSourceInfo.java}                       | 114 +++----
 .../cluster/query/reader/mult/MultEmptyReader.java |  30 +-
 .../reader/mult/MultSeriesRawDataPointReader.java  |  55 ++++
 .../query/reader/mult/RemoteMultSeriesReader.java  | 222 +++++++++++++
 .../apache/iotdb/cluster/server/ClientServer.java  |   2 +-
 .../iotdb/cluster/server/DataClusterServer.java    |  50 +++
 ...dler.java => GetChildNodeNextLevelHandler.java} |   9 +-
 .../cluster/server/member/MetaGroupMember.java     |  53 ++--
 .../iotdb/cluster/server/member/RaftMember.java    |  11 +-
 .../cluster/server/service/DataAsyncService.java   | 119 +++++--
 .../cluster/server/service/DataSyncService.java    | 133 +++++---
 .../cluster/client/sync/SyncClientAdaptorTest.java |   9 +
 .../iotdb/cluster/common/TestAsyncDataClient.java  |  39 ++-
 .../org/apache/iotdb/cluster/common/TestUtils.java |   1 +
 .../cluster/log/manage/RaftLogManagerTest.java     |  35 +++
 .../apache/iotdb/cluster/query/BaseQueryTest.java  |   4 +-
 .../query/ClusterDataQueryExecutorTest.java        |  83 +++++
 .../query/groupby/RemoteGroupByExecutorTest.java   |   4 +-
 .../query/reader/ClusterTimeGeneratorTest.java     |   2 +-
 .../mult/AssignPathManagedMergeReaderTest.java     | 190 ++++++++++++
 .../mult/MultSeriesRawDataPointReaderTest.java     |  67 ++++
 .../reader/mult/RemoteMultSeriesReaderTest.java    | 286 +++++++++++++++++
 .../member/{MemberTest.java => BaseMember.java}    | 210 +------------
 .../cluster/server/member/DataGroupMemberTest.java |  66 +++-
 .../cluster/server/member/MetaGroupMemberTest.java |  35 ++-
 .../cluster/server/member/RaftMemberTest.java      | 244 +++++++++++++++
 docs/SystemDesign/TsFile/Format.md                 |   2 +-
 .../UserGuide/Ecosystem Integration/Flink IoTDB.md |  67 ++--
 .../Ecosystem Integration/Flink TsFile.md          |  17 +-
 docs/UserGuide/Server/Cluster Setup.md             |   6 +-
 docs/zh/SystemDesign/StorageEngine/FileLists.md    |   2 +-
 docs/zh/SystemDesign/TsFile/Format.md              |   2 +-
 .../UserGuide/Ecosystem Integration/Flink IoTDB.md | 122 ++++++++
 .../Ecosystem Integration/Flink TsFile.md          |  38 +--
 docs/zh/UserGuide/Server/Cluster Setup.md          |   6 +-
 .../main/java/org/apache/iotdb/SessionExample.java | 102 +++++-
 pom.xml                                            |   2 +-
 .../apache/iotdb/db/cost/statistic/Operation.java  |   1 +
 .../org/apache/iotdb/db/engine/StorageEngine.java  |   4 +-
 .../engine/compaction/utils/CompactionUtils.java   |   2 +-
 .../iotdb/db/qp/physical/crud/QueryPlan.java       |  10 +
 .../db/query/dataset/AlignByDeviceDataSet.java     |  11 +
 .../dataset/RawQueryDataSetWithoutValueFilter.java |  11 +
 .../db/query/executor/RawDataQueryExecutor.java    |  25 ++
 .../universal/CachedPriorityMergeReader.java       |   3 +-
 .../reader/universal/DescPriorityMergeReader.java  |   4 +-
 .../iotdb/db/query/reader/universal/Element.java   |  72 +++++
 .../reader/universal/PriorityMergeReader.java      |  39 +--
 .../apache/iotdb/db/query/udf/api/access/Row.java  |   7 +
 .../iotdb/db/query/udf/core/access/RowImpl.java    |   5 +
 .../org/apache/iotdb/db/service/TSServiceImpl.java | 202 +++++++++++-
 .../iotdb/db/integration/IoTDBCheckConfigIT.java   |   1 +
 .../iotdb/db/integration/IoTDBExecuteBatchIT.java  | 118 +++++++
 .../db/integration/IoTDBNewTsFileCompactionIT.java |   2 +-
 .../db/integration/IoTDBRemovePartitionIT.java     |  10 +
 server/src/test/resources/logback.xml              |   8 +-
 .../org/apache/iotdb/rpc/RedirectException.java    |   3 +-
 .../java/org/apache/iotdb/session/Session.java     | 103 ++++++-
 .../apache/iotdb/session/SessionConnection.java    |  32 +-
 site/src/main/.vuepress/config.js                  |   8 +-
 thrift/src/main/thrift/cluster.thrift              |  32 ++
 thrift/src/main/thrift/rpc.thrift                  |   3 +
 .../tsfile/read/query/dataset/QueryDataSet.java    |  48 +++
 .../apache/iotdb/tsfile/write/schema/Schema.java   |   2 +-
 107 files changed, 4798 insertions(+), 852 deletions(-)
 create mode 100644 client-py/.flake8
 create mode 100644 client-py/iotdb/TestContainer.py
 create mode 100644 client-py/pyproject.toml
 create mode 100644 client-py/tests/__init__.py
 create mode 100644 cluster/src/main/java/org/apache/iotdb/cluster/query/reader/mult/AbstractMultPointReader.java
 copy cluster/src/main/java/org/apache/iotdb/cluster/query/reader/{ManagedMergeReader.java => mult/AssignPathManagedMergeReader.java} (88%)
 create mode 100644 cluster/src/main/java/org/apache/iotdb/cluster/query/reader/mult/AssignPathPriorityMergeReader.java
 copy tsfile/src/main/java/org/apache/iotdb/tsfile/read/reader/IBatchReader.java => cluster/src/main/java/org/apache/iotdb/cluster/query/reader/mult/IMultBatchReader.java (75%)
 create mode 100644 cluster/src/main/java/org/apache/iotdb/cluster/query/reader/mult/MultBatchReader.java
 copy cluster/src/main/java/org/apache/iotdb/cluster/query/reader/{DataSourceInfo.java => mult/MultDataSourceInfo.java} (70%)
 copy server/src/main/java/org/apache/iotdb/db/query/externalsort/serialize/IExternalSortFileDeserializer.java => cluster/src/main/java/org/apache/iotdb/cluster/query/reader/mult/MultEmptyReader.java (60%)
 create mode 100644 cluster/src/main/java/org/apache/iotdb/cluster/query/reader/mult/MultSeriesRawDataPointReader.java
 create mode 100644 cluster/src/main/java/org/apache/iotdb/cluster/query/reader/mult/RemoteMultSeriesReader.java
 copy cluster/src/main/java/org/apache/iotdb/cluster/server/handlers/caller/{GetChildNodeNextLevelPathHandler.java => GetChildNodeNextLevelHandler.java} (80%)
 create mode 100644 cluster/src/test/java/org/apache/iotdb/cluster/query/reader/mult/AssignPathManagedMergeReaderTest.java
 create mode 100644 cluster/src/test/java/org/apache/iotdb/cluster/query/reader/mult/MultSeriesRawDataPointReaderTest.java
 create mode 100644 cluster/src/test/java/org/apache/iotdb/cluster/query/reader/mult/RemoteMultSeriesReaderTest.java
 rename cluster/src/test/java/org/apache/iotdb/cluster/server/member/{MemberTest.java => BaseMember.java} (59%)
 create mode 100644 cluster/src/test/java/org/apache/iotdb/cluster/server/member/RaftMemberTest.java
 copy example/flink/src/main/java/org/apache/iotdb/flink/FlinkIoTDBSink.java => docs/UserGuide/Ecosystem Integration/Flink IoTDB.md (59%)
 copy flink-tsfile-connector/README.md => docs/UserGuide/Ecosystem Integration/Flink TsFile.md (97%)
 create mode 100644 docs/zh/UserGuide/Ecosystem Integration/Flink IoTDB.md
 copy flink-tsfile-connector/README.md => docs/zh/UserGuide/Ecosystem Integration/Flink TsFile.md (86%)
 create mode 100644 server/src/main/java/org/apache/iotdb/db/query/reader/universal/Element.java

[iotdb] 01/04: [CLIENT-PY] Changed directory layout to be more pythonic.

Posted by jf...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

jfeinauer pushed a commit to branch feature/restrucutre-python-module
in repository https://gitbox.apache.org/repos/asf/iotdb.git

commit 899a2ecc926a603d5232f0b04949e63249ce9893
Author: Julian Feinauer <j....@pragmaticminds.de>
AuthorDate: Fri Mar 26 09:34:06 2021 +0100

    [CLIENT-PY] Changed directory layout to be more pythonic.
---
 client-py/.gitignore                               |   1 +
 client-py/LICENSE                                  | 236 +++++++++++++++++++++
 client-py/{pypi => }/README.md                     |  90 +++++++-
 client-py/{src => }/SessionExample.py              |   0
 client-py/{src => }/SessionTest.py                 |   0
 client-py/{src => }/iotdb/Session.py               |   0
 client-py/{src/iotdb/utils => iotdb}/__init__.py   |   0
 client-py/{src => }/iotdb/utils/Field.py           |   0
 client-py/{src => }/iotdb/utils/IoTDBConstants.py  |   0
 client-py/{src => }/iotdb/utils/IoTDBRpcDataSet.py |   0
 client-py/{src => }/iotdb/utils/RowRecord.py       |   0
 client-py/{src => }/iotdb/utils/SessionDataSet.py  |   0
 client-py/{src => }/iotdb/utils/Tablet.py          |   0
 client-py/{src/iotdb => iotdb/utils}/__init__.py   |   0
 client-py/pom.xml                                  |  84 +++-----
 client-py/readme.md                                |  80 -------
 client-py/requirements.txt                         |   1 +
 client-py/requirements_dev.txt                     |   1 +
 client-py/{pypi => }/setup.py                      |   1 +
 client-py/tests/test_dataframe.py                  |   5 +
 20 files changed, 361 insertions(+), 138 deletions(-)

diff --git a/client-py/.gitignore b/client-py/.gitignore
new file mode 100644
index 0000000..ff6532c
--- /dev/null
+++ b/client-py/.gitignore
@@ -0,0 +1 @@
+/iotdb/thrift/
diff --git a/client-py/LICENSE b/client-py/LICENSE
new file mode 100644
index 0000000..d84f2bd
--- /dev/null
+++ b/client-py/LICENSE
@@ -0,0 +1,236 @@
+                                 Apache License
+                           Version 2.0, January 2004
+                        http://www.apache.org/licenses/
+
+   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+   1. Definitions.
+
+      "License" shall mean the terms and conditions for use, reproduction,
+      and distribution as defined by Sections 1 through 9 of this document.
+
+      "Licensor" shall mean the copyright owner or entity authorized by
+      the copyright owner that is granting the License.
+
+      "Legal Entity" shall mean the union of the acting entity and all
+      other entities that control, are controlled by, or are under common
+      control with that entity. For the purposes of this definition,
+      "control" means (i) the power, direct or indirect, to cause the
+      direction or management of such entity, whether by contract or
+      otherwise, or (ii) ownership of fifty percent (50%) or more of the
+      outstanding shares, or (iii) beneficial ownership of such entity.
+
+      "You" (or "Your") shall mean an individual or Legal Entity
+      exercising permissions granted by this License.
+
+      "Source" form shall mean the preferred form for making modifications,
+      including but not limited to software source code, documentation
+      source, and configuration files.
+
+      "Object" form shall mean any form resulting from mechanical
+      transformation or translation of a Source form, including but
+      not limited to compiled object code, generated documentation,
+      and conversions to other media types.
+
+      "Work" shall mean the work of authorship, whether in Source or
+      Object form, made available under the License, as indicated by a
+      copyright notice that is included in or attached to the work
+      (an example is provided in the Appendix below).
+
+      "Derivative Works" shall mean any work, whether in Source or Object
+      form, that is based on (or derived from) the Work and for which the
+      editorial revisions, annotations, elaborations, or other modifications
+      represent, as a whole, an original work of authorship. For the purposes
+      of this License, Derivative Works shall not include works that remain
+      separable from, or merely link (or bind by name) to the interfaces of,
+      the Work and Derivative Works thereof.
+
+      "Contribution" shall mean any work of authorship, including
+      the original version of the Work and any modifications or additions
+      to that Work or Derivative Works thereof, that is intentionally
+      submitted to Licensor for inclusion in the Work by the copyright owner
+      or by an individual or Legal Entity authorized to submit on behalf of
+      the copyright owner. For the purposes of this definition, "submitted"
+      means any form of electronic, verbal, or written communication sent
+      to the Licensor or its representatives, including but not limited to
+      communication on electronic mailing lists, source code control systems,
+      and issue tracking systems that are managed by, or on behalf of, the
+      Licensor for the purpose of discussing and improving the Work, but
+      excluding communication that is conspicuously marked or otherwise
+      designated in writing by the copyright owner as "Not a Contribution."
+
+      "Contributor" shall mean Licensor and any individual or Legal Entity
+      on behalf of whom a Contribution has been received by Licensor and
+      subsequently incorporated within the Work.
+
+   2. Grant of Copyright License. Subject to the terms and conditions of
+      this License, each Contributor hereby grants to You a perpetual,
+      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+      copyright license to reproduce, prepare Derivative Works of,
+      publicly display, publicly perform, sublicense, and distribute the
+      Work and such Derivative Works in Source or Object form.
+
+   3. Grant of Patent License. Subject to the terms and conditions of
+      this License, each Contributor hereby grants to You a perpetual,
+      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+      (except as stated in this section) patent license to make, have made,
+      use, offer to sell, sell, import, and otherwise transfer the Work,
+      where such license applies only to those patent claims licensable
+      by such Contributor that are necessarily infringed by their
+      Contribution(s) alone or by combination of their Contribution(s)
+      with the Work to which such Contribution(s) was submitted. If You
+      institute patent litigation against any entity (including a
+      cross-claim or counterclaim in a lawsuit) alleging that the Work
+      or a Contribution incorporated within the Work constitutes direct
+      or contributory patent infringement, then any patent licenses
+      granted to You under this License for that Work shall terminate
+      as of the date such litigation is filed.
+
+   4. Redistribution. You may reproduce and distribute copies of the
+      Work or Derivative Works thereof in any medium, with or without
+      modifications, and in Source or Object form, provided that You
+      meet the following conditions:
+
+      (a) You must give any other recipients of the Work or
+          Derivative Works a copy of this License; and
+
+      (b) You must cause any modified files to carry prominent notices
+          stating that You changed the files; and
+
+      (c) You must retain, in the Source form of any Derivative Works
+          that You distribute, all copyright, patent, trademark, and
+          attribution notices from the Source form of the Work,
+          excluding those notices that do not pertain to any part of
+          the Derivative Works; and
+
+      (d) If the Work includes a "NOTICE" text file as part of its
+          distribution, then any Derivative Works that You distribute must
+          include a readable copy of the attribution notices contained
+          within such NOTICE file, excluding those notices that do not
+          pertain to any part of the Derivative Works, in at least one
+          of the following places: within a NOTICE text file distributed
+          as part of the Derivative Works; within the Source form or
+          documentation, if provided along with the Derivative Works; or,
+          within a display generated by the Derivative Works, if and
+          wherever such third-party notices normally appear. The contents
+          of the NOTICE file are for informational purposes only and
+          do not modify the License. You may add Your own attribution
+          notices within Derivative Works that You distribute, alongside
+          or as an addendum to the NOTICE text from the Work, provided
+          that such additional attribution notices cannot be construed
+          as modifying the License.
+
+      You may add Your own copyright statement to Your modifications and
+      may provide additional or different license terms and conditions
+      for use, reproduction, or distribution of Your modifications, or
+      for any such Derivative Works as a whole, provided Your use,
+      reproduction, and distribution of the Work otherwise complies with
+      the conditions stated in this License.
+
+   5. Submission of Contributions. Unless You explicitly state otherwise,
+      any Contribution intentionally submitted for inclusion in the Work
+      by You to the Licensor shall be under the terms and conditions of
+      this License, without any additional terms or conditions.
+      Notwithstanding the above, nothing herein shall supersede or modify
+      the terms of any separate license agreement you may have executed
+      with Licensor regarding such Contributions.
+
+   6. Trademarks. This License does not grant permission to use the trade
+      names, trademarks, service marks, or product names of the Licensor,
+      except as required for reasonable and customary use in describing the
+      origin of the Work and reproducing the content of the NOTICE file.
+
+   7. Disclaimer of Warranty. Unless required by applicable law or
+      agreed to in writing, Licensor provides the Work (and each
+      Contributor provides its Contributions) on an "AS IS" BASIS,
+      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+      implied, including, without limitation, any warranties or conditions
+      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+      PARTICULAR PURPOSE. You are solely responsible for determining the
+      appropriateness of using or redistributing the Work and assume any
+      risks associated with Your exercise of permissions under this License.
+
+   8. Limitation of Liability. In no event and under no legal theory,
+      whether in tort (including negligence), contract, or otherwise,
+      unless required by applicable law (such as deliberate and grossly
+      negligent acts) or agreed to in writing, shall any Contributor be
+      liable to You for damages, including any direct, indirect, special,
+      incidental, or consequential damages of any character arising as a
+      result of this License or out of the use or inability to use the
+      Work (including but not limited to damages for loss of goodwill,
+      work stoppage, computer failure or malfunction, or any and all
+      other commercial damages or losses), even if such Contributor
+      has been advised of the possibility of such damages.
+
+   9. Accepting Warranty or Additional Liability. While redistributing
+      the Work or Derivative Works thereof, You may choose to offer,
+      and charge a fee for, acceptance of support, warranty, indemnity,
+      or other liability obligations and/or rights consistent with this
+      License. However, in accepting such obligations, You may act only
+      on Your own behalf and on Your sole responsibility, not on behalf
+      of any other Contributor, and only if You agree to indemnify,
+      defend, and hold each Contributor harmless for any liability
+      incurred by, or claims asserted against, such Contributor by reason
+      of your accepting any such warranty or additional liability.
+
+   END OF TERMS AND CONDITIONS
+
+   APPENDIX: How to apply the Apache License to your work.
+
+      To apply the Apache License to your work, attach the following
+      boilerplate notice, with the fields enclosed by brackets "[]"
+      replaced with your own identifying information. (Don't include
+      the brackets!)  The text should be enclosed in the appropriate
+      comment syntax for the file format. We also recommend that a
+      file or class name and description of purpose be included on the
+      same "printed page" as the copyright notice for easier
+      identification within third-party archives.
+
+   Copyright [yyyy] [name of copyright owner]
+
+   Licensed under the Apache License, Version 2.0 (the "License");
+   you may not use this file except in compliance with the License.
+   You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+
+--------------------------------------------------------------------------------
+                           APACHE IOTDB SUBCOMPONENTS
+--------------------------------------------------------------------------------
+
+The following class is copied from maven-wrapper (https://github.com/takari/maven-wrapper),
+which is under Apache License 2.0:
+
+./.mvn/wrapper/MavenWrapperDownloader.java
+
+--------------------------------------------------------------------------------
+
+The following class is modified from Apache commons-collections
+
+./tsfile/src/main/java/org/apache/iotdb/tsfile/utils/Murmur128Hash.java
+Relevant pr is: https://github.com/apache/commons-collections/pull/83/
+
+--------------------------------------------------------------------------------
+
+The following files include code modified from Michael Burman's gorilla-tsc project.
+
+./tsfile/src/main/java/org/apache/iotdb/tsfile/encoding/encoder/GorillaEncoderV2.java
+./tsfile/src/main/java/org/apache/iotdb/tsfile/encoding/encoder/IntGorillaEncoder.java
+./tsfile/src/main/java/org/apache/iotdb/tsfile/encoding/encoder/LongGorillaEncoder.java
+./tsfile/src/main/java/org/apache/iotdb/tsfile/encoding/encoder/SinglePrecisionEncoderV2.java
+./tsfile/src/main/java/org/apache/iotdb/tsfile/encoding/encoder/DoublePrecisionEncoderV2.java
+./tsfile/src/main/java/org/apache/iotdb/tsfile/encoding/decoder/GorillaDecoderV2.java
+./tsfile/src/main/java/org/apache/iotdb/tsfile/encoding/decoder/IntGorillaDecoder.java
+./tsfile/src/main/java/org/apache/iotdb/tsfile/encoding/decoder/LongGorillaDecoder.java
+./tsfile/src/main/java/org/apache/iotdb/tsfile/encoding/decoder/SinglePrecisionDecoderV2.java
+./tsfile/src/main/java/org/apache/iotdb/tsfile/encoding/decoder/DoublePrecisionDecoderV2.java
+
+Copyright: 2016-2018 Michael Burman and/or other contributors
+Project page: https://github.com/burmanm/gorilla-tsc
+License: http://www.apache.org/licenses/LICENSE-2.0
diff --git a/client-py/pypi/README.md b/client-py/README.md
similarity index 58%
rename from client-py/pypi/README.md
rename to client-py/README.md
index 257c1d8..6ae309b 100644
--- a/client-py/pypi/README.md
+++ b/client-py/README.md
@@ -89,4 +89,92 @@ class MyTestCase(unittest.TestCase):
             session.close()
 ```
 
-by default it will load the image `apache/iotdb:latest`, if you want a specific version just pass it like e.g. `IoTDBContainer("apache/iotdb:0.10.0")` to get version `0.10.0` running.
\ No newline at end of file
+by default it will load the image `apache/iotdb:latest`, if you want a specific version just pass it like e.g. `IoTDBContainer("apache/iotdb:0.10.0")` to get version `0.10.0` running.
+
+## Pandas Support
+
+To easily transform a query result to a [Pandas Dataframe](https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.html)
+the SessionDataSet has a method `.todf()` which consumes the dataset and transforms it to a pandas dataframe.
+
+Example:
+
+```python
+
+from iotdb.Session import Session
+
+ip = "127.0.0.1"
+port_ = "6667"
+username_ = 'root'
+password_ = 'root'
+session = Session(ip, port_, username_, password_)
+session.open(False)
+result = session.execute_query_statement("SELECT * FROM root.*")
+
+# Transform to Pandas Dataset
+df = result.todf()
+
+session.close()
+
+# Now you can work with the dataframe
+df = ...
+```
+
+## Developers
+
+### Introduction
+
+This is an example of how to connect to IoTDB with python, using the thrift rpc interfaces. Things
+are almost the same on Windows or Linux, but pay attention to the difference like path separator.
+
+### Prerequisites
+
+python3.7 or later is preferred.
+
+You have to install Thrift (0.11.0 or later) to compile our thrift file into python code. Below is the official
+tutorial of installation, eventually, you should have a thrift executable.
+
+```
+http://thrift.apache.org/docs/install/
+```
+
+### Compile the thrift library and Debug
+
+In the root of IoTDB's source code folder,  run `mvn generate-sources -pl client-py -am`.
+
+Then a complete project will be generated at `client-py/target/pypi` folder.
+But !BE CAUTIOUS!
+All your modifications in `client-py/target/pypi` must be copied manually to `client-py/src/` folder.
+Otherwise once you run `mvn clean`, you will lose all your effort.
+
+Or, you can also copy `client-py/target/pypi/iotdb/thrift` folder to `client-py/src/thrift`, then the
+`src` folder will become also a complete python project.
+But !BE CAUTIOUS!
+Do not upload `client-py/src/thrift` to the git repo.
+
+
+### Session Client & Example
+
+We packed up the Thrift interface in `client-py/src/iotdb/Session.py` (similar with its Java counterpart), also provided
+an example file `client-py/src/SessionExample.py` of how to use the session module. please read it carefully.
+
+
+Or, another simple example:
+
+```python
+from iotdb.Session import Session
+
+ip = "127.0.0.1"
+port_ = "6667"
+username_ = 'root'
+password_ = 'root'
+session = Session(ip, port_, username_, password_)
+session.open(False)
+zone = session.get_time_zone()
+session.close()
+```
+
+### test file
+
+You can use `client-py/src/SessionTest.py` to test python session, if the test has been passed, it will return 0. Otherwise it will return 1. You can use the printed message to locate failed operations and the reason of them.
+
+Notice: you should start IoTDB server firstly and then run the test.
diff --git a/client-py/src/SessionExample.py b/client-py/SessionExample.py
similarity index 100%
rename from client-py/src/SessionExample.py
rename to client-py/SessionExample.py
diff --git a/client-py/src/SessionTest.py b/client-py/SessionTest.py
similarity index 100%
rename from client-py/src/SessionTest.py
rename to client-py/SessionTest.py
diff --git a/client-py/src/iotdb/Session.py b/client-py/iotdb/Session.py
similarity index 100%
rename from client-py/src/iotdb/Session.py
rename to client-py/iotdb/Session.py
diff --git a/client-py/src/iotdb/utils/__init__.py b/client-py/iotdb/__init__.py
similarity index 100%
rename from client-py/src/iotdb/utils/__init__.py
rename to client-py/iotdb/__init__.py
diff --git a/client-py/src/iotdb/utils/Field.py b/client-py/iotdb/utils/Field.py
similarity index 100%
rename from client-py/src/iotdb/utils/Field.py
rename to client-py/iotdb/utils/Field.py
diff --git a/client-py/src/iotdb/utils/IoTDBConstants.py b/client-py/iotdb/utils/IoTDBConstants.py
similarity index 100%
rename from client-py/src/iotdb/utils/IoTDBConstants.py
rename to client-py/iotdb/utils/IoTDBConstants.py
diff --git a/client-py/src/iotdb/utils/IoTDBRpcDataSet.py b/client-py/iotdb/utils/IoTDBRpcDataSet.py
similarity index 100%
rename from client-py/src/iotdb/utils/IoTDBRpcDataSet.py
rename to client-py/iotdb/utils/IoTDBRpcDataSet.py
diff --git a/client-py/src/iotdb/utils/RowRecord.py b/client-py/iotdb/utils/RowRecord.py
similarity index 100%
rename from client-py/src/iotdb/utils/RowRecord.py
rename to client-py/iotdb/utils/RowRecord.py
diff --git a/client-py/src/iotdb/utils/SessionDataSet.py b/client-py/iotdb/utils/SessionDataSet.py
similarity index 100%
rename from client-py/src/iotdb/utils/SessionDataSet.py
rename to client-py/iotdb/utils/SessionDataSet.py
diff --git a/client-py/src/iotdb/utils/Tablet.py b/client-py/iotdb/utils/Tablet.py
similarity index 100%
rename from client-py/src/iotdb/utils/Tablet.py
rename to client-py/iotdb/utils/Tablet.py
diff --git a/client-py/src/iotdb/__init__.py b/client-py/iotdb/utils/__init__.py
similarity index 100%
rename from client-py/src/iotdb/__init__.py
rename to client-py/iotdb/utils/__init__.py
diff --git a/client-py/pom.xml b/client-py/pom.xml
index 564d765..4fb7ab6 100644
--- a/client-py/pom.xml
+++ b/client-py/pom.xml
@@ -39,7 +39,31 @@
     </dependencies>
     <build>
         <plugins>
-            <!-- for pypi distribution -->
+            <plugin>
+                <groupId>org.apache.maven.plugins</groupId>
+                <artifactId>maven-checkstyle-plugin</artifactId>
+                <configuration>
+                    <skip>true</skip>
+                </configuration>
+            </plugin>
+            <!-- clean thrift folder -->
+            <plugin>
+                <groupId>org.apache.maven.plugins</groupId>
+                <artifactId>maven-clean-plugin</artifactId>
+                <version>2.4.1</version>
+                <configuration>
+                    <filesets>
+                        <fileset>
+                            <directory>iotdb</directory>
+                            <includes>
+                                <include>thrift/</include>
+                            </includes>
+                            <followSymlinks>false</followSymlinks>
+                        </fileset>
+                    </filesets>
+                </configuration>
+            </plugin>
+            <!-- fill thrift folder -->
             <plugin>
                 <groupId>org.apache.maven.plugins</groupId>
                 <artifactId>maven-resources-plugin</artifactId>
@@ -57,64 +81,10 @@
                         </goals>
                         <configuration>
                             <encoding>utf-8</encoding>
-                            <outputDirectory>${project.build.directory}/pypi/</outputDirectory>
-                            <resources>
-                                <resource>
-                                    <directory>${basedir}/../thrift/target/generated-sources-python</directory>
-                                </resource>
-                            </resources>
-                        </configuration>
-                    </execution>
-                    <execution>
-                        <id>copy-license-resources</id>
-                        <!-- here the phase you need -->
-                        <phase>generate-sources</phase>
-                        <goals>
-                            <goal>copy-resources</goal>
-                        </goals>
-                        <configuration>
-                            <encoding>utf-8</encoding>
-                            <outputDirectory>${project.build.directory}/pypi</outputDirectory>
-                            <resources>
-                                <resource>
-                                    <directory>${basedir}/../</directory>
-                                    <includes>
-                                        <include>LICENSE</include>
-                                    </includes>
-                                </resource>
-                            </resources>
-                        </configuration>
-                    </execution>
-                    <execution>
-                        <id>copy-python-file-resources</id>
-                        <!-- here the phase you need -->
-                        <phase>generate-sources</phase>
-                        <goals>
-                            <goal>copy-resources</goal>
-                        </goals>
-                        <configuration>
-                            <encoding>utf-8</encoding>
-                            <outputDirectory>${project.build.directory}/pypi/</outputDirectory>
-                            <resources>
-                                <resource>
-                                    <directory>${basedir}/src/</directory>
-                                </resource>
-                            </resources>
-                        </configuration>
-                    </execution>
-                    <execution>
-                        <id>copy-pypi-file-resources</id>
-                        <!-- here the phase you need -->
-                        <phase>generate-sources</phase>
-                        <goals>
-                            <goal>copy-resources</goal>
-                        </goals>
-                        <configuration>
-                            <encoding>utf-8</encoding>
-                            <outputDirectory>${project.build.directory}/pypi</outputDirectory>
+                            <outputDirectory>${basedir}/iotdb/thrift/</outputDirectory>
                             <resources>
                                 <resource>
-                                    <directory>${basedir}/pypi</directory>
+                                    <directory>${basedir}/../thrift/target/generated-sources-python/iotdb/thrift/</directory>
                                 </resource>
                             </resources>
                         </configuration>
diff --git a/client-py/readme.md b/client-py/readme.md
deleted file mode 100644
index 3e4223b..0000000
--- a/client-py/readme.md
+++ /dev/null
@@ -1,80 +0,0 @@
-<!--
-
-    Licensed to the Apache Software Foundation (ASF) under one
-    or more contributor license agreements.  See the NOTICE file
-    distributed with this work for additional information
-    regarding copyright ownership.  The ASF licenses this file
-    to you under the Apache License, Version 2.0 (the
-    "License"); you may not use this file except in compliance
-    with the License.  You may obtain a copy of the License at
-
-        http://www.apache.org/licenses/LICENSE-2.0
-
-    Unless required by applicable law or agreed to in writing,
-    software distributed under the License is distributed on an
-    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-    KIND, either express or implied.  See the License for the
-    specific language governing permissions and limitations
-    under the License.
-
--->
-
-# Python Client
-
-## Introduction
-
-This is an example of how to connect to IoTDB with python, using the thrift rpc interfaces. Things 
-are almost the same on Windows or Linux, but pay attention to the difference like path separator.
-
-## Prerequisites
-
-python3.7 or later is preferred.
-
-You have to install Thrift (0.11.0 or later) to compile our thrift file into python code. Below is the official
-tutorial of installation, eventually, you should have a thrift executable.
-
-```
-http://thrift.apache.org/docs/install/
-```
-
-## Compile the thrift library and Debug
-
-In the root of IoTDB's source code folder,  run `mvn generate-sources -pl client-py -am`.
-
-Then a complete project will be generated at `client-py/target/pypi` folder. 
-But !BE CAUTIOUS!
-All your modifications in `client-py/target/pypi` must be copied manually to `client-py/src/` folder.
-Otherwise once you run `mvn clean`, you will lose all your effort.
-
-Or, you can also copy `client-py/target/pypi/iotdb/thrift` folder to `client-py/src/thrift`, then the 
-`src` folder will become also a complete python project. 
-But !BE CAUTIOUS!
-Do not upload `client-py/src/thrift` to the git repo.
-
-
-## Session Client & Example
-
-We packed up the Thrift interface in `client-py/src/iotdb/Session.py` (similar with its Java counterpart), also provided 
-an example file `client-py/src/SessionExample.py` of how to use the session module. please read it carefully.
-
-
-Or, another simple example:
-
-```python
-from iotdb.Session import Session
-
-ip = "127.0.0.1"
-port_ = "6667"
-username_ = 'root'
-password_ = 'root'
-session = Session(ip, port_, username_, password_)
-session.open(False)
-zone = session.get_time_zone()
-session.close()
-```
-
-## test file
-
-You can use `client-py/src/SessionTest.py` to test python session, if the test has been passed, it will return 0. Otherwise it will return 1. You can use the printed message to locate failed operations and the reason of them.
-
-Notice: you should start IoTDB server firstly and then run the test.
diff --git a/client-py/requirements.txt b/client-py/requirements.txt
new file mode 100644
index 0000000..bbd21b4
--- /dev/null
+++ b/client-py/requirements.txt
@@ -0,0 +1 @@
+pandas==1.2.3
\ No newline at end of file
diff --git a/client-py/requirements_dev.txt b/client-py/requirements_dev.txt
new file mode 100644
index 0000000..8aa50dd
--- /dev/null
+++ b/client-py/requirements_dev.txt
@@ -0,0 +1 @@
+pytest==6.2.2
\ No newline at end of file
diff --git a/client-py/pypi/setup.py b/client-py/setup.py
similarity index 97%
rename from client-py/pypi/setup.py
rename to client-py/setup.py
index c3ec019..69d5559 100644
--- a/client-py/pypi/setup.py
+++ b/client-py/setup.py
@@ -41,6 +41,7 @@ setuptools.setup(
     packages=setuptools.find_packages(),
     install_requires=[
               'thrift>=0.13.0',
+              'pandas>=1.0.0,<1.99.99'
           ],
     classifiers=[
         "Programming Language :: Python :: 3",
diff --git a/client-py/tests/test_dataframe.py b/client-py/tests/test_dataframe.py
new file mode 100644
index 0000000..fa41274
--- /dev/null
+++ b/client-py/tests/test_dataframe.py
@@ -0,0 +1,5 @@
+def test_something():
+    assert 2 == 1+1
+
+def test_something_2():
+    assert 2 == 1+1+1
\ No newline at end of file

[iotdb] 02/04: Added first test.

Posted by jf...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

jfeinauer pushed a commit to branch feature/restrucutre-python-module
in repository https://gitbox.apache.org/repos/asf/iotdb.git

commit 5c9e579d9f90a0ed84dfa2ec3fb75ddcd30bfdb4
Author: Julian Feinauer <j....@pragmaticminds.de>
AuthorDate: Fri Mar 26 09:44:37 2021 +0100

    Added first test.
---
 client-py/iotdb/Session.py                 |  4 ++++
 client-py/{src => }/iotdb/TestContainer.py |  0
 client-py/requirements.txt                 |  5 ++++-
 client-py/requirements_dev.txt             |  5 ++++-
 client-py/tests/__init__.py                |  0
 client-py/tests/test_dataframe.py          | 23 +++++++++++++++++++----
 6 files changed, 31 insertions(+), 6 deletions(-)

diff --git a/client-py/iotdb/Session.py b/client-py/iotdb/Session.py
index e371423..e55a692 100644
--- a/client-py/iotdb/Session.py
+++ b/client-py/iotdb/Session.py
@@ -225,6 +225,10 @@ class Session(object):
 
     def insert_str_record(self, device_id, timestamp, measurements, string_values):
         """ special case for inserting one row of String (TEXT) value """
+        if type(string_values) == str:
+            string_values = [string_values]
+        if type(measurements) == str:
+            measurements = [measurements]
         data_types = [TSDataType.TEXT.value for _ in string_values]
         request = self.gen_insert_str_record_req(device_id, timestamp, measurements, data_types, string_values)
         status = self.__client.insertStringRecord(request)
diff --git a/client-py/src/iotdb/TestContainer.py b/client-py/iotdb/TestContainer.py
similarity index 100%
rename from client-py/src/iotdb/TestContainer.py
rename to client-py/iotdb/TestContainer.py
diff --git a/client-py/requirements.txt b/client-py/requirements.txt
index bbd21b4..39c5e61 100644
--- a/client-py/requirements.txt
+++ b/client-py/requirements.txt
@@ -1 +1,4 @@
-pandas==1.2.3
\ No newline at end of file
+# Pandas Export
+pandas==1.2.3
+# Testcontainer
+testcontainers==3.3.0
\ No newline at end of file
diff --git a/client-py/requirements_dev.txt b/client-py/requirements_dev.txt
index 8aa50dd..0ee3c7c 100644
--- a/client-py/requirements_dev.txt
+++ b/client-py/requirements_dev.txt
@@ -1 +1,4 @@
-pytest==6.2.2
\ No newline at end of file
+-r requirements.txt
+# Pytest to run tests
+pytest==6.2.2
+thrift==0.13.0
\ No newline at end of file
diff --git a/client-py/tests/__init__.py b/client-py/tests/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/client-py/tests/test_dataframe.py b/client-py/tests/test_dataframe.py
index fa41274..988caaf 100644
--- a/client-py/tests/test_dataframe.py
+++ b/client-py/tests/test_dataframe.py
@@ -1,5 +1,20 @@
-def test_something():
-    assert 2 == 1+1
+from iotdb.Session import Session
+from iotdb.TestContainer import IoTDBContainer
 
-def test_something_2():
-    assert 2 == 1+1+1
\ No newline at end of file
+
+def test_simple_query():
+    with IoTDBContainer("apache/iotdb:0.11.2") as db:
+        db: IoTDBContainer
+        session = Session(db.get_container_host_ip(), db.get_exposed_port(6667))
+        session.open(False)
+
+        # Write data
+        session.insert_str_record("device", 123, "pressure", "15.0")
+
+        # Read
+        session_data_set = session.execute_query_statement("SELECT * FROM root.*")
+        df = session_data_set.todf()
+
+        session.close()
+
+    assert df == []
\ No newline at end of file

[iotdb] 04/04: Added automatic test. Added flake8 and config. Added black and config.

Posted by jf...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

jfeinauer pushed a commit to branch feature/restrucutre-python-module
in repository https://gitbox.apache.org/repos/asf/iotdb.git

commit c35fb875b71d0d0725fc8cacf533b5683a441d90
Author: Julian Feinauer <j....@pragmaticminds.de>
AuthorDate: Fri Mar 26 10:16:10 2021 +0100

    Added automatic test.
    Added flake8 and config.
    Added black and config.
---
 client-py/.flake8                        |  16 ++
 client-py/README.md                      |  37 ++--
 client-py/SessionExample.py              | 134 ++++++++----
 client-py/SessionTest.py                 | 207 +++++++++++++------
 client-py/iotdb/Session.py               | 337 +++++++++++++++++++++++--------
 client-py/iotdb/TestContainer.py         |   4 +-
 client-py/iotdb/__init__.py              |   1 -
 client-py/iotdb/iotdb_utils.py           |  74 -------
 client-py/iotdb/utils/Field.py           |  10 +-
 client-py/iotdb/utils/IoTDBRpcDataSet.py |  90 ++++++---
 client-py/iotdb/utils/RowRecord.py       |   5 +-
 client-py/iotdb/utils/SessionDataSet.py  | 110 ++++++++--
 client-py/iotdb/utils/Tablet.py          |  16 +-
 client-py/iotdb/utils/__init__.py        |   1 -
 client-py/pyproject.toml                 |  27 +++
 client-py/requirements_dev.txt           |   4 +-
 client-py/setup.py                       |  17 +-
 client-py/tests/test_dataframe.py        |   7 +-
 18 files changed, 746 insertions(+), 351 deletions(-)

diff --git a/client-py/.flake8 b/client-py/.flake8
new file mode 100644
index 0000000..d8098a6
--- /dev/null
+++ b/client-py/.flake8
@@ -0,0 +1,16 @@
+[flake8]
+ignore =
+    E203,
+    W503
+max-line-length=200
+exclude =
+    .git,
+    test/*,
+    iotdb/thrift/**/*
+extend-exclude =
+    dist,
+    build,
+    venv
+show-source = True
+statistics = True
+format = %(path)s:%(row)d,%(col)d:%(code)s:%(text)s:https://lintlyci.github.io/Flake8Rules/rules/%(code)s.html
diff --git a/client-py/README.md b/client-py/README.md
index 6ae309b..c2e03cb 100644
--- a/client-py/README.md
+++ b/client-py/README.md
@@ -137,19 +137,19 @@ tutorial of installation, eventually, you should have a thrift executable.
 http://thrift.apache.org/docs/install/
 ```
 
+Before starting you need to install `requirements_dev.txt` in your python environment, e.g. by calling
+```
+pip install -r requirements_dev.txt
+```
+
 ### Compile the thrift library and Debug
 
-In the root of IoTDB's source code folder,  run `mvn generate-sources -pl client-py -am`.
+In the root of IoTDB's source code folder,  run `mvn clean generate-sources -pl client-py -am`.
 
-Then a complete project will be generated at `client-py/target/pypi` folder.
-But !BE CAUTIOUS!
-All your modifications in `client-py/target/pypi` must be copied manually to `client-py/src/` folder.
-Otherwise once you run `mvn clean`, you will lose all your effort.
+This will automatically delete and repopulate the folder `iotdb/thrift` with the generated thrift files.
+This folder is ignored from git and should **never be pushed to git!**
 
-Or, you can also copy `client-py/target/pypi/iotdb/thrift` folder to `client-py/src/thrift`, then the
-`src` folder will become also a complete python project.
-But !BE CAUTIOUS!
-Do not upload `client-py/src/thrift` to the git repo.
+**Notice** Do not upload `iotdb/thrift` to the git repo.
 
 
 ### Session Client & Example
@@ -173,8 +173,21 @@ zone = session.get_time_zone()
 session.close()
 ```
 
-### test file
+### Tests
+
+Please add your custom tests in `tests` folder.
+To run all defined tests just type `pytest .` in the root folder.
+
+**Notice** Some tests need docker to be started on your system as a test instance is started in a docker container using [testcontainers](https://testcontainers-python.readthedocs.io/en/latest/index.html).
+
+### Futher Tools
+
+[black](https://pypi.org/project/black/) and [flake8](https://pypi.org/project/flake8/) are installed for autoformatting and linting.
+Both can be run by `black .` or `flake8 .` respectively.
 
-You can use `client-py/src/SessionTest.py` to test python session, if the test has been passed, it will return 0. Otherwise it will return 1. You can use the printed message to locate failed operations and the reason of them.
+## Releasing
 
-Notice: you should start IoTDB server firstly and then run the test.
+To do a release just ensure that you have the right set of generated thrift files.
+Then run linting and auto-formatting.
+Then, ensure that all tests work (via `pytest .`).
+Then you are good to go to do a release!
\ No newline at end of file
diff --git a/client-py/SessionExample.py b/client-py/SessionExample.py
index 0bc08b5..bf56555 100644
--- a/client-py/SessionExample.py
+++ b/client-py/SessionExample.py
@@ -19,15 +19,15 @@
 # Uncomment the following line to use apache-iotdb module installed by pip3
 
 from iotdb.Session import Session
+from iotdb.utils.IoTDBConstants import TSDataType, TSEncoding, Compressor
 from iotdb.utils.Tablet import Tablet
-from iotdb.utils.IoTDBConstants import *
 
 # creating session connection.
 ip = "127.0.0.1"
 port_ = "6667"
-username_ = 'root'
-password_ = 'root'
-session = Session(ip, port_, username_, password_, fetch_size=1024, zone_id='UTC+8')
+username_ = "root"
+password_ = "root"
+session = Session(ip, port_, username_, password_, fetch_size=1024, zone_id="UTC+8")
 session.open(False)
 
 # set and delete storage groups
@@ -39,74 +39,136 @@ session.delete_storage_group("root.sg_test_02")
 session.delete_storage_groups(["root.sg_test_03", "root.sg_test_04"])
 
 # setting time series.
-session.create_time_series("root.sg_test_01.d_01.s_01", TSDataType.BOOLEAN, TSEncoding.PLAIN, Compressor.SNAPPY)
-session.create_time_series("root.sg_test_01.d_01.s_02", TSDataType.INT32, TSEncoding.PLAIN, Compressor.SNAPPY)
-session.create_time_series("root.sg_test_01.d_01.s_03", TSDataType.INT64, TSEncoding.PLAIN, Compressor.SNAPPY)
+session.create_time_series(
+    "root.sg_test_01.d_01.s_01", TSDataType.BOOLEAN, TSEncoding.PLAIN, Compressor.SNAPPY
+)
+session.create_time_series(
+    "root.sg_test_01.d_01.s_02", TSDataType.INT32, TSEncoding.PLAIN, Compressor.SNAPPY
+)
+session.create_time_series(
+    "root.sg_test_01.d_01.s_03", TSDataType.INT64, TSEncoding.PLAIN, Compressor.SNAPPY
+)
 
 # setting multiple time series once.
-ts_path_lst_ = ["root.sg_test_01.d_01.s_04", "root.sg_test_01.d_01.s_05", "root.sg_test_01.d_01.s_06",
-                "root.sg_test_01.d_01.s_07", "root.sg_test_01.d_01.s_08", "root.sg_test_01.d_01.s_09"]
-data_type_lst_ = [TSDataType.FLOAT, TSDataType.DOUBLE, TSDataType.TEXT,
-                  TSDataType.FLOAT, TSDataType.DOUBLE, TSDataType.TEXT]
+ts_path_lst_ = [
+    "root.sg_test_01.d_01.s_04",
+    "root.sg_test_01.d_01.s_05",
+    "root.sg_test_01.d_01.s_06",
+    "root.sg_test_01.d_01.s_07",
+    "root.sg_test_01.d_01.s_08",
+    "root.sg_test_01.d_01.s_09",
+]
+data_type_lst_ = [
+    TSDataType.FLOAT,
+    TSDataType.DOUBLE,
+    TSDataType.TEXT,
+    TSDataType.FLOAT,
+    TSDataType.DOUBLE,
+    TSDataType.TEXT,
+]
 encoding_lst_ = [TSEncoding.PLAIN for _ in range(len(data_type_lst_))]
 compressor_lst_ = [Compressor.SNAPPY for _ in range(len(data_type_lst_))]
-session.create_multi_time_series(ts_path_lst_, data_type_lst_, encoding_lst_, compressor_lst_)
+session.create_multi_time_series(
+    ts_path_lst_, data_type_lst_, encoding_lst_, compressor_lst_
+)
 
 # delete time series
-session.delete_time_series(["root.sg_test_01.d_01.s_07", "root.sg_test_01.d_01.s_08", "root.sg_test_01.d_01.s_09"])
+session.delete_time_series(
+    [
+        "root.sg_test_01.d_01.s_07",
+        "root.sg_test_01.d_01.s_08",
+        "root.sg_test_01.d_01.s_09",
+    ]
+)
 
 # checking time series
-print("s_07 expecting False, checking result: ", session.check_time_series_exists("root.sg_test_01.d_01.s_07"))
-print("s_03 expecting True, checking result: ", session.check_time_series_exists("root.sg_test_01.d_01.s_03"))
+print(
+    "s_07 expecting False, checking result: ",
+    session.check_time_series_exists("root.sg_test_01.d_01.s_07"),
+)
+print(
+    "s_03 expecting True, checking result: ",
+    session.check_time_series_exists("root.sg_test_01.d_01.s_03"),
+)
 
 # insert one record into the database.
 measurements_ = ["s_01", "s_02", "s_03", "s_04", "s_05", "s_06"]
 values_ = [False, 10, 11, 1.1, 10011.1, "test_record"]
-data_types_ = [TSDataType.BOOLEAN, TSDataType.INT32, TSDataType.INT64,
-               TSDataType.FLOAT, TSDataType.DOUBLE, TSDataType.TEXT]
+data_types_ = [
+    TSDataType.BOOLEAN,
+    TSDataType.INT32,
+    TSDataType.INT64,
+    TSDataType.FLOAT,
+    TSDataType.DOUBLE,
+    TSDataType.TEXT,
+]
 session.insert_record("root.sg_test_01.d_01", 1, measurements_, data_types_, values_)
 
 # insert multiple records into database
-measurements_list_ = [["s_01", "s_02", "s_03", "s_04", "s_05", "s_06"],
-                      ["s_01", "s_02", "s_03", "s_04", "s_05", "s_06"]]
-values_list_ = [[False, 22, 33, 4.4, 55.1, "test_records01"],
-                [True, 77, 88, 1.25, 8.125, "test_records02"]]
+measurements_list_ = [
+    ["s_01", "s_02", "s_03", "s_04", "s_05", "s_06"],
+    ["s_01", "s_02", "s_03", "s_04", "s_05", "s_06"],
+]
+values_list_ = [
+    [False, 22, 33, 4.4, 55.1, "test_records01"],
+    [True, 77, 88, 1.25, 8.125, "test_records02"],
+]
 data_type_list_ = [data_types_, data_types_]
 device_ids_ = ["root.sg_test_01.d_01", "root.sg_test_01.d_01"]
-session.insert_records(device_ids_, [2, 3], measurements_list_, data_type_list_, values_list_)
+session.insert_records(
+    device_ids_, [2, 3], measurements_list_, data_type_list_, values_list_
+)
 
 # insert one tablet into the database.
-values_ = [[False, 10, 11, 1.1, 10011.1, "test01"],
-           [True, 100, 11111, 1.25, 101.0, "test02"],
-           [False, 100, 1, 188.1, 688.25, "test03"],
-           [True, 0, 0, 0, 6.25, "test04"]]  # Non-ASCII text will cause error since bytes can only hold 0-128 nums.
+values_ = [
+    [False, 10, 11, 1.1, 10011.1, "test01"],
+    [True, 100, 11111, 1.25, 101.0, "test02"],
+    [False, 100, 1, 188.1, 688.25, "test03"],
+    [True, 0, 0, 0, 6.25, "test04"],
+]  # Non-ASCII text will cause error since bytes can only hold 0-128 nums.
 timestamps_ = [4, 5, 6, 7]
-tablet_ = Tablet("root.sg_test_01.d_01", measurements_, data_types_, values_, timestamps_)
+tablet_ = Tablet(
+    "root.sg_test_01.d_01", measurements_, data_types_, values_, timestamps_
+)
 session.insert_tablet(tablet_)
 
 # insert multiple tablets into database
-tablet_01 = Tablet("root.sg_test_01.d_01", measurements_, data_types_, values_, [8, 9, 10, 11])
-tablet_02 = Tablet("root.sg_test_01.d_01", measurements_, data_types_, values_, [12, 13, 14, 15])
+tablet_01 = Tablet(
+    "root.sg_test_01.d_01", measurements_, data_types_, values_, [8, 9, 10, 11]
+)
+tablet_02 = Tablet(
+    "root.sg_test_01.d_01", measurements_, data_types_, values_, [12, 13, 14, 15]
+)
 session.insert_tablets([tablet_01, tablet_02])
 
 # insert records of one device
 time_list = [1, 2, 3]
-measurements_list = [["s_01", "s_02", "s_03"], ["s_01", "s_02", "s_03"], ["s_01", "s_02", "s_03"]]
-data_types_list = [[TSDataType.BOOLEAN, TSDataType.INT32, TSDataType.INT64],
-                   [TSDataType.BOOLEAN, TSDataType.INT32, TSDataType.INT64],
-                   [TSDataType.BOOLEAN, TSDataType.INT32, TSDataType.INT64]]
+measurements_list = [
+    ["s_01", "s_02", "s_03"],
+    ["s_01", "s_02", "s_03"],
+    ["s_01", "s_02", "s_03"],
+]
+data_types_list = [
+    [TSDataType.BOOLEAN, TSDataType.INT32, TSDataType.INT64],
+    [TSDataType.BOOLEAN, TSDataType.INT32, TSDataType.INT64],
+    [TSDataType.BOOLEAN, TSDataType.INT32, TSDataType.INT64],
+]
 values_list = [[False, 22, 33], [True, 1, 23], [False, 15, 26]]
 
-session.insert_records_of_one_device("root.sg_test_01.d_01", time_list, measurements_list, data_types_list, values_list)
+session.insert_records_of_one_device(
+    "root.sg_test_01.d_01", time_list, measurements_list, data_types_list, values_list
+)
 
 # execute non-query sql statement
-session.execute_non_query_statement("insert into root.sg_test_01.d_01(timestamp, s_02) values(16, 188)")
+session.execute_non_query_statement(
+    "insert into root.sg_test_01.d_01(timestamp, s_02) values(16, 188)"
+)
 
 # execute sql query statement
 session_data_set = session.execute_query_statement("select * from root.sg_test_01.d_01")
 session_data_set.set_fetch_size(1024)
 while session_data_set.has_next():
-  print(session_data_set.next())
+    print(session_data_set.next())
 session_data_set.close_operation_handle()
 
 # close session connection.
diff --git a/client-py/SessionTest.py b/client-py/SessionTest.py
index 5e9dd7f..96ff00d 100644
--- a/client-py/SessionTest.py
+++ b/client-py/SessionTest.py
@@ -18,33 +18,35 @@
 
 # Uncomment the following line to use apache-iotdb module installed by pip3
 from iotdb.Session import Session
+from iotdb.utils.IoTDBConstants import TSDataType, TSEncoding, Compressor
 from iotdb.utils.Tablet import Tablet
-from iotdb.utils.IoTDBConstants import *
 
 # whether the test has passed
 final_flag = True
 failed_count = 0
 
+
 def test_fail(message):
-  global failed_count
-  global final_flag
-  print("*********")
-  print(message)
-  print("*********")
-  final_flag = False
-  failed_count += 1
+    global failed_count
+    global final_flag
+    print("*********")
+    print(message)
+    print("*********")
+    final_flag = False
+    failed_count += 1
+
 
 # creating session connection.
 ip = "127.0.0.1"
 port_ = "6667"
-username_ = 'root'
-password_ = 'root'
-session = Session(ip, port_, username_, password_, fetch_size=1024, zone_id='UTC+8')
+username_ = "root"
+password_ = "root"
+session = Session(ip, port_, username_, password_, fetch_size=1024, zone_id="UTC+8")
 session.open(False)
 
 if not session.is_open():
-  print("can't open session")
-  exit(1)
+    print("can't open session")
+    exit(1)
 
 # set and delete storage groups
 session.set_storage_group("root.sg_test_01")
@@ -53,86 +55,165 @@ session.set_storage_group("root.sg_test_03")
 session.set_storage_group("root.sg_test_04")
 
 if session.delete_storage_group("root.sg_test_02") < 0:
-  test_fail("delete storage group failed")
+    test_fail("delete storage group failed")
 
 if session.delete_storage_groups(["root.sg_test_03", "root.sg_test_04"]) < 0:
-  test_fail("delete storage groups failed")
+    test_fail("delete storage groups failed")
 
 # setting time series.
-session.create_time_series("root.sg_test_01.d_01.s_01", TSDataType.BOOLEAN, TSEncoding.PLAIN, Compressor.SNAPPY)
-session.create_time_series("root.sg_test_01.d_01.s_02", TSDataType.INT32, TSEncoding.PLAIN, Compressor.SNAPPY)
-session.create_time_series("root.sg_test_01.d_01.s_03", TSDataType.INT64, TSEncoding.PLAIN, Compressor.SNAPPY)
+session.create_time_series(
+    "root.sg_test_01.d_01.s_01", TSDataType.BOOLEAN, TSEncoding.PLAIN, Compressor.SNAPPY
+)
+session.create_time_series(
+    "root.sg_test_01.d_01.s_02", TSDataType.INT32, TSEncoding.PLAIN, Compressor.SNAPPY
+)
+session.create_time_series(
+    "root.sg_test_01.d_01.s_03", TSDataType.INT64, TSEncoding.PLAIN, Compressor.SNAPPY
+)
 
 # setting multiple time series once.
-ts_path_lst_ = ["root.sg_test_01.d_01.s_04", "root.sg_test_01.d_01.s_05", "root.sg_test_01.d_01.s_06",
-                "root.sg_test_01.d_01.s_07", "root.sg_test_01.d_01.s_08", "root.sg_test_01.d_01.s_09"]
-data_type_lst_ = [TSDataType.FLOAT, TSDataType.DOUBLE, TSDataType.TEXT,
-                  TSDataType.FLOAT, TSDataType.DOUBLE, TSDataType.TEXT]
+ts_path_lst_ = [
+    "root.sg_test_01.d_01.s_04",
+    "root.sg_test_01.d_01.s_05",
+    "root.sg_test_01.d_01.s_06",
+    "root.sg_test_01.d_01.s_07",
+    "root.sg_test_01.d_01.s_08",
+    "root.sg_test_01.d_01.s_09",
+]
+data_type_lst_ = [
+    TSDataType.FLOAT,
+    TSDataType.DOUBLE,
+    TSDataType.TEXT,
+    TSDataType.FLOAT,
+    TSDataType.DOUBLE,
+    TSDataType.TEXT,
+]
 encoding_lst_ = [TSEncoding.PLAIN for _ in range(len(data_type_lst_))]
 compressor_lst_ = [Compressor.SNAPPY for _ in range(len(data_type_lst_))]
-session.create_multi_time_series(ts_path_lst_, data_type_lst_, encoding_lst_, compressor_lst_)
+session.create_multi_time_series(
+    ts_path_lst_, data_type_lst_, encoding_lst_, compressor_lst_
+)
 
 # delete time series
-if session.delete_time_series(["root.sg_test_01.d_01.s_07", "root.sg_test_01.d_01.s_08", "root.sg_test_01.d_01.s_09"]) < 0:
-  test_fail("delete time series failed")
+if (
+    session.delete_time_series(
+        [
+            "root.sg_test_01.d_01.s_07",
+            "root.sg_test_01.d_01.s_08",
+            "root.sg_test_01.d_01.s_09",
+        ]
+    )
+    < 0
+):
+    test_fail("delete time series failed")
 
 # checking time series
 # s_07 expecting False
 if session.check_time_series_exists("root.sg_test_01.d_01.s_07"):
-  test_fail("root.sg_test_01.d_01.s_07 shouldn't exist")
+    test_fail("root.sg_test_01.d_01.s_07 shouldn't exist")
 
 # s_03 expecting True
 if not session.check_time_series_exists("root.sg_test_01.d_01.s_03"):
-  test_fail("root.sg_test_01.d_01.s_03 should exist")
+    test_fail("root.sg_test_01.d_01.s_03 should exist")
 
 # insert one record into the database.
 measurements_ = ["s_01", "s_02", "s_03", "s_04", "s_05", "s_06"]
 values_ = [False, 10, 11, 1.1, 10011.1, "test_record"]
-data_types_ = [TSDataType.BOOLEAN, TSDataType.INT32, TSDataType.INT64,
-               TSDataType.FLOAT, TSDataType.DOUBLE, TSDataType.TEXT]
-if session.insert_record("root.sg_test_01.d_01", 1, measurements_, data_types_, values_) < 0:
-  test_fail("insert record failed")
+data_types_ = [
+    TSDataType.BOOLEAN,
+    TSDataType.INT32,
+    TSDataType.INT64,
+    TSDataType.FLOAT,
+    TSDataType.DOUBLE,
+    TSDataType.TEXT,
+]
+if (
+    session.insert_record(
+        "root.sg_test_01.d_01", 1, measurements_, data_types_, values_
+    )
+    < 0
+):
+    test_fail("insert record failed")
 
 # insert multiple records into database
-measurements_list_ = [["s_01", "s_02", "s_03", "s_04", "s_05", "s_06"],
-                      ["s_01", "s_02", "s_03", "s_04", "s_05", "s_06"]]
-values_list_ = [[False, 22, 33, 4.4, 55.1, "test_records01"],
-                [True, 77, 88, 1.25, 8.125, "test_records02"]]
+measurements_list_ = [
+    ["s_01", "s_02", "s_03", "s_04", "s_05", "s_06"],
+    ["s_01", "s_02", "s_03", "s_04", "s_05", "s_06"],
+]
+values_list_ = [
+    [False, 22, 33, 4.4, 55.1, "test_records01"],
+    [True, 77, 88, 1.25, 8.125, "test_records02"],
+]
 data_type_list_ = [data_types_, data_types_]
 device_ids_ = ["root.sg_test_01.d_01", "root.sg_test_01.d_01"]
-if session.insert_records(device_ids_, [2, 3], measurements_list_, data_type_list_, values_list_) < 0:
-  test_fail("insert records failed")
+if (
+    session.insert_records(
+        device_ids_, [2, 3], measurements_list_, data_type_list_, values_list_
+    )
+    < 0
+):
+    test_fail("insert records failed")
 
 # insert one tablet into the database.
-values_ = [[False, 10, 11, 1.1, 10011.1, "test01"],
-           [True, 100, 11111, 1.25, 101.0, "test02"],
-           [False, 100, 1, 188.1, 688.25, "test03"],
-           [True, 0, 0, 0, 6.25, "test04"]]  # Non-ASCII text will cause error since bytes can only hold 0-128 nums.
+values_ = [
+    [False, 10, 11, 1.1, 10011.1, "test01"],
+    [True, 100, 11111, 1.25, 101.0, "test02"],
+    [False, 100, 1, 188.1, 688.25, "test03"],
+    [True, 0, 0, 0, 6.25, "test04"],
+]  # Non-ASCII text will cause error since bytes can only hold 0-128 nums.
 timestamps_ = [4, 5, 6, 7]
-tablet_ = Tablet("root.sg_test_01.d_01", measurements_, data_types_, values_, timestamps_)
+tablet_ = Tablet(
+    "root.sg_test_01.d_01", measurements_, data_types_, values_, timestamps_
+)
 if session.insert_tablet(tablet_) < 0:
-  test_fail("insert tablet failed")
+    test_fail("insert tablet failed")
 
 # insert multiple tablets into database
-tablet_01 = Tablet("root.sg_test_01.d_01", measurements_, data_types_, values_, [8, 9, 10, 11])
-tablet_02 = Tablet("root.sg_test_01.d_01", measurements_, data_types_, values_, [12, 13, 14, 15])
+tablet_01 = Tablet(
+    "root.sg_test_01.d_01", measurements_, data_types_, values_, [8, 9, 10, 11]
+)
+tablet_02 = Tablet(
+    "root.sg_test_01.d_01", measurements_, data_types_, values_, [12, 13, 14, 15]
+)
 if session.insert_tablets([tablet_01, tablet_02]) < 0:
-  test_fail("insert tablets failed")
+    test_fail("insert tablets failed")
 
 # insert records of one device
 time_list = [1, 2, 3]
-measurements_list = [["s_01", "s_02", "s_03"], ["s_01", "s_02", "s_03"], ["s_01", "s_02", "s_03"]]
-data_types_list = [[TSDataType.BOOLEAN, TSDataType.INT32, TSDataType.INT64],
-                   [TSDataType.BOOLEAN, TSDataType.INT32, TSDataType.INT64],
-                   [TSDataType.BOOLEAN, TSDataType.INT32, TSDataType.INT64]]
+measurements_list = [
+    ["s_01", "s_02", "s_03"],
+    ["s_01", "s_02", "s_03"],
+    ["s_01", "s_02", "s_03"],
+]
+data_types_list = [
+    [TSDataType.BOOLEAN, TSDataType.INT32, TSDataType.INT64],
+    [TSDataType.BOOLEAN, TSDataType.INT32, TSDataType.INT64],
+    [TSDataType.BOOLEAN, TSDataType.INT32, TSDataType.INT64],
+]
 values_list = [[False, 22, 33], [True, 1, 23], [False, 15, 26]]
 
-if session.insert_records_of_one_device("root.sg_test_01.d_01", time_list, measurements_list, data_types_list, values_list) < 0:
-  test_fail("insert records of one device failed")
+if (
+    session.insert_records_of_one_device(
+        "root.sg_test_01.d_01",
+        time_list,
+        measurements_list,
+        data_types_list,
+        values_list,
+    )
+    < 0
+):
+    test_fail("insert records of one device failed")
 
 # execute non-query sql statement
-if session.execute_non_query_statement("insert into root.sg_test_01.d_01(timestamp, s_02) values(16, 188)") < 0:
-  test_fail("execute 'insert into root.sg_test_01.d_01(timestamp, s_02) values(16, 188)' failed")
+if (
+    session.execute_non_query_statement(
+        "insert into root.sg_test_01.d_01(timestamp, s_02) values(16, 188)"
+    )
+    < 0
+):
+    test_fail(
+        "execute 'insert into root.sg_test_01.d_01(timestamp, s_02) values(16, 188)' failed"
+    )
 
 # execute sql query statement
 session_data_set = session.execute_query_statement("select * from root.sg_test_01.d_01")
@@ -140,19 +221,23 @@ session_data_set.set_fetch_size(1024)
 expect_count = 16
 actual_count = 0
 while session_data_set.has_next():
-  actual_count += 1
+    actual_count += 1
 session_data_set.close_operation_handle()
 
 if actual_count != expect_count:
-  test_fail("query count mismatch: expect count: "
-            + str(expect_count) + " actual count: " + str(actual_count))
+    test_fail(
+        "query count mismatch: expect count: "
+        + str(expect_count)
+        + " actual count: "
+        + str(actual_count)
+    )
 
 # close session connection.
 session.close()
 
 if final_flag:
-  print("All executions done!!")
+    print("All executions done!!")
 else:
-  print("Some test failed, please have a check")
-  print("failed count: ", failed_count)
-  exit(1)
+    print("Some test failed, please have a check")
+    print("failed count: ", failed_count)
+    exit(1)
diff --git a/client-py/iotdb/Session.py b/client-py/iotdb/Session.py
index a2aec45..e7a3618 100644
--- a/client-py/iotdb/Session.py
+++ b/client-py/iotdb/Session.py
@@ -19,15 +19,25 @@
 import struct
 import time
 
-from .utils.SessionDataSet import SessionDataSet
-from .utils.IoTDBConstants import *
+from iotdb.utils.SessionDataSet import SessionDataSet
 
 from thrift.protocol import TBinaryProtocol, TCompactProtocol
 from thrift.transport import TSocket, TTransport
 
-from .thrift.rpc.TSIService import Client, TSCreateTimeseriesReq, TSInsertRecordReq, TSInsertStringRecordReq, TSInsertTabletReq, \
-    TSExecuteStatementReq, TSOpenSessionReq, TSCreateMultiTimeseriesReq, TSCloseSessionReq, TSInsertTabletsReq, TSInsertRecordsReq, \
-    TSInsertRecordsOfOneDeviceReq
+from .thrift.rpc.TSIService import (
+    Client,
+    TSCreateTimeseriesReq,
+    TSInsertRecordReq,
+    TSInsertStringRecordReq,
+    TSInsertTabletReq,
+    TSExecuteStatementReq,
+    TSOpenSessionReq,
+    TSCreateMultiTimeseriesReq,
+    TSCloseSessionReq,
+    TSInsertTabletsReq,
+    TSInsertRecordsReq,
+    TSInsertRecordsOfOneDeviceReq,
+)
 from .thrift.rpc.ttypes import TSDeleteDataReq, TSProtocolVersion, TSSetTimeZoneReq
 
 # for debug
@@ -41,16 +51,25 @@ from .thrift.rpc.ttypes import TSDeleteDataReq, TSProtocolVersion, TSSetTimeZone
 #      TSExecuteStatementReq, TSOpenSessionReq, TSQueryDataSet, TSFetchResultsReq, TSCloseOperationReq, \
 #      TSCreateMultiTimeseriesReq, TSCloseSessionReq, TSInsertTabletsReq, TSInsertRecordsReq
 # from iotdb.rpc.ttypes import TSDeleteDataReq, TSProtocolVersion, TSSetTimeZoneReq
+from .utils.IoTDBConstants import TSDataType
 
 
 class Session(object):
     SUCCESS_CODE = 200
     DEFAULT_FETCH_SIZE = 10000
-    DEFAULT_USER = 'root'
-    DEFAULT_PASSWORD = 'root'
-    DEFAULT_ZONE_ID = time.strftime('%z')
-
-    def __init__(self, host, port, user=DEFAULT_USER, password=DEFAULT_PASSWORD, fetch_size=DEFAULT_FETCH_SIZE, zone_id=DEFAULT_ZONE_ID):
+    DEFAULT_USER = "root"
+    DEFAULT_PASSWORD = "root"
+    DEFAULT_ZONE_ID = time.strftime("%z")
+
+    def __init__(
+        self,
+        host,
+        port,
+        user=DEFAULT_USER,
+        password=DEFAULT_PASSWORD,
+        fetch_size=DEFAULT_FETCH_SIZE,
+        zone_id=DEFAULT_ZONE_ID,
+    ):
         self.__host = host
         self.__port = port
         self.__user = user
@@ -67,30 +86,37 @@ class Session(object):
     def open(self, enable_rpc_compression):
         if not self.__is_close:
             return
-        self.__transport = TTransport.TFramedTransport(TSocket.TSocket(self.__host, self.__port))
+        self.__transport = TTransport.TFramedTransport(
+            TSocket.TSocket(self.__host, self.__port)
+        )
 
         if not self.__transport.isOpen():
             try:
                 self.__transport.open()
             except TTransport.TTransportException as e:
-                print('TTransportException: ', e)
+                print("TTransportException: ", e)
 
         if enable_rpc_compression:
             self.__client = Client(TCompactProtocol.TCompactProtocol(self.__transport))
         else:
             self.__client = Client(TBinaryProtocol.TBinaryProtocol(self.__transport))
 
-        open_req = TSOpenSessionReq(client_protocol=self.protocol_version,
-                                    username=self.__user,
-                                    password=self.__password,
-                                    zoneId=self.__zone_id)
+        open_req = TSOpenSessionReq(
+            client_protocol=self.protocol_version,
+            username=self.__user,
+            password=self.__password,
+            zoneId=self.__zone_id,
+        )
 
         try:
             open_resp = self.__client.openSession(open_req)
 
             if self.protocol_version != open_resp.serverProtocolVersion:
-                print("Protocol differ, Client version is {}, but Server version is {}".format(
-                    self.protocol_version, open_resp.serverProtocolVersion))
+                print(
+                    "Protocol differ, Client version is {}, but Server version is {}".format(
+                        self.protocol_version, open_resp.serverProtocolVersion
+                    )
+                )
                 # version is less than 0.10
                 if open_resp.serverProtocolVersion == 0:
                     raise TTransport.TException(message="Protocol not supported.")
@@ -119,7 +145,10 @@ class Session(object):
         try:
             self.__client.closeSession(req)
         except TTransport.TException as e:
-            print("Error occurs when closing session at server. Maybe server is down. Error message: ", e)
+            print(
+                "Error occurs when closing session at server. Maybe server is down. Error message: ",
+                e,
+            )
         finally:
             self.__is_close = True
             if self.__transport is not None:
@@ -149,7 +178,11 @@ class Session(object):
         :param storage_group_lst: List, paths of the target storage groups.
         """
         status = self.__client.deleteStorageGroups(self.__session_id, storage_group_lst)
-        print("delete storage group(s) {} message: {}".format(storage_group_lst, status.message))
+        print(
+            "delete storage group(s) {} message: {}".format(
+                storage_group_lst, status.message
+            )
+        )
 
         return Session.verify_success(status)
 
@@ -164,13 +197,17 @@ class Session(object):
         data_type = data_type.value
         encoding = encoding.value
         compressor = compressor.value
-        request = TSCreateTimeseriesReq(self.__session_id, ts_path, data_type, encoding, compressor)
+        request = TSCreateTimeseriesReq(
+            self.__session_id, ts_path, data_type, encoding, compressor
+        )
         status = self.__client.createTimeseries(request)
         print("creating time series {} message: {}".format(ts_path, status.message))
 
         return Session.verify_success(status)
 
-    def create_multi_time_series(self, ts_path_lst, data_type_lst, encoding_lst, compressor_lst):
+    def create_multi_time_series(
+        self, ts_path_lst, data_type_lst, encoding_lst, compressor_lst
+    ):
         """
         create multiple time series
         :param ts_path_lst: List of String, complete time series paths (starts from root)
@@ -182,10 +219,15 @@ class Session(object):
         encoding_lst = [encoding.value for encoding in encoding_lst]
         compressor_lst = [compressor.value for compressor in compressor_lst]
 
-        request = TSCreateMultiTimeseriesReq(self.__session_id, ts_path_lst, data_type_lst,
-                                             encoding_lst, compressor_lst)
+        request = TSCreateMultiTimeseriesReq(
+            self.__session_id, ts_path_lst, data_type_lst, encoding_lst, compressor_lst
+        )
         status = self.__client.createMultiTimeseries(request)
-        print("creating multiple time series {} message: {}".format(ts_path_lst, status.message))
+        print(
+            "creating multiple time series {} message: {}".format(
+                ts_path_lst, status.message
+            )
+        )
 
         return Session.verify_success(status)
 
@@ -195,7 +237,11 @@ class Session(object):
         :param paths_list: List of time series path, which should be complete (starts from root)
         """
         status = self.__client.deleteTimeseries(self.__session_id, paths_list)
-        print("deleting multiple time series {} message: {}".format(paths_list, status.message))
+        print(
+            "deleting multiple time series {} message: {}".format(
+                paths_list, status.message
+            )
+        )
 
         return Session.verify_success(status)
 
@@ -230,9 +276,15 @@ class Session(object):
         if type(measurements) == str:
             measurements = [measurements]
         data_types = [TSDataType.TEXT.value for _ in string_values]
-        request = self.gen_insert_str_record_req(device_id, timestamp, measurements, data_types, string_values)
+        request = self.gen_insert_str_record_req(
+            device_id, timestamp, measurements, data_types, string_values
+        )
         status = self.__client.insertStringRecord(request)
-        print("insert one record to device {} message: {}".format(device_id, status.message))
+        print(
+            "insert one record to device {} message: {}".format(
+                device_id, status.message
+            )
+        )
 
         return Session.verify_success(status)
 
@@ -249,13 +301,21 @@ class Session(object):
         :param values: List, values to be inserted, for each sensor
         """
         data_types = [data_type.value for data_type in data_types]
-        request = self.gen_insert_record_req(device_id, timestamp, measurements, data_types, values)
+        request = self.gen_insert_record_req(
+            device_id, timestamp, measurements, data_types, values
+        )
         status = self.__client.insertRecord(request)
-        print("insert one record to device {} message: {}".format(device_id, status.message))
+        print(
+            "insert one record to device {} message: {}".format(
+                device_id, status.message
+            )
+        )
 
         return Session.verify_success(status)
 
-    def insert_records(self, device_ids, times, measurements_lst, types_lst, values_lst):
+    def insert_records(
+        self, device_ids, times, measurements_lst, types_lst, values_lst
+    ):
         """
         insert multiple rows of data, records are independent to each other, in other words, there's no relationship
         between those records
@@ -269,13 +329,21 @@ class Session(object):
         for types in types_lst:
             data_types = [data_type.value for data_type in types]
             type_values_lst.append(data_types)
-        request = self.gen_insert_records_req(device_ids, times, measurements_lst, type_values_lst, values_lst)
+        request = self.gen_insert_records_req(
+            device_ids, times, measurements_lst, type_values_lst, values_lst
+        )
         status = self.__client.insertRecords(request)
-        print("insert multiple records to devices {} message: {}".format(device_ids, status.message))
+        print(
+            "insert multiple records to devices {} message: {}".format(
+                device_ids, status.message
+            )
+        )
 
         return Session.verify_success(status)
 
-    def test_insert_record(self, device_id, timestamp, measurements, data_types, values):
+    def test_insert_record(
+        self, device_id, timestamp, measurements, data_types, values
+    ):
         """
         this method NOT insert data into database and the server just return after accept the request, this method
         should be used to test other time cost in client
@@ -286,13 +354,21 @@ class Session(object):
         :param values: List, values to be inserted, for each sensor
         """
         data_types = [data_type.value for data_type in data_types]
-        request = self.gen_insert_record_req(device_id, timestamp, measurements, data_types, values)
+        request = self.gen_insert_record_req(
+            device_id, timestamp, measurements, data_types, values
+        )
         status = self.__client.testInsertRecord(request)
-        print("testing! insert one record to device {} message: {}".format(device_id, status.message))
+        print(
+            "testing! insert one record to device {} message: {}".format(
+                device_id, status.message
+            )
+        )
 
         return Session.verify_success(status)
 
-    def test_insert_records(self, device_ids, times, measurements_lst, types_lst, values_lst):
+    def test_insert_records(
+        self, device_ids, times, measurements_lst, types_lst, values_lst
+    ):
         """
         this method NOT insert data into database and the server just return after accept the request, this method
         should be used to test other time cost in client
@@ -306,44 +382,68 @@ class Session(object):
         for types in types_lst:
             data_types = [data_type.value for data_type in types]
             type_values_lst.append(data_types)
-        request = self.gen_insert_records_req(device_ids, times, measurements_lst, type_values_lst, values_lst)
+        request = self.gen_insert_records_req(
+            device_ids, times, measurements_lst, type_values_lst, values_lst
+        )
         status = self.__client.testInsertRecords(request)
         print("testing! insert multiple records, message: {}".format(status.message))
 
         return Session.verify_success(status)
 
-    def gen_insert_record_req(self, device_id, timestamp, measurements, data_types, values):
+    def gen_insert_record_req(
+        self, device_id, timestamp, measurements, data_types, values
+    ):
         if (len(values) != len(data_types)) or (len(values) != len(measurements)):
             print("length of data types does not equal to length of values!")
             # could raise an error here.
             return
         values_in_bytes = Session.value_to_bytes(data_types, values)
-        return TSInsertRecordReq(self.__session_id, device_id, measurements, values_in_bytes, timestamp)
+        return TSInsertRecordReq(
+            self.__session_id, device_id, measurements, values_in_bytes, timestamp
+        )
 
-    def gen_insert_str_record_req(self, device_id, timestamp, measurements, data_types, values):
+    def gen_insert_str_record_req(
+        self, device_id, timestamp, measurements, data_types, values
+    ):
         if (len(values) != len(data_types)) or (len(values) != len(measurements)):
             print("length of data types does not equal to length of values!")
             # could raise an error here.
             return
-        return TSInsertStringRecordReq(self.__session_id, device_id, measurements, values, timestamp)
-
-    def gen_insert_records_req(self, device_ids, times, measurements_lst, types_lst, values_lst):
-        if (len(device_ids) != len(measurements_lst)) or (len(times) != len(types_lst)) or \
-            (len(device_ids) != len(times)) or (len(times) != len(values_lst)):
-            print("deviceIds, times, measurementsList and valuesList's size should be equal")
+        return TSInsertStringRecordReq(
+            self.__session_id, device_id, measurements, values, timestamp
+        )
+
+    def gen_insert_records_req(
+        self, device_ids, times, measurements_lst, types_lst, values_lst
+    ):
+        if (
+            (len(device_ids) != len(measurements_lst))
+            or (len(times) != len(types_lst))
+            or (len(device_ids) != len(times))
+            or (len(times) != len(values_lst))
+        ):
+            print(
+                "deviceIds, times, measurementsList and valuesList's size should be equal"
+            )
             # could raise an error here.
             return
 
         value_lst = []
-        for values, data_types, measurements in zip(values_lst, types_lst, measurements_lst):
+        for values, data_types, measurements in zip(
+            values_lst, types_lst, measurements_lst
+        ):
             if (len(values) != len(data_types)) or (len(values) != len(measurements)):
-                print("deviceIds, times, measurementsList and valuesList's size should be equal")
+                print(
+                    "deviceIds, times, measurementsList and valuesList's size should be equal"
+                )
                 # could raise an error here.
                 return
             values_in_bytes = Session.value_to_bytes(data_types, values)
             value_lst.append(values_in_bytes)
 
-        return TSInsertRecordsReq(self.__session_id, device_ids, measurements_lst, value_lst, times)
+        return TSInsertRecordsReq(
+            self.__session_id, device_ids, measurements_lst, value_lst, times
+        )
 
     def insert_tablet(self, tablet):
         """
@@ -358,7 +458,11 @@ class Session(object):
         :param tablet: a tablet specified above
         """
         status = self.__client.insertTablet(self.gen_insert_tablet_req(tablet))
-        print("insert one tablet to device {} message: {}".format(tablet.get_device_id(), status.message))
+        print(
+            "insert one tablet to device {} message: {}".format(
+                tablet.get_device_id(), status.message
+            )
+        )
 
         return Session.verify_success(status)
 
@@ -372,16 +476,25 @@ class Session(object):
 
         return Session.verify_success(status)
 
-
-    def insert_records_of_one_device(self, device_id, times_list, measurements_list, types_list, values_list):
+    def insert_records_of_one_device(
+        self, device_id, times_list, measurements_list, types_list, values_list
+    ):
         # sort by timestamp
-        sorted_zipped = sorted(zip(times_list, measurements_list, types_list, values_list))
+        sorted_zipped = sorted(
+            zip(times_list, measurements_list, types_list, values_list)
+        )
         result = zip(*sorted_zipped)
-        times_list, measurements_list, types_list, values_list = [list(x) for x in result]
+        times_list, measurements_list, types_list, values_list = [
+            list(x) for x in result
+        ]
 
-        return self.insert_records_of_one_device_sorted(device_id, times_list, measurements_list, types_list, values_list)
+        return self.insert_records_of_one_device_sorted(
+            device_id, times_list, measurements_list, types_list, values_list
+        )
 
-    def insert_records_of_one_device_sorted(self, device_id, times_list, measurements_list, types_list, values_list):
+    def insert_records_of_one_device_sorted(
+        self, device_id, times_list, measurements_list, types_list, values_list
+    ):
         """
         Insert multiple rows, which can reduce the overhead of network. This method is just like jdbc
         executeBatch, we pack some insert request in batch and send them to server. If you want improve
@@ -396,8 +509,14 @@ class Session(object):
         """
         # check parameter
         size = len(times_list)
-        if (size != len(measurements_list) or size != len(types_list) or size != len(values_list)):
-            print("insert records of one device error: types, times, measurementsList and valuesList's size should be equal")
+        if (
+            size != len(measurements_list)
+            or size != len(types_list)
+            or size != len(values_list)
+        ):
+            print(
+                "insert records of one device error: types, times, measurementsList and valuesList's size should be equal"
+            )
             return
 
         # check sorted
@@ -405,7 +524,9 @@ class Session(object):
             print("insert records of one device error: timestamp not sorted")
             return
 
-        request = self.gen_insert_records_of_one_device_request(device_id, times_list, measurements_list, values_list, types_list)
+        request = self.gen_insert_records_of_one_device_request(
+            device_id, times_list, measurements_list, values_list, types_list
+        )
 
         # send request
         status = self.__client.insertRecordsOfOneDevice(request)
@@ -413,19 +534,30 @@ class Session(object):
 
         return Session.verify_success(status)
 
-    def gen_insert_records_of_one_device_request(self, device_id, times_list, measurements_list, values_list, types_list):
+    def gen_insert_records_of_one_device_request(
+        self, device_id, times_list, measurements_list, values_list, types_list
+    ):
         binary_value_list = []
-        for values, data_types, measurements in zip(values_list, types_list, measurements_list):
+        for values, data_types, measurements in zip(
+            values_list, types_list, measurements_list
+        ):
             data_types = [data_type.value for data_type in data_types]
             if (len(values) != len(data_types)) or (len(values) != len(measurements)):
-                print("insert records of one device error: deviceIds, times, measurementsList and valuesList's size should be equal")
+                print(
+                    "insert records of one device error: deviceIds, times, measurementsList and valuesList's size should be equal"
+                )
                 # could raise an error here.
                 return
             values_in_bytes = Session.value_to_bytes(data_types, values)
             binary_value_list.append(values_in_bytes)
 
-        return TSInsertRecordsOfOneDeviceReq(self.__session_id, device_id, measurements_list, binary_value_list, times_list)
-
+        return TSInsertRecordsOfOneDeviceReq(
+            self.__session_id,
+            device_id,
+            measurements_list,
+            binary_value_list,
+            times_list,
+        )
 
     def test_insert_tablet(self, tablet):
         """
@@ -434,7 +566,11 @@ class Session(object):
         :param tablet: a tablet of data
         """
         status = self.__client.testInsertTablet(self.gen_insert_tablet_req(tablet))
-        print("testing! insert one tablet to device {} message: {}".format(tablet.get_device_id(), status.message))
+        print(
+            "testing! insert one tablet to device {} message: {}".format(
+                tablet.get_device_id(), status.message
+            )
+        )
 
         return Session.verify_success(status)
 
@@ -444,16 +580,24 @@ class Session(object):
          should be used to test other time cost in client
         :param tablet_list: List of tablets
         """
-        status = self.__client.testInsertTablets(self.gen_insert_tablets_req(tablet_list))
+        status = self.__client.testInsertTablets(
+            self.gen_insert_tablets_req(tablet_list)
+        )
         print("testing! insert multiple tablets, message: {}".format(status.message))
 
         return Session.verify_success(status)
 
     def gen_insert_tablet_req(self, tablet):
         data_type_values = [data_type.value for data_type in tablet.get_data_types()]
-        return TSInsertTabletReq(self.__session_id, tablet.get_device_id(), tablet.get_measurements(),
-                                 tablet.get_binary_values(), tablet.get_binary_timestamps(),
-                                 data_type_values, tablet.get_row_number())
+        return TSInsertTabletReq(
+            self.__session_id,
+            tablet.get_device_id(),
+            tablet.get_measurements(),
+            tablet.get_binary_values(),
+            tablet.get_binary_timestamps(),
+            data_type_values,
+            tablet.get_row_number(),
+        )
 
     def gen_insert_tablets_req(self, tablet_lst):
         device_id_lst = []
@@ -463,27 +607,46 @@ class Session(object):
         type_lst = []
         size_lst = []
         for tablet in tablet_lst:
-            data_type_values = [data_type.value for data_type in tablet.get_data_types()]
+            data_type_values = [
+                data_type.value for data_type in tablet.get_data_types()
+            ]
             device_id_lst.append(tablet.get_device_id())
             measurements_lst.append(tablet.get_measurements())
             values_lst.append(tablet.get_binary_values())
             timestamps_lst.append(tablet.get_binary_timestamps())
             type_lst.append(data_type_values)
             size_lst.append(tablet.get_row_number())
-        return TSInsertTabletsReq(self.__session_id, device_id_lst, measurements_lst,
-                                  values_lst, timestamps_lst, type_lst, size_lst)
-
-    def execute_query_statement(self, sql, timeout = 0):
+        return TSInsertTabletsReq(
+            self.__session_id,
+            device_id_lst,
+            measurements_lst,
+            values_lst,
+            timestamps_lst,
+            type_lst,
+            size_lst,
+        )
+
+    def execute_query_statement(self, sql, timeout=0):
         """
         execute query sql statement and returns SessionDataSet
         :param sql: String, query sql statement
         :return: SessionDataSet, contains query results and relevant info (see SessionDataSet.py)
         """
-        request = TSExecuteStatementReq(self.__session_id, sql, self.__statement_id, self.__fetch_size, timeout)
+        request = TSExecuteStatementReq(
+            self.__session_id, sql, self.__statement_id, self.__fetch_size, timeout
+        )
         resp = self.__client.executeQueryStatement(request)
-        return SessionDataSet(sql, resp.columns, resp.dataTypeList, resp.columnNameIndexMap, resp.queryId,
-                              self.__client, self.__session_id, resp.queryDataSet, resp.ignoreTimeStamp)
-
+        return SessionDataSet(
+            sql,
+            resp.columns,
+            resp.dataTypeList,
+            resp.columnNameIndexMap,
+            resp.queryId,
+            self.__client,
+            self.__session_id,
+            resp.queryDataSet,
+            resp.ignoreTimeStamp,
+        )
 
     def execute_non_query_statement(self, sql):
         """
@@ -494,7 +657,9 @@ class Session(object):
         try:
             resp = self.__client.executeUpdateStatement(request)
             status = resp.status
-            print("execute non-query statement {} message: {}".format(sql, status.message))
+            print(
+                "execute non-query statement {} message: {}".format(sql, status.message)
+            )
             return Session.verify_success(status)
         except TTransport.TException as e:
             print("execution of non-query statement fails because: ", e)
@@ -531,7 +696,7 @@ class Session(object):
                 values_tobe_packed.append(bytes([TSDataType.DOUBLE.value]))
                 values_tobe_packed.append(value)
             elif data_type == TSDataType.TEXT.value:
-                value_bytes = bytes(value, 'utf-8')
+                value_bytes = bytes(value, "utf-8")
                 format_str_list.append("c")
                 format_str_list.append("i")
                 format_str_list.append(str(len(value_bytes)))
@@ -543,7 +708,7 @@ class Session(object):
                 print("Unsupported data type:" + str(data_type))
                 # could raise an error here.
                 return
-        format_str = ''.join(format_str_list)
+        format_str = "".join(format_str_list)
         return struct.pack(format_str, *values_tobe_packed)
 
     def get_time_zone(self):
@@ -560,7 +725,11 @@ class Session(object):
         request = TSSetTimeZoneReq(self.__session_id, zone_id)
         try:
             status = self.__client.setTimeZone(request)
-            print("setting time zone_id as {}, message: {}".format(zone_id, status.message))
+            print(
+                "setting time zone_id as {}, message: {}".format(
+                    zone_id, status.message
+                )
+            )
         except TTransport.TException as e:
             print("Could not set time zone because: ", e)
             raise Exception
@@ -583,4 +752,4 @@ class Session(object):
             return 0
 
         print("error status is", status)
-        return -1
\ No newline at end of file
+        return -1
diff --git a/client-py/iotdb/TestContainer.py b/client-py/iotdb/TestContainer.py
index c34b242..9a01887 100644
--- a/client-py/iotdb/TestContainer.py
+++ b/client-py/iotdb/TestContainer.py
@@ -33,7 +33,9 @@ class IoTDBContainer(DockerContainer):
 
     @wait_container_is_ready()
     def _connect(self):
-        session = Session(self.get_container_host_ip(), self.get_exposed_port(6667), 'root', 'root')
+        session = Session(
+            self.get_container_host_ip(), self.get_exposed_port(6667), "root", "root"
+        )
         session.open(False)
         session.close()
 
diff --git a/client-py/iotdb/__init__.py b/client-py/iotdb/__init__.py
index a4797b6..2a1e720 100644
--- a/client-py/iotdb/__init__.py
+++ b/client-py/iotdb/__init__.py
@@ -15,4 +15,3 @@
 # specific language governing permissions and limitations
 # under the License.
 #
-
diff --git a/client-py/iotdb/iotdb_utils.py b/client-py/iotdb/iotdb_utils.py
deleted file mode 100644
index bd4573e..0000000
--- a/client-py/iotdb/iotdb_utils.py
+++ /dev/null
@@ -1,74 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-#
-
-import pandas as pd
-
-from iotdb.utils.Field import Field
-from iotdb.utils.IoTDBConstants import TSDataType
-from iotdb.utils.SessionDataSet import SessionDataSet
-
-
-def resultset_to_pandas(result_set: SessionDataSet) -> pd.DataFrame:
-    """
-    Transforms a SessionDataSet from IoTDB to a Pandas Data Frame
-    Each Field from IoTDB is a column in Pandas
-    :param result_set:
-    :return:
-    """
-    # get column names and fields
-    column_names = result_set.get_column_names()
-
-    value_dict = {}
-
-    for i in range(len(column_names)):
-        value_dict[column_names[i]] = []
-
-    while result_set.has_next():
-        record = result_set.next()
-
-        value_dict["Time"].append(record.get_timestamp())
-
-        for col in range(len(record.get_fields())):
-            field: Field = record.get_fields()[col]
-
-            value_dict[column_names[col + 1]].append(
-                get_typed_point(field)
-            )
-
-    return pd.DataFrame(value_dict)
-
-
-def get_typed_point(field: Field, none_value=None):
-    choices = {
-        # In Case of Boolean, cast to 0 / 1
-        TSDataType.BOOLEAN: lambda field: 1 if field.get_bool_value() else 0,
-        TSDataType.TEXT: lambda field: field.get_string_value(),
-        TSDataType.FLOAT: lambda field: field.get_float_value(),
-        TSDataType.INT32: lambda field: field.get_int_value(),
-        TSDataType.DOUBLE: lambda field: field.get_double_value(),
-        TSDataType.INT64: lambda field: field.get_long_value(),
-    }
-
-    result_next_type: TSDataType = field.get_data_type()
-
-    if result_next_type in choices.keys():
-        return choices.get(result_next_type)(field)
-    elif result_next_type is None:
-        return none_value
-    else:
-        raise Exception(f"Unknown DataType {result_next_type}!")
diff --git a/client-py/iotdb/utils/Field.py b/client-py/iotdb/utils/Field.py
index 55d1e33..0756b1c 100644
--- a/client-py/iotdb/utils/Field.py
+++ b/client-py/iotdb/utils/Field.py
@@ -19,11 +19,8 @@
 # for package
 from .IoTDBConstants import TSDataType
 
-# for debug
-# from IoTDBConstants import TSDataType
 
 class Field(object):
-
     def __init__(self, data_type):
         """
         :param data_type: TSDataType
@@ -53,7 +50,9 @@ class Field(object):
             elif output.get_data_type() == TSDataType.TEXT:
                 output.set_binary_value(field.get_binary_value())
             else:
-                raise Exception("unsupported data type {}".format(output.get_data_type()))
+                raise Exception(
+                    "unsupported data type {}".format(output.get_data_type())
+                )
         return output
 
     def get_data_type(self):
@@ -124,7 +123,7 @@ class Field(object):
         elif self.__data_type == TSDataType.DOUBLE:
             return str(self.__double_value)
         elif self.__data_type == TSDataType.TEXT:
-            return self.__binary_value.decode('utf-8')
+            return self.__binary_value.decode("utf-8")
         else:
             raise Exception("unsupported data type {}".format(self.__data_type))
 
@@ -176,4 +175,3 @@ class Field(object):
         else:
             raise Exception("unsupported data type {}".format(data_type))
         return field
-
diff --git a/client-py/iotdb/utils/IoTDBRpcDataSet.py b/client-py/iotdb/utils/IoTDBRpcDataSet.py
index 6830488..6920245 100644
--- a/client-py/iotdb/utils/IoTDBRpcDataSet.py
+++ b/client-py/iotdb/utils/IoTDBRpcDataSet.py
@@ -17,18 +17,9 @@
 #
 
 # for package
-from .IoTDBConstants import *
-
-# for debug
-# from IoTDBConstants import *
-
-import sys
-from os.path import dirname, abspath
-path = dirname(dirname(abspath(__file__)))
-sys.path.append(path)
-
 from thrift.transport import TTransport
 from iotdb.thrift.rpc.TSIService import TSFetchResultsReq, TSCloseOperationReq
+from iotdb.utils.IoTDBConstants import TSDataType
 
 
 class IoTDBRpcDataSet(object):
@@ -37,8 +28,19 @@ class IoTDBRpcDataSet(object):
     START_INDEX = 2
     FLAG = 0x80
 
-    def __init__(self, sql, column_name_list, column_type_list, column_name_index, ignore_timestamp, query_id,
-        client, session_id, query_data_set, fetch_size):
+    def __init__(
+        self,
+        sql,
+        column_name_list,
+        column_type_list,
+        column_name_index,
+        ignore_timestamp,
+        query_id,
+        client,
+        session_id,
+        query_data_set,
+        fetch_size,
+    ):
         self.__session_id = session_id
         self.__ignore_timestamp = ignore_timestamp
         self.__sql = sql
@@ -57,15 +59,21 @@ class IoTDBRpcDataSet(object):
             self.__column_ordinal_dict[IoTDBRpcDataSet.TIMESTAMP_STR] = 1
 
         if column_name_index is not None:
-            self.__column_type_deduplicated_list = [None for _ in range(len(column_name_index))]
+            self.__column_type_deduplicated_list = [
+                None for _ in range(len(column_name_index))
+            ]
             for i in range(len(column_name_list)):
                 name = column_name_list[i]
                 self.__column_name_list.append(name)
                 self.__column_type_list.append(TSDataType[column_type_list[i]])
                 if name not in self.__column_ordinal_dict:
                     index = column_name_index[name]
-                    self.__column_ordinal_dict[name] = index + IoTDBRpcDataSet.START_INDEX
-                    self.__column_type_deduplicated_list[index] = TSDataType[column_type_list[i]]
+                    self.__column_ordinal_dict[name] = (
+                        index + IoTDBRpcDataSet.START_INDEX
+                    )
+                    self.__column_type_deduplicated_list[index] = TSDataType[
+                        column_type_list[i]
+                    ]
         else:
             index = IoTDBRpcDataSet.START_INDEX
             self.__column_type_deduplicated_list = []
@@ -76,10 +84,14 @@ class IoTDBRpcDataSet(object):
                 if name not in self.__column_ordinal_dict:
                     self.__column_ordinal_dict[name] = index
                     index += 1
-                    self.__column_type_deduplicated_list.append(TSDataType[column_type_list[i]])
+                    self.__column_type_deduplicated_list.append(
+                        TSDataType[column_type_list[i]]
+                    )
 
         self.__time_bytes = bytes(0)
-        self.__current_bitmap = [bytes(0) for _ in range(len(self.__column_type_deduplicated_list))]
+        self.__current_bitmap = [
+            bytes(0) for _ in range(len(self.__column_type_deduplicated_list))
+        ]
         self.__value = [None for _ in range(len(self.__column_type_deduplicated_list))]
         self.__query_data_set = query_data_set
         self.__is_closed = False
@@ -92,8 +104,14 @@ class IoTDBRpcDataSet(object):
             return
         if self.__client is not None:
             try:
-                status = self.__client.closeOperation(TSCloseOperationReq(self.__session_id, self.__query_id))
-                print("close session {}, message: {}".format(self.__session_id, status.message))
+                status = self.__client.closeOperation(
+                    TSCloseOperationReq(self.__session_id, self.__query_id)
+                )
+                print(
+                    "close session {}, message: {}".format(
+                        self.__session_id, status.message
+                    )
+                )
             except TTransport.TException as e:
                 print("close session {} failed because: ".format(self.__session_id), e)
                 raise Exception
@@ -113,7 +131,9 @@ class IoTDBRpcDataSet(object):
         return False
 
     def has_cached_result(self):
-        return (self.__query_data_set is not None) and (len(self.__query_data_set.time) != 0)
+        return (self.__query_data_set is not None) and (
+            len(self.__query_data_set.time) != 0
+        )
 
     def construct_one_row(self):
         # simulating buffer, read 8 bytes from data set and discard first 8 bytes which have been read.
@@ -147,9 +167,11 @@ class IoTDBRpcDataSet(object):
                     self.__value[i] = value_buffer[:8]
                     self.__query_data_set.valueList[i] = value_buffer[8:]
                 elif data_type == TSDataType.TEXT:
-                    length = int.from_bytes(value_buffer[:4], byteorder="big", signed=False)
-                    self.__value[i] = value_buffer[4: 4 + length]
-                    self.__query_data_set.valueList[i] = value_buffer[4 + length:]
+                    length = int.from_bytes(
+                        value_buffer[:4], byteorder="big", signed=False
+                    )
+                    self.__value[i] = value_buffer[4 : 4 + length]
+                    self.__query_data_set.valueList[i] = value_buffer[4 + length :]
                 else:
                     print("unsupported data type {}.".format(data_type))
                     # could raise exception here
@@ -158,7 +180,14 @@ class IoTDBRpcDataSet(object):
 
     def fetch_results(self):
         self.__rows_index = 0
-        request = TSFetchResultsReq(self.__session_id, self.__sql, self.__fetch_size, self.__query_id, True, self.__default_time_out)
+        request = TSFetchResultsReq(
+            self.__session_id,
+            self.__sql,
+            self.__fetch_size,
+            self.__query_id,
+            True,
+            self.__default_time_out,
+        )
         try:
             resp = self.__client.fetchResults(request)
             if not resp.hasResultSet:
@@ -172,10 +201,13 @@ class IoTDBRpcDataSet(object):
     def is_null(self, index, row_num):
         bitmap = self.__current_bitmap[index]
         shift = row_num % 8
-        return ((IoTDBRpcDataSet.FLAG >> shift) & (bitmap & 0xff)) == 0
+        return ((IoTDBRpcDataSet.FLAG >> shift) & (bitmap & 0xFF)) == 0
 
     def is_null_by_index(self, column_index):
-        index = self.__column_ordinal_dict[self.find_column_name_by_index(column_index)] - IoTDBRpcDataSet.START_INDEX
+        index = (
+            self.__column_ordinal_dict[self.find_column_name_by_index(column_index)]
+            - IoTDBRpcDataSet.START_INDEX
+        )
         # time column will never be None
         if index < 0:
             return True
@@ -192,7 +224,11 @@ class IoTDBRpcDataSet(object):
         if column_index <= 0:
             raise Exception("Column index should start from 1")
         if column_index > len(self.__column_name_list):
-            raise Exception("column index {} out of range {}".format(column_index, self.__column_size))
+            raise Exception(
+                "column index {} out of range {}".format(
+                    column_index, self.__column_size
+                )
+            )
         return self.__column_name_list[column_index - 1]
 
     def get_fetch_size(self):
diff --git a/client-py/iotdb/utils/RowRecord.py b/client-py/iotdb/utils/RowRecord.py
index ac1b219..16a88f1 100644
--- a/client-py/iotdb/utils/RowRecord.py
+++ b/client-py/iotdb/utils/RowRecord.py
@@ -23,15 +23,12 @@ from .Field import Field
 # from IoTDBConstants import TSDataType
 # from Field import Field
 
-class RowRecord(object):
 
+class RowRecord(object):
     def __init__(self, timestamp, field_list=None):
         self.__timestamp = timestamp
         self.__field_list = field_list
 
-    def add_field(self, field):
-        self.__field_list.append(field)
-
     def add_field(self, value, data_type):
         self.__field_list.append(Field.get_field(value, data_type))
 
diff --git a/client-py/iotdb/utils/SessionDataSet.py b/client-py/iotdb/utils/SessionDataSet.py
index f75a0e2..8437010 100644
--- a/client-py/iotdb/utils/SessionDataSet.py
+++ b/client-py/iotdb/utils/SessionDataSet.py
@@ -16,29 +16,43 @@
 # under the License.
 #
 
-# for package
-from .IoTDBConstants import TSDataType
-from .IoTDBRpcDataSet import IoTDBRpcDataSet
-from .Field import Field
-from .RowRecord import RowRecord
+import struct
 
-# for debug
-# from IoTDBConstants import TSDataType
-# from IoTDBRpcDataSet import IoTDBRpcDataSet
-# from Field import Field
-# from RowRecord import RowRecord
+from iotdb.utils.Field import Field
 
-import struct
+# for package
+from iotdb.utils.IoTDBConstants import TSDataType
+from iotdb.utils.IoTDBRpcDataSet import IoTDBRpcDataSet
+from iotdb.utils.RowRecord import RowRecord
 
-from ..iotdb_utils import resultset_to_pandas
+import pandas as pd
 
 
 class SessionDataSet(object):
-
-    def __init__(self, sql, column_name_list, column_type_list, column_name_index, query_id, client, session_id,
-                 query_data_set, ignore_timestamp):
-        self.iotdb_rpc_data_set = IoTDBRpcDataSet(sql, column_name_list, column_type_list, column_name_index,
-                                                  ignore_timestamp, query_id, client, session_id, query_data_set, 1024)
+    def __init__(
+        self,
+        sql,
+        column_name_list,
+        column_type_list,
+        column_name_index,
+        query_id,
+        client,
+        session_id,
+        query_data_set,
+        ignore_timestamp,
+    ):
+        self.iotdb_rpc_data_set = IoTDBRpcDataSet(
+            sql,
+            column_name_list,
+            column_type_list,
+            column_name_index,
+            ignore_timestamp,
+            query_id,
+            client,
+            session_id,
+            query_data_set,
+            1024,
+        )
 
     def get_fetch_size(self):
         return self.iotdb_rpc_data_set.get_fetch_size()
@@ -71,11 +85,16 @@ class SessionDataSet(object):
                 index -= 1
                 data_set_column_index -= 1
             column_name = self.iotdb_rpc_data_set.get_column_names()[index]
-            location = self.iotdb_rpc_data_set.get_column_ordinal_dict()[column_name] - IoTDBRpcDataSet.START_INDEX
+            location = (
+                self.iotdb_rpc_data_set.get_column_ordinal_dict()[column_name]
+                - IoTDBRpcDataSet.START_INDEX
+            )
 
             if not self.iotdb_rpc_data_set.is_null_by_index(data_set_column_index):
                 value_bytes = self.iotdb_rpc_data_set.get_values()[location]
-                data_type = self.iotdb_rpc_data_set.get_column_type_deduplicated_list()[location]
+                data_type = self.iotdb_rpc_data_set.get_column_type_deduplicated_list()[
+                    location
+                ]
                 field = Field(data_type)
                 if data_type == TSDataType.BOOLEAN:
                     value = struct.unpack(">?", value_bytes)[0]
@@ -101,10 +120,61 @@ class SessionDataSet(object):
                 field = Field(None)
             out_fields.append(field)
 
-        return RowRecord(struct.unpack(">q", self.iotdb_rpc_data_set.get_time_bytes())[0], out_fields)
+        return RowRecord(
+            struct.unpack(">q", self.iotdb_rpc_data_set.get_time_bytes())[0], out_fields
+        )
 
     def close_operation_handle(self):
         self.iotdb_rpc_data_set.close()
 
     def todf(self):
         return resultset_to_pandas(self)
+
+
+def resultset_to_pandas(result_set: SessionDataSet) -> pd.DataFrame:
+    """
+    Transforms a SessionDataSet from IoTDB to a Pandas Data Frame
+    Each Field from IoTDB is a column in Pandas
+    :param result_set:
+    :return:
+    """
+    # get column names and fields
+    column_names = result_set.get_column_names()
+
+    value_dict = {}
+
+    for i in range(len(column_names)):
+        value_dict[column_names[i]] = []
+
+    while result_set.has_next():
+        record = result_set.next()
+
+        value_dict["Time"].append(record.get_timestamp())
+
+        for col in range(len(record.get_fields())):
+            field: Field = record.get_fields()[col]
+
+            value_dict[column_names[col + 1]].append(get_typed_point(field))
+
+    return pd.DataFrame(value_dict)
+
+
+def get_typed_point(field: Field, none_value=None):
+    choices = {
+        # In Case of Boolean, cast to 0 / 1
+        TSDataType.BOOLEAN: lambda field: 1 if field.get_bool_value() else 0,
+        TSDataType.TEXT: lambda field: field.get_string_value(),
+        TSDataType.FLOAT: lambda field: field.get_float_value(),
+        TSDataType.INT32: lambda field: field.get_int_value(),
+        TSDataType.DOUBLE: lambda field: field.get_double_value(),
+        TSDataType.INT64: lambda field: field.get_long_value(),
+    }
+
+    result_next_type: TSDataType = field.get_data_type()
+
+    if result_next_type in choices.keys():
+        return choices.get(result_next_type)(field)
+    elif result_next_type is None:
+        return none_value
+    else:
+        raise Exception(f"Unknown DataType {result_next_type}!")
diff --git a/client-py/iotdb/utils/Tablet.py b/client-py/iotdb/utils/Tablet.py
index cdb0c21..444f4e8 100644
--- a/client-py/iotdb/utils/Tablet.py
+++ b/client-py/iotdb/utils/Tablet.py
@@ -16,17 +16,12 @@
 # under the License.
 #
 
-# for package
-from .IoTDBConstants import *
-
-# for debug
-# from IoTDBConstants import *
-
 import struct
 
+from iotdb.utils.IoTDBConstants import TSDataType
 
-class Tablet(object):
 
+class Tablet(object):
     def __init__(self, device_id, measurements, data_types, values, timestamps):
         """
         creating a tablet for insertion
@@ -88,7 +83,7 @@ class Tablet(object):
             format_str_list.append("q")
             values_tobe_packed.append(timestamp)
 
-        format_str = ''.join(format_str_list)
+        format_str = "".join(format_str_list)
         return struct.pack(format_str, *values_tobe_packed)
 
     def get_binary_values(self):
@@ -122,7 +117,7 @@ class Tablet(object):
                     values_tobe_packed.append(self.__values[j][i])
             elif self.__data_types[i] == TSDataType.TEXT:
                 for j in range(self.__row_number):
-                    value_bytes = bytes(self.__values[j][i], 'utf-8')
+                    value_bytes = bytes(self.__values[j][i], "utf-8")
                     format_str_list.append("i")
                     format_str_list.append(str(len(value_bytes)))
                     format_str_list.append("s")
@@ -133,6 +128,5 @@ class Tablet(object):
                 # could raise an error here.
                 return
 
-        format_str = ''.join(format_str_list)
+        format_str = "".join(format_str_list)
         return struct.pack(format_str, *values_tobe_packed)
-
diff --git a/client-py/iotdb/utils/__init__.py b/client-py/iotdb/utils/__init__.py
index a4797b6..2a1e720 100644
--- a/client-py/iotdb/utils/__init__.py
+++ b/client-py/iotdb/utils/__init__.py
@@ -15,4 +15,3 @@
 # specific language governing permissions and limitations
 # under the License.
 #
-
diff --git a/client-py/pyproject.toml b/client-py/pyproject.toml
new file mode 100644
index 0000000..e6f34c9
--- /dev/null
+++ b/client-py/pyproject.toml
@@ -0,0 +1,27 @@
+[tool.black]
+line-length = 88
+target-version = ['py37']
+include = '\.pyi?$'
+exclude = '''
+
+(
+  /(
+      \.eggs         # exclude a few common directories in the
+    | \.git          # root of the project
+    | \.hg
+    | \.mypy_cache
+    | \.tox
+    | \.venv
+    | venv
+    | _build
+    | buck-out
+    | build
+    | dist
+    | migrations
+    | test
+    | iotdb/thrift
+  )/
+  | foo.py           # also separately exclude a file named foo.py in
+                     # the root of the project
+)
+'''
\ No newline at end of file
diff --git a/client-py/requirements_dev.txt b/client-py/requirements_dev.txt
index 0ee3c7c..6fb957f 100644
--- a/client-py/requirements_dev.txt
+++ b/client-py/requirements_dev.txt
@@ -1,4 +1,6 @@
 -r requirements.txt
 # Pytest to run tests
 pytest==6.2.2
-thrift==0.13.0
\ No newline at end of file
+thrift==0.13.0
+flake8==3.9.0
+black==20.8b1
\ No newline at end of file
diff --git a/client-py/setup.py b/client-py/setup.py
index 69d5559..5b2d95c 100644
--- a/client-py/setup.py
+++ b/client-py/setup.py
@@ -21,16 +21,16 @@ import io
 
 
 try:
-    with io.open('README.md', encoding='utf-8') as f:
+    with io.open("README.md", encoding="utf-8") as f:
         long_description = f.read()
 except FileNotFoundError:
-    long_description = ''
+    long_description = ""
 
 
 print(long_description)
 
 setuptools.setup(
-    name="apache-iotdb", # Replace with your own username
+    name="apache-iotdb",  # Replace with your own username
     version="0.12.0",
     author=" Apache Software Foundation",
     author_email="dev@iotdb.apache.org",
@@ -39,10 +39,7 @@ setuptools.setup(
     long_description_content_type="text/markdown",
     url="https://github.com/apache/iotdb",
     packages=setuptools.find_packages(),
-    install_requires=[
-              'thrift>=0.13.0',
-              'pandas>=1.0.0,<1.99.99'
-          ],
+    install_requires=["thrift>=0.13.0", "pandas>=1.0.0,<1.99.99", "testcontainers>=2.0.0"],
     classifiers=[
         "Programming Language :: Python :: 3",
         "License :: OSI Approved :: Apache Software License",
@@ -50,7 +47,7 @@ setuptools.setup(
         "Topic :: Software Development :: Libraries",
         "Topic :: Software Development :: Libraries :: Python Modules",
     ],
-    python_requires='>=3.7',
-    license='Apache License, Version 2.0',
-    website='https://iotdb.apache.org',
+    python_requires=">=3.7",
+    license="Apache License, Version 2.0",
+    website="https://iotdb.apache.org",
 )
diff --git a/client-py/tests/test_dataframe.py b/client-py/tests/test_dataframe.py
index 988caaf..239cc8d 100644
--- a/client-py/tests/test_dataframe.py
+++ b/client-py/tests/test_dataframe.py
@@ -1,6 +1,8 @@
 from iotdb.Session import Session
 from iotdb.TestContainer import IoTDBContainer
 
+from numpy.testing import assert_array_equal
+
 
 def test_simple_query():
     with IoTDBContainer("apache/iotdb:0.11.2") as db:
@@ -9,7 +11,7 @@ def test_simple_query():
         session.open(False)
 
         # Write data
-        session.insert_str_record("device", 123, "pressure", "15.0")
+        session.insert_str_record("root.device", 123, "pressure", "15.0")
 
         # Read
         session_data_set = session.execute_query_statement("SELECT * FROM root.*")
@@ -17,4 +19,5 @@ def test_simple_query():
 
         session.close()
 
-    assert df == []
\ No newline at end of file
+    assert list(df.columns) == ["Time", "root.device.pressure"]
+    assert_array_equal(df.values, [[123.0, 15.0]])

[iotdb] 03/04: Merge branch 'feature/client-py-pandas' into feature/restrucutre-python-module

Posted by jf...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

jfeinauer pushed a commit to branch feature/restrucutre-python-module
in repository https://gitbox.apache.org/repos/asf/iotdb.git

commit a2b104e623ad4cc0ebfbf9de15e4018601acf244
Merge: 5c9e579 8a7aa94
Author: Julian Feinauer <j....@pragmaticminds.de>
AuthorDate: Fri Mar 26 09:45:16 2021 +0100

    Merge branch 'feature/client-py-pandas' into feature/restrucutre-python-module
    
    # Conflicts:
    #	client-py/pypi/README.md

 client-py/iotdb/Session.py              |  1 +
 client-py/iotdb/iotdb_utils.py          | 74 +++++++++++++++++++++++++++++++++
 client-py/iotdb/utils/SessionDataSet.py |  7 ++--
 3 files changed, 79 insertions(+), 3 deletions(-)

diff --cc client-py/iotdb/iotdb_utils.py
index 0000000,bd4573e..bd4573e
mode 000000,100644..100644
--- a/client-py/iotdb/iotdb_utils.py
+++ b/client-py/iotdb/iotdb_utils.py