You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@couchdb.apache.org by Russell Branca <ch...@apache.org> on 2017/12/15 01:03:32 UTC

[RFC] On the Testing of CouchDB

Howdy folks!

The testing of CouchDB is something that has seen focus and improvements
for the last several years, for instance migrating the etap suite to eunit,
and updating the JS suite to run against clusters in 2.x. There's still
improvements to be made, and that was one of the topics of the CouchDB dev
summit early in the year [1].

Before we go further, I want to clarify some nomenclature. I'm by no means
going to try and define unit testing vs integration testing vs quantum
phase shift testing, but instead I want to focus on the distinction of
where the testing takes place. Fundamentally, we have two places we test
CouchDB: 1) at the Erlang VM level where we conduct assertions against
module functions or process states; 2) at the HTTP level where we test the
behavior of CouchDB at the user level API. This post focuses entirely on
the latter; that's not to say the former doesn't also merit attention, just
that the two are different enough that we can focus on them in isolation.

So with that, let's chat about the current HTTP test suite in CouchDB. This
is the "JS suite" I referred to above, which is a custom built test suite
written in Javascript and executed in the aging SpiderMonkey. The JS suite
has put in work for years, but it's showing it's age, and is a bit awkward
to work with and improve. However, I think the biggest issue with the JS
suite is that it's utilized far less than it should be, and folks seem to
avoid extending it or adding additional tests to it. There's been
discussion for years about replacing said suite, but the discussions
invariably got blocked on the bike shed of whether to rewrite the suite in
Javascript or Python. This thread provides a third option, with code!

I started hacking on a replacement for the JS suite, this time written in
Elixir. Overall I'm quite impressed with how it's come along, and have some
good examples to show. This is basically an Elixir app that has an HTTP
client and then runs a series of tests that conduct tests against the
CouchDB HTTP API and make assertions therein.

You can find the current code in [2], and a comparison of the changes in
[3]. The core HTTP client is only a handful of lines of codes and works
quite well [4]. The utility functions used across all tests are located in
[5], and the tests themselves are in [6]. The existing test modules have a
1:1 correspondence with the associated JS suite test modules, and in
general are as direct of a port as possible.

The test modules ported in their entirety or most of the way are:

  * all_docs.js
  * basics.js
  * config.js
  * reduce.js
  * rewrite.js
  * uuids.js
  * view_collation.js

Paul has dove in and is responsible for a few of those test modules and
he's almost completed porting the replication.js suite as well. We started
with the hard ones first, so for the most part the rest of the ports should
be fairly smooth sailing.

Here's an example of a very basic test:

```erlang
defmodule WelcomeTest do
  use CouchTestCase

  test "Welcome endpoint" do
    assert Couch.get("/").body["couchdb"] == "Welcome", "Should say welcome"
  end

end

```


As you can see, the `Couch` client is very simple HTTP client with
easy HTTP verb based methods. Let's look at a more complicated test
for asserting we can create documents in a database:


```erlang

  @tag :with_db
  test "Create a document and save it to the database", context do
    resp = Couch.post("/#{context[:db_name]}", [body: %{:_id => "0",
:a => 1, :b => 1}])
    assert resp.status_code == 201, "Should be 201 created"
    assert resp.body["id"], "Id should be present"
    assert resp.body["rev"], "Rev should be present"

    resp2 = Couch.get("/#{context[:db_name]}/#{resp.body["id"]}")
    assert resp2.body["_id"] == resp.body["id"], "Ids should match"
    assert resp2.body["_rev"] == resp.body["rev"], "Revs should match"
  end

```


This is fairly straightforward code to POST a new doc, make assertions
on the response, and then fetch the doc to make sure everything
matches up. What I really wanted to highlight here is the `@tag
:with_db` decorator. We can easily add custom "tags" to the tests to
simplify setup and teardown. That `:with_db` tag does two things, it
dynamically generates a random database name, and then takes care of
setup/teardown for creating and deleting said database for that
particular test. This is really useful and has been very nice to work
with so far. We also have tag functionality in place for executing a
test with a particular set of config options:


```erlang

  @tag config: [
    {"uuids", "algorithm", "utc_random"}
  ]
  test "utc_random uuids are roughly random" do
    resp = Couch.get("/_uuids", query: %{:count => 1000})
    assert resp.status_code == 200
    uuids = resp.body["uuids"]

    assert String.length(Enum.at(uuids, 1)) == 32

    # Assert no collisions
    assert length(Enum.uniq(uuids)) == length(uuids)

    # Assert rough ordering of UUIDs
    u1 = String.slice(Enum.at(uuids, 1), 0..13)
    u2 = String.slice(Enum.at(uuids, -1), 0..13)
    assert u1 < u2
  end
```


The tag system really simplifies a lot of the standard auxiliary
actions needed to conduct tests.


To test out the suite, you'll need to spin up the dev server in one window with:


```

./dev/run --admin=adm:pass

```


and then in another window go into the relevant CouchDB src directory and run:


```

cd ~/src/couchdb/elixir_suite/

mix deps.get

mix test --trace

```


The `--trace` flag makes the nice line item output per test, which I
greatly prefer over a slew of periods. You can run an individual test
with `mix test --trace tests/basics_test.exs`. I've pasted the output
from running the basics suite at the bottom of this email so you can
see what the real output looks like.


Overall I'm quite impressed with the toolkit we've been able to put
together in a short amount of time, and I propose we migrate fully to
this test suite by porting all remaining JS suite tests and then
removing the JS suite entirely. Given we've already ported most of the
"hard suites", I think a full port is reasonable to do and just
requires some leg work. Again, I'm impressed with how simple the
tooling here is and how quickly we've been able to run with things,
turns out the Elixir dev experience is actually quite nice! I hope
others have similar opinions after diving in! Let me know what you
think.



-Russell



[1] https://github.com/janl/couchdb-next/issues/39
[2] https://github.com/apache/couchdb/tree/elixir-suite
[3] https://github.com/apache/couchdb/compare/elixir-suite
[4]
https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/lib/couch.ex
[5]
https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/test/test_helper.exs
[6] https://github.com/apache/couchdb/tree/elixir-suite/elixir_suite/test


vagrant@contrib-jessie:~/src/couchdb/elixir_suite$ mix test --trace
test/basics_test.exs
Excluding tags: [pending: true]

BasicsTest
  * test Session contains adm context (66.8ms)
  * test Creating a new DB with slashes should return Location header
(COUCHDB-411) (85.8ms)
  * test oops, the doc id got lost in code nirwana (82.1ms)
  * test Welcome endpoint (7.6ms)
  * test POST doc with an _id field isn't overwritten by uuid (102.7ms)
  * test On restart, a request for creating an already existing db can
not override (skipped)
  * test Creating a new DB should return location header (118.7ms)
  * test _bulk_docs POST error when body not an object (95.0ms)
  * test Empty database should have zero docs (161.0ms)
  * test _all_docs POST error when multi-get is not a {'key': [...]}
structure (104.3ms)
  * test Regression test for COUCHDB-954 (skipped)
  * test DELETE'ing a non-existent doc should 404 (100.0ms)
  * test Revs info status is good (127.3ms)
  * test PUT on existing DB should return 412 instead of 500 (97.6ms)
  * test Database should be in _all_dbs (117.7ms)
  * test Check for invalid document members (122.4ms)
  * test Can create several documents (213.0ms)
  * test Make sure you can do a seq=true option (99.1ms)
  * test PUT doc has a Location header (skipped)
  * test Create a document and save it to the database (116.3ms)
  * test Created database has appropriate db info name (99.7ms)
  * test PUT error when body not an object (89.5ms)
  * test Simple map functions (473.0ms)
  * test POST doc response has a Location header (117.1ms)

CouchTestCase


Finished in 3.3 seconds
24 tests, 0 failures, 3 skipped

Randomized with seed 936284

Re: [RFC] On the Testing of CouchDB

Posted by Mike Rhodes <mr...@linux.vnet.ibm.com>.
Russ,

I think Elixir is a nice language, easier to use than Erlang to my fairly simple mind anyway.

One thing that seems to be happening here is an effort to port _every_ JS test before merging -- is this really required, or given the broad support of the port in other bits of this thread, can we merge in a piecemeal fashion rather than one huge merge of all tests?

I guess for a while that'd mean running both suites as tests get ported over, but it would also mean that we're running these tests in anger earlier rather than later in production builds. So if they don't work for some reason, effort hasn't been wasted on a huge porting effort?

It would also make it significantly simpler for code reviewers to be confident that a ported set of tests actually do replicate the original JS tests if there are ten or twenty tests per PR rather than several hundred or more.

Mike.

> On 15 Dec 2017, at 01:03, Russell Branca <ch...@apache.org> wrote:
> 
> Howdy folks!
> 
> The testing of CouchDB is something that has seen focus and improvements
> for the last several years, for instance migrating the etap suite to eunit,
> and updating the JS suite to run against clusters in 2.x. There's still
> improvements to be made, and that was one of the topics of the CouchDB dev
> summit early in the year [1].
> 
> Before we go further, I want to clarify some nomenclature. I'm by no means
> going to try and define unit testing vs integration testing vs quantum
> phase shift testing, but instead I want to focus on the distinction of
> where the testing takes place. Fundamentally, we have two places we test
> CouchDB: 1) at the Erlang VM level where we conduct assertions against
> module functions or process states; 2) at the HTTP level where we test the
> behavior of CouchDB at the user level API. This post focuses entirely on
> the latter; that's not to say the former doesn't also merit attention, just
> that the two are different enough that we can focus on them in isolation.
> 
> So with that, let's chat about the current HTTP test suite in CouchDB. This
> is the "JS suite" I referred to above, which is a custom built test suite
> written in Javascript and executed in the aging SpiderMonkey. The JS suite
> has put in work for years, but it's showing it's age, and is a bit awkward
> to work with and improve. However, I think the biggest issue with the JS
> suite is that it's utilized far less than it should be, and folks seem to
> avoid extending it or adding additional tests to it. There's been
> discussion for years about replacing said suite, but the discussions
> invariably got blocked on the bike shed of whether to rewrite the suite in
> Javascript or Python. This thread provides a third option, with code!
> 
> I started hacking on a replacement for the JS suite, this time written in
> Elixir. Overall I'm quite impressed with how it's come along, and have some
> good examples to show. This is basically an Elixir app that has an HTTP
> client and then runs a series of tests that conduct tests against the
> CouchDB HTTP API and make assertions therein.
> 
> You can find the current code in [2], and a comparison of the changes in
> [3]. The core HTTP client is only a handful of lines of codes and works
> quite well [4]. The utility functions used across all tests are located in
> [5], and the tests themselves are in [6]. The existing test modules have a
> 1:1 correspondence with the associated JS suite test modules, and in
> general are as direct of a port as possible.
> 
> The test modules ported in their entirety or most of the way are:
> 
>  * all_docs.js
>  * basics.js
>  * config.js
>  * reduce.js
>  * rewrite.js
>  * uuids.js
>  * view_collation.js
> 
> Paul has dove in and is responsible for a few of those test modules and
> he's almost completed porting the replication.js suite as well. We started
> with the hard ones first, so for the most part the rest of the ports should
> be fairly smooth sailing.
> 
> Here's an example of a very basic test:
> 
> ```erlang
> defmodule WelcomeTest do
>  use CouchTestCase
> 
>  test "Welcome endpoint" do
>    assert Couch.get("/").body["couchdb"] == "Welcome", "Should say welcome"
>  end
> 
> end
> 
> ```
> 
> 
> As you can see, the `Couch` client is very simple HTTP client with
> easy HTTP verb based methods. Let's look at a more complicated test
> for asserting we can create documents in a database:
> 
> 
> ```erlang
> 
>  @tag :with_db
>  test "Create a document and save it to the database", context do
>    resp = Couch.post("/#{context[:db_name]}", [body: %{:_id => "0",
> :a => 1, :b => 1}])
>    assert resp.status_code == 201, "Should be 201 created"
>    assert resp.body["id"], "Id should be present"
>    assert resp.body["rev"], "Rev should be present"
> 
>    resp2 = Couch.get("/#{context[:db_name]}/#{resp.body["id"]}")
>    assert resp2.body["_id"] == resp.body["id"], "Ids should match"
>    assert resp2.body["_rev"] == resp.body["rev"], "Revs should match"
>  end
> 
> ```
> 
> 
> This is fairly straightforward code to POST a new doc, make assertions
> on the response, and then fetch the doc to make sure everything
> matches up. What I really wanted to highlight here is the `@tag
> :with_db` decorator. We can easily add custom "tags" to the tests to
> simplify setup and teardown. That `:with_db` tag does two things, it
> dynamically generates a random database name, and then takes care of
> setup/teardown for creating and deleting said database for that
> particular test. This is really useful and has been very nice to work
> with so far. We also have tag functionality in place for executing a
> test with a particular set of config options:
> 
> 
> ```erlang
> 
>  @tag config: [
>    {"uuids", "algorithm", "utc_random"}
>  ]
>  test "utc_random uuids are roughly random" do
>    resp = Couch.get("/_uuids", query: %{:count => 1000})
>    assert resp.status_code == 200
>    uuids = resp.body["uuids"]
> 
>    assert String.length(Enum.at(uuids, 1)) == 32
> 
>    # Assert no collisions
>    assert length(Enum.uniq(uuids)) == length(uuids)
> 
>    # Assert rough ordering of UUIDs
>    u1 = String.slice(Enum.at(uuids, 1), 0..13)
>    u2 = String.slice(Enum.at(uuids, -1), 0..13)
>    assert u1 < u2
>  end
> ```
> 
> 
> The tag system really simplifies a lot of the standard auxiliary
> actions needed to conduct tests.
> 
> 
> To test out the suite, you'll need to spin up the dev server in one window with:
> 
> 
> ```
> 
> ./dev/run --admin=adm:pass
> 
> ```
> 
> 
> and then in another window go into the relevant CouchDB src directory and run:
> 
> 
> ```
> 
> cd ~/src/couchdb/elixir_suite/
> 
> mix deps.get
> 
> mix test --trace
> 
> ```
> 
> 
> The `--trace` flag makes the nice line item output per test, which I
> greatly prefer over a slew of periods. You can run an individual test
> with `mix test --trace tests/basics_test.exs`. I've pasted the output
> from running the basics suite at the bottom of this email so you can
> see what the real output looks like.
> 
> 
> Overall I'm quite impressed with the toolkit we've been able to put
> together in a short amount of time, and I propose we migrate fully to
> this test suite by porting all remaining JS suite tests and then
> removing the JS suite entirely. Given we've already ported most of the
> "hard suites", I think a full port is reasonable to do and just
> requires some leg work. Again, I'm impressed with how simple the
> tooling here is and how quickly we've been able to run with things,
> turns out the Elixir dev experience is actually quite nice! I hope
> others have similar opinions after diving in! Let me know what you
> think.
> 
> 
> 
> -Russell
> 
> 
> 
> [1] https://github.com/janl/couchdb-next/issues/39
> [2] https://github.com/apache/couchdb/tree/elixir-suite
> [3] https://github.com/apache/couchdb/compare/elixir-suite
> [4]
> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/lib/couch.ex
> [5]
> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/test/test_helper.exs
> [6] https://github.com/apache/couchdb/tree/elixir-suite/elixir_suite/test
> 
> 
> vagrant@contrib-jessie:~/src/couchdb/elixir_suite$ mix test --trace
> test/basics_test.exs
> Excluding tags: [pending: true]
> 
> BasicsTest
>  * test Session contains adm context (66.8ms)
>  * test Creating a new DB with slashes should return Location header
> (COUCHDB-411) (85.8ms)
>  * test oops, the doc id got lost in code nirwana (82.1ms)
>  * test Welcome endpoint (7.6ms)
>  * test POST doc with an _id field isn't overwritten by uuid (102.7ms)
>  * test On restart, a request for creating an already existing db can
> not override (skipped)
>  * test Creating a new DB should return location header (118.7ms)
>  * test _bulk_docs POST error when body not an object (95.0ms)
>  * test Empty database should have zero docs (161.0ms)
>  * test _all_docs POST error when multi-get is not a {'key': [...]}
> structure (104.3ms)
>  * test Regression test for COUCHDB-954 (skipped)
>  * test DELETE'ing a non-existent doc should 404 (100.0ms)
>  * test Revs info status is good (127.3ms)
>  * test PUT on existing DB should return 412 instead of 500 (97.6ms)
>  * test Database should be in _all_dbs (117.7ms)
>  * test Check for invalid document members (122.4ms)
>  * test Can create several documents (213.0ms)
>  * test Make sure you can do a seq=true option (99.1ms)
>  * test PUT doc has a Location header (skipped)
>  * test Create a document and save it to the database (116.3ms)
>  * test Created database has appropriate db info name (99.7ms)
>  * test PUT error when body not an object (89.5ms)
>  * test Simple map functions (473.0ms)
>  * test POST doc response has a Location header (117.1ms)
> 
> CouchTestCase
> 
> 
> Finished in 3.3 seconds
> 24 tests, 0 failures, 3 skipped
> 
> Randomized with seed 936284


Re: [RFC] On the Testing of CouchDB

Posted by Garren Smith <ga...@apache.org>.
I just want to follow up on this work. Could we consider moving the current
work into master. It seems having this on a side branch means we have all
forgotten about it. I think its worth having the current tests that are
ported running off of master and then any new work we do, we could aslo
write the tests in elixir.

Cheers
Garren

On Mon, Jan 22, 2018 at 3:46 AM, Peng Hui Jiang <ji...@cn.ibm.com> wrote:

> Hi Russ,
>
> I also see that Elixir is good way to go, compared to legacy eunit or
> javascript test.
>
> For now, I introduced one js test test case[1] for _design_docs endpoint.
> Later on, I would like to convert it to elixir version.
>
> Peng Hui (jiangphcn)
>
> [1]: https://github.com/apache/couchdb/pull/1110/files#diff-
> 1e07aee7a9178c48feebcd86c5251dbc
>
> [image: Inactive hide details for Jay Doane ---08/01/2018 02:50:05
> PM---Elixir's focus on positive developer experience seems like a bi]Jay
> Doane ---08/01/2018 02:50:05 PM---Elixir's focus on positive developer
> experience seems like a big selling point (among many others),
>
> From: Jay Doane <ja...@almery.com>
> To: dev@couchdb.apache.org
> Date: 08/01/2018 02:50 PM
> Subject: Re: [RFC] On the Testing of CouchDB
> Sent by: jay.s.doane@gmail.com
> ------------------------------
>
>
>
> Elixir's focus on positive developer experience seems like a big selling
> point (among many others), so count me in.
>
> I'd like to work on stats.js, provided nobody else has started it yet.
>
> Jay
>
> On Thu, Dec 14, 2017 at 5:03 PM, Russell Branca <ch...@apache.org>
> wrote:
>
> > Howdy folks!
> >
> > The testing of CouchDB is something that has seen focus and improvements
> > for the last several years, for instance migrating the etap suite to
> eunit,
> > and updating the JS suite to run against clusters in 2.x. There's still
> > improvements to be made, and that was one of the topics of the CouchDB
> dev
> > summit early in the year [1].
> >
> > Before we go further, I want to clarify some nomenclature. I'm by no
> means
> > going to try and define unit testing vs integration testing vs quantum
> > phase shift testing, but instead I want to focus on the distinction of
> > where the testing takes place. Fundamentally, we have two places we test
> > CouchDB: 1) at the Erlang VM level where we conduct assertions against
> > module functions or process states; 2) at the HTTP level where we test
> the
> > behavior of CouchDB at the user level API. This post focuses entirely on
> > the latter; that's not to say the former doesn't also merit attention,
> just
> > that the two are different enough that we can focus on them in isolation.
> >
> > So with that, let's chat about the current HTTP test suite in CouchDB.
> This
> > is the "JS suite" I referred to above, which is a custom built test suite
> > written in Javascript and executed in the aging SpiderMonkey. The JS
> suite
> > has put in work for years, but it's showing it's age, and is a bit
> awkward
> > to work with and improve. However, I think the biggest issue with the JS
> > suite is that it's utilized far less than it should be, and folks seem to
> > avoid extending it or adding additional tests to it. There's been
> > discussion for years about replacing said suite, but the discussions
> > invariably got blocked on the bike shed of whether to rewrite the suite
> in
> > Javascript or Python. This thread provides a third option, with code!
> >
> > I started hacking on a replacement for the JS suite, this time written in
> > Elixir. Overall I'm quite impressed with how it's come along, and have
> some
> > good examples to show. This is basically an Elixir app that has an HTTP
> > client and then runs a series of tests that conduct tests against the
> > CouchDB HTTP API and make assertions therein.
> >
> > You can find the current code in [2], and a comparison of the changes in
> > [3]. The core HTTP client is only a handful of lines of codes and works
> > quite well [4]. The utility functions used across all tests are located
> in
> > [5], and the tests themselves are in [6]. The existing test modules have
> a
> > 1:1 correspondence with the associated JS suite test modules, and in
> > general are as direct of a port as possible.
> >
> > The test modules ported in their entirety or most of the way are:
> >
> >   * all_docs.js
> >   * basics.js
> >   * config.js
> >   * reduce.js
> >   * rewrite.js
> >   * uuids.js
> >   * view_collation.js
> >
> > Paul has dove in and is responsible for a few of those test modules and
> > he's almost completed porting the replication.js suite as well. We
> started
> > with the hard ones first, so for the most part the rest of the ports
> should
> > be fairly smooth sailing.
> >
> > Here's an example of a very basic test:
> >
> > ```erlang
> > defmodule WelcomeTest do
> >   use CouchTestCase
> >
> >   test "Welcome endpoint" do
> >     assert Couch.get("/").body["couchdb"] == "Welcome", "Should say
> > welcome"
> >   end
> >
> > end
> >
> > ```
> >
> >
> > As you can see, the `Couch` client is very simple HTTP client with
> > easy HTTP verb based methods. Let's look at a more complicated test
> > for asserting we can create documents in a database:
> >
> >
> > ```erlang
> >
> >   @tag :with_db
> >   test "Create a document and save it to the database", context do
> >     resp = Couch.post("/#{context[:db_name]}", [body: %{:_id => "0",
> > :a => 1, :b => 1}])
> >     assert resp.status_code == 201, "Should be 201 created"
> >     assert resp.body["id"], "Id should be present"
> >     assert resp.body["rev"], "Rev should be present"
> >
> >     resp2 = Couch.get("/#{context[:db_name]}/#{resp.body["id"]}")
> >     assert resp2.body["_id"] == resp.body["id"], "Ids should match"
> >     assert resp2.body["_rev"] == resp.body["rev"], "Revs should match"
> >   end
> >
> > ```
> >
> >
> > This is fairly straightforward code to POST a new doc, make assertions
> > on the response, and then fetch the doc to make sure everything
> > matches up. What I really wanted to highlight here is the `@tag
> > :with_db` decorator. We can easily add custom "tags" to the tests to
> > simplify setup and teardown. That `:with_db` tag does two things, it
> > dynamically generates a random database name, and then takes care of
> > setup/teardown for creating and deleting said database for that
> > particular test. This is really useful and has been very nice to work
> > with so far. We also have tag functionality in place for executing a
> > test with a particular set of config options:
> >
> >
> > ```erlang
> >
> >   @tag config: [
> >     {"uuids", "algorithm", "utc_random"}
> >   ]
> >   test "utc_random uuids are roughly random" do
> >     resp = Couch.get("/_uuids", query: %{:count => 1000})
> >     assert resp.status_code == 200
> >     uuids = resp.body["uuids"]
> >
> >     assert String.length(Enum.at(uuids, 1)) == 32
> >
> >     # Assert no collisions
> >     assert length(Enum.uniq(uuids)) == length(uuids)
> >
> >     # Assert rough ordering of UUIDs
> >     u1 = String.slice(Enum.at(uuids, 1), 0..13)
> >     u2 = String.slice(Enum.at(uuids, -1), 0..13)
> >     assert u1 < u2
> >   end
> > ```
> >
> >
> > The tag system really simplifies a lot of the standard auxiliary
> > actions needed to conduct tests.
> >
> >
> > To test out the suite, you'll need to spin up the dev server in one
> window
> > with:
> >
> >
> > ```
> >
> > ./dev/run --admin=adm:pass
> >
> > ```
> >
> >
> > and then in another window go into the relevant CouchDB src directory and
> > run:
> >
> >
> > ```
> >
> > cd ~/src/couchdb/elixir_suite/
> >
> > mix deps.get
> >
> > mix test --trace
> >
> > ```
> >
> >
> > The `--trace` flag makes the nice line item output per test, which I
> > greatly prefer over a slew of periods. You can run an individual test
> > with `mix test --trace tests/basics_test.exs`. I've pasted the output
> > from running the basics suite at the bottom of this email so you can
> > see what the real output looks like.
> >
> >
> > Overall I'm quite impressed with the toolkit we've been able to put
> > together in a short amount of time, and I propose we migrate fully to
> > this test suite by porting all remaining JS suite tests and then
> > removing the JS suite entirely. Given we've already ported most of the
> > "hard suites", I think a full port is reasonable to do and just
> > requires some leg work. Again, I'm impressed with how simple the
> > tooling here is and how quickly we've been able to run with things,
> > turns out the Elixir dev experience is actually quite nice! I hope
> > others have similar opinions after diving in! Let me know what you
> > think.
> >
> >
> >
> > -Russell
> >
> >
> >
> > [1] https://urldefense.proofpoint.com/v2/url?u=https-3A__github.
> com_janl_couchdb-2Dnext_issues_39&d=DwIBaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=
> PKZ65oA9tV05sXjYYyZUJf_d-ASaaLXiLw-gQdWPDsQ&m=KFEs1TJr_BD-8lO_
> QkBCmAmkSibdeftttGOftQ0IavE&s=2X7h-oq2dK4zffTcTnVRkisZv8lzX6u4lPa
> NOd3xv4E&e=
> > [2] https://urldefense.proofpoint.com/v2/url?u=https-3A__github.
> com_apache_couchdb_tree_elixir-2Dsuite&d=DwIBaQ&c=jf_
> iaSHvJObTbx-siA1ZOg&r=PKZ65oA9tV05sXjYYyZUJf_d-
> ASaaLXiLw-gQdWPDsQ&m=KFEs1TJr_BD-8lO_QkBCmAmkSibdeftttGOftQ0IavE&s=
> ukdcgbNVMsAZDG4Rfkvy4Ty7tt0sVHONmxWiNNxZ4Uo&e=
> > [3] https://urldefense.proofpoint.com/v2/url?u=https-3A__github.
> com_apache_couchdb_compare_elixir-2Dsuite&d=DwIBaQ&c=jf_
> iaSHvJObTbx-siA1ZOg&r=PKZ65oA9tV05sXjYYyZUJf_d-
> ASaaLXiLw-gQdWPDsQ&m=KFEs1TJr_BD-8lO_QkBCmAmkSibdeftttGOftQ0IavE&s=_
> L7VAXIfL2cs3062JRUCa63aRd2BVhrqjuaZ8ejW_bs&e=
> > [4]
> > https://urldefense.proofpoint.com/v2/url?u=https-3A__github.
> com_apache_couchdb_blob_elixir-2Dsuite_&d=DwIBaQ&c=jf_
> iaSHvJObTbx-siA1ZOg&r=PKZ65oA9tV05sXjYYyZUJf_d-
> ASaaLXiLw-gQdWPDsQ&m=KFEs1TJr_BD-8lO_QkBCmAmkSibdeftttGOftQ0IavE&s=
> 2m-qQpRdGIB2IXwU33qA9bFE2yl-ukb4UQ2ps3tfNKE&e=
> > elixir_suite/lib/couch.ex
> > [5]
> > https://urldefense.proofpoint.com/v2/url?u=https-3A__github.
> com_apache_couchdb_blob_elixir-2Dsuite_&d=DwIBaQ&c=jf_
> iaSHvJObTbx-siA1ZOg&r=PKZ65oA9tV05sXjYYyZUJf_d-
> ASaaLXiLw-gQdWPDsQ&m=KFEs1TJr_BD-8lO_QkBCmAmkSibdeftttGOftQ0IavE&s=
> 2m-qQpRdGIB2IXwU33qA9bFE2yl-ukb4UQ2ps3tfNKE&e=
> > elixir_suite/test/test_helper.exs
> > [6] https://urldefense.proofpoint.com/v2/url?u=https-3A__github.
> com_apache_couchdb_tree_elixir-2Dsuite_elixir-5Fsuite_test&d=DwIBaQ&c=jf_
> iaSHvJObTbx-siA1ZOg&r=PKZ65oA9tV05sXjYYyZUJf_d-
> ASaaLXiLw-gQdWPDsQ&m=KFEs1TJr_BD-8lO_QkBCmAmkSibdeftttGOftQ0IavE&s=
> OFNKwLRXREEHl0ba3xHseXqxp0lZRaO9aSW-bFvKL04&e=
>
> >
> >
> > vagrant@contrib-jessie:~/src/couchdb/elixir_suite$ mix test --trace
> > test/basics_test.exs
> > Excluding tags: [pending: true]
> >
> > BasicsTest
> >   * test Session contains adm context (66.8ms)
> >   * test Creating a new DB with slashes should return Location header
> > (COUCHDB-411) (85.8ms)
> >   * test oops, the doc id got lost in code nirwana (82.1ms)
> >   * test Welcome endpoint (7.6ms)
> >   * test POST doc with an _id field isn't overwritten by uuid (102.7ms)
> >   * test On restart, a request for creating an already existing db can
> > not override (skipped)
> >   * test Creating a new DB should return location header (118.7ms)
> >   * test _bulk_docs POST error when body not an object (95.0ms)
> >   * test Empty database should have zero docs (161.0ms)
> >   * test _all_docs POST error when multi-get is not a {'key': [...]}
> > structure (104.3ms)
> >   * test Regression test for COUCHDB-954 (skipped)
> >   * test DELETE'ing a non-existent doc should 404 (100.0ms)
> >   * test Revs info status is good (127.3ms)
> >   * test PUT on existing DB should return 412 instead of 500 (97.6ms)
> >   * test Database should be in _all_dbs (117.7ms)
> >   * test Check for invalid document members (122.4ms)
> >   * test Can create several documents (213.0ms)
> >   * test Make sure you can do a seq=true option (99.1ms)
> >   * test PUT doc has a Location header (skipped)
> >   * test Create a document and save it to the database (116.3ms)
> >   * test Created database has appropriate db info name (99.7ms)
> >   * test PUT error when body not an object (89.5ms)
> >   * test Simple map functions (473.0ms)
> >   * test POST doc response has a Location header (117.1ms)
> >
> > CouchTestCase
> >
> >
> > Finished in 3.3 seconds
> > 24 tests, 0 failures, 3 skipped
> >
> > Randomized with seed 936284
> >
>
>
>
>

Re: [RFC] On the Testing of CouchDB

Posted by Peng Hui Jiang <ji...@cn.ibm.com>.
Hi Russ,

I also see that Elixir is good way to go, compared to legacy eunit or
javascript test.

For now, I introduced one js test test case[1] for _design_docs endpoint.
Later on, I would like to convert it to elixir version.

Peng Hui (jiangphcn)

[1]:
https://github.com/apache/couchdb/pull/1110/files#diff-1e07aee7a9178c48feebcd86c5251dbc



From:	Jay Doane <ja...@almery.com>
To:	dev@couchdb.apache.org
Date:	08/01/2018 02:50 PM
Subject:	Re: [RFC] On the Testing of CouchDB
Sent by:	jay.s.doane@gmail.com



Elixir's focus on positive developer experience seems like a big selling
point (among many others), so count me in.

I'd like to work on stats.js, provided nobody else has started it yet.

Jay

On Thu, Dec 14, 2017 at 5:03 PM, Russell Branca <ch...@apache.org>
wrote:

> Howdy folks!
>
> The testing of CouchDB is something that has seen focus and improvements
> for the last several years, for instance migrating the etap suite to
eunit,
> and updating the JS suite to run against clusters in 2.x. There's still
> improvements to be made, and that was one of the topics of the CouchDB
dev
> summit early in the year [1].
>
> Before we go further, I want to clarify some nomenclature. I'm by no
means
> going to try and define unit testing vs integration testing vs quantum
> phase shift testing, but instead I want to focus on the distinction of
> where the testing takes place. Fundamentally, we have two places we test
> CouchDB: 1) at the Erlang VM level where we conduct assertions against
> module functions or process states; 2) at the HTTP level where we test
the
> behavior of CouchDB at the user level API. This post focuses entirely on
> the latter; that's not to say the former doesn't also merit attention,
just
> that the two are different enough that we can focus on them in isolation.
>
> So with that, let's chat about the current HTTP test suite in CouchDB.
This
> is the "JS suite" I referred to above, which is a custom built test suite
> written in Javascript and executed in the aging SpiderMonkey. The JS
suite
> has put in work for years, but it's showing it's age, and is a bit
awkward
> to work with and improve. However, I think the biggest issue with the JS
> suite is that it's utilized far less than it should be, and folks seem to
> avoid extending it or adding additional tests to it. There's been
> discussion for years about replacing said suite, but the discussions
> invariably got blocked on the bike shed of whether to rewrite the suite
in
> Javascript or Python. This thread provides a third option, with code!
>
> I started hacking on a replacement for the JS suite, this time written in
> Elixir. Overall I'm quite impressed with how it's come along, and have
some
> good examples to show. This is basically an Elixir app that has an HTTP
> client and then runs a series of tests that conduct tests against the
> CouchDB HTTP API and make assertions therein.
>
> You can find the current code in [2], and a comparison of the changes in
> [3]. The core HTTP client is only a handful of lines of codes and works
> quite well [4]. The utility functions used across all tests are located
in
> [5], and the tests themselves are in [6]. The existing test modules have
a
> 1:1 correspondence with the associated JS suite test modules, and in
> general are as direct of a port as possible.
>
> The test modules ported in their entirety or most of the way are:
>
>   * all_docs.js
>   * basics.js
>   * config.js
>   * reduce.js
>   * rewrite.js
>   * uuids.js
>   * view_collation.js
>
> Paul has dove in and is responsible for a few of those test modules and
> he's almost completed porting the replication.js suite as well. We
started
> with the hard ones first, so for the most part the rest of the ports
should
> be fairly smooth sailing.
>
> Here's an example of a very basic test:
>
> ```erlang
> defmodule WelcomeTest do
>   use CouchTestCase
>
>   test "Welcome endpoint" do
>     assert Couch.get("/").body["couchdb"] == "Welcome", "Should say
> welcome"
>   end
>
> end
>
> ```
>
>
> As you can see, the `Couch` client is very simple HTTP client with
> easy HTTP verb based methods. Let's look at a more complicated test
> for asserting we can create documents in a database:
>
>
> ```erlang
>
>   @tag :with_db
>   test "Create a document and save it to the database", context do
>     resp = Couch.post("/#{context[:db_name]}", [body: %{:_id => "0",
> :a => 1, :b => 1}])
>     assert resp.status_code == 201, "Should be 201 created"
>     assert resp.body["id"], "Id should be present"
>     assert resp.body["rev"], "Rev should be present"
>
>     resp2 = Couch.get("/#{context[:db_name]}/#{resp.body["id"]}")
>     assert resp2.body["_id"] == resp.body["id"], "Ids should match"
>     assert resp2.body["_rev"] == resp.body["rev"], "Revs should match"
>   end
>
> ```
>
>
> This is fairly straightforward code to POST a new doc, make assertions
> on the response, and then fetch the doc to make sure everything
> matches up. What I really wanted to highlight here is the `@tag
> :with_db` decorator. We can easily add custom "tags" to the tests to
> simplify setup and teardown. That `:with_db` tag does two things, it
> dynamically generates a random database name, and then takes care of
> setup/teardown for creating and deleting said database for that
> particular test. This is really useful and has been very nice to work
> with so far. We also have tag functionality in place for executing a
> test with a particular set of config options:
>
>
> ```erlang
>
>   @tag config: [
>     {"uuids", "algorithm", "utc_random"}
>   ]
>   test "utc_random uuids are roughly random" do
>     resp = Couch.get("/_uuids", query: %{:count => 1000})
>     assert resp.status_code == 200
>     uuids = resp.body["uuids"]
>
>     assert String.length(Enum.at(uuids, 1)) == 32
>
>     # Assert no collisions
>     assert length(Enum.uniq(uuids)) == length(uuids)
>
>     # Assert rough ordering of UUIDs
>     u1 = String.slice(Enum.at(uuids, 1), 0..13)
>     u2 = String.slice(Enum.at(uuids, -1), 0..13)
>     assert u1 < u2
>   end
> ```
>
>
> The tag system really simplifies a lot of the standard auxiliary
> actions needed to conduct tests.
>
>
> To test out the suite, you'll need to spin up the dev server in one
window
> with:
>
>
> ```
>
> ./dev/run --admin=adm:pass
>
> ```
>
>
> and then in another window go into the relevant CouchDB src directory and
> run:
>
>
> ```
>
> cd ~/src/couchdb/elixir_suite/
>
> mix deps.get
>
> mix test --trace
>
> ```
>
>
> The `--trace` flag makes the nice line item output per test, which I
> greatly prefer over a slew of periods. You can run an individual test
> with `mix test --trace tests/basics_test.exs`. I've pasted the output
> from running the basics suite at the bottom of this email so you can
> see what the real output looks like.
>
>
> Overall I'm quite impressed with the toolkit we've been able to put
> together in a short amount of time, and I propose we migrate fully to
> this test suite by porting all remaining JS suite tests and then
> removing the JS suite entirely. Given we've already ported most of the
> "hard suites", I think a full port is reasonable to do and just
> requires some leg work. Again, I'm impressed with how simple the
> tooling here is and how quickly we've been able to run with things,
> turns out the Elixir dev experience is actually quite nice! I hope
> others have similar opinions after diving in! Let me know what you
> think.
>
>
>
> -Russell
>
>
>
> [1]
https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_janl_couchdb-2Dnext_issues_39&d=DwIBaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=PKZ65oA9tV05sXjYYyZUJf_d-ASaaLXiLw-gQdWPDsQ&m=KFEs1TJr_BD-8lO_QkBCmAmkSibdeftttGOftQ0IavE&s=2X7h-oq2dK4zffTcTnVRkisZv8lzX6u4lPaNOd3xv4E&e=

> [2]
https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_apache_couchdb_tree_elixir-2Dsuite&d=DwIBaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=PKZ65oA9tV05sXjYYyZUJf_d-ASaaLXiLw-gQdWPDsQ&m=KFEs1TJr_BD-8lO_QkBCmAmkSibdeftttGOftQ0IavE&s=ukdcgbNVMsAZDG4Rfkvy4Ty7tt0sVHONmxWiNNxZ4Uo&e=

> [3]
https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_apache_couchdb_compare_elixir-2Dsuite&d=DwIBaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=PKZ65oA9tV05sXjYYyZUJf_d-ASaaLXiLw-gQdWPDsQ&m=KFEs1TJr_BD-8lO_QkBCmAmkSibdeftttGOftQ0IavE&s=_L7VAXIfL2cs3062JRUCa63aRd2BVhrqjuaZ8ejW_bs&e=

> [4]
>
https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_apache_couchdb_blob_elixir-2Dsuite_&d=DwIBaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=PKZ65oA9tV05sXjYYyZUJf_d-ASaaLXiLw-gQdWPDsQ&m=KFEs1TJr_BD-8lO_QkBCmAmkSibdeftttGOftQ0IavE&s=2m-qQpRdGIB2IXwU33qA9bFE2yl-ukb4UQ2ps3tfNKE&e=

> elixir_suite/lib/couch.ex
> [5]
>
https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_apache_couchdb_blob_elixir-2Dsuite_&d=DwIBaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=PKZ65oA9tV05sXjYYyZUJf_d-ASaaLXiLw-gQdWPDsQ&m=KFEs1TJr_BD-8lO_QkBCmAmkSibdeftttGOftQ0IavE&s=2m-qQpRdGIB2IXwU33qA9bFE2yl-ukb4UQ2ps3tfNKE&e=

> elixir_suite/test/test_helper.exs
> [6]
https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_apache_couchdb_tree_elixir-2Dsuite_elixir-5Fsuite_test&d=DwIBaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=PKZ65oA9tV05sXjYYyZUJf_d-ASaaLXiLw-gQdWPDsQ&m=KFEs1TJr_BD-8lO_QkBCmAmkSibdeftttGOftQ0IavE&s=OFNKwLRXREEHl0ba3xHseXqxp0lZRaO9aSW-bFvKL04&e=

>
>
> vagrant@contrib-jessie:~/src/couchdb/elixir_suite$ mix test --trace
> test/basics_test.exs
> Excluding tags: [pending: true]
>
> BasicsTest
>   * test Session contains adm context (66.8ms)
>   * test Creating a new DB with slashes should return Location header
> (COUCHDB-411) (85.8ms)
>   * test oops, the doc id got lost in code nirwana (82.1ms)
>   * test Welcome endpoint (7.6ms)
>   * test POST doc with an _id field isn't overwritten by uuid (102.7ms)
>   * test On restart, a request for creating an already existing db can
> not override (skipped)
>   * test Creating a new DB should return location header (118.7ms)
>   * test _bulk_docs POST error when body not an object (95.0ms)
>   * test Empty database should have zero docs (161.0ms)
>   * test _all_docs POST error when multi-get is not a {'key': [...]}
> structure (104.3ms)
>   * test Regression test for COUCHDB-954 (skipped)
>   * test DELETE'ing a non-existent doc should 404 (100.0ms)
>   * test Revs info status is good (127.3ms)
>   * test PUT on existing DB should return 412 instead of 500 (97.6ms)
>   * test Database should be in _all_dbs (117.7ms)
>   * test Check for invalid document members (122.4ms)
>   * test Can create several documents (213.0ms)
>   * test Make sure you can do a seq=true option (99.1ms)
>   * test PUT doc has a Location header (skipped)
>   * test Create a document and save it to the database (116.3ms)
>   * test Created database has appropriate db info name (99.7ms)
>   * test PUT error when body not an object (89.5ms)
>   * test Simple map functions (473.0ms)
>   * test POST doc response has a Location header (117.1ms)
>
> CouchTestCase
>
>
> Finished in 3.3 seconds
> 24 tests, 0 failures, 3 skipped
>
> Randomized with seed 936284
>



Re: [RFC] On the Testing of CouchDB

Posted by Jay Doane <ja...@almery.com>.
Elixir's focus on positive developer experience seems like a big selling
point (among many others), so count me in.

I'd like to work on stats.js, provided nobody else has started it yet.

Jay

On Thu, Dec 14, 2017 at 5:03 PM, Russell Branca <ch...@apache.org>
wrote:

> Howdy folks!
>
> The testing of CouchDB is something that has seen focus and improvements
> for the last several years, for instance migrating the etap suite to eunit,
> and updating the JS suite to run against clusters in 2.x. There's still
> improvements to be made, and that was one of the topics of the CouchDB dev
> summit early in the year [1].
>
> Before we go further, I want to clarify some nomenclature. I'm by no means
> going to try and define unit testing vs integration testing vs quantum
> phase shift testing, but instead I want to focus on the distinction of
> where the testing takes place. Fundamentally, we have two places we test
> CouchDB: 1) at the Erlang VM level where we conduct assertions against
> module functions or process states; 2) at the HTTP level where we test the
> behavior of CouchDB at the user level API. This post focuses entirely on
> the latter; that's not to say the former doesn't also merit attention, just
> that the two are different enough that we can focus on them in isolation.
>
> So with that, let's chat about the current HTTP test suite in CouchDB. This
> is the "JS suite" I referred to above, which is a custom built test suite
> written in Javascript and executed in the aging SpiderMonkey. The JS suite
> has put in work for years, but it's showing it's age, and is a bit awkward
> to work with and improve. However, I think the biggest issue with the JS
> suite is that it's utilized far less than it should be, and folks seem to
> avoid extending it or adding additional tests to it. There's been
> discussion for years about replacing said suite, but the discussions
> invariably got blocked on the bike shed of whether to rewrite the suite in
> Javascript or Python. This thread provides a third option, with code!
>
> I started hacking on a replacement for the JS suite, this time written in
> Elixir. Overall I'm quite impressed with how it's come along, and have some
> good examples to show. This is basically an Elixir app that has an HTTP
> client and then runs a series of tests that conduct tests against the
> CouchDB HTTP API and make assertions therein.
>
> You can find the current code in [2], and a comparison of the changes in
> [3]. The core HTTP client is only a handful of lines of codes and works
> quite well [4]. The utility functions used across all tests are located in
> [5], and the tests themselves are in [6]. The existing test modules have a
> 1:1 correspondence with the associated JS suite test modules, and in
> general are as direct of a port as possible.
>
> The test modules ported in their entirety or most of the way are:
>
>   * all_docs.js
>   * basics.js
>   * config.js
>   * reduce.js
>   * rewrite.js
>   * uuids.js
>   * view_collation.js
>
> Paul has dove in and is responsible for a few of those test modules and
> he's almost completed porting the replication.js suite as well. We started
> with the hard ones first, so for the most part the rest of the ports should
> be fairly smooth sailing.
>
> Here's an example of a very basic test:
>
> ```erlang
> defmodule WelcomeTest do
>   use CouchTestCase
>
>   test "Welcome endpoint" do
>     assert Couch.get("/").body["couchdb"] == "Welcome", "Should say
> welcome"
>   end
>
> end
>
> ```
>
>
> As you can see, the `Couch` client is very simple HTTP client with
> easy HTTP verb based methods. Let's look at a more complicated test
> for asserting we can create documents in a database:
>
>
> ```erlang
>
>   @tag :with_db
>   test "Create a document and save it to the database", context do
>     resp = Couch.post("/#{context[:db_name]}", [body: %{:_id => "0",
> :a => 1, :b => 1}])
>     assert resp.status_code == 201, "Should be 201 created"
>     assert resp.body["id"], "Id should be present"
>     assert resp.body["rev"], "Rev should be present"
>
>     resp2 = Couch.get("/#{context[:db_name]}/#{resp.body["id"]}")
>     assert resp2.body["_id"] == resp.body["id"], "Ids should match"
>     assert resp2.body["_rev"] == resp.body["rev"], "Revs should match"
>   end
>
> ```
>
>
> This is fairly straightforward code to POST a new doc, make assertions
> on the response, and then fetch the doc to make sure everything
> matches up. What I really wanted to highlight here is the `@tag
> :with_db` decorator. We can easily add custom "tags" to the tests to
> simplify setup and teardown. That `:with_db` tag does two things, it
> dynamically generates a random database name, and then takes care of
> setup/teardown for creating and deleting said database for that
> particular test. This is really useful and has been very nice to work
> with so far. We also have tag functionality in place for executing a
> test with a particular set of config options:
>
>
> ```erlang
>
>   @tag config: [
>     {"uuids", "algorithm", "utc_random"}
>   ]
>   test "utc_random uuids are roughly random" do
>     resp = Couch.get("/_uuids", query: %{:count => 1000})
>     assert resp.status_code == 200
>     uuids = resp.body["uuids"]
>
>     assert String.length(Enum.at(uuids, 1)) == 32
>
>     # Assert no collisions
>     assert length(Enum.uniq(uuids)) == length(uuids)
>
>     # Assert rough ordering of UUIDs
>     u1 = String.slice(Enum.at(uuids, 1), 0..13)
>     u2 = String.slice(Enum.at(uuids, -1), 0..13)
>     assert u1 < u2
>   end
> ```
>
>
> The tag system really simplifies a lot of the standard auxiliary
> actions needed to conduct tests.
>
>
> To test out the suite, you'll need to spin up the dev server in one window
> with:
>
>
> ```
>
> ./dev/run --admin=adm:pass
>
> ```
>
>
> and then in another window go into the relevant CouchDB src directory and
> run:
>
>
> ```
>
> cd ~/src/couchdb/elixir_suite/
>
> mix deps.get
>
> mix test --trace
>
> ```
>
>
> The `--trace` flag makes the nice line item output per test, which I
> greatly prefer over a slew of periods. You can run an individual test
> with `mix test --trace tests/basics_test.exs`. I've pasted the output
> from running the basics suite at the bottom of this email so you can
> see what the real output looks like.
>
>
> Overall I'm quite impressed with the toolkit we've been able to put
> together in a short amount of time, and I propose we migrate fully to
> this test suite by porting all remaining JS suite tests and then
> removing the JS suite entirely. Given we've already ported most of the
> "hard suites", I think a full port is reasonable to do and just
> requires some leg work. Again, I'm impressed with how simple the
> tooling here is and how quickly we've been able to run with things,
> turns out the Elixir dev experience is actually quite nice! I hope
> others have similar opinions after diving in! Let me know what you
> think.
>
>
>
> -Russell
>
>
>
> [1] https://github.com/janl/couchdb-next/issues/39
> [2] https://github.com/apache/couchdb/tree/elixir-suite
> [3] https://github.com/apache/couchdb/compare/elixir-suite
> [4]
> https://github.com/apache/couchdb/blob/elixir-suite/
> elixir_suite/lib/couch.ex
> [5]
> https://github.com/apache/couchdb/blob/elixir-suite/
> elixir_suite/test/test_helper.exs
> [6] https://github.com/apache/couchdb/tree/elixir-suite/elixir_suite/test
>
>
> vagrant@contrib-jessie:~/src/couchdb/elixir_suite$ mix test --trace
> test/basics_test.exs
> Excluding tags: [pending: true]
>
> BasicsTest
>   * test Session contains adm context (66.8ms)
>   * test Creating a new DB with slashes should return Location header
> (COUCHDB-411) (85.8ms)
>   * test oops, the doc id got lost in code nirwana (82.1ms)
>   * test Welcome endpoint (7.6ms)
>   * test POST doc with an _id field isn't overwritten by uuid (102.7ms)
>   * test On restart, a request for creating an already existing db can
> not override (skipped)
>   * test Creating a new DB should return location header (118.7ms)
>   * test _bulk_docs POST error when body not an object (95.0ms)
>   * test Empty database should have zero docs (161.0ms)
>   * test _all_docs POST error when multi-get is not a {'key': [...]}
> structure (104.3ms)
>   * test Regression test for COUCHDB-954 (skipped)
>   * test DELETE'ing a non-existent doc should 404 (100.0ms)
>   * test Revs info status is good (127.3ms)
>   * test PUT on existing DB should return 412 instead of 500 (97.6ms)
>   * test Database should be in _all_dbs (117.7ms)
>   * test Check for invalid document members (122.4ms)
>   * test Can create several documents (213.0ms)
>   * test Make sure you can do a seq=true option (99.1ms)
>   * test PUT doc has a Location header (skipped)
>   * test Create a document and save it to the database (116.3ms)
>   * test Created database has appropriate db info name (99.7ms)
>   * test PUT error when body not an object (89.5ms)
>   * test Simple map functions (473.0ms)
>   * test POST doc response has a Location header (117.1ms)
>
> CouchTestCase
>
>
> Finished in 3.3 seconds
> 24 tests, 0 failures, 3 skipped
>
> Randomized with seed 936284
>

Re: [RFC] On the Testing of CouchDB

Posted by Russell Branca <ru...@chewbranca.com>.
I just updated the README in the Elixir suite to have a todo list of tests
to port, in case folks are looking for tests:
https://github.com/apache/couchdb/blob/elixir-suite/test/elixir/README.md


-Russell
On Fri, Dec 15, 2017 at 9:46 AM Paul Davis <pa...@gmail.com>
wrote:

> For `make check` it should be fairly straightforward to map the
> current approach to it. I could probably knock that out fairly quickly
> if you want me to give it a whirl.
>
> On Fri, Dec 15, 2017 at 11:42 AM, Russell Branca <ch...@apache.org>
> wrote:
> > Yeah just to reiterate what Paul said, the Elixir dev experience is
> really
> > nice and easy to get rolling with. I had no prior actual experience with
> > Elixir and I was able to get things rolling in a few hours.
> >
> > RE Ben's question about diving in: please do! Just grab one of the
> unported
> > js suites and goto town. I've just been cherry-pick'ing things out of
> > Paul's branch and we can continue to do the same until we get this more
> > locked down. My goal with the porting is to keep chugging along and just
> > get it knocked out, as I really don't think it will be overly onerous to
> do
> > so. And if anyone else wants to jump in, there's still a fair number of
> > tests to port, just take your pick.
> >
> > One other thing that needs work is figuring out how to hook all this into
> > "make check" and what not. I've mostly ignored that as this just points
> at
> > a CouchDB instance and can be run directly, but we'll need to sort that
> out
> > at some point.
> >
> >
> > -Russell
> >
> > On Fri, Dec 15, 2017 at 9:03 AM Paul Davis <pa...@gmail.com>
> > wrote:
> >
> >> Hello everybody!
> >>
> >> I figured I should probably go ahead and chime in seeing as I've also
> >> been playing around porting some of the tests in my free time between
> >> ops shifts the last couple weeks.
> >>
> >> My first impression was that it was ridiculously easy to get involved.
> >> On OS X at least, `brew install elixir` was enough to get a working
> >> elixir installed (however, if you use kerl or erln8 you'll want have
> >> to build an Erlang 20.x VM to use the brew package). I went from not
> >> having Elixir installed to a full port of uuids.js with the config tag
> >> logic written in about two hours one night. So far the Elixir docs and
> >> seem very well written and put together. I'd say the worst part of
> >> Elixir so far is that knowing Erlang I find myself searching for "How
> >> do I do this Erlang thing in Elixir?" Which isn't as bad as it sounds.
> >> The Elixir libraries have certainly had a considerable amount of
> >> thought put into them to make them easy to use and remember. I find it
> >> to be a lot like my experience when learning Python in that I may have
> >> to Google once and then its muscle memory. As opposed to Erlang's
> >> library where I'm constantly reading the lists manpage to remember
> >> argument orderings and whether I want search or find versions etc.
> >>
> >> Which I guess is a long way of saying I'm rather liking the Elixir
> >> development experience so far.
> >>
> >> That said, I'm currently about half way through porting replication.js
> >> tests to Elixir. For the most part its fairly straightforward. My
> >> current approach as we've done for the other modules is to do a direct
> >> port. Once that's finished we'll want to break up that huge module
> >> into a series of modules that share a lot of the utility functions.
> >> One of the nice things about moving to Elixir is that its got a full
> >> on development story rather than our current couchjs approach that
> >> prevents sharing code easily between subsets of tests.
> >>
> >> For Ben's question on diving in, I'd do just that. I'd say leave a
> >> note here about which module(s)? you're going to port so that we're
> >> not duplicating efforts and then its basically just a matter of
> >> getting Elixir installed. For that, here's a quick rundown on how I
> >> got that working:
> >>
> >> $ brew update
> >> $ brew install elixir
> >> $ # wait for all the things...
> >> $ iex # which fails cause I have an Erlang VM older than 20.0 as a
> default
> >> $ erln8 --fetch
> >> $ erln8 --build --tag=OTP-20.1.6 --id=20.1.6
> >> $ # wait while erln8 does its thing
> >> $ git clone https://github.com/apache/couchdb
> >> $ cd couchdb
> >> $ ./configure --disable-docs --disable-fauxton --with-curl
> >> $ make
> >> $ git checkout -b elixir-suite-davisp origin/elixir-suite # but use
> >> your own name
> >> $ cd elixir_suite # Russel promises to move this to test/elixir
> >> eventually... :)
> >> $ mix deps.get
> >> $ # For the moment, in another terminal, run ./dev/run -a adm:pass
> >> $ mix test --trace
> >> $ # For development you can also do this:
> >> $ mix test --trace test/module_i_am_working_on.exs
> >>
> >> For the time being, anyone that does any porting work, I'd just let
> >> Russel know and he can pull the changes into the main elixir-suite
> >> branch. For the initial work it might get a bit messy but we can
> >> always clean up after the fact if we decide this is a direction we'd
> >> like to go for real. To that end, I'd also make sure that we do a
> >> single .js -> .exs port per commit to try and make any future cleanup
> >> work easier.
> >>
> >> Also, even if people don't feel like doing any actual porting work I'd
> >> still be interested in hearing what its like for people to just run
> >> through their platform equivalent of the above steps. And even just
> >> initial impressions on toying around with Elixir. My only experience
> >> with Elixir prior to this was reading through their quick
> >> start/tutorial pages a couple of times to get a feeling for the syntax
> >> but hadn't actually even typed it into an editor till last week.
> >>
> >> And that's all I've got for now.
> >>
> >> On Thu, Dec 14, 2017 at 11:57 PM, Benjamin Anderson
> >> <ba...@apache.org> wrote:
> >> > Slick! This seems like it's coming together really nicely. Can't argue
> >> > with commits like "Prefer ?w=3 over hacky sleeps"[1] in any case.
> >> >
> >> >> I hope others have similar opinions after diving in!
> >> >
> >> > How should one dive in? Are you looking for others to help out with
> >> > the ports, or just thinking aspirationally about future regular
> >> > contributions to the test suite?
> >> >
> >> > --
> >> > b
> >> >
> >> > [1]:
> >>
> https://github.com/apache/couchdb/commit/5bce2d98a298c25b77d8dcda19deeedb494cc289
> >> >
> >> > On Thu, Dec 14, 2017 at 5:03 PM, Russell Branca <
> chewbranca@apache.org>
> >> wrote:
> >> >> Howdy folks!
> >> >>
> >> >> The testing of CouchDB is something that has seen focus and
> improvements
> >> >> for the last several years, for instance migrating the etap suite to
> >> eunit,
> >> >> and updating the JS suite to run against clusters in 2.x. There's
> still
> >> >> improvements to be made, and that was one of the topics of the
> CouchDB
> >> dev
> >> >> summit early in the year [1].
> >> >>
> >> >> Before we go further, I want to clarify some nomenclature. I'm by no
> >> means
> >> >> going to try and define unit testing vs integration testing vs
> quantum
> >> >> phase shift testing, but instead I want to focus on the distinction
> of
> >> >> where the testing takes place. Fundamentally, we have two places we
> test
> >> >> CouchDB: 1) at the Erlang VM level where we conduct assertions
> against
> >> >> module functions or process states; 2) at the HTTP level where we
> test
> >> the
> >> >> behavior of CouchDB at the user level API. This post focuses
> entirely on
> >> >> the latter; that's not to say the former doesn't also merit
> attention,
> >> just
> >> >> that the two are different enough that we can focus on them in
> >> isolation.
> >> >>
> >> >> So with that, let's chat about the current HTTP test suite in
> CouchDB.
> >> This
> >> >> is the "JS suite" I referred to above, which is a custom built test
> >> suite
> >> >> written in Javascript and executed in the aging SpiderMonkey. The JS
> >> suite
> >> >> has put in work for years, but it's showing it's age, and is a bit
> >> awkward
> >> >> to work with and improve. However, I think the biggest issue with
> the JS
> >> >> suite is that it's utilized far less than it should be, and folks
> seem
> >> to
> >> >> avoid extending it or adding additional tests to it. There's been
> >> >> discussion for years about replacing said suite, but the discussions
> >> >> invariably got blocked on the bike shed of whether to rewrite the
> suite
> >> in
> >> >> Javascript or Python. This thread provides a third option, with code!
> >> >>
> >> >> I started hacking on a replacement for the JS suite, this time
> written
> >> in
> >> >> Elixir. Overall I'm quite impressed with how it's come along, and
> have
> >> some
> >> >> good examples to show. This is basically an Elixir app that has an
> HTTP
> >> >> client and then runs a series of tests that conduct tests against the
> >> >> CouchDB HTTP API and make assertions therein.
> >> >>
> >> >> You can find the current code in [2], and a comparison of the
> changes in
> >> >> [3]. The core HTTP client is only a handful of lines of codes and
> works
> >> >> quite well [4]. The utility functions used across all tests are
> located
> >> in
> >> >> [5], and the tests themselves are in [6]. The existing test modules
> >> have a
> >> >> 1:1 correspondence with the associated JS suite test modules, and in
> >> >> general are as direct of a port as possible.
> >> >>
> >> >> The test modules ported in their entirety or most of the way are:
> >> >>
> >> >>   * all_docs.js
> >> >>   * basics.js
> >> >>   * config.js
> >> >>   * reduce.js
> >> >>   * rewrite.js
> >> >>   * uuids.js
> >> >>   * view_collation.js
> >> >>
> >> >> Paul has dove in and is responsible for a few of those test modules
> and
> >> >> he's almost completed porting the replication.js suite as well. We
> >> started
> >> >> with the hard ones first, so for the most part the rest of the ports
> >> should
> >> >> be fairly smooth sailing.
> >> >>
> >> >> Here's an example of a very basic test:
> >> >>
> >> >> ```erlang
> >> >> defmodule WelcomeTest do
> >> >>   use CouchTestCase
> >> >>
> >> >>   test "Welcome endpoint" do
> >> >>     assert Couch.get("/").body["couchdb"] == "Welcome", "Should say
> >> welcome"
> >> >>   end
> >> >>
> >> >> end
> >> >>
> >> >> ```
> >> >>
> >> >>
> >> >> As you can see, the `Couch` client is very simple HTTP client with
> >> >> easy HTTP verb based methods. Let's look at a more complicated test
> >> >> for asserting we can create documents in a database:
> >> >>
> >> >>
> >> >> ```erlang
> >> >>
> >> >>   @tag :with_db
> >> >>   test "Create a document and save it to the database", context do
> >> >>     resp = Couch.post("/#{context[:db_name]}", [body: %{:_id => "0",
> >> >> :a => 1, :b => 1}])
> >> >>     assert resp.status_code == 201, "Should be 201 created"
> >> >>     assert resp.body["id"], "Id should be present"
> >> >>     assert resp.body["rev"], "Rev should be present"
> >> >>
> >> >>     resp2 = Couch.get("/#{context[:db_name]}/#{resp.body["id"]}")
> >> >>     assert resp2.body["_id"] == resp.body["id"], "Ids should match"
> >> >>     assert resp2.body["_rev"] == resp.body["rev"], "Revs should
> match"
> >> >>   end
> >> >>
> >> >> ```
> >> >>
> >> >>
> >> >> This is fairly straightforward code to POST a new doc, make
> assertions
> >> >> on the response, and then fetch the doc to make sure everything
> >> >> matches up. What I really wanted to highlight here is the `@tag
> >> >> :with_db` decorator. We can easily add custom "tags" to the tests to
> >> >> simplify setup and teardown. That `:with_db` tag does two things, it
> >> >> dynamically generates a random database name, and then takes care of
> >> >> setup/teardown for creating and deleting said database for that
> >> >> particular test. This is really useful and has been very nice to work
> >> >> with so far. We also have tag functionality in place for executing a
> >> >> test with a particular set of config options:
> >> >>
> >> >>
> >> >> ```erlang
> >> >>
> >> >>   @tag config: [
> >> >>     {"uuids", "algorithm", "utc_random"}
> >> >>   ]
> >> >>   test "utc_random uuids are roughly random" do
> >> >>     resp = Couch.get("/_uuids", query: %{:count => 1000})
> >> >>     assert resp.status_code == 200
> >> >>     uuids = resp.body["uuids"]
> >> >>
> >> >>     assert String.length(Enum.at(uuids, 1)) == 32
> >> >>
> >> >>     # Assert no collisions
> >> >>     assert length(Enum.uniq(uuids)) == length(uuids)
> >> >>
> >> >>     # Assert rough ordering of UUIDs
> >> >>     u1 = String.slice(Enum.at(uuids, 1), 0..13)
> >> >>     u2 = String.slice(Enum.at(uuids, -1), 0..13)
> >> >>     assert u1 < u2
> >> >>   end
> >> >> ```
> >> >>
> >> >>
> >> >> The tag system really simplifies a lot of the standard auxiliary
> >> >> actions needed to conduct tests.
> >> >>
> >> >>
> >> >> To test out the suite, you'll need to spin up the dev server in one
> >> window with:
> >> >>
> >> >>
> >> >> ```
> >> >>
> >> >> ./dev/run --admin=adm:pass
> >> >>
> >> >> ```
> >> >>
> >> >>
> >> >> and then in another window go into the relevant CouchDB src directory
> >> and run:
> >> >>
> >> >>
> >> >> ```
> >> >>
> >> >> cd ~/src/couchdb/elixir_suite/
> >> >>
> >> >> mix deps.get
> >> >>
> >> >> mix test --trace
> >> >>
> >> >> ```
> >> >>
> >> >>
> >> >> The `--trace` flag makes the nice line item output per test, which I
> >> >> greatly prefer over a slew of periods. You can run an individual test
> >> >> with `mix test --trace tests/basics_test.exs`. I've pasted the output
> >> >> from running the basics suite at the bottom of this email so you can
> >> >> see what the real output looks like.
> >> >>
> >> >>
> >> >> Overall I'm quite impressed with the toolkit we've been able to put
> >> >> together in a short amount of time, and I propose we migrate fully to
> >> >> this test suite by porting all remaining JS suite tests and then
> >> >> removing the JS suite entirely. Given we've already ported most of
> the
> >> >> "hard suites", I think a full port is reasonable to do and just
> >> >> requires some leg work. Again, I'm impressed with how simple the
> >> >> tooling here is and how quickly we've been able to run with things,
> >> >> turns out the Elixir dev experience is actually quite nice! I hope
> >> >> others have similar opinions after diving in! Let me know what you
> >> >> think.
> >> >>
> >> >>
> >> >>
> >> >> -Russell
> >> >>
> >> >>
> >> >>
> >> >> [1] https://github.com/janl/couchdb-next/issues/39
> >> >> [2] https://github.com/apache/couchdb/tree/elixir-suite
> >> >> [3] https://github.com/apache/couchdb/compare/elixir-suite
> >> >> [4]
> >> >>
> >>
> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/lib/couch.ex
> >> >> [5]
> >> >>
> >>
> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/test/test_helper.exs
> >> >> [6]
> >> https://github.com/apache/couchdb/tree/elixir-suite/elixir_suite/test
> >> >>
> >> >>
> >> >> vagrant@contrib-jessie:~/src/couchdb/elixir_suite$ mix test --trace
> >> >> test/basics_test.exs
> >> >> Excluding tags: [pending: true]
> >> >>
> >> >> BasicsTest
> >> >>   * test Session contains adm context (66.8ms)
> >> >>   * test Creating a new DB with slashes should return Location header
> >> >> (COUCHDB-411) (85.8ms)
> >> >>   * test oops, the doc id got lost in code nirwana (82.1ms)
> >> >>   * test Welcome endpoint (7.6ms)
> >> >>   * test POST doc with an _id field isn't overwritten by uuid
> (102.7ms)
> >> >>   * test On restart, a request for creating an already existing db
> can
> >> >> not override (skipped)
> >> >>   * test Creating a new DB should return location header (118.7ms)
> >> >>   * test _bulk_docs POST error when body not an object (95.0ms)
> >> >>   * test Empty database should have zero docs (161.0ms)
> >> >>   * test _all_docs POST error when multi-get is not a {'key': [...]}
> >> >> structure (104.3ms)
> >> >>   * test Regression test for COUCHDB-954 (skipped)
> >> >>   * test DELETE'ing a non-existent doc should 404 (100.0ms)
> >> >>   * test Revs info status is good (127.3ms)
> >> >>   * test PUT on existing DB should return 412 instead of 500 (97.6ms)
> >> >>   * test Database should be in _all_dbs (117.7ms)
> >> >>   * test Check for invalid document members (122.4ms)
> >> >>   * test Can create several documents (213.0ms)
> >> >>   * test Make sure you can do a seq=true option (99.1ms)
> >> >>   * test PUT doc has a Location header (skipped)
> >> >>   * test Create a document and save it to the database (116.3ms)
> >> >>   * test Created database has appropriate db info name (99.7ms)
> >> >>   * test PUT error when body not an object (89.5ms)
> >> >>   * test Simple map functions (473.0ms)
> >> >>   * test POST doc response has a Location header (117.1ms)
> >> >>
> >> >> CouchTestCase
> >> >>
> >> >>
> >> >> Finished in 3.3 seconds
> >> >> 24 tests, 0 failures, 3 skipped
> >> >>
> >> >> Randomized with seed 936284
> >>
>

Re: [RFC] On the Testing of CouchDB

Posted by Paul Davis <pa...@gmail.com>.
I went ahead and added a `make elixir` command to the elixir-suite branch.

Of note, my earlier instructions that referenced the elixir_suite
directory are now slightly different as I moved things to test/elixir
cause Russel was being slow.

Current rundown is now:

$ # Get Elixir installed as per previous
$ # Build CouchDB as per previous
$ make elixir

The `make elixir` target is currently not integrated with dependencies
(i.e., you need to run `make` on your own) and also doesn't have the
fancy things for running a single test or anything. And I haven't put
it as part of `make check` itself.

On Fri, Dec 15, 2017 at 11:45 AM, Paul Davis
<pa...@gmail.com> wrote:
> For `make check` it should be fairly straightforward to map the
> current approach to it. I could probably knock that out fairly quickly
> if you want me to give it a whirl.
>
> On Fri, Dec 15, 2017 at 11:42 AM, Russell Branca <ch...@apache.org> wrote:
>> Yeah just to reiterate what Paul said, the Elixir dev experience is really
>> nice and easy to get rolling with. I had no prior actual experience with
>> Elixir and I was able to get things rolling in a few hours.
>>
>> RE Ben's question about diving in: please do! Just grab one of the unported
>> js suites and goto town. I've just been cherry-pick'ing things out of
>> Paul's branch and we can continue to do the same until we get this more
>> locked down. My goal with the porting is to keep chugging along and just
>> get it knocked out, as I really don't think it will be overly onerous to do
>> so. And if anyone else wants to jump in, there's still a fair number of
>> tests to port, just take your pick.
>>
>> One other thing that needs work is figuring out how to hook all this into
>> "make check" and what not. I've mostly ignored that as this just points at
>> a CouchDB instance and can be run directly, but we'll need to sort that out
>> at some point.
>>
>>
>> -Russell
>>
>> On Fri, Dec 15, 2017 at 9:03 AM Paul Davis <pa...@gmail.com>
>> wrote:
>>
>>> Hello everybody!
>>>
>>> I figured I should probably go ahead and chime in seeing as I've also
>>> been playing around porting some of the tests in my free time between
>>> ops shifts the last couple weeks.
>>>
>>> My first impression was that it was ridiculously easy to get involved.
>>> On OS X at least, `brew install elixir` was enough to get a working
>>> elixir installed (however, if you use kerl or erln8 you'll want have
>>> to build an Erlang 20.x VM to use the brew package). I went from not
>>> having Elixir installed to a full port of uuids.js with the config tag
>>> logic written in about two hours one night. So far the Elixir docs and
>>> seem very well written and put together. I'd say the worst part of
>>> Elixir so far is that knowing Erlang I find myself searching for "How
>>> do I do this Erlang thing in Elixir?" Which isn't as bad as it sounds.
>>> The Elixir libraries have certainly had a considerable amount of
>>> thought put into them to make them easy to use and remember. I find it
>>> to be a lot like my experience when learning Python in that I may have
>>> to Google once and then its muscle memory. As opposed to Erlang's
>>> library where I'm constantly reading the lists manpage to remember
>>> argument orderings and whether I want search or find versions etc.
>>>
>>> Which I guess is a long way of saying I'm rather liking the Elixir
>>> development experience so far.
>>>
>>> That said, I'm currently about half way through porting replication.js
>>> tests to Elixir. For the most part its fairly straightforward. My
>>> current approach as we've done for the other modules is to do a direct
>>> port. Once that's finished we'll want to break up that huge module
>>> into a series of modules that share a lot of the utility functions.
>>> One of the nice things about moving to Elixir is that its got a full
>>> on development story rather than our current couchjs approach that
>>> prevents sharing code easily between subsets of tests.
>>>
>>> For Ben's question on diving in, I'd do just that. I'd say leave a
>>> note here about which module(s)? you're going to port so that we're
>>> not duplicating efforts and then its basically just a matter of
>>> getting Elixir installed. For that, here's a quick rundown on how I
>>> got that working:
>>>
>>> $ brew update
>>> $ brew install elixir
>>> $ # wait for all the things...
>>> $ iex # which fails cause I have an Erlang VM older than 20.0 as a default
>>> $ erln8 --fetch
>>> $ erln8 --build --tag=OTP-20.1.6 --id=20.1.6
>>> $ # wait while erln8 does its thing
>>> $ git clone https://github.com/apache/couchdb
>>> $ cd couchdb
>>> $ ./configure --disable-docs --disable-fauxton --with-curl
>>> $ make
>>> $ git checkout -b elixir-suite-davisp origin/elixir-suite # but use
>>> your own name
>>> $ cd elixir_suite # Russel promises to move this to test/elixir
>>> eventually... :)
>>> $ mix deps.get
>>> $ # For the moment, in another terminal, run ./dev/run -a adm:pass
>>> $ mix test --trace
>>> $ # For development you can also do this:
>>> $ mix test --trace test/module_i_am_working_on.exs
>>>
>>> For the time being, anyone that does any porting work, I'd just let
>>> Russel know and he can pull the changes into the main elixir-suite
>>> branch. For the initial work it might get a bit messy but we can
>>> always clean up after the fact if we decide this is a direction we'd
>>> like to go for real. To that end, I'd also make sure that we do a
>>> single .js -> .exs port per commit to try and make any future cleanup
>>> work easier.
>>>
>>> Also, even if people don't feel like doing any actual porting work I'd
>>> still be interested in hearing what its like for people to just run
>>> through their platform equivalent of the above steps. And even just
>>> initial impressions on toying around with Elixir. My only experience
>>> with Elixir prior to this was reading through their quick
>>> start/tutorial pages a couple of times to get a feeling for the syntax
>>> but hadn't actually even typed it into an editor till last week.
>>>
>>> And that's all I've got for now.
>>>
>>> On Thu, Dec 14, 2017 at 11:57 PM, Benjamin Anderson
>>> <ba...@apache.org> wrote:
>>> > Slick! This seems like it's coming together really nicely. Can't argue
>>> > with commits like "Prefer ?w=3 over hacky sleeps"[1] in any case.
>>> >
>>> >> I hope others have similar opinions after diving in!
>>> >
>>> > How should one dive in? Are you looking for others to help out with
>>> > the ports, or just thinking aspirationally about future regular
>>> > contributions to the test suite?
>>> >
>>> > --
>>> > b
>>> >
>>> > [1]:
>>> https://github.com/apache/couchdb/commit/5bce2d98a298c25b77d8dcda19deeedb494cc289
>>> >
>>> > On Thu, Dec 14, 2017 at 5:03 PM, Russell Branca <ch...@apache.org>
>>> wrote:
>>> >> Howdy folks!
>>> >>
>>> >> The testing of CouchDB is something that has seen focus and improvements
>>> >> for the last several years, for instance migrating the etap suite to
>>> eunit,
>>> >> and updating the JS suite to run against clusters in 2.x. There's still
>>> >> improvements to be made, and that was one of the topics of the CouchDB
>>> dev
>>> >> summit early in the year [1].
>>> >>
>>> >> Before we go further, I want to clarify some nomenclature. I'm by no
>>> means
>>> >> going to try and define unit testing vs integration testing vs quantum
>>> >> phase shift testing, but instead I want to focus on the distinction of
>>> >> where the testing takes place. Fundamentally, we have two places we test
>>> >> CouchDB: 1) at the Erlang VM level where we conduct assertions against
>>> >> module functions or process states; 2) at the HTTP level where we test
>>> the
>>> >> behavior of CouchDB at the user level API. This post focuses entirely on
>>> >> the latter; that's not to say the former doesn't also merit attention,
>>> just
>>> >> that the two are different enough that we can focus on them in
>>> isolation.
>>> >>
>>> >> So with that, let's chat about the current HTTP test suite in CouchDB.
>>> This
>>> >> is the "JS suite" I referred to above, which is a custom built test
>>> suite
>>> >> written in Javascript and executed in the aging SpiderMonkey. The JS
>>> suite
>>> >> has put in work for years, but it's showing it's age, and is a bit
>>> awkward
>>> >> to work with and improve. However, I think the biggest issue with the JS
>>> >> suite is that it's utilized far less than it should be, and folks seem
>>> to
>>> >> avoid extending it or adding additional tests to it. There's been
>>> >> discussion for years about replacing said suite, but the discussions
>>> >> invariably got blocked on the bike shed of whether to rewrite the suite
>>> in
>>> >> Javascript or Python. This thread provides a third option, with code!
>>> >>
>>> >> I started hacking on a replacement for the JS suite, this time written
>>> in
>>> >> Elixir. Overall I'm quite impressed with how it's come along, and have
>>> some
>>> >> good examples to show. This is basically an Elixir app that has an HTTP
>>> >> client and then runs a series of tests that conduct tests against the
>>> >> CouchDB HTTP API and make assertions therein.
>>> >>
>>> >> You can find the current code in [2], and a comparison of the changes in
>>> >> [3]. The core HTTP client is only a handful of lines of codes and works
>>> >> quite well [4]. The utility functions used across all tests are located
>>> in
>>> >> [5], and the tests themselves are in [6]. The existing test modules
>>> have a
>>> >> 1:1 correspondence with the associated JS suite test modules, and in
>>> >> general are as direct of a port as possible.
>>> >>
>>> >> The test modules ported in their entirety or most of the way are:
>>> >>
>>> >>   * all_docs.js
>>> >>   * basics.js
>>> >>   * config.js
>>> >>   * reduce.js
>>> >>   * rewrite.js
>>> >>   * uuids.js
>>> >>   * view_collation.js
>>> >>
>>> >> Paul has dove in and is responsible for a few of those test modules and
>>> >> he's almost completed porting the replication.js suite as well. We
>>> started
>>> >> with the hard ones first, so for the most part the rest of the ports
>>> should
>>> >> be fairly smooth sailing.
>>> >>
>>> >> Here's an example of a very basic test:
>>> >>
>>> >> ```erlang
>>> >> defmodule WelcomeTest do
>>> >>   use CouchTestCase
>>> >>
>>> >>   test "Welcome endpoint" do
>>> >>     assert Couch.get("/").body["couchdb"] == "Welcome", "Should say
>>> welcome"
>>> >>   end
>>> >>
>>> >> end
>>> >>
>>> >> ```
>>> >>
>>> >>
>>> >> As you can see, the `Couch` client is very simple HTTP client with
>>> >> easy HTTP verb based methods. Let's look at a more complicated test
>>> >> for asserting we can create documents in a database:
>>> >>
>>> >>
>>> >> ```erlang
>>> >>
>>> >>   @tag :with_db
>>> >>   test "Create a document and save it to the database", context do
>>> >>     resp = Couch.post("/#{context[:db_name]}", [body: %{:_id => "0",
>>> >> :a => 1, :b => 1}])
>>> >>     assert resp.status_code == 201, "Should be 201 created"
>>> >>     assert resp.body["id"], "Id should be present"
>>> >>     assert resp.body["rev"], "Rev should be present"
>>> >>
>>> >>     resp2 = Couch.get("/#{context[:db_name]}/#{resp.body["id"]}")
>>> >>     assert resp2.body["_id"] == resp.body["id"], "Ids should match"
>>> >>     assert resp2.body["_rev"] == resp.body["rev"], "Revs should match"
>>> >>   end
>>> >>
>>> >> ```
>>> >>
>>> >>
>>> >> This is fairly straightforward code to POST a new doc, make assertions
>>> >> on the response, and then fetch the doc to make sure everything
>>> >> matches up. What I really wanted to highlight here is the `@tag
>>> >> :with_db` decorator. We can easily add custom "tags" to the tests to
>>> >> simplify setup and teardown. That `:with_db` tag does two things, it
>>> >> dynamically generates a random database name, and then takes care of
>>> >> setup/teardown for creating and deleting said database for that
>>> >> particular test. This is really useful and has been very nice to work
>>> >> with so far. We also have tag functionality in place for executing a
>>> >> test with a particular set of config options:
>>> >>
>>> >>
>>> >> ```erlang
>>> >>
>>> >>   @tag config: [
>>> >>     {"uuids", "algorithm", "utc_random"}
>>> >>   ]
>>> >>   test "utc_random uuids are roughly random" do
>>> >>     resp = Couch.get("/_uuids", query: %{:count => 1000})
>>> >>     assert resp.status_code == 200
>>> >>     uuids = resp.body["uuids"]
>>> >>
>>> >>     assert String.length(Enum.at(uuids, 1)) == 32
>>> >>
>>> >>     # Assert no collisions
>>> >>     assert length(Enum.uniq(uuids)) == length(uuids)
>>> >>
>>> >>     # Assert rough ordering of UUIDs
>>> >>     u1 = String.slice(Enum.at(uuids, 1), 0..13)
>>> >>     u2 = String.slice(Enum.at(uuids, -1), 0..13)
>>> >>     assert u1 < u2
>>> >>   end
>>> >> ```
>>> >>
>>> >>
>>> >> The tag system really simplifies a lot of the standard auxiliary
>>> >> actions needed to conduct tests.
>>> >>
>>> >>
>>> >> To test out the suite, you'll need to spin up the dev server in one
>>> window with:
>>> >>
>>> >>
>>> >> ```
>>> >>
>>> >> ./dev/run --admin=adm:pass
>>> >>
>>> >> ```
>>> >>
>>> >>
>>> >> and then in another window go into the relevant CouchDB src directory
>>> and run:
>>> >>
>>> >>
>>> >> ```
>>> >>
>>> >> cd ~/src/couchdb/elixir_suite/
>>> >>
>>> >> mix deps.get
>>> >>
>>> >> mix test --trace
>>> >>
>>> >> ```
>>> >>
>>> >>
>>> >> The `--trace` flag makes the nice line item output per test, which I
>>> >> greatly prefer over a slew of periods. You can run an individual test
>>> >> with `mix test --trace tests/basics_test.exs`. I've pasted the output
>>> >> from running the basics suite at the bottom of this email so you can
>>> >> see what the real output looks like.
>>> >>
>>> >>
>>> >> Overall I'm quite impressed with the toolkit we've been able to put
>>> >> together in a short amount of time, and I propose we migrate fully to
>>> >> this test suite by porting all remaining JS suite tests and then
>>> >> removing the JS suite entirely. Given we've already ported most of the
>>> >> "hard suites", I think a full port is reasonable to do and just
>>> >> requires some leg work. Again, I'm impressed with how simple the
>>> >> tooling here is and how quickly we've been able to run with things,
>>> >> turns out the Elixir dev experience is actually quite nice! I hope
>>> >> others have similar opinions after diving in! Let me know what you
>>> >> think.
>>> >>
>>> >>
>>> >>
>>> >> -Russell
>>> >>
>>> >>
>>> >>
>>> >> [1] https://github.com/janl/couchdb-next/issues/39
>>> >> [2] https://github.com/apache/couchdb/tree/elixir-suite
>>> >> [3] https://github.com/apache/couchdb/compare/elixir-suite
>>> >> [4]
>>> >>
>>> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/lib/couch.ex
>>> >> [5]
>>> >>
>>> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/test/test_helper.exs
>>> >> [6]
>>> https://github.com/apache/couchdb/tree/elixir-suite/elixir_suite/test
>>> >>
>>> >>
>>> >> vagrant@contrib-jessie:~/src/couchdb/elixir_suite$ mix test --trace
>>> >> test/basics_test.exs
>>> >> Excluding tags: [pending: true]
>>> >>
>>> >> BasicsTest
>>> >>   * test Session contains adm context (66.8ms)
>>> >>   * test Creating a new DB with slashes should return Location header
>>> >> (COUCHDB-411) (85.8ms)
>>> >>   * test oops, the doc id got lost in code nirwana (82.1ms)
>>> >>   * test Welcome endpoint (7.6ms)
>>> >>   * test POST doc with an _id field isn't overwritten by uuid (102.7ms)
>>> >>   * test On restart, a request for creating an already existing db can
>>> >> not override (skipped)
>>> >>   * test Creating a new DB should return location header (118.7ms)
>>> >>   * test _bulk_docs POST error when body not an object (95.0ms)
>>> >>   * test Empty database should have zero docs (161.0ms)
>>> >>   * test _all_docs POST error when multi-get is not a {'key': [...]}
>>> >> structure (104.3ms)
>>> >>   * test Regression test for COUCHDB-954 (skipped)
>>> >>   * test DELETE'ing a non-existent doc should 404 (100.0ms)
>>> >>   * test Revs info status is good (127.3ms)
>>> >>   * test PUT on existing DB should return 412 instead of 500 (97.6ms)
>>> >>   * test Database should be in _all_dbs (117.7ms)
>>> >>   * test Check for invalid document members (122.4ms)
>>> >>   * test Can create several documents (213.0ms)
>>> >>   * test Make sure you can do a seq=true option (99.1ms)
>>> >>   * test PUT doc has a Location header (skipped)
>>> >>   * test Create a document and save it to the database (116.3ms)
>>> >>   * test Created database has appropriate db info name (99.7ms)
>>> >>   * test PUT error when body not an object (89.5ms)
>>> >>   * test Simple map functions (473.0ms)
>>> >>   * test POST doc response has a Location header (117.1ms)
>>> >>
>>> >> CouchTestCase
>>> >>
>>> >>
>>> >> Finished in 3.3 seconds
>>> >> 24 tests, 0 failures, 3 skipped
>>> >>
>>> >> Randomized with seed 936284
>>>

Re: [RFC] On the Testing of CouchDB

Posted by Paul Davis <pa...@gmail.com>.
For `make check` it should be fairly straightforward to map the
current approach to it. I could probably knock that out fairly quickly
if you want me to give it a whirl.

On Fri, Dec 15, 2017 at 11:42 AM, Russell Branca <ch...@apache.org> wrote:
> Yeah just to reiterate what Paul said, the Elixir dev experience is really
> nice and easy to get rolling with. I had no prior actual experience with
> Elixir and I was able to get things rolling in a few hours.
>
> RE Ben's question about diving in: please do! Just grab one of the unported
> js suites and goto town. I've just been cherry-pick'ing things out of
> Paul's branch and we can continue to do the same until we get this more
> locked down. My goal with the porting is to keep chugging along and just
> get it knocked out, as I really don't think it will be overly onerous to do
> so. And if anyone else wants to jump in, there's still a fair number of
> tests to port, just take your pick.
>
> One other thing that needs work is figuring out how to hook all this into
> "make check" and what not. I've mostly ignored that as this just points at
> a CouchDB instance and can be run directly, but we'll need to sort that out
> at some point.
>
>
> -Russell
>
> On Fri, Dec 15, 2017 at 9:03 AM Paul Davis <pa...@gmail.com>
> wrote:
>
>> Hello everybody!
>>
>> I figured I should probably go ahead and chime in seeing as I've also
>> been playing around porting some of the tests in my free time between
>> ops shifts the last couple weeks.
>>
>> My first impression was that it was ridiculously easy to get involved.
>> On OS X at least, `brew install elixir` was enough to get a working
>> elixir installed (however, if you use kerl or erln8 you'll want have
>> to build an Erlang 20.x VM to use the brew package). I went from not
>> having Elixir installed to a full port of uuids.js with the config tag
>> logic written in about two hours one night. So far the Elixir docs and
>> seem very well written and put together. I'd say the worst part of
>> Elixir so far is that knowing Erlang I find myself searching for "How
>> do I do this Erlang thing in Elixir?" Which isn't as bad as it sounds.
>> The Elixir libraries have certainly had a considerable amount of
>> thought put into them to make them easy to use and remember. I find it
>> to be a lot like my experience when learning Python in that I may have
>> to Google once and then its muscle memory. As opposed to Erlang's
>> library where I'm constantly reading the lists manpage to remember
>> argument orderings and whether I want search or find versions etc.
>>
>> Which I guess is a long way of saying I'm rather liking the Elixir
>> development experience so far.
>>
>> That said, I'm currently about half way through porting replication.js
>> tests to Elixir. For the most part its fairly straightforward. My
>> current approach as we've done for the other modules is to do a direct
>> port. Once that's finished we'll want to break up that huge module
>> into a series of modules that share a lot of the utility functions.
>> One of the nice things about moving to Elixir is that its got a full
>> on development story rather than our current couchjs approach that
>> prevents sharing code easily between subsets of tests.
>>
>> For Ben's question on diving in, I'd do just that. I'd say leave a
>> note here about which module(s)? you're going to port so that we're
>> not duplicating efforts and then its basically just a matter of
>> getting Elixir installed. For that, here's a quick rundown on how I
>> got that working:
>>
>> $ brew update
>> $ brew install elixir
>> $ # wait for all the things...
>> $ iex # which fails cause I have an Erlang VM older than 20.0 as a default
>> $ erln8 --fetch
>> $ erln8 --build --tag=OTP-20.1.6 --id=20.1.6
>> $ # wait while erln8 does its thing
>> $ git clone https://github.com/apache/couchdb
>> $ cd couchdb
>> $ ./configure --disable-docs --disable-fauxton --with-curl
>> $ make
>> $ git checkout -b elixir-suite-davisp origin/elixir-suite # but use
>> your own name
>> $ cd elixir_suite # Russel promises to move this to test/elixir
>> eventually... :)
>> $ mix deps.get
>> $ # For the moment, in another terminal, run ./dev/run -a adm:pass
>> $ mix test --trace
>> $ # For development you can also do this:
>> $ mix test --trace test/module_i_am_working_on.exs
>>
>> For the time being, anyone that does any porting work, I'd just let
>> Russel know and he can pull the changes into the main elixir-suite
>> branch. For the initial work it might get a bit messy but we can
>> always clean up after the fact if we decide this is a direction we'd
>> like to go for real. To that end, I'd also make sure that we do a
>> single .js -> .exs port per commit to try and make any future cleanup
>> work easier.
>>
>> Also, even if people don't feel like doing any actual porting work I'd
>> still be interested in hearing what its like for people to just run
>> through their platform equivalent of the above steps. And even just
>> initial impressions on toying around with Elixir. My only experience
>> with Elixir prior to this was reading through their quick
>> start/tutorial pages a couple of times to get a feeling for the syntax
>> but hadn't actually even typed it into an editor till last week.
>>
>> And that's all I've got for now.
>>
>> On Thu, Dec 14, 2017 at 11:57 PM, Benjamin Anderson
>> <ba...@apache.org> wrote:
>> > Slick! This seems like it's coming together really nicely. Can't argue
>> > with commits like "Prefer ?w=3 over hacky sleeps"[1] in any case.
>> >
>> >> I hope others have similar opinions after diving in!
>> >
>> > How should one dive in? Are you looking for others to help out with
>> > the ports, or just thinking aspirationally about future regular
>> > contributions to the test suite?
>> >
>> > --
>> > b
>> >
>> > [1]:
>> https://github.com/apache/couchdb/commit/5bce2d98a298c25b77d8dcda19deeedb494cc289
>> >
>> > On Thu, Dec 14, 2017 at 5:03 PM, Russell Branca <ch...@apache.org>
>> wrote:
>> >> Howdy folks!
>> >>
>> >> The testing of CouchDB is something that has seen focus and improvements
>> >> for the last several years, for instance migrating the etap suite to
>> eunit,
>> >> and updating the JS suite to run against clusters in 2.x. There's still
>> >> improvements to be made, and that was one of the topics of the CouchDB
>> dev
>> >> summit early in the year [1].
>> >>
>> >> Before we go further, I want to clarify some nomenclature. I'm by no
>> means
>> >> going to try and define unit testing vs integration testing vs quantum
>> >> phase shift testing, but instead I want to focus on the distinction of
>> >> where the testing takes place. Fundamentally, we have two places we test
>> >> CouchDB: 1) at the Erlang VM level where we conduct assertions against
>> >> module functions or process states; 2) at the HTTP level where we test
>> the
>> >> behavior of CouchDB at the user level API. This post focuses entirely on
>> >> the latter; that's not to say the former doesn't also merit attention,
>> just
>> >> that the two are different enough that we can focus on them in
>> isolation.
>> >>
>> >> So with that, let's chat about the current HTTP test suite in CouchDB.
>> This
>> >> is the "JS suite" I referred to above, which is a custom built test
>> suite
>> >> written in Javascript and executed in the aging SpiderMonkey. The JS
>> suite
>> >> has put in work for years, but it's showing it's age, and is a bit
>> awkward
>> >> to work with and improve. However, I think the biggest issue with the JS
>> >> suite is that it's utilized far less than it should be, and folks seem
>> to
>> >> avoid extending it or adding additional tests to it. There's been
>> >> discussion for years about replacing said suite, but the discussions
>> >> invariably got blocked on the bike shed of whether to rewrite the suite
>> in
>> >> Javascript or Python. This thread provides a third option, with code!
>> >>
>> >> I started hacking on a replacement for the JS suite, this time written
>> in
>> >> Elixir. Overall I'm quite impressed with how it's come along, and have
>> some
>> >> good examples to show. This is basically an Elixir app that has an HTTP
>> >> client and then runs a series of tests that conduct tests against the
>> >> CouchDB HTTP API and make assertions therein.
>> >>
>> >> You can find the current code in [2], and a comparison of the changes in
>> >> [3]. The core HTTP client is only a handful of lines of codes and works
>> >> quite well [4]. The utility functions used across all tests are located
>> in
>> >> [5], and the tests themselves are in [6]. The existing test modules
>> have a
>> >> 1:1 correspondence with the associated JS suite test modules, and in
>> >> general are as direct of a port as possible.
>> >>
>> >> The test modules ported in their entirety or most of the way are:
>> >>
>> >>   * all_docs.js
>> >>   * basics.js
>> >>   * config.js
>> >>   * reduce.js
>> >>   * rewrite.js
>> >>   * uuids.js
>> >>   * view_collation.js
>> >>
>> >> Paul has dove in and is responsible for a few of those test modules and
>> >> he's almost completed porting the replication.js suite as well. We
>> started
>> >> with the hard ones first, so for the most part the rest of the ports
>> should
>> >> be fairly smooth sailing.
>> >>
>> >> Here's an example of a very basic test:
>> >>
>> >> ```erlang
>> >> defmodule WelcomeTest do
>> >>   use CouchTestCase
>> >>
>> >>   test "Welcome endpoint" do
>> >>     assert Couch.get("/").body["couchdb"] == "Welcome", "Should say
>> welcome"
>> >>   end
>> >>
>> >> end
>> >>
>> >> ```
>> >>
>> >>
>> >> As you can see, the `Couch` client is very simple HTTP client with
>> >> easy HTTP verb based methods. Let's look at a more complicated test
>> >> for asserting we can create documents in a database:
>> >>
>> >>
>> >> ```erlang
>> >>
>> >>   @tag :with_db
>> >>   test "Create a document and save it to the database", context do
>> >>     resp = Couch.post("/#{context[:db_name]}", [body: %{:_id => "0",
>> >> :a => 1, :b => 1}])
>> >>     assert resp.status_code == 201, "Should be 201 created"
>> >>     assert resp.body["id"], "Id should be present"
>> >>     assert resp.body["rev"], "Rev should be present"
>> >>
>> >>     resp2 = Couch.get("/#{context[:db_name]}/#{resp.body["id"]}")
>> >>     assert resp2.body["_id"] == resp.body["id"], "Ids should match"
>> >>     assert resp2.body["_rev"] == resp.body["rev"], "Revs should match"
>> >>   end
>> >>
>> >> ```
>> >>
>> >>
>> >> This is fairly straightforward code to POST a new doc, make assertions
>> >> on the response, and then fetch the doc to make sure everything
>> >> matches up. What I really wanted to highlight here is the `@tag
>> >> :with_db` decorator. We can easily add custom "tags" to the tests to
>> >> simplify setup and teardown. That `:with_db` tag does two things, it
>> >> dynamically generates a random database name, and then takes care of
>> >> setup/teardown for creating and deleting said database for that
>> >> particular test. This is really useful and has been very nice to work
>> >> with so far. We also have tag functionality in place for executing a
>> >> test with a particular set of config options:
>> >>
>> >>
>> >> ```erlang
>> >>
>> >>   @tag config: [
>> >>     {"uuids", "algorithm", "utc_random"}
>> >>   ]
>> >>   test "utc_random uuids are roughly random" do
>> >>     resp = Couch.get("/_uuids", query: %{:count => 1000})
>> >>     assert resp.status_code == 200
>> >>     uuids = resp.body["uuids"]
>> >>
>> >>     assert String.length(Enum.at(uuids, 1)) == 32
>> >>
>> >>     # Assert no collisions
>> >>     assert length(Enum.uniq(uuids)) == length(uuids)
>> >>
>> >>     # Assert rough ordering of UUIDs
>> >>     u1 = String.slice(Enum.at(uuids, 1), 0..13)
>> >>     u2 = String.slice(Enum.at(uuids, -1), 0..13)
>> >>     assert u1 < u2
>> >>   end
>> >> ```
>> >>
>> >>
>> >> The tag system really simplifies a lot of the standard auxiliary
>> >> actions needed to conduct tests.
>> >>
>> >>
>> >> To test out the suite, you'll need to spin up the dev server in one
>> window with:
>> >>
>> >>
>> >> ```
>> >>
>> >> ./dev/run --admin=adm:pass
>> >>
>> >> ```
>> >>
>> >>
>> >> and then in another window go into the relevant CouchDB src directory
>> and run:
>> >>
>> >>
>> >> ```
>> >>
>> >> cd ~/src/couchdb/elixir_suite/
>> >>
>> >> mix deps.get
>> >>
>> >> mix test --trace
>> >>
>> >> ```
>> >>
>> >>
>> >> The `--trace` flag makes the nice line item output per test, which I
>> >> greatly prefer over a slew of periods. You can run an individual test
>> >> with `mix test --trace tests/basics_test.exs`. I've pasted the output
>> >> from running the basics suite at the bottom of this email so you can
>> >> see what the real output looks like.
>> >>
>> >>
>> >> Overall I'm quite impressed with the toolkit we've been able to put
>> >> together in a short amount of time, and I propose we migrate fully to
>> >> this test suite by porting all remaining JS suite tests and then
>> >> removing the JS suite entirely. Given we've already ported most of the
>> >> "hard suites", I think a full port is reasonable to do and just
>> >> requires some leg work. Again, I'm impressed with how simple the
>> >> tooling here is and how quickly we've been able to run with things,
>> >> turns out the Elixir dev experience is actually quite nice! I hope
>> >> others have similar opinions after diving in! Let me know what you
>> >> think.
>> >>
>> >>
>> >>
>> >> -Russell
>> >>
>> >>
>> >>
>> >> [1] https://github.com/janl/couchdb-next/issues/39
>> >> [2] https://github.com/apache/couchdb/tree/elixir-suite
>> >> [3] https://github.com/apache/couchdb/compare/elixir-suite
>> >> [4]
>> >>
>> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/lib/couch.ex
>> >> [5]
>> >>
>> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/test/test_helper.exs
>> >> [6]
>> https://github.com/apache/couchdb/tree/elixir-suite/elixir_suite/test
>> >>
>> >>
>> >> vagrant@contrib-jessie:~/src/couchdb/elixir_suite$ mix test --trace
>> >> test/basics_test.exs
>> >> Excluding tags: [pending: true]
>> >>
>> >> BasicsTest
>> >>   * test Session contains adm context (66.8ms)
>> >>   * test Creating a new DB with slashes should return Location header
>> >> (COUCHDB-411) (85.8ms)
>> >>   * test oops, the doc id got lost in code nirwana (82.1ms)
>> >>   * test Welcome endpoint (7.6ms)
>> >>   * test POST doc with an _id field isn't overwritten by uuid (102.7ms)
>> >>   * test On restart, a request for creating an already existing db can
>> >> not override (skipped)
>> >>   * test Creating a new DB should return location header (118.7ms)
>> >>   * test _bulk_docs POST error when body not an object (95.0ms)
>> >>   * test Empty database should have zero docs (161.0ms)
>> >>   * test _all_docs POST error when multi-get is not a {'key': [...]}
>> >> structure (104.3ms)
>> >>   * test Regression test for COUCHDB-954 (skipped)
>> >>   * test DELETE'ing a non-existent doc should 404 (100.0ms)
>> >>   * test Revs info status is good (127.3ms)
>> >>   * test PUT on existing DB should return 412 instead of 500 (97.6ms)
>> >>   * test Database should be in _all_dbs (117.7ms)
>> >>   * test Check for invalid document members (122.4ms)
>> >>   * test Can create several documents (213.0ms)
>> >>   * test Make sure you can do a seq=true option (99.1ms)
>> >>   * test PUT doc has a Location header (skipped)
>> >>   * test Create a document and save it to the database (116.3ms)
>> >>   * test Created database has appropriate db info name (99.7ms)
>> >>   * test PUT error when body not an object (89.5ms)
>> >>   * test Simple map functions (473.0ms)
>> >>   * test POST doc response has a Location header (117.1ms)
>> >>
>> >> CouchTestCase
>> >>
>> >>
>> >> Finished in 3.3 seconds
>> >> 24 tests, 0 failures, 3 skipped
>> >>
>> >> Randomized with seed 936284
>>

Re: [RFC] On the Testing of CouchDB

Posted by Russell Branca <ch...@apache.org>.
Yeah just to reiterate what Paul said, the Elixir dev experience is really
nice and easy to get rolling with. I had no prior actual experience with
Elixir and I was able to get things rolling in a few hours.

RE Ben's question about diving in: please do! Just grab one of the unported
js suites and goto town. I've just been cherry-pick'ing things out of
Paul's branch and we can continue to do the same until we get this more
locked down. My goal with the porting is to keep chugging along and just
get it knocked out, as I really don't think it will be overly onerous to do
so. And if anyone else wants to jump in, there's still a fair number of
tests to port, just take your pick.

One other thing that needs work is figuring out how to hook all this into
"make check" and what not. I've mostly ignored that as this just points at
a CouchDB instance and can be run directly, but we'll need to sort that out
at some point.


-Russell

On Fri, Dec 15, 2017 at 9:03 AM Paul Davis <pa...@gmail.com>
wrote:

> Hello everybody!
>
> I figured I should probably go ahead and chime in seeing as I've also
> been playing around porting some of the tests in my free time between
> ops shifts the last couple weeks.
>
> My first impression was that it was ridiculously easy to get involved.
> On OS X at least, `brew install elixir` was enough to get a working
> elixir installed (however, if you use kerl or erln8 you'll want have
> to build an Erlang 20.x VM to use the brew package). I went from not
> having Elixir installed to a full port of uuids.js with the config tag
> logic written in about two hours one night. So far the Elixir docs and
> seem very well written and put together. I'd say the worst part of
> Elixir so far is that knowing Erlang I find myself searching for "How
> do I do this Erlang thing in Elixir?" Which isn't as bad as it sounds.
> The Elixir libraries have certainly had a considerable amount of
> thought put into them to make them easy to use and remember. I find it
> to be a lot like my experience when learning Python in that I may have
> to Google once and then its muscle memory. As opposed to Erlang's
> library where I'm constantly reading the lists manpage to remember
> argument orderings and whether I want search or find versions etc.
>
> Which I guess is a long way of saying I'm rather liking the Elixir
> development experience so far.
>
> That said, I'm currently about half way through porting replication.js
> tests to Elixir. For the most part its fairly straightforward. My
> current approach as we've done for the other modules is to do a direct
> port. Once that's finished we'll want to break up that huge module
> into a series of modules that share a lot of the utility functions.
> One of the nice things about moving to Elixir is that its got a full
> on development story rather than our current couchjs approach that
> prevents sharing code easily between subsets of tests.
>
> For Ben's question on diving in, I'd do just that. I'd say leave a
> note here about which module(s)? you're going to port so that we're
> not duplicating efforts and then its basically just a matter of
> getting Elixir installed. For that, here's a quick rundown on how I
> got that working:
>
> $ brew update
> $ brew install elixir
> $ # wait for all the things...
> $ iex # which fails cause I have an Erlang VM older than 20.0 as a default
> $ erln8 --fetch
> $ erln8 --build --tag=OTP-20.1.6 --id=20.1.6
> $ # wait while erln8 does its thing
> $ git clone https://github.com/apache/couchdb
> $ cd couchdb
> $ ./configure --disable-docs --disable-fauxton --with-curl
> $ make
> $ git checkout -b elixir-suite-davisp origin/elixir-suite # but use
> your own name
> $ cd elixir_suite # Russel promises to move this to test/elixir
> eventually... :)
> $ mix deps.get
> $ # For the moment, in another terminal, run ./dev/run -a adm:pass
> $ mix test --trace
> $ # For development you can also do this:
> $ mix test --trace test/module_i_am_working_on.exs
>
> For the time being, anyone that does any porting work, I'd just let
> Russel know and he can pull the changes into the main elixir-suite
> branch. For the initial work it might get a bit messy but we can
> always clean up after the fact if we decide this is a direction we'd
> like to go for real. To that end, I'd also make sure that we do a
> single .js -> .exs port per commit to try and make any future cleanup
> work easier.
>
> Also, even if people don't feel like doing any actual porting work I'd
> still be interested in hearing what its like for people to just run
> through their platform equivalent of the above steps. And even just
> initial impressions on toying around with Elixir. My only experience
> with Elixir prior to this was reading through their quick
> start/tutorial pages a couple of times to get a feeling for the syntax
> but hadn't actually even typed it into an editor till last week.
>
> And that's all I've got for now.
>
> On Thu, Dec 14, 2017 at 11:57 PM, Benjamin Anderson
> <ba...@apache.org> wrote:
> > Slick! This seems like it's coming together really nicely. Can't argue
> > with commits like "Prefer ?w=3 over hacky sleeps"[1] in any case.
> >
> >> I hope others have similar opinions after diving in!
> >
> > How should one dive in? Are you looking for others to help out with
> > the ports, or just thinking aspirationally about future regular
> > contributions to the test suite?
> >
> > --
> > b
> >
> > [1]:
> https://github.com/apache/couchdb/commit/5bce2d98a298c25b77d8dcda19deeedb494cc289
> >
> > On Thu, Dec 14, 2017 at 5:03 PM, Russell Branca <ch...@apache.org>
> wrote:
> >> Howdy folks!
> >>
> >> The testing of CouchDB is something that has seen focus and improvements
> >> for the last several years, for instance migrating the etap suite to
> eunit,
> >> and updating the JS suite to run against clusters in 2.x. There's still
> >> improvements to be made, and that was one of the topics of the CouchDB
> dev
> >> summit early in the year [1].
> >>
> >> Before we go further, I want to clarify some nomenclature. I'm by no
> means
> >> going to try and define unit testing vs integration testing vs quantum
> >> phase shift testing, but instead I want to focus on the distinction of
> >> where the testing takes place. Fundamentally, we have two places we test
> >> CouchDB: 1) at the Erlang VM level where we conduct assertions against
> >> module functions or process states; 2) at the HTTP level where we test
> the
> >> behavior of CouchDB at the user level API. This post focuses entirely on
> >> the latter; that's not to say the former doesn't also merit attention,
> just
> >> that the two are different enough that we can focus on them in
> isolation.
> >>
> >> So with that, let's chat about the current HTTP test suite in CouchDB.
> This
> >> is the "JS suite" I referred to above, which is a custom built test
> suite
> >> written in Javascript and executed in the aging SpiderMonkey. The JS
> suite
> >> has put in work for years, but it's showing it's age, and is a bit
> awkward
> >> to work with and improve. However, I think the biggest issue with the JS
> >> suite is that it's utilized far less than it should be, and folks seem
> to
> >> avoid extending it or adding additional tests to it. There's been
> >> discussion for years about replacing said suite, but the discussions
> >> invariably got blocked on the bike shed of whether to rewrite the suite
> in
> >> Javascript or Python. This thread provides a third option, with code!
> >>
> >> I started hacking on a replacement for the JS suite, this time written
> in
> >> Elixir. Overall I'm quite impressed with how it's come along, and have
> some
> >> good examples to show. This is basically an Elixir app that has an HTTP
> >> client and then runs a series of tests that conduct tests against the
> >> CouchDB HTTP API and make assertions therein.
> >>
> >> You can find the current code in [2], and a comparison of the changes in
> >> [3]. The core HTTP client is only a handful of lines of codes and works
> >> quite well [4]. The utility functions used across all tests are located
> in
> >> [5], and the tests themselves are in [6]. The existing test modules
> have a
> >> 1:1 correspondence with the associated JS suite test modules, and in
> >> general are as direct of a port as possible.
> >>
> >> The test modules ported in their entirety or most of the way are:
> >>
> >>   * all_docs.js
> >>   * basics.js
> >>   * config.js
> >>   * reduce.js
> >>   * rewrite.js
> >>   * uuids.js
> >>   * view_collation.js
> >>
> >> Paul has dove in and is responsible for a few of those test modules and
> >> he's almost completed porting the replication.js suite as well. We
> started
> >> with the hard ones first, so for the most part the rest of the ports
> should
> >> be fairly smooth sailing.
> >>
> >> Here's an example of a very basic test:
> >>
> >> ```erlang
> >> defmodule WelcomeTest do
> >>   use CouchTestCase
> >>
> >>   test "Welcome endpoint" do
> >>     assert Couch.get("/").body["couchdb"] == "Welcome", "Should say
> welcome"
> >>   end
> >>
> >> end
> >>
> >> ```
> >>
> >>
> >> As you can see, the `Couch` client is very simple HTTP client with
> >> easy HTTP verb based methods. Let's look at a more complicated test
> >> for asserting we can create documents in a database:
> >>
> >>
> >> ```erlang
> >>
> >>   @tag :with_db
> >>   test "Create a document and save it to the database", context do
> >>     resp = Couch.post("/#{context[:db_name]}", [body: %{:_id => "0",
> >> :a => 1, :b => 1}])
> >>     assert resp.status_code == 201, "Should be 201 created"
> >>     assert resp.body["id"], "Id should be present"
> >>     assert resp.body["rev"], "Rev should be present"
> >>
> >>     resp2 = Couch.get("/#{context[:db_name]}/#{resp.body["id"]}")
> >>     assert resp2.body["_id"] == resp.body["id"], "Ids should match"
> >>     assert resp2.body["_rev"] == resp.body["rev"], "Revs should match"
> >>   end
> >>
> >> ```
> >>
> >>
> >> This is fairly straightforward code to POST a new doc, make assertions
> >> on the response, and then fetch the doc to make sure everything
> >> matches up. What I really wanted to highlight here is the `@tag
> >> :with_db` decorator. We can easily add custom "tags" to the tests to
> >> simplify setup and teardown. That `:with_db` tag does two things, it
> >> dynamically generates a random database name, and then takes care of
> >> setup/teardown for creating and deleting said database for that
> >> particular test. This is really useful and has been very nice to work
> >> with so far. We also have tag functionality in place for executing a
> >> test with a particular set of config options:
> >>
> >>
> >> ```erlang
> >>
> >>   @tag config: [
> >>     {"uuids", "algorithm", "utc_random"}
> >>   ]
> >>   test "utc_random uuids are roughly random" do
> >>     resp = Couch.get("/_uuids", query: %{:count => 1000})
> >>     assert resp.status_code == 200
> >>     uuids = resp.body["uuids"]
> >>
> >>     assert String.length(Enum.at(uuids, 1)) == 32
> >>
> >>     # Assert no collisions
> >>     assert length(Enum.uniq(uuids)) == length(uuids)
> >>
> >>     # Assert rough ordering of UUIDs
> >>     u1 = String.slice(Enum.at(uuids, 1), 0..13)
> >>     u2 = String.slice(Enum.at(uuids, -1), 0..13)
> >>     assert u1 < u2
> >>   end
> >> ```
> >>
> >>
> >> The tag system really simplifies a lot of the standard auxiliary
> >> actions needed to conduct tests.
> >>
> >>
> >> To test out the suite, you'll need to spin up the dev server in one
> window with:
> >>
> >>
> >> ```
> >>
> >> ./dev/run --admin=adm:pass
> >>
> >> ```
> >>
> >>
> >> and then in another window go into the relevant CouchDB src directory
> and run:
> >>
> >>
> >> ```
> >>
> >> cd ~/src/couchdb/elixir_suite/
> >>
> >> mix deps.get
> >>
> >> mix test --trace
> >>
> >> ```
> >>
> >>
> >> The `--trace` flag makes the nice line item output per test, which I
> >> greatly prefer over a slew of periods. You can run an individual test
> >> with `mix test --trace tests/basics_test.exs`. I've pasted the output
> >> from running the basics suite at the bottom of this email so you can
> >> see what the real output looks like.
> >>
> >>
> >> Overall I'm quite impressed with the toolkit we've been able to put
> >> together in a short amount of time, and I propose we migrate fully to
> >> this test suite by porting all remaining JS suite tests and then
> >> removing the JS suite entirely. Given we've already ported most of the
> >> "hard suites", I think a full port is reasonable to do and just
> >> requires some leg work. Again, I'm impressed with how simple the
> >> tooling here is and how quickly we've been able to run with things,
> >> turns out the Elixir dev experience is actually quite nice! I hope
> >> others have similar opinions after diving in! Let me know what you
> >> think.
> >>
> >>
> >>
> >> -Russell
> >>
> >>
> >>
> >> [1] https://github.com/janl/couchdb-next/issues/39
> >> [2] https://github.com/apache/couchdb/tree/elixir-suite
> >> [3] https://github.com/apache/couchdb/compare/elixir-suite
> >> [4]
> >>
> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/lib/couch.ex
> >> [5]
> >>
> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/test/test_helper.exs
> >> [6]
> https://github.com/apache/couchdb/tree/elixir-suite/elixir_suite/test
> >>
> >>
> >> vagrant@contrib-jessie:~/src/couchdb/elixir_suite$ mix test --trace
> >> test/basics_test.exs
> >> Excluding tags: [pending: true]
> >>
> >> BasicsTest
> >>   * test Session contains adm context (66.8ms)
> >>   * test Creating a new DB with slashes should return Location header
> >> (COUCHDB-411) (85.8ms)
> >>   * test oops, the doc id got lost in code nirwana (82.1ms)
> >>   * test Welcome endpoint (7.6ms)
> >>   * test POST doc with an _id field isn't overwritten by uuid (102.7ms)
> >>   * test On restart, a request for creating an already existing db can
> >> not override (skipped)
> >>   * test Creating a new DB should return location header (118.7ms)
> >>   * test _bulk_docs POST error when body not an object (95.0ms)
> >>   * test Empty database should have zero docs (161.0ms)
> >>   * test _all_docs POST error when multi-get is not a {'key': [...]}
> >> structure (104.3ms)
> >>   * test Regression test for COUCHDB-954 (skipped)
> >>   * test DELETE'ing a non-existent doc should 404 (100.0ms)
> >>   * test Revs info status is good (127.3ms)
> >>   * test PUT on existing DB should return 412 instead of 500 (97.6ms)
> >>   * test Database should be in _all_dbs (117.7ms)
> >>   * test Check for invalid document members (122.4ms)
> >>   * test Can create several documents (213.0ms)
> >>   * test Make sure you can do a seq=true option (99.1ms)
> >>   * test PUT doc has a Location header (skipped)
> >>   * test Create a document and save it to the database (116.3ms)
> >>   * test Created database has appropriate db info name (99.7ms)
> >>   * test PUT error when body not an object (89.5ms)
> >>   * test Simple map functions (473.0ms)
> >>   * test POST doc response has a Location header (117.1ms)
> >>
> >> CouchTestCase
> >>
> >>
> >> Finished in 3.3 seconds
> >> 24 tests, 0 failures, 3 skipped
> >>
> >> Randomized with seed 936284
>

Re: [RFC] On the Testing of CouchDB

Posted by Paul Davis <pa...@gmail.com>.
Hello everybody!

I figured I should probably go ahead and chime in seeing as I've also
been playing around porting some of the tests in my free time between
ops shifts the last couple weeks.

My first impression was that it was ridiculously easy to get involved.
On OS X at least, `brew install elixir` was enough to get a working
elixir installed (however, if you use kerl or erln8 you'll want have
to build an Erlang 20.x VM to use the brew package). I went from not
having Elixir installed to a full port of uuids.js with the config tag
logic written in about two hours one night. So far the Elixir docs and
seem very well written and put together. I'd say the worst part of
Elixir so far is that knowing Erlang I find myself searching for "How
do I do this Erlang thing in Elixir?" Which isn't as bad as it sounds.
The Elixir libraries have certainly had a considerable amount of
thought put into them to make them easy to use and remember. I find it
to be a lot like my experience when learning Python in that I may have
to Google once and then its muscle memory. As opposed to Erlang's
library where I'm constantly reading the lists manpage to remember
argument orderings and whether I want search or find versions etc.

Which I guess is a long way of saying I'm rather liking the Elixir
development experience so far.

That said, I'm currently about half way through porting replication.js
tests to Elixir. For the most part its fairly straightforward. My
current approach as we've done for the other modules is to do a direct
port. Once that's finished we'll want to break up that huge module
into a series of modules that share a lot of the utility functions.
One of the nice things about moving to Elixir is that its got a full
on development story rather than our current couchjs approach that
prevents sharing code easily between subsets of tests.

For Ben's question on diving in, I'd do just that. I'd say leave a
note here about which module(s)? you're going to port so that we're
not duplicating efforts and then its basically just a matter of
getting Elixir installed. For that, here's a quick rundown on how I
got that working:

$ brew update
$ brew install elixir
$ # wait for all the things...
$ iex # which fails cause I have an Erlang VM older than 20.0 as a default
$ erln8 --fetch
$ erln8 --build --tag=OTP-20.1.6 --id=20.1.6
$ # wait while erln8 does its thing
$ git clone https://github.com/apache/couchdb
$ cd couchdb
$ ./configure --disable-docs --disable-fauxton --with-curl
$ make
$ git checkout -b elixir-suite-davisp origin/elixir-suite # but use
your own name
$ cd elixir_suite # Russel promises to move this to test/elixir eventually... :)
$ mix deps.get
$ # For the moment, in another terminal, run ./dev/run -a adm:pass
$ mix test --trace
$ # For development you can also do this:
$ mix test --trace test/module_i_am_working_on.exs

For the time being, anyone that does any porting work, I'd just let
Russel know and he can pull the changes into the main elixir-suite
branch. For the initial work it might get a bit messy but we can
always clean up after the fact if we decide this is a direction we'd
like to go for real. To that end, I'd also make sure that we do a
single .js -> .exs port per commit to try and make any future cleanup
work easier.

Also, even if people don't feel like doing any actual porting work I'd
still be interested in hearing what its like for people to just run
through their platform equivalent of the above steps. And even just
initial impressions on toying around with Elixir. My only experience
with Elixir prior to this was reading through their quick
start/tutorial pages a couple of times to get a feeling for the syntax
but hadn't actually even typed it into an editor till last week.

And that's all I've got for now.

On Thu, Dec 14, 2017 at 11:57 PM, Benjamin Anderson
<ba...@apache.org> wrote:
> Slick! This seems like it's coming together really nicely. Can't argue
> with commits like "Prefer ?w=3 over hacky sleeps"[1] in any case.
>
>> I hope others have similar opinions after diving in!
>
> How should one dive in? Are you looking for others to help out with
> the ports, or just thinking aspirationally about future regular
> contributions to the test suite?
>
> --
> b
>
> [1]: https://github.com/apache/couchdb/commit/5bce2d98a298c25b77d8dcda19deeedb494cc289
>
> On Thu, Dec 14, 2017 at 5:03 PM, Russell Branca <ch...@apache.org> wrote:
>> Howdy folks!
>>
>> The testing of CouchDB is something that has seen focus and improvements
>> for the last several years, for instance migrating the etap suite to eunit,
>> and updating the JS suite to run against clusters in 2.x. There's still
>> improvements to be made, and that was one of the topics of the CouchDB dev
>> summit early in the year [1].
>>
>> Before we go further, I want to clarify some nomenclature. I'm by no means
>> going to try and define unit testing vs integration testing vs quantum
>> phase shift testing, but instead I want to focus on the distinction of
>> where the testing takes place. Fundamentally, we have two places we test
>> CouchDB: 1) at the Erlang VM level where we conduct assertions against
>> module functions or process states; 2) at the HTTP level where we test the
>> behavior of CouchDB at the user level API. This post focuses entirely on
>> the latter; that's not to say the former doesn't also merit attention, just
>> that the two are different enough that we can focus on them in isolation.
>>
>> So with that, let's chat about the current HTTP test suite in CouchDB. This
>> is the "JS suite" I referred to above, which is a custom built test suite
>> written in Javascript and executed in the aging SpiderMonkey. The JS suite
>> has put in work for years, but it's showing it's age, and is a bit awkward
>> to work with and improve. However, I think the biggest issue with the JS
>> suite is that it's utilized far less than it should be, and folks seem to
>> avoid extending it or adding additional tests to it. There's been
>> discussion for years about replacing said suite, but the discussions
>> invariably got blocked on the bike shed of whether to rewrite the suite in
>> Javascript or Python. This thread provides a third option, with code!
>>
>> I started hacking on a replacement for the JS suite, this time written in
>> Elixir. Overall I'm quite impressed with how it's come along, and have some
>> good examples to show. This is basically an Elixir app that has an HTTP
>> client and then runs a series of tests that conduct tests against the
>> CouchDB HTTP API and make assertions therein.
>>
>> You can find the current code in [2], and a comparison of the changes in
>> [3]. The core HTTP client is only a handful of lines of codes and works
>> quite well [4]. The utility functions used across all tests are located in
>> [5], and the tests themselves are in [6]. The existing test modules have a
>> 1:1 correspondence with the associated JS suite test modules, and in
>> general are as direct of a port as possible.
>>
>> The test modules ported in their entirety or most of the way are:
>>
>>   * all_docs.js
>>   * basics.js
>>   * config.js
>>   * reduce.js
>>   * rewrite.js
>>   * uuids.js
>>   * view_collation.js
>>
>> Paul has dove in and is responsible for a few of those test modules and
>> he's almost completed porting the replication.js suite as well. We started
>> with the hard ones first, so for the most part the rest of the ports should
>> be fairly smooth sailing.
>>
>> Here's an example of a very basic test:
>>
>> ```erlang
>> defmodule WelcomeTest do
>>   use CouchTestCase
>>
>>   test "Welcome endpoint" do
>>     assert Couch.get("/").body["couchdb"] == "Welcome", "Should say welcome"
>>   end
>>
>> end
>>
>> ```
>>
>>
>> As you can see, the `Couch` client is very simple HTTP client with
>> easy HTTP verb based methods. Let's look at a more complicated test
>> for asserting we can create documents in a database:
>>
>>
>> ```erlang
>>
>>   @tag :with_db
>>   test "Create a document and save it to the database", context do
>>     resp = Couch.post("/#{context[:db_name]}", [body: %{:_id => "0",
>> :a => 1, :b => 1}])
>>     assert resp.status_code == 201, "Should be 201 created"
>>     assert resp.body["id"], "Id should be present"
>>     assert resp.body["rev"], "Rev should be present"
>>
>>     resp2 = Couch.get("/#{context[:db_name]}/#{resp.body["id"]}")
>>     assert resp2.body["_id"] == resp.body["id"], "Ids should match"
>>     assert resp2.body["_rev"] == resp.body["rev"], "Revs should match"
>>   end
>>
>> ```
>>
>>
>> This is fairly straightforward code to POST a new doc, make assertions
>> on the response, and then fetch the doc to make sure everything
>> matches up. What I really wanted to highlight here is the `@tag
>> :with_db` decorator. We can easily add custom "tags" to the tests to
>> simplify setup and teardown. That `:with_db` tag does two things, it
>> dynamically generates a random database name, and then takes care of
>> setup/teardown for creating and deleting said database for that
>> particular test. This is really useful and has been very nice to work
>> with so far. We also have tag functionality in place for executing a
>> test with a particular set of config options:
>>
>>
>> ```erlang
>>
>>   @tag config: [
>>     {"uuids", "algorithm", "utc_random"}
>>   ]
>>   test "utc_random uuids are roughly random" do
>>     resp = Couch.get("/_uuids", query: %{:count => 1000})
>>     assert resp.status_code == 200
>>     uuids = resp.body["uuids"]
>>
>>     assert String.length(Enum.at(uuids, 1)) == 32
>>
>>     # Assert no collisions
>>     assert length(Enum.uniq(uuids)) == length(uuids)
>>
>>     # Assert rough ordering of UUIDs
>>     u1 = String.slice(Enum.at(uuids, 1), 0..13)
>>     u2 = String.slice(Enum.at(uuids, -1), 0..13)
>>     assert u1 < u2
>>   end
>> ```
>>
>>
>> The tag system really simplifies a lot of the standard auxiliary
>> actions needed to conduct tests.
>>
>>
>> To test out the suite, you'll need to spin up the dev server in one window with:
>>
>>
>> ```
>>
>> ./dev/run --admin=adm:pass
>>
>> ```
>>
>>
>> and then in another window go into the relevant CouchDB src directory and run:
>>
>>
>> ```
>>
>> cd ~/src/couchdb/elixir_suite/
>>
>> mix deps.get
>>
>> mix test --trace
>>
>> ```
>>
>>
>> The `--trace` flag makes the nice line item output per test, which I
>> greatly prefer over a slew of periods. You can run an individual test
>> with `mix test --trace tests/basics_test.exs`. I've pasted the output
>> from running the basics suite at the bottom of this email so you can
>> see what the real output looks like.
>>
>>
>> Overall I'm quite impressed with the toolkit we've been able to put
>> together in a short amount of time, and I propose we migrate fully to
>> this test suite by porting all remaining JS suite tests and then
>> removing the JS suite entirely. Given we've already ported most of the
>> "hard suites", I think a full port is reasonable to do and just
>> requires some leg work. Again, I'm impressed with how simple the
>> tooling here is and how quickly we've been able to run with things,
>> turns out the Elixir dev experience is actually quite nice! I hope
>> others have similar opinions after diving in! Let me know what you
>> think.
>>
>>
>>
>> -Russell
>>
>>
>>
>> [1] https://github.com/janl/couchdb-next/issues/39
>> [2] https://github.com/apache/couchdb/tree/elixir-suite
>> [3] https://github.com/apache/couchdb/compare/elixir-suite
>> [4]
>> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/lib/couch.ex
>> [5]
>> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/test/test_helper.exs
>> [6] https://github.com/apache/couchdb/tree/elixir-suite/elixir_suite/test
>>
>>
>> vagrant@contrib-jessie:~/src/couchdb/elixir_suite$ mix test --trace
>> test/basics_test.exs
>> Excluding tags: [pending: true]
>>
>> BasicsTest
>>   * test Session contains adm context (66.8ms)
>>   * test Creating a new DB with slashes should return Location header
>> (COUCHDB-411) (85.8ms)
>>   * test oops, the doc id got lost in code nirwana (82.1ms)
>>   * test Welcome endpoint (7.6ms)
>>   * test POST doc with an _id field isn't overwritten by uuid (102.7ms)
>>   * test On restart, a request for creating an already existing db can
>> not override (skipped)
>>   * test Creating a new DB should return location header (118.7ms)
>>   * test _bulk_docs POST error when body not an object (95.0ms)
>>   * test Empty database should have zero docs (161.0ms)
>>   * test _all_docs POST error when multi-get is not a {'key': [...]}
>> structure (104.3ms)
>>   * test Regression test for COUCHDB-954 (skipped)
>>   * test DELETE'ing a non-existent doc should 404 (100.0ms)
>>   * test Revs info status is good (127.3ms)
>>   * test PUT on existing DB should return 412 instead of 500 (97.6ms)
>>   * test Database should be in _all_dbs (117.7ms)
>>   * test Check for invalid document members (122.4ms)
>>   * test Can create several documents (213.0ms)
>>   * test Make sure you can do a seq=true option (99.1ms)
>>   * test PUT doc has a Location header (skipped)
>>   * test Create a document and save it to the database (116.3ms)
>>   * test Created database has appropriate db info name (99.7ms)
>>   * test PUT error when body not an object (89.5ms)
>>   * test Simple map functions (473.0ms)
>>   * test POST doc response has a Location header (117.1ms)
>>
>> CouchTestCase
>>
>>
>> Finished in 3.3 seconds
>> 24 tests, 0 failures, 3 skipped
>>
>> Randomized with seed 936284

Re: [RFC] On the Testing of CouchDB

Posted by Benjamin Anderson <ba...@apache.org>.
Slick! This seems like it's coming together really nicely. Can't argue
with commits like "Prefer ?w=3 over hacky sleeps"[1] in any case.

> I hope others have similar opinions after diving in!

How should one dive in? Are you looking for others to help out with
the ports, or just thinking aspirationally about future regular
contributions to the test suite?

--
b

[1]: https://github.com/apache/couchdb/commit/5bce2d98a298c25b77d8dcda19deeedb494cc289

On Thu, Dec 14, 2017 at 5:03 PM, Russell Branca <ch...@apache.org> wrote:
> Howdy folks!
>
> The testing of CouchDB is something that has seen focus and improvements
> for the last several years, for instance migrating the etap suite to eunit,
> and updating the JS suite to run against clusters in 2.x. There's still
> improvements to be made, and that was one of the topics of the CouchDB dev
> summit early in the year [1].
>
> Before we go further, I want to clarify some nomenclature. I'm by no means
> going to try and define unit testing vs integration testing vs quantum
> phase shift testing, but instead I want to focus on the distinction of
> where the testing takes place. Fundamentally, we have two places we test
> CouchDB: 1) at the Erlang VM level where we conduct assertions against
> module functions or process states; 2) at the HTTP level where we test the
> behavior of CouchDB at the user level API. This post focuses entirely on
> the latter; that's not to say the former doesn't also merit attention, just
> that the two are different enough that we can focus on them in isolation.
>
> So with that, let's chat about the current HTTP test suite in CouchDB. This
> is the "JS suite" I referred to above, which is a custom built test suite
> written in Javascript and executed in the aging SpiderMonkey. The JS suite
> has put in work for years, but it's showing it's age, and is a bit awkward
> to work with and improve. However, I think the biggest issue with the JS
> suite is that it's utilized far less than it should be, and folks seem to
> avoid extending it or adding additional tests to it. There's been
> discussion for years about replacing said suite, but the discussions
> invariably got blocked on the bike shed of whether to rewrite the suite in
> Javascript or Python. This thread provides a third option, with code!
>
> I started hacking on a replacement for the JS suite, this time written in
> Elixir. Overall I'm quite impressed with how it's come along, and have some
> good examples to show. This is basically an Elixir app that has an HTTP
> client and then runs a series of tests that conduct tests against the
> CouchDB HTTP API and make assertions therein.
>
> You can find the current code in [2], and a comparison of the changes in
> [3]. The core HTTP client is only a handful of lines of codes and works
> quite well [4]. The utility functions used across all tests are located in
> [5], and the tests themselves are in [6]. The existing test modules have a
> 1:1 correspondence with the associated JS suite test modules, and in
> general are as direct of a port as possible.
>
> The test modules ported in their entirety or most of the way are:
>
>   * all_docs.js
>   * basics.js
>   * config.js
>   * reduce.js
>   * rewrite.js
>   * uuids.js
>   * view_collation.js
>
> Paul has dove in and is responsible for a few of those test modules and
> he's almost completed porting the replication.js suite as well. We started
> with the hard ones first, so for the most part the rest of the ports should
> be fairly smooth sailing.
>
> Here's an example of a very basic test:
>
> ```erlang
> defmodule WelcomeTest do
>   use CouchTestCase
>
>   test "Welcome endpoint" do
>     assert Couch.get("/").body["couchdb"] == "Welcome", "Should say welcome"
>   end
>
> end
>
> ```
>
>
> As you can see, the `Couch` client is very simple HTTP client with
> easy HTTP verb based methods. Let's look at a more complicated test
> for asserting we can create documents in a database:
>
>
> ```erlang
>
>   @tag :with_db
>   test "Create a document and save it to the database", context do
>     resp = Couch.post("/#{context[:db_name]}", [body: %{:_id => "0",
> :a => 1, :b => 1}])
>     assert resp.status_code == 201, "Should be 201 created"
>     assert resp.body["id"], "Id should be present"
>     assert resp.body["rev"], "Rev should be present"
>
>     resp2 = Couch.get("/#{context[:db_name]}/#{resp.body["id"]}")
>     assert resp2.body["_id"] == resp.body["id"], "Ids should match"
>     assert resp2.body["_rev"] == resp.body["rev"], "Revs should match"
>   end
>
> ```
>
>
> This is fairly straightforward code to POST a new doc, make assertions
> on the response, and then fetch the doc to make sure everything
> matches up. What I really wanted to highlight here is the `@tag
> :with_db` decorator. We can easily add custom "tags" to the tests to
> simplify setup and teardown. That `:with_db` tag does two things, it
> dynamically generates a random database name, and then takes care of
> setup/teardown for creating and deleting said database for that
> particular test. This is really useful and has been very nice to work
> with so far. We also have tag functionality in place for executing a
> test with a particular set of config options:
>
>
> ```erlang
>
>   @tag config: [
>     {"uuids", "algorithm", "utc_random"}
>   ]
>   test "utc_random uuids are roughly random" do
>     resp = Couch.get("/_uuids", query: %{:count => 1000})
>     assert resp.status_code == 200
>     uuids = resp.body["uuids"]
>
>     assert String.length(Enum.at(uuids, 1)) == 32
>
>     # Assert no collisions
>     assert length(Enum.uniq(uuids)) == length(uuids)
>
>     # Assert rough ordering of UUIDs
>     u1 = String.slice(Enum.at(uuids, 1), 0..13)
>     u2 = String.slice(Enum.at(uuids, -1), 0..13)
>     assert u1 < u2
>   end
> ```
>
>
> The tag system really simplifies a lot of the standard auxiliary
> actions needed to conduct tests.
>
>
> To test out the suite, you'll need to spin up the dev server in one window with:
>
>
> ```
>
> ./dev/run --admin=adm:pass
>
> ```
>
>
> and then in another window go into the relevant CouchDB src directory and run:
>
>
> ```
>
> cd ~/src/couchdb/elixir_suite/
>
> mix deps.get
>
> mix test --trace
>
> ```
>
>
> The `--trace` flag makes the nice line item output per test, which I
> greatly prefer over a slew of periods. You can run an individual test
> with `mix test --trace tests/basics_test.exs`. I've pasted the output
> from running the basics suite at the bottom of this email so you can
> see what the real output looks like.
>
>
> Overall I'm quite impressed with the toolkit we've been able to put
> together in a short amount of time, and I propose we migrate fully to
> this test suite by porting all remaining JS suite tests and then
> removing the JS suite entirely. Given we've already ported most of the
> "hard suites", I think a full port is reasonable to do and just
> requires some leg work. Again, I'm impressed with how simple the
> tooling here is and how quickly we've been able to run with things,
> turns out the Elixir dev experience is actually quite nice! I hope
> others have similar opinions after diving in! Let me know what you
> think.
>
>
>
> -Russell
>
>
>
> [1] https://github.com/janl/couchdb-next/issues/39
> [2] https://github.com/apache/couchdb/tree/elixir-suite
> [3] https://github.com/apache/couchdb/compare/elixir-suite
> [4]
> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/lib/couch.ex
> [5]
> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/test/test_helper.exs
> [6] https://github.com/apache/couchdb/tree/elixir-suite/elixir_suite/test
>
>
> vagrant@contrib-jessie:~/src/couchdb/elixir_suite$ mix test --trace
> test/basics_test.exs
> Excluding tags: [pending: true]
>
> BasicsTest
>   * test Session contains adm context (66.8ms)
>   * test Creating a new DB with slashes should return Location header
> (COUCHDB-411) (85.8ms)
>   * test oops, the doc id got lost in code nirwana (82.1ms)
>   * test Welcome endpoint (7.6ms)
>   * test POST doc with an _id field isn't overwritten by uuid (102.7ms)
>   * test On restart, a request for creating an already existing db can
> not override (skipped)
>   * test Creating a new DB should return location header (118.7ms)
>   * test _bulk_docs POST error when body not an object (95.0ms)
>   * test Empty database should have zero docs (161.0ms)
>   * test _all_docs POST error when multi-get is not a {'key': [...]}
> structure (104.3ms)
>   * test Regression test for COUCHDB-954 (skipped)
>   * test DELETE'ing a non-existent doc should 404 (100.0ms)
>   * test Revs info status is good (127.3ms)
>   * test PUT on existing DB should return 412 instead of 500 (97.6ms)
>   * test Database should be in _all_dbs (117.7ms)
>   * test Check for invalid document members (122.4ms)
>   * test Can create several documents (213.0ms)
>   * test Make sure you can do a seq=true option (99.1ms)
>   * test PUT doc has a Location header (skipped)
>   * test Create a document and save it to the database (116.3ms)
>   * test Created database has appropriate db info name (99.7ms)
>   * test PUT error when body not an object (89.5ms)
>   * test Simple map functions (473.0ms)
>   * test POST doc response has a Location header (117.1ms)
>
> CouchTestCase
>
>
> Finished in 3.3 seconds
> 24 tests, 0 failures, 3 skipped
>
> Randomized with seed 936284

Re: [RFC] On the Testing of CouchDB

Posted by Russell Branca <ru...@chewbranca.com>.
Hey Jan, thanks for the feedback. Handful of replies inline.

On Sat, Dec 16, 2017 at 3:18 AM Jan Lehnardt <ja...@apache.org> wrote:

> Heyall,
>
> this is really cool. Im very happy to see progress on our test suites
> and this looks like a great way forward.
>
> Yeah, me too!


> A little bit more context on the JS suite and its runtime environment
> couchjs: couchjs was never meant to be a general purpose JS execution
> environment. We started using couchjs for the JS test suite because it
> was already available in CouchDB and we were using JS tests as opposed
> to Erlang tests because they were meant as initial documentation of
> CouchDBs API. That was before the wiki was useful and dedicated docs
> were nowhere in sight.
>
> We are even maintaining a custom, C-based HTTP client so we can use
> couchjs for HTTP tests. I’d be very happy to get rid of that code.
>
> Today, there are better JS runtimes that are easy to use, and the JS
> tests changed purpose from being documentation to being the source of
> validation for a lot of API quirks in CouchDB that are not encoded
> anywhere else outside of CouchDB. That’s why it is useful to keep the
> knowledge inside the JS test suite around, but bring it forward to a
> more usable execution environment.
>
> * * *
>
> When we’ve discussed the JS test suite in the past, we talked about
> joining forces with the PouchDB team, since they have a version of our
> JS tests running, with many good additions. Going with Elixir, we’re
> closing the door on that particular collaboration, but given that this
> has been discussed for 2+ years and not much happened, I’m okay with the
> Elixir suite. The pouchdb-server project will be able to use the Elixir
> suite just as we do.
>
> The one thing that would be nice here if it were easy to disable certain
> tests or suites that make no sense in the pouchdb-server environment, so
> they can easily integrate it in their CI.
>

This is actually well supported with the ExUnit engine from Elixir, and in
fact we're already basically doing this. The tag system we used for
creating dbs and setting configs is also utilized as a filtering system for
tests. You can see a handful of examples and options by running "mix help
test". In the link below you can see how we're currently skipping tests
tagged with "pending", and similarly, you can set module level tags in
addition to test specific tags. As long as we're diligent with setting test
tags, it will be trivial to run a specific subset of tests with something
like "mix test --exclude breaks-pouchdb" or some such.

https://github.com/apache/couchdb/blob/elixir-suite/test/elixir/test/test_helper.exs#L1



> * * *
>
> One of the problems with the JS tests is that they predate testing best
> practices. The majority of tests are single functions with sometimes
> hundreds of assertions and changes in configuration and mock data. This
> is not very maintainable. The stats.js tests are split up more
> traditionally, but that effort was never extended to the other test
> files.
>
> I was happy to see that the Elixir port of the basics.js tests took a
> sensible approach breaking things up in the way a modern test suite
> would be done. Advantages include better isolation of specific behaviour
> tests, ease of understanding tests, simpler running of individual tests
> during development, more chances to structure and re-use mock data and
> other boilerplate code, all making any work with the tests more pleasant.
>
> It would be great if we could use this opportunity to apply this across
> all JS test files when we port them to Elixir. It means a little bit
> more work per test file, but I hope with a few more contributors and
> guidelines, this is an easily paralleliseable task, so individual burden
> can be minimised.
>
> Yeah I agree completely. The basics suite was fairly amenable to be being
broken up in that manner, however there are other suites that were not as
easy to break apart like that, for instance the all docs suite linked
below. My general approach so far has been to break things apart into
isolated tests whenever there's a simple and clear path to do so, and when
not just do as direct of a port of the tests as possible. IMO we should do
a first round of porting the tests over directly, and then once we've got
them all in Elixir we should do some refactoring to simplify and
restructure things as appropriate.

https://github.com/apache/couchdb/blob/elixir-suite/test/elixir/test/all_docs_test.exs



> * * *
>
> I noticed that one of the reduce tests took 30+ seconds to run on my
> machine and I experimented with different cluster configuration values
> and to nobodys surprise, the default of q=8 is the main factor in view
> test execution speed. q=4 takes ~20s, q=2 ~10s and q=1 ~5s. I’m not
> suggesting we set q=1 for all tests since q>1 is a behaviour we would
> want to test as well, but maybe we can set q=2 when running the test
> suite(s) for the time being. Shaving 25s off of a single test will get
> us a long way with all tests ported. What do others think?
>

I'm not sure what's going on here, although Paul ran into similar
situations where tests would run on my machine in under a second but take
significantly longer on his machine. I think we should investigate this a
bit more before taking any major action, but for a lot of tests running
with a lower Q value would probably be fine. I think it's important we run
with Q >= 2 so we at least run through the standard code paths of having to
merge over multiple shard ranges for aggregate operations. Although that
said, it could also be worthwhile to run tests with Q=1 to exercise the non
merge code paths as well.


>
> @Nick: you introduced the -C / --config-overrides option to dev/run,
> but I could never figure out how to apply it. That would be the easiest
> place to make the cluster config change for the Elixir tests.
>
> `-c q=2` makes the nodes fail to start and I’m not sure how in this case,
> the `[cluster]` section is meant.
>
> * * *
>
> Russell, Paul: do you think it is worth reaching out to the Elixir
> community and ask if they are interested in helping out a little? If
> you think its too early, we can wait with this.
>

Yeah I think that's a great idea and could be an excellent opportunity to
get more folks involved in things. I do think we should probably figure out
our branch/dev strategy prior to doing that, as me cherry-picking commits
will eventually become onerous given sufficient branches.


> * * *
>
> Thanks again!
>
> Best
> Jan
> --
>
>
No problem, really happy to see folks excited about this new engine!


-Russell


>
>
> > On 15. Dec 2017, at 02:03, Russell Branca <ch...@apache.org> wrote:
> >
> > Howdy folks!
> >
> > The testing of CouchDB is something that has seen focus and improvements
> > for the last several years, for instance migrating the etap suite to
> eunit,
> > and updating the JS suite to run against clusters in 2.x. There's still
> > improvements to be made, and that was one of the topics of the CouchDB
> dev
> > summit early in the year [1].
> >
> > Before we go further, I want to clarify some nomenclature. I'm by no
> means
> > going to try and define unit testing vs integration testing vs quantum
> > phase shift testing, but instead I want to focus on the distinction of
> > where the testing takes place. Fundamentally, we have two places we test
> > CouchDB: 1) at the Erlang VM level where we conduct assertions against
> > module functions or process states; 2) at the HTTP level where we test
> the
> > behavior of CouchDB at the user level API. This post focuses entirely on
> > the latter; that's not to say the former doesn't also merit attention,
> just
> > that the two are different enough that we can focus on them in isolation.
> >
> > So with that, let's chat about the current HTTP test suite in CouchDB.
> This
> > is the "JS suite" I referred to above, which is a custom built test suite
> > written in Javascript and executed in the aging SpiderMonkey. The JS
> suite
> > has put in work for years, but it's showing it's age, and is a bit
> awkward
> > to work with and improve. However, I think the biggest issue with the JS
> > suite is that it's utilized far less than it should be, and folks seem to
> > avoid extending it or adding additional tests to it. There's been
> > discussion for years about replacing said suite, but the discussions
> > invariably got blocked on the bike shed of whether to rewrite the suite
> in
> > Javascript or Python. This thread provides a third option, with code!
> >
> > I started hacking on a replacement for the JS suite, this time written in
> > Elixir. Overall I'm quite impressed with how it's come along, and have
> some
> > good examples to show. This is basically an Elixir app that has an HTTP
> > client and then runs a series of tests that conduct tests against the
> > CouchDB HTTP API and make assertions therein.
> >
> > You can find the current code in [2], and a comparison of the changes in
> > [3]. The core HTTP client is only a handful of lines of codes and works
> > quite well [4]. The utility functions used across all tests are located
> in
> > [5], and the tests themselves are in [6]. The existing test modules have
> a
> > 1:1 correspondence with the associated JS suite test modules, and in
> > general are as direct of a port as possible.
> >
> > The test modules ported in their entirety or most of the way are:
> >
> >  * all_docs.js
> >  * basics.js
> >  * config.js
> >  * reduce.js
> >  * rewrite.js
> >  * uuids.js
> >  * view_collation.js
> >
> > Paul has dove in and is responsible for a few of those test modules and
> > he's almost completed porting the replication.js suite as well. We
> started
> > with the hard ones first, so for the most part the rest of the ports
> should
> > be fairly smooth sailing.
> >
> > Here's an example of a very basic test:
> >
> > ```erlang
> > defmodule WelcomeTest do
> >  use CouchTestCase
> >
> >  test "Welcome endpoint" do
> >    assert Couch.get("/").body["couchdb"] == "Welcome", "Should say
> welcome"
> >  end
> >
> > end
> >
> > ```
> >
> >
> > As you can see, the `Couch` client is very simple HTTP client with
> > easy HTTP verb based methods. Let's look at a more complicated test
> > for asserting we can create documents in a database:
> >
> >
> > ```erlang
> >
> >  @tag :with_db
> >  test "Create a document and save it to the database", context do
> >    resp = Couch.post("/#{context[:db_name]}", [body: %{:_id => "0",
> > :a => 1, :b => 1}])
> >    assert resp.status_code == 201, "Should be 201 created"
> >    assert resp.body["id"], "Id should be present"
> >    assert resp.body["rev"], "Rev should be present"
> >
> >    resp2 = Couch.get("/#{context[:db_name]}/#{resp.body["id"]}")
> >    assert resp2.body["_id"] == resp.body["id"], "Ids should match"
> >    assert resp2.body["_rev"] == resp.body["rev"], "Revs should match"
> >  end
> >
> > ```
> >
> >
> > This is fairly straightforward code to POST a new doc, make assertions
> > on the response, and then fetch the doc to make sure everything
> > matches up. What I really wanted to highlight here is the `@tag
> > :with_db` decorator. We can easily add custom "tags" to the tests to
> > simplify setup and teardown. That `:with_db` tag does two things, it
> > dynamically generates a random database name, and then takes care of
> > setup/teardown for creating and deleting said database for that
> > particular test. This is really useful and has been very nice to work
> > with so far. We also have tag functionality in place for executing a
> > test with a particular set of config options:
> >
> >
> > ```erlang
> >
> >  @tag config: [
> >    {"uuids", "algorithm", "utc_random"}
> >  ]
> >  test "utc_random uuids are roughly random" do
> >    resp = Couch.get("/_uuids", query: %{:count => 1000})
> >    assert resp.status_code == 200
> >    uuids = resp.body["uuids"]
> >
> >    assert String.length(Enum.at(uuids, 1)) == 32
> >
> >    # Assert no collisions
> >    assert length(Enum.uniq(uuids)) == length(uuids)
> >
> >    # Assert rough ordering of UUIDs
> >    u1 = String.slice(Enum.at(uuids, 1), 0..13)
> >    u2 = String.slice(Enum.at(uuids, -1), 0..13)
> >    assert u1 < u2
> >  end
> > ```
> >
> >
> > The tag system really simplifies a lot of the standard auxiliary
> > actions needed to conduct tests.
> >
> >
> > To test out the suite, you'll need to spin up the dev server in one
> window with:
> >
> >
> > ```
> >
> > ./dev/run --admin=adm:pass
> >
> > ```
> >
> >
> > and then in another window go into the relevant CouchDB src directory
> and run:
> >
> >
> > ```
> >
> > cd ~/src/couchdb/elixir_suite/
> >
> > mix deps.get
> >
> > mix test --trace
> >
> > ```
> >
> >
> > The `--trace` flag makes the nice line item output per test, which I
> > greatly prefer over a slew of periods. You can run an individual test
> > with `mix test --trace tests/basics_test.exs`. I've pasted the output
> > from running the basics suite at the bottom of this email so you can
> > see what the real output looks like.
> >
> >
> > Overall I'm quite impressed with the toolkit we've been able to put
> > together in a short amount of time, and I propose we migrate fully to
> > this test suite by porting all remaining JS suite tests and then
> > removing the JS suite entirely. Given we've already ported most of the
> > "hard suites", I think a full port is reasonable to do and just
> > requires some leg work. Again, I'm impressed with how simple the
> > tooling here is and how quickly we've been able to run with things,
> > turns out the Elixir dev experience is actually quite nice! I hope
> > others have similar opinions after diving in! Let me know what you
> > think.
> >
> >
> >
> > -Russell
> >
> >
> >
> > [1] https://github.com/janl/couchdb-next/issues/39
> > [2] https://github.com/apache/couchdb/tree/elixir-suite
> > [3] https://github.com/apache/couchdb/compare/elixir-suite
> > [4]
> >
> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/lib/couch.ex
> > [5]
> >
> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/test/test_helper.exs
> > [6]
> https://github.com/apache/couchdb/tree/elixir-suite/elixir_suite/test
> >
> >
> > vagrant@contrib-jessie:~/src/couchdb/elixir_suite$ mix test --trace
> > test/basics_test.exs
> > Excluding tags: [pending: true]
> >
> > BasicsTest
> >  * test Session contains adm context (66.8ms)
> >  * test Creating a new DB with slashes should return Location header
> > (COUCHDB-411) (85.8ms)
> >  * test oops, the doc id got lost in code nirwana (82.1ms)
> >  * test Welcome endpoint (7.6ms)
> >  * test POST doc with an _id field isn't overwritten by uuid (102.7ms)
> >  * test On restart, a request for creating an already existing db can
> > not override (skipped)
> >  * test Creating a new DB should return location header (118.7ms)
> >  * test _bulk_docs POST error when body not an object (95.0ms)
> >  * test Empty database should have zero docs (161.0ms)
> >  * test _all_docs POST error when multi-get is not a {'key': [...]}
> > structure (104.3ms)
> >  * test Regression test for COUCHDB-954 (skipped)
> >  * test DELETE'ing a non-existent doc should 404 (100.0ms)
> >  * test Revs info status is good (127.3ms)
> >  * test PUT on existing DB should return 412 instead of 500 (97.6ms)
> >  * test Database should be in _all_dbs (117.7ms)
> >  * test Check for invalid document members (122.4ms)
> >  * test Can create several documents (213.0ms)
> >  * test Make sure you can do a seq=true option (99.1ms)
> >  * test PUT doc has a Location header (skipped)
> >  * test Create a document and save it to the database (116.3ms)
> >  * test Created database has appropriate db info name (99.7ms)
> >  * test PUT error when body not an object (89.5ms)
> >  * test Simple map functions (473.0ms)
> >  * test POST doc response has a Location header (117.1ms)
> >
> > CouchTestCase
> >
> >
> > Finished in 3.3 seconds
> > 24 tests, 0 failures, 3 skipped
> >
> > Randomized with seed 936284
>
> --
> Professional Support for Apache CouchDB:
> https://neighbourhood.ie/couchdb-support/
>
>

Re: [RFC] On the Testing of CouchDB

Posted by Garren Smith <ga...@apache.org>.
This is really great. Count me in to help. I'll start with the bulk_docs.js
file.

On Sat, Dec 16, 2017 at 11:29 PM, Jan Lehnardt <ma...@jan.io> wrote:

>
> On 16. Dec 2017, at 18:56, Paul Davis <pa...@gmail.com> wrote:
>
> >> The one thing that would be nice here if it were easy to disable certain
> >> tests or suites that make no sense in the pouchdb-server environment, so
> >> they can easily integrate it in their CI.
> >
> > The cool thing is that Elixir supports this natively in that you can
> > add tags to test to selectively enable/disable test classes so this
> > will just be a matter of letting the pouchdb team disable anything
> > that doesn't make sense for their implementaiton.
> >
> >> It would be great if we could use this opportunity to apply this across
> >> all JS test files when we port them to Elixir. It means a little bit
> >> more work per test file, but I hope with a few more contributors and
> >> guidelines, this is an easily paralleliseable task, so individual burden
> >> can be minimised.
> >
> > My current approach so far is to try and first port the test directly
> > and then afterwards go back through and refactor things to be more
> > traditional. My thinking here was that the initial port could be
> > reviewed alongside the existing JS test to double check that we're not
> > dropping any important tests or assertions along the way before we
> > start moving a lot of code around. That said, Elixir still allows us
> > to break things. So for instance the replication.js tests I've broken
> > up into a number of functions that still follow the same order as the
> > original suite but once the initial port is done it'll be trivial to
> > split that out into a base class and then have each test extend from
> > there. Its also possible to generate tests too so the replication
> > tests that check for all combinations of local/remote source/target
> > pairs end up as separate tests.
> >
> >> I noticed that one of the reduce tests took 30+ seconds to run on my
> >> machine and I experimented with different cluster configuration values
> >> and to nobodys surprise, the default of q=8 is the main factor in view
> >> test execution speed. q=4 takes ~20s, q=2 ~10s and q=1 ~5s. I’m not
> >> suggesting we set q=1 for all tests since q>1 is a behaviour we would
> >> want to test as well, but maybe we can set q=2 when running the test
> >> suite(s) for the time being. Shaving 25s off of a single test will get
> >> us a long way with all tests ported. What do others think?
> >
> > I've noticed some pretty terrible slowness on OS X (which I'm assuming
> > you're running on) and chatting with Russel it appears that on Linux
> > there's a massive speed difference when running tests. I'd very much
> > prefer to keep our tests against a Q=3 cluster.
>
> Thanks for the clarification, you too Russell! Just a nit, our default is
> 8, not 3, but that's still a lot faster than 8 ;)
>
> As for the performance difference: on Darwin, Erlang does a F_FULLFSYNC
> which does some magic to coerce otherwise lying hard drives to flush their
> caches, something that Linux fsync() doesn't do. On spinning disk, this
> meant ~3 file:fsync()/s on Darwin vs. a lot more on Linux. Multiplied by
> n=3 x q=8 them's a lot of F_FULLFSYNC to go around. I don't know about
> SSD's though, so this is somewhat speculative :)
>
> > I'd like to try and
> > dig in a bit to see if we can't figure out where we're having such a
> > dramatic time difference between the two. Hopefully some quick
> > measuring will point us to a knob to adjust to speed things up without
> > sacrificing cluster nodes during the tests.
>
>

Re: [RFC] On the Testing of CouchDB

Posted by Jan Lehnardt <ma...@jan.io>.
On 16. Dec 2017, at 18:56, Paul Davis <pa...@gmail.com> wrote:

>> The one thing that would be nice here if it were easy to disable certain
>> tests or suites that make no sense in the pouchdb-server environment, so
>> they can easily integrate it in their CI.
> 
> The cool thing is that Elixir supports this natively in that you can
> add tags to test to selectively enable/disable test classes so this
> will just be a matter of letting the pouchdb team disable anything
> that doesn't make sense for their implementaiton.
> 
>> It would be great if we could use this opportunity to apply this across
>> all JS test files when we port them to Elixir. It means a little bit
>> more work per test file, but I hope with a few more contributors and
>> guidelines, this is an easily paralleliseable task, so individual burden
>> can be minimised.
> 
> My current approach so far is to try and first port the test directly
> and then afterwards go back through and refactor things to be more
> traditional. My thinking here was that the initial port could be
> reviewed alongside the existing JS test to double check that we're not
> dropping any important tests or assertions along the way before we
> start moving a lot of code around. That said, Elixir still allows us
> to break things. So for instance the replication.js tests I've broken
> up into a number of functions that still follow the same order as the
> original suite but once the initial port is done it'll be trivial to
> split that out into a base class and then have each test extend from
> there. Its also possible to generate tests too so the replication
> tests that check for all combinations of local/remote source/target
> pairs end up as separate tests.
> 
>> I noticed that one of the reduce tests took 30+ seconds to run on my
>> machine and I experimented with different cluster configuration values
>> and to nobodys surprise, the default of q=8 is the main factor in view
>> test execution speed. q=4 takes ~20s, q=2 ~10s and q=1 ~5s. I’m not
>> suggesting we set q=1 for all tests since q>1 is a behaviour we would
>> want to test as well, but maybe we can set q=2 when running the test
>> suite(s) for the time being. Shaving 25s off of a single test will get
>> us a long way with all tests ported. What do others think?
> 
> I've noticed some pretty terrible slowness on OS X (which I'm assuming
> you're running on) and chatting with Russel it appears that on Linux
> there's a massive speed difference when running tests. I'd very much
> prefer to keep our tests against a Q=3 cluster.

Thanks for the clarification, you too Russell! Just a nit, our default is 8, not 3, but that's still a lot faster than 8 ;)

As for the performance difference: on Darwin, Erlang does a F_FULLFSYNC which does some magic to coerce otherwise lying hard drives to flush their caches, something that Linux fsync() doesn't do. On spinning disk, this meant ~3 file:fsync()/s on Darwin vs. a lot more on Linux. Multiplied by n=3 x q=8 them's a lot of F_FULLFSYNC to go around. I don't know about SSD's though, so this is somewhat speculative :)

> I'd like to try and
> dig in a bit to see if we can't figure out where we're having such a
> dramatic time difference between the two. Hopefully some quick
> measuring will point us to a knob to adjust to speed things up without
> sacrificing cluster nodes during the tests.


Re: [RFC] On the Testing of CouchDB

Posted by Paul Davis <pa...@gmail.com>.
> The one thing that would be nice here if it were easy to disable certain
> tests or suites that make no sense in the pouchdb-server environment, so
> they can easily integrate it in their CI.

The cool thing is that Elixir supports this natively in that you can
add tags to test to selectively enable/disable test classes so this
will just be a matter of letting the pouchdb team disable anything
that doesn't make sense for their implementaiton.

> It would be great if we could use this opportunity to apply this across
> all JS test files when we port them to Elixir. It means a little bit
> more work per test file, but I hope with a few more contributors and
> guidelines, this is an easily paralleliseable task, so individual burden
> can be minimised.

My current approach so far is to try and first port the test directly
and then afterwards go back through and refactor things to be more
traditional. My thinking here was that the initial port could be
reviewed alongside the existing JS test to double check that we're not
dropping any important tests or assertions along the way before we
start moving a lot of code around. That said, Elixir still allows us
to break things. So for instance the replication.js tests I've broken
up into a number of functions that still follow the same order as the
original suite but once the initial port is done it'll be trivial to
split that out into a base class and then have each test extend from
there. Its also possible to generate tests too so the replication
tests that check for all combinations of local/remote source/target
pairs end up as separate tests.

> I noticed that one of the reduce tests took 30+ seconds to run on my
> machine and I experimented with different cluster configuration values
> and to nobodys surprise, the default of q=8 is the main factor in view
> test execution speed. q=4 takes ~20s, q=2 ~10s and q=1 ~5s. I’m not
> suggesting we set q=1 for all tests since q>1 is a behaviour we would
> want to test as well, but maybe we can set q=2 when running the test
> suite(s) for the time being. Shaving 25s off of a single test will get
> us a long way with all tests ported. What do others think?

I've noticed some pretty terrible slowness on OS X (which I'm assuming
you're running on) and chatting with Russel it appears that on Linux
there's a massive speed difference when running tests. I'd very much
prefer to keep our tests against a Q=3 cluster. I'd like to try and
dig in a bit to see if we can't figure out where we're having such a
dramatic time difference between the two. Hopefully some quick
measuring will point us to a knob to adjust to speed things up without
sacrificing cluster nodes during the tests.

Re: [RFC] On the Testing of CouchDB

Posted by Jan Lehnardt <ja...@apache.org>.
Heyall,

this is really cool. Im very happy to see progress on our test suites
and this looks like a great way forward.

A little bit more context on the JS suite and its runtime environment
couchjs: couchjs was never meant to be a general purpose JS execution
environment. We started using couchjs for the JS test suite because it
was already available in CouchDB and we were using JS tests as opposed
to Erlang tests because they were meant as initial documentation of
CouchDBs API. That was before the wiki was useful and dedicated docs
were nowhere in sight.

We are even maintaining a custom, C-based HTTP client so we can use
couchjs for HTTP tests. I’d be very happy to get rid of that code.

Today, there are better JS runtimes that are easy to use, and the JS
tests changed purpose from being documentation to being the source of
validation for a lot of API quirks in CouchDB that are not encoded
anywhere else outside of CouchDB. That’s why it is useful to keep the
knowledge inside the JS test suite around, but bring it forward to a
more usable execution environment.

* * *

When we’ve discussed the JS test suite in the past, we talked about
joining forces with the PouchDB team, since they have a version of our
JS tests running, with many good additions. Going with Elixir, we’re
closing the door on that particular collaboration, but given that this
has been discussed for 2+ years and not much happened, I’m okay with the
Elixir suite. The pouchdb-server project will be able to use the Elixir
suite just as we do.

The one thing that would be nice here if it were easy to disable certain
tests or suites that make no sense in the pouchdb-server environment, so
they can easily integrate it in their CI.

* * *

One of the problems with the JS tests is that they predate testing best
practices. The majority of tests are single functions with sometimes
hundreds of assertions and changes in configuration and mock data. This
is not very maintainable. The stats.js tests are split up more
traditionally, but that effort was never extended to the other test
files.

I was happy to see that the Elixir port of the basics.js tests took a
sensible approach breaking things up in the way a modern test suite
would be done. Advantages include better isolation of specific behaviour
tests, ease of understanding tests, simpler running of individual tests
during development, more chances to structure and re-use mock data and
other boilerplate code, all making any work with the tests more pleasant.

It would be great if we could use this opportunity to apply this across
all JS test files when we port them to Elixir. It means a little bit
more work per test file, but I hope with a few more contributors and
guidelines, this is an easily paralleliseable task, so individual burden
can be minimised.


* * *

I noticed that one of the reduce tests took 30+ seconds to run on my
machine and I experimented with different cluster configuration values
and to nobodys surprise, the default of q=8 is the main factor in view
test execution speed. q=4 takes ~20s, q=2 ~10s and q=1 ~5s. I’m not
suggesting we set q=1 for all tests since q>1 is a behaviour we would
want to test as well, but maybe we can set q=2 when running the test
suite(s) for the time being. Shaving 25s off of a single test will get
us a long way with all tests ported. What do others think?

@Nick: you introduced the -C / --config-overrides option to dev/run,
but I could never figure out how to apply it. That would be the easiest
place to make the cluster config change for the Elixir tests.

`-c q=2` makes the nodes fail to start and I’m not sure how in this case,
the `[cluster]` section is meant.

* * *

Russell, Paul: do you think it is worth reaching out to the Elixir
community and ask if they are interested in helping out a little? If
you think its too early, we can wait with this.

* * *

Thanks again!

Best
Jan
--



> On 15. Dec 2017, at 02:03, Russell Branca <ch...@apache.org> wrote:
> 
> Howdy folks!
> 
> The testing of CouchDB is something that has seen focus and improvements
> for the last several years, for instance migrating the etap suite to eunit,
> and updating the JS suite to run against clusters in 2.x. There's still
> improvements to be made, and that was one of the topics of the CouchDB dev
> summit early in the year [1].
> 
> Before we go further, I want to clarify some nomenclature. I'm by no means
> going to try and define unit testing vs integration testing vs quantum
> phase shift testing, but instead I want to focus on the distinction of
> where the testing takes place. Fundamentally, we have two places we test
> CouchDB: 1) at the Erlang VM level where we conduct assertions against
> module functions or process states; 2) at the HTTP level where we test the
> behavior of CouchDB at the user level API. This post focuses entirely on
> the latter; that's not to say the former doesn't also merit attention, just
> that the two are different enough that we can focus on them in isolation.
> 
> So with that, let's chat about the current HTTP test suite in CouchDB. This
> is the "JS suite" I referred to above, which is a custom built test suite
> written in Javascript and executed in the aging SpiderMonkey. The JS suite
> has put in work for years, but it's showing it's age, and is a bit awkward
> to work with and improve. However, I think the biggest issue with the JS
> suite is that it's utilized far less than it should be, and folks seem to
> avoid extending it or adding additional tests to it. There's been
> discussion for years about replacing said suite, but the discussions
> invariably got blocked on the bike shed of whether to rewrite the suite in
> Javascript or Python. This thread provides a third option, with code!
> 
> I started hacking on a replacement for the JS suite, this time written in
> Elixir. Overall I'm quite impressed with how it's come along, and have some
> good examples to show. This is basically an Elixir app that has an HTTP
> client and then runs a series of tests that conduct tests against the
> CouchDB HTTP API and make assertions therein.
> 
> You can find the current code in [2], and a comparison of the changes in
> [3]. The core HTTP client is only a handful of lines of codes and works
> quite well [4]. The utility functions used across all tests are located in
> [5], and the tests themselves are in [6]. The existing test modules have a
> 1:1 correspondence with the associated JS suite test modules, and in
> general are as direct of a port as possible.
> 
> The test modules ported in their entirety or most of the way are:
> 
>  * all_docs.js
>  * basics.js
>  * config.js
>  * reduce.js
>  * rewrite.js
>  * uuids.js
>  * view_collation.js
> 
> Paul has dove in and is responsible for a few of those test modules and
> he's almost completed porting the replication.js suite as well. We started
> with the hard ones first, so for the most part the rest of the ports should
> be fairly smooth sailing.
> 
> Here's an example of a very basic test:
> 
> ```erlang
> defmodule WelcomeTest do
>  use CouchTestCase
> 
>  test "Welcome endpoint" do
>    assert Couch.get("/").body["couchdb"] == "Welcome", "Should say welcome"
>  end
> 
> end
> 
> ```
> 
> 
> As you can see, the `Couch` client is very simple HTTP client with
> easy HTTP verb based methods. Let's look at a more complicated test
> for asserting we can create documents in a database:
> 
> 
> ```erlang
> 
>  @tag :with_db
>  test "Create a document and save it to the database", context do
>    resp = Couch.post("/#{context[:db_name]}", [body: %{:_id => "0",
> :a => 1, :b => 1}])
>    assert resp.status_code == 201, "Should be 201 created"
>    assert resp.body["id"], "Id should be present"
>    assert resp.body["rev"], "Rev should be present"
> 
>    resp2 = Couch.get("/#{context[:db_name]}/#{resp.body["id"]}")
>    assert resp2.body["_id"] == resp.body["id"], "Ids should match"
>    assert resp2.body["_rev"] == resp.body["rev"], "Revs should match"
>  end
> 
> ```
> 
> 
> This is fairly straightforward code to POST a new doc, make assertions
> on the response, and then fetch the doc to make sure everything
> matches up. What I really wanted to highlight here is the `@tag
> :with_db` decorator. We can easily add custom "tags" to the tests to
> simplify setup and teardown. That `:with_db` tag does two things, it
> dynamically generates a random database name, and then takes care of
> setup/teardown for creating and deleting said database for that
> particular test. This is really useful and has been very nice to work
> with so far. We also have tag functionality in place for executing a
> test with a particular set of config options:
> 
> 
> ```erlang
> 
>  @tag config: [
>    {"uuids", "algorithm", "utc_random"}
>  ]
>  test "utc_random uuids are roughly random" do
>    resp = Couch.get("/_uuids", query: %{:count => 1000})
>    assert resp.status_code == 200
>    uuids = resp.body["uuids"]
> 
>    assert String.length(Enum.at(uuids, 1)) == 32
> 
>    # Assert no collisions
>    assert length(Enum.uniq(uuids)) == length(uuids)
> 
>    # Assert rough ordering of UUIDs
>    u1 = String.slice(Enum.at(uuids, 1), 0..13)
>    u2 = String.slice(Enum.at(uuids, -1), 0..13)
>    assert u1 < u2
>  end
> ```
> 
> 
> The tag system really simplifies a lot of the standard auxiliary
> actions needed to conduct tests.
> 
> 
> To test out the suite, you'll need to spin up the dev server in one window with:
> 
> 
> ```
> 
> ./dev/run --admin=adm:pass
> 
> ```
> 
> 
> and then in another window go into the relevant CouchDB src directory and run:
> 
> 
> ```
> 
> cd ~/src/couchdb/elixir_suite/
> 
> mix deps.get
> 
> mix test --trace
> 
> ```
> 
> 
> The `--trace` flag makes the nice line item output per test, which I
> greatly prefer over a slew of periods. You can run an individual test
> with `mix test --trace tests/basics_test.exs`. I've pasted the output
> from running the basics suite at the bottom of this email so you can
> see what the real output looks like.
> 
> 
> Overall I'm quite impressed with the toolkit we've been able to put
> together in a short amount of time, and I propose we migrate fully to
> this test suite by porting all remaining JS suite tests and then
> removing the JS suite entirely. Given we've already ported most of the
> "hard suites", I think a full port is reasonable to do and just
> requires some leg work. Again, I'm impressed with how simple the
> tooling here is and how quickly we've been able to run with things,
> turns out the Elixir dev experience is actually quite nice! I hope
> others have similar opinions after diving in! Let me know what you
> think.
> 
> 
> 
> -Russell
> 
> 
> 
> [1] https://github.com/janl/couchdb-next/issues/39
> [2] https://github.com/apache/couchdb/tree/elixir-suite
> [3] https://github.com/apache/couchdb/compare/elixir-suite
> [4]
> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/lib/couch.ex
> [5]
> https://github.com/apache/couchdb/blob/elixir-suite/elixir_suite/test/test_helper.exs
> [6] https://github.com/apache/couchdb/tree/elixir-suite/elixir_suite/test
> 
> 
> vagrant@contrib-jessie:~/src/couchdb/elixir_suite$ mix test --trace
> test/basics_test.exs
> Excluding tags: [pending: true]
> 
> BasicsTest
>  * test Session contains adm context (66.8ms)
>  * test Creating a new DB with slashes should return Location header
> (COUCHDB-411) (85.8ms)
>  * test oops, the doc id got lost in code nirwana (82.1ms)
>  * test Welcome endpoint (7.6ms)
>  * test POST doc with an _id field isn't overwritten by uuid (102.7ms)
>  * test On restart, a request for creating an already existing db can
> not override (skipped)
>  * test Creating a new DB should return location header (118.7ms)
>  * test _bulk_docs POST error when body not an object (95.0ms)
>  * test Empty database should have zero docs (161.0ms)
>  * test _all_docs POST error when multi-get is not a {'key': [...]}
> structure (104.3ms)
>  * test Regression test for COUCHDB-954 (skipped)
>  * test DELETE'ing a non-existent doc should 404 (100.0ms)
>  * test Revs info status is good (127.3ms)
>  * test PUT on existing DB should return 412 instead of 500 (97.6ms)
>  * test Database should be in _all_dbs (117.7ms)
>  * test Check for invalid document members (122.4ms)
>  * test Can create several documents (213.0ms)
>  * test Make sure you can do a seq=true option (99.1ms)
>  * test PUT doc has a Location header (skipped)
>  * test Create a document and save it to the database (116.3ms)
>  * test Created database has appropriate db info name (99.7ms)
>  * test PUT error when body not an object (89.5ms)
>  * test Simple map functions (473.0ms)
>  * test POST doc response has a Location header (117.1ms)
> 
> CouchTestCase
> 
> 
> Finished in 3.3 seconds
> 24 tests, 0 failures, 3 skipped
> 
> Randomized with seed 936284

-- 
Professional Support for Apache CouchDB:
https://neighbourhood.ie/couchdb-support/